How can I do projects on big data Hadoop from an IT/MNC organization?

Asked by Last Modified  

4 Answers

Learn Hadoop

Follow 6
Answer

Please enter your answer

Technical Lead & Archiect - Azure & Databricks, Spark,Kafka,Snowflake,Scala,Pyspark,AWS Cloud,NoSQL

Hi Neha , you need to share your cv to HR department with college letter for the internship .If any company required trainee then surely you will get leads.if you want you can approach Ample7 Infra Soft Pvt ltd for internship . they may provide you .
Comments

Big Data, Hadoop ecosystem ,spark,python ,java Programming And Mathematics Expert

you can do if you have knowledge of hadoop and you have done so many sample or small project in hadoop. you need to give this confidence to company that you are able to do. if you have this skill then i can offer hadoop project in my team with real data and customer.
Comments

Application Developer

Every department of an organization including marketing, finance and HR are now getting direct access to their own data. This is creating a huge job opportunity and there is an urgent requirement for the professionals to master Big Data Hadoop skills. Organizations across the world are excited about...
read more
Every department of an organization including marketing, finance and HR are now getting direct access to their own data. This is creating a huge job opportunity and there is an urgent requirement for the professionals to master Big Data Hadoop skills. Organizations across the world are excited about big data and customer analytics not just because the data are big but the potential for companies using big data is huge. read less
Comments

Trainer

The best way to do the projects on BigData & Hadoop is to leverage the open data sets available across the web. These datasets are available across domains like IMDB datasets, NY Taxi datasets, Wikipedia data sets etc. These are available in different sizes as well (from few mbs to 100s of gbs). Understanding...
read more
The best way to do the projects on BigData & Hadoop is to leverage the open data sets available across the web. These datasets are available across domains like IMDB datasets, NY Taxi datasets, Wikipedia data sets etc. These are available in different sizes as well (from few mbs to 100s of gbs). Understanding the datasets structure and then defining the problem you want to pursue becomes very essential in planning to do the project. Another important factor to consider is - what module you want to use (like Hive, Spark or MapReduce) or any other related projects like Kafka, Flume you want to use to execute the project in. Also what is the infrastructure that is available at your dispose (like only VM or a cluster on AWS or Azure) helps in identifying and executing the project. Hope it gives a high level idea. Ravi read less
Comments

View 2 more Answers

Related Questions

What are some of the big data processing frameworks one should know about?
Apache Spark ,Apache Akka , Apache Flink ,Hadoop
Arun
0 0
5
I want to learn Hadoop admin.
Hi Suresh, I am providing hadoop administration training which will lead you to clear the Cloudera Administrator Certification exam (CCA131). You can contact me for course details. Regards Biswanath
Suresh
Is there a list of the world's largest Hadoop clusters on the web?
No . As pf now Yahoo has tested with 5000 nodes . but there is no such information .
Nishant
0 0
7

Now ask question in any of the 1000+ Categories, and get Answers from Tutors and Trainers on UrbanPro.com

Ask a Question

Related Lessons

Best way to learn any software Course
Hi First conform whether you are learning from a real time consultant. Get some Case Studies from the consultant and try to complete with the help of google not with consultant. Because in real time same situation will arise. Thank you

HDFS And Mapreduce
1. HDFS (Hadoop Distributed File System): Makes distributed filesystem look like a regular filesystem. Breaks files down into blocks. Distributes blocks to different nodes in the cluster based on...

How To Be A Hadoop Developer?
i. Becoming a Hadoop Developer: Dice survey revealed that 9 out of 10 high paid IT jobs require big data skills. A McKinsey Research Report on Big Data highlights that by end of 2018 the demand for...

How can you recover from a NameNode failure in Hadoop cluster?
How can you recover from a Namenode failure in Hadoop?Why is Namenode so important?Namenode is the most important Hadoop service. It contains the location of all blocks in the cluster. It maintains the...
B

Biswanath Banerjee

0 0
0

How to create UDF (User Defined Function) in Hive
1. User Defined Function (UDF) in Hive using Java. 2. Download hive-0.4.1.jar and add it to lib-> Buil Path -> Add jar to libraries 3. Q:Find the Cube of number passed: import org.apache.hadoop.hive.ql.exec.UDF; public...
S

Sachin Patil

0 0
0

Recommended Articles

Hadoop is a framework which has been developed for organizing and analysing big chunks of data for a business. Suppose you have a file larger than your system’s storage capacity and you can’t store it. Hadoop helps in storing bigger files than what could be stored on one particular server. You can therefore store very,...

Read full article >

Big data is a phrase which is used to describe a very large amount of structured (or unstructured) data. This data is so “big” that it gets problematic to be handled using conventional database techniques and software.  A Big Data Scientist is a business employee who is responsible for handling and statistically evaluating...

Read full article >

We have already discussed why and how “Big Data” is all set to revolutionize our lives, professions and the way we communicate. Data is growing by leaps and bounds. The Walmart database handles over 2.6 petabytes of massive data from several million customer transactions every hour. Facebook database, similarly handles...

Read full article >

In the domain of Information Technology, there is always a lot to learn and implement. However, some technologies have a relatively higher demand than the rest of the others. So here are some popular IT courses for the present and upcoming future: Cloud Computing Cloud Computing is a computing technique which is used...

Read full article >

Find Hadoop near you

Looking for Hadoop ?

Learn from the Best Tutors on UrbanPro

Are you a Tutor or Training Institute?

Join UrbanPro Today to find students near you