Is java necessary to learn Hadoop?

Asked by Last Modified  

18 Answers

Learn Hadoop

Follow 2
Answer

Please enter your answer

Yes
Comments

If you want start or learn hadoop development you must require java and basics of linux.
Comments

Not necessary but recommended. You can learn hadoop using python also.
Comments

Hadoop is written in Java, thus knowledge of Java basics is essential to learn Hadoop.
Comments

Yes it's necessary to know the basics of Java to learn Hadoop.
Comments

Training Center

Hi, Yes It is must that you need to have knowledge of JAVA (Core). We will be teaching the basic topics what all is required then we will teach the Hadoop.
Comments

Tutor

Yes, If you want to become an Hadoop developer or contributor you must know the Java. In addition to that you must know the linux and networking knowledge for Hadoop administrator jobs.
Comments

It is not mandatory to have java knowledge other server side scripting languages like python, pearl will also do the work, but having the knowledge of java will help you in understanding the daemon processes that happen while we run Hadoop.
Comments

BTech with 5+ years of Java web application development

Just Basics of oops concepts are sufficient nevertheless Strong knowledge on SQL would help a lot.
Comments

Prefect Training

Hi, In place of java, if you have a knowledge on SQL that helps you Hadoop Ecosystem called hive, basic linux fundamentals and shell scripts.
Comments

View 16 more Answers

Related Questions

Hello, I have completed B.com , MBA fin & M and 5 yr working experience in SAP PLM 1 - Engineering documentation management 2 - Documentation management Please suggest me which IT course suitable to my career growth and scope in market ? Thanks.
If you think you are strong in finance and costing, I would suggest you a SAP FICO course which is definitely always in demand. if you have an experience as a end user on SAP PLM / Documentation etc, even a course on SAP PLM DMS should be good.
Priya
1 0
9
What does the term "data locality" mean in Hadoop?
Data locality in Hadoop refers to the practice of processing data on the same node where it is stored, reducing network traffic and improving performance.
Sabna
0 0
5
Hi everyone, What is Hadoop /bigdata and what is required qualification and work experience background for Hadoop/bigdata?
Hadoop is the core platform for structuring Big Data, and solves the problem of formatting it for subsequent analytics purposes. Hadoop uses a distributed computing architecture consisting of multiple...
Priya
Do I need to learn the Java-Hibernate framework to be a Hadoop developer?
Not At All . To be Hadoop Developer , you need the knowledge of basic core Java programming along with SQL . No one will ask any question in interview on hibernate .
Pritam
0 0
6

Hi, currently I am working as php developer having 5 year of experience, I want to change the technology, so can any one suggest me which technology is better for me and in future also (hadoop or node with angular js).

Big Data is cake for data processing whereas Angular is for UI framework. I would recommend you to consider learning Big Data technologies.
Srikanth

Now ask question in any of the 1000+ Categories, and get Answers from Tutors and Trainers on UrbanPro.com

Ask a Question

Related Lessons

Up, Up And Up of Hadoop's Future
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just to provide...

Understanding Big Data
Introduction to Big Data This blog is about Big Data, its meaning, and applications prevalent currently in the industry.It’s an accepted fact that Big Data has taken the world by storm and has become...
M

Mymirror

0 0
0

Why is the Hadoop essential?
Capacity to store and process large measures of any information, rapidly. With information volumes and assortments always expanding, particularly from web-based life and the Internet of Things (IoT), that...

Lesson: Hive Queries
Lesson: Hive Queries This lesson will cover the following topics: Simple selects ? selecting columns Simple selects – selecting rows Creating new columns Hive Functions In SQL, of which...

How to create UDF (User Defined Function) in Hive
1. User Defined Function (UDF) in Hive using Java. 2. Download hive-0.4.1.jar and add it to lib-> Buil Path -> Add jar to libraries 3. Q:Find the Cube of number passed: import org.apache.hadoop.hive.ql.exec.UDF; public...
S

Sachin Patil

0 0
0

Recommended Articles

Big data is a phrase which is used to describe a very large amount of structured (or unstructured) data. This data is so “big” that it gets problematic to be handled using conventional database techniques and software.  A Big Data Scientist is a business employee who is responsible for handling and statistically evaluating...

Read full article >

Hadoop is a framework which has been developed for organizing and analysing big chunks of data for a business. Suppose you have a file larger than your system’s storage capacity and you can’t store it. Hadoop helps in storing bigger files than what could be stored on one particular server. You can therefore store very,...

Read full article >

We have already discussed why and how “Big Data” is all set to revolutionize our lives, professions and the way we communicate. Data is growing by leaps and bounds. The Walmart database handles over 2.6 petabytes of massive data from several million customer transactions every hour. Facebook database, similarly handles...

Read full article >

In the domain of Information Technology, there is always a lot to learn and implement. However, some technologies have a relatively higher demand than the rest of the others. So here are some popular IT courses for the present and upcoming future: Cloud Computing Cloud Computing is a computing technique which is used...

Read full article >

Find Hadoop near you

Looking for Hadoop ?

Learn from the Best Tutors on UrbanPro

Are you a Tutor or Training Institute?

Join UrbanPro Today to find students near you