#101, Nandini Towers, Western hills, JNTU Circle
APHB Colony, Hyderabad, India- 500072. .
Naveen Reddy Arkala
attended in Feb,2016
posted on 02 Apr, 2016
"Best training available for Storage in SAN Tree It has real time lab environment.I took SAN training and i got the job with the help of SAN TREE team."
attended in Jan,2016
posted on 02 Apr, 2016
"I have completed IBM SAN training at SanTree. The course really helped to have a good understanding of San to a level of 3 years experience. The 45 days course is really good with a lot of real time scenarios which helps to crack any interview easily. I would strongly recommend this course for anyone who wishes to pursue their career in IBM SAN domain."
"I am not able to figure out which syallabus to take up with. To your information I am from ICSE since class 8." in IT Courses
The query we often listen from those who want to get started with Hadoop is if knowledge of Java is a prerequisite or not. The answer is both a Yes and a No, depends on the individual persons interest on what they would like to do with Hadoop. Why No? Map Reduce provides Map and Reduce primitives which had been as Map and Fold primitives in the Functional Programming world in language like Lisp from quite some time. Hadoop provides interfaces to code in Java against those primitives. But, any language supporting read/write to STDIO like Perl, Python, PHP and others can also be used using Hadoop streaming feature. Also, there are high level abstractions provided by Apache frameworks like Pig and Hive for which familiarity of Java is not required. Pig can be programmed in Pig Latin and Hive can be programmed using Hive QL. Both of these programs will be automatically converted to Map Reduce programs in Java. What Do Real-Life Hadoop Workloads Look Like? ` - Pig and Hive constitute a majority of the workloads in a Hadoop cluster. Why Yes? Hadoop and the ecosystem can be easily extended for additional functionality like developing custom Input and Output Formats, UDF (User Defined Functions) and others. For customizing Hadoop knowledge of Java is mandatory. Also, many times it's required to get deep into Hadoop code as to why something is behaving a particular way or to know more about the functionality of a particular module. Again knowledge of Java comes handy here. Hadoop projects come with a lot of different roles like Architect, Developer, Tester, Linux/Network/Hardware Administrator and some of which require explicit knowledge of Java and some don't. My suggestion is if you are genuinely interested in Big Data and think that Big Data will make a difference then deep dive into Big Data technologies irrespective of knowledge about Java.