I have undergone a training in Hadoop Development at ETL Hive Pimple Saudagar Branch. The Training Experience was Good, learned many concepts which helped me a lot to gain knowledge and command over the technology.
 
 
 
 "Anything that can be automated, should be automated." I'm a System Software Engineer with 9 years of experience in Product (IoT, Embedded Systems), Quality (QA, Risk management), Infrastructure (Automation, IaC, Cloud, DevOps) and Release Engineering (CI/CD, Branching, Packaging, Delivery, Software Configuration Management, Application Lifecycle Management). AREAS OF EXPERTISE: C, Python, GNU/Linux, Embedded Linux, DevOps, Git/GitLab, CI/CD, Automation, Cloud (MS Azure, Google Cloud, AWS), Django, Distributed Systems, Embedded Systems. Programming - C, C++, .NET Scripting - Python, Bash Virtualization and Cloud - VMWare vSphere, Microsoft Hyper-V, Microsoft Azure, Docker, Kubernetes, QEMU CI /CD - Jenkins, GitlabCI Configuration Management - Ansible Database - Mysql, SQLite, MongoDB, Elasticsearch Monitoring & Alerting (Infrastructure and Application) - Grafana, Prometheus, AlertManager, Zabbix, ELK (Kibana)
"Anything that can be automated, should be automated." I'm a System Software Engineer with 9 years of experience in Product (IoT, Embedded Systems), Quality (QA, Risk management), Infrastructure (Automation, IaC, Cloud, DevOps) and Release Engineering (CI/CD, Branching, Packaging, Delivery, Software Configuration Management, Application Lifecycle Management). AREAS OF EXPERTISE: C, Python, GNU/Linux, Embedded Linux, DevOps, Git/GitLab, CI/CD, Automation, Cloud (MS Azure, Google Cloud, AWS), Django, Distributed Systems, Embedded Systems. Programming - C, C++, .NET Scripting - Python, Bash Virtualization and Cloud - VMWare vSphere, Microsoft Hyper-V, Microsoft Azure, Docker, Kubernetes, QEMU CI /CD - Jenkins, GitlabCI Configuration Management - Ansible Database - Mysql, SQLite, MongoDB, Elasticsearch Monitoring & Alerting (Infrastructure and Application) - Grafana, Prometheus, AlertManager, Zabbix, ELK (Kibana)
I have total 11 years of IT experience. With Extensive knowledge on Big Data, Hadoop, Data Science, Machine Learning, and Blockchain. My exp. includes 4.5 Years of Teaching exp. with as Individual classroom/online and also in corporate training. Profile is available on request.
I have total 11 years of IT experience. With Extensive knowledge on Big Data, Hadoop, Data Science, Machine Learning, and Blockchain. My exp. includes 4.5 Years of Teaching exp. with as Individual classroom/online and also in corporate training. Profile is available on request.
I have 2 years of teaching experience in Hadoop Ecosystem including Apache Spark, Apache HBase, Hive, Impala, Mapreduce, Oozie, Sqoop, Kafka. Also in Scala basics and Shell scripting basics
I am a Staff Data Engineer with over 10 years of experience in designing, developing, and optimizing large-scale Big Data and Cloud-based Data Engineering pipelines. I currently work at Altimetrik, where I lead strategic initiatives around data architecture, real-time streaming, and cloud-native data platforms. In addition to my engineering role, I’m a passionate educator and mentor, offering home/online tutoring in Big Data, Cloud, Python, PySpark, SQL, and Data Engineering concepts for professionals and students aspiring to break into the data field. I believe in simplifying complex concepts and empowering the next generation of data engineers. Big Data Technologies: Apache Spark, Hadoop, Kafka, Hive, HBase Cloud Platforms: AWS (Glue, EMR, Redshift, S3), GCP (BigQuery, Dataflow, Pub/Sub), Azure Data Lake Data Engineering: ETL/ELT pipelines, Batch and Streaming data, Data Lakehouse, Delta Lake Programming: Python, PySpark, SQL, Shell Scripting Orchestration: Airflow, DBT, AWS Step Functions DevOps & CI/CD: Docker, Terraform, Git, Jenkins Data Modeling & Warehousing: Dimensional Modeling, SCDs, Snowflake, Redshift Monitoring & Logging: Datadog, Prometheus, ELK Stack
I am a Staff Data Engineer with over 10 years of experience in designing, developing, and optimizing large-scale Big Data and Cloud-based Data Engineering pipelines. I currently work at Altimetrik, where I lead strategic initiatives around data architecture, real-time streaming, and cloud-native data platforms. In addition to my engineering role, I’m a passionate educator and mentor, offering home/online tutoring in Big Data, Cloud, Python, PySpark, SQL, and Data Engineering concepts for professionals and students aspiring to break into the data field. I believe in simplifying complex concepts and empowering the next generation of data engineers. Big Data Technologies: Apache Spark, Hadoop, Kafka, Hive, HBase Cloud Platforms: AWS (Glue, EMR, Redshift, S3), GCP (BigQuery, Dataflow, Pub/Sub), Azure Data Lake Data Engineering: ETL/ELT pipelines, Batch and Streaming data, Data Lakehouse, Delta Lake Programming: Python, PySpark, SQL, Shell Scripting Orchestration: Airflow, DBT, AWS Step Functions DevOps & CI/CD: Docker, Terraform, Git, Jenkins Data Modeling & Warehousing: Dimensional Modeling, SCDs, Snowflake, Redshift Monitoring & Logging: Datadog, Prometheus, ELK Stack
I am a technical lead and a big data engineer having 8 years of experience in the IT field. I have executed several projects in java/j2ee along with AWS and big data. I am interested to teach the students in respective technologies and give them the practical real-world scenarios. I am also mentoring in my company and giving the learning and teaching guidance to newly coming employees. I am also an AWS certified associate developer.
I am a technical lead and a big data engineer having 8 years of experience in the IT field. I have executed several projects in java/j2ee along with AWS and big data. I am interested to teach the students in respective technologies and give them the practical real-world scenarios. I am also mentoring in my company and giving the learning and teaching guidance to newly coming employees. I am also an AWS certified associate developer.
I am working as a Big Data Engineer in a global financial firm with 8+ years of experience. I am well versed in the below tech stacks with proficency - Hadoop, Spark, python, scala, hive, sql, linux, AWS. Since I have given & taken numerous interviews in different organizations I can help & guide my students to have a interview ready profile that will help to crack any big data job
I am working as a Big Data Engineer in a global financial firm with 8+ years of experience. I am well versed in the below tech stacks with proficency - Hadoop, Spark, python, scala, hive, sql, linux, AWS. Since I have given & taken numerous interviews in different organizations I can help & guide my students to have a interview ready profile that will help to crack any big data job
Highly Motivated and Passionate Data driven ML Practitioner. Discovering Business insights by leveraging Data Science / Machine Learning expertise in diverse domains from the past 4 years Passionate about invigorate data science methodologies as well as aiding students / professionals to pursue interest and build skills in this area of expertise.
Highly Motivated and Passionate Data driven ML Practitioner. Discovering Business insights by leveraging Data Science / Machine Learning expertise in diverse domains from the past 4 years Passionate about invigorate data science methodologies as well as aiding students / professionals to pursue interest and build skills in this area of expertise.
Over all 20 yrs IT experience from HCLT, AMDOCS, COGNIZANT. Till last year, worked as SM in DWBI team. Currently given 60 HRS training in Imarticus with 20 students. Almost all components are covered in the training with industry standard examples with Java APIs.Reporting done by Apache Zeppelin. Spark SQL, Scala are also covered.
Over all 20 yrs IT experience from HCLT, AMDOCS, COGNIZANT. Till last year, worked as SM in DWBI team. Currently given 60 HRS training in Imarticus with 20 students. Almost all components are covered in the training with industry standard examples with Java APIs.Reporting done by Apache Zeppelin. Spark SQL, Scala are also covered.
I am technical trainer with expertise on Hadoop, Bigdata, AWS, PySpark, Snowflake, Shell Scripting & Oracle. I have worked with MNC's for almost 10 years & have been delivering Online & Classroom training's since 2014. Certified in Spark & AWS. I love to train people technically as per the industry standards with practical hands-on approach. Saif.
I am technical trainer with expertise on Hadoop, Bigdata, AWS, PySpark, Snowflake, Shell Scripting & Oracle. I have worked with MNC's for almost 10 years & have been delivering Online & Classroom training's since 2014. Certified in Spark & AWS. I love to train people technically as per the industry standards with practical hands-on approach. Saif.
I am an undisputed pioneer in providing career-oriented corporate, customer, and online training. I focused hands-on learning on varied technical skills so as to enable the professionals to keep pace with the ever-changing technology. I provides best training courses in Hadoop, Selenium, Data Science, Informatica, AngularJS.
I am an undisputed pioneer in providing career-oriented corporate, customer, and online training. I focused hands-on learning on varied technical skills so as to enable the professionals to keep pace with the ever-changing technology. I provides best training courses in Hadoop, Selenium, Data Science, Informatica, AngularJS.
• I am a seasoned Bigdata Hadoop, PySpark trainer I have been teaching online for 3 years, have trained many individuals, and made them capable to excel in the big data world. • Extensive hands-on experience with working-level knowledge on Big data and NoSQL (Not Only SQL) technologies including Hadoop’s, PIG, Hive, HBase, SQOOP, Flume, Oozie, YARN, Map Reduce using Java, HDFS, Master-Slave architectural model, Clustering, managing, and monitoring Hadoop clusters using Ambari with Spark & Python
• I am a seasoned Bigdata Hadoop, PySpark trainer I have been teaching online for 3 years, have trained many individuals, and made them capable to excel in the big data world. • Extensive hands-on experience with working-level knowledge on Big data and NoSQL (Not Only SQL) technologies including Hadoop’s, PIG, Hive, HBase, SQOOP, Flume, Oozie, YARN, Map Reduce using Java, HDFS, Master-Slave architectural model, Clustering, managing, and monitoring Hadoop clusters using Ambari with Spark & Python
We have certified trainers as well as industrial experts with experience over a decade in IT industry. We have expertise in Data Warehousing tools, Business Intelligence, Big Data, Hadoop, Middleware, Databases, Operating Platforms, Date Center Technology with Virtualization, Mobile Application Development. The things which make us different from others are Effective Pricing, Expertise of our teachers in our institute, well equipped rooms, projector, and well equipped faculty. We have more than 12 teachers in our institute. We have 2 branches in Pune.
I have undergone a training in Hadoop Development at ETL Hive Pimple Saudagar Branch. The Training Experience was Good, learned many concepts which helped me a lot to gain knowledge and command over the technology.
We have certified trainers as well as industrial experts with experience over a decade in IT industry. We have expertise in Data Warehousing tools, Business Intelligence, Big Data, Hadoop, Middleware, Databases, Operating Platforms, Date Center Technology with Virtualization, Mobile Application Development. The things which make us different from others are Effective Pricing, Expertise of our teachers in our institute, well equipped rooms, projector, and well equipped faculty. We have more than 12 teachers in our institute. We have 2 branches in Pune.
I have undergone a training in Hadoop Development at ETL Hive Pimple Saudagar Branch. The Training Experience was Good, learned many concepts which helped me a lot to gain knowledge and command over the technology.
Hi - I am an ETL developer in one of IT companies. Ab Initio developer having 7+ years of experience in development of various ETL projects in Insurance, Banking and Market research domain. Analyzed functional requirement, mapping documents and assisted in problem solving and troubleshooting. Developed and rectified various graphs for data cleansing, validation ,transformation and loading. Worked on Ab Initio, UNIX, HADOOP, SAS, Qlickview
Hi - I am an ETL developer in one of IT companies. Ab Initio developer having 7+ years of experience in development of various ETL projects in Insurance, Banking and Market research domain. Analyzed functional requirement, mapping documents and assisted in problem solving and troubleshooting. Developed and rectified various graphs for data cleansing, validation ,transformation and loading. Worked on Ab Initio, UNIX, HADOOP, SAS, Qlickview
I have 14 Years Of IT experience And 8 Years Of Experience in Big Data Technologies.
I have 14 Years Of IT experience And 8 Years Of Experience in Big Data Technologies.
Industry experts with 15+ exp in DWBI. We are experts in Datawarehousing technology stack. We will be providing extensive training on BigData - Hadoop with variety of technology stack used across industry. we will also provide hands on sessions with Spark. We are adding unique feature tp the training with PySpark.
Industry experts with 15+ exp in DWBI. We are experts in Datawarehousing technology stack. We will be providing extensive training on BigData - Hadoop with variety of technology stack used across industry. we will also provide hands on sessions with Spark. We are adding unique feature tp the training with PySpark.
Hi, I deliver PySpark with AWS in depth real-time training with hands on. 4 Months duration with 175+ Assignments, Use-case & also implementation of CCA-175 Practice Test. 3 End to End Project for real-time knowledge. Airflow for Orchestration. Shell Scripting for Automation.
Hi, I deliver PySpark with AWS in depth real-time training with hands on. 4 Months duration with 175+ Assignments, Use-case & also implementation of CCA-175 Practice Test. 3 End to End Project for real-time knowledge. Airflow for Orchestration. Shell Scripting for Automation.
Big Data lead with over 9 years of IT experience. I use to provide online coaching and tuition from last 2 years. I have completed big data courses from IIM-Kashipur.
Big Data lead with over 9 years of IT experience. I use to provide online coaching and tuition from last 2 years. I have completed big data courses from IIM-Kashipur.
I will cover as mentioned Big Data Hadoop Course Content Hadoop Installation & setup Introduction to Big Data Hadoop. Understanding HDFS & Mapreduce Deep Dive in Mapreduce Introduction to Hive Advance Hive & Impala Introduction to Pig Flume, Sqoop & HBase Writing Spark Applications using Scala Spark framework RDD in Spark Data Frames and Spark SQL Machine Learning using Spark (Mlib) Spark Streaming Hadoop Administration – Multi Node Cluster Setup using Amazon EC2 Hadoop Administration – Cluster Configuration Hadoop Administration – Maintenance, Monitoring and Troubleshooting ETL Connectivity with Hadoop Ecosystem Project Solution Discussion and Cloudera Certification Tips & Tricks Hadoop Application Testing Roles and Responsibilities of Hadoop Testing Professional Framework called MR Unit for Testing of Map-Reduce Programs Test Execution Test Plan Strategy and writing Test Cases for testing Hadoop Application Hadoop Projects You will be working on
I will cover as mentioned Big Data Hadoop Course Content Hadoop Installation & setup Introduction to Big Data Hadoop. Understanding HDFS & Mapreduce Deep Dive in Mapreduce Introduction to Hive Advance Hive & Impala Introduction to Pig Flume, Sqoop & HBase Writing Spark Applications using Scala Spark framework RDD in Spark Data Frames and Spark SQL Machine Learning using Spark (Mlib) Spark Streaming Hadoop Administration – Multi Node Cluster Setup using Amazon EC2 Hadoop Administration – Cluster Configuration Hadoop Administration – Maintenance, Monitoring and Troubleshooting ETL Connectivity with Hadoop Ecosystem Project Solution Discussion and Cloudera Certification Tips & Tricks Hadoop Application Testing Roles and Responsibilities of Hadoop Testing Professional Framework called MR Unit for Testing of Map-Reduce Programs Test Execution Test Plan Strategy and writing Test Cases for testing Hadoop Application Hadoop Projects You will be working on
Adroit is IT company for two decades which works in IT Domain. Its primary focus is using the Big Data and Data Analytics for our clients problems.Using Information Management and Analytics techniques to help customers derive information value and create business advantage. We have various trainers in our network who can deliver training from Project Management , IT Infrastructure and IT Programming All Trainers are with 15 years teaching experience and can take technical subjects.Most have worked in Industry for 15 years plus and some have studied in US.
Adroit is IT company for two decades which works in IT Domain. Its primary focus is using the Big Data and Data Analytics for our clients problems.Using Information Management and Analytics techniques to help customers derive information value and create business advantage. We have various trainers in our network who can deliver training from Project Management , IT Infrastructure and IT Programming All Trainers are with 15 years teaching experience and can take technical subjects.Most have worked in Industry for 15 years plus and some have studied in US.
Big Data Developer with 4+ years of experience and Having working knowledge of MR, Sqoop, Hive, Shell Script, Oozie, Spark, and Python, Core Java. Successfully done Google Cloud Data Engineer Certification. Also having experience in training.
Big Data Developer with 4+ years of experience and Having working knowledge of MR, Sqoop, Hive, Shell Script, Oozie, Spark, and Python, Core Java. Successfully done Google Cloud Data Engineer Certification. Also having experience in training.
I am working as free lancer and was a Data Scientist. Proficient with Data Science and Analysis,Deep Learning and Python. Data Visualization tools like Tableau, Qlickview. Performed statistical Analysis in R Programming and took workshops on Python and R in Machine Learning.
I am working as free lancer and was a Data Scientist. Proficient with Data Science and Analysis,Deep Learning and Python. Data Visualization tools like Tableau, Qlickview. Performed statistical Analysis in R Programming and took workshops on Python and R in Machine Learning.
This skill is very much niche in the market and not many people are pro in this field . I have good experience working on the real time projects along with training people in the organization .
This skill is very much niche in the market and not many people are pro in this field . I have good experience working on the real time projects along with training people in the organization .
I am software developer designer architect for past 15 years. worked for several fortune 500 companies. Worked on predominantly on java technologies. Have 2 years into bigdata. Have no prior teaching experience.
I am software developer designer architect for past 15 years. worked for several fortune 500 companies. Worked on predominantly on java technologies. Have 2 years into bigdata. Have no prior teaching experience.
Currently working in the projects on Spark, have good knowledge of using Spark with Python. I can teach you Spark Core, Sprak SQL and Dataframe, all using Python.
Currently working in the projects on Spark, have good knowledge of using Spark with Python. I can teach you Spark Core, Sprak SQL and Dataframe, all using Python.
Find more Hadoop
 Selected Location Do you offer Hadoop ?
Create Free Profile >>You can browse the list of best Hadoop tutors on UrbanPro.com. You can even book a free demo class to decide which Tutor to start classes with.
The fee charged varies between online and offline classes. Generally you get the best quality at the lowest cost in the online classes, as the best tutors don’t like to travel to the Student’s location.
It definitely helps to join Hadoop near me in Pune, India, as you get the desired motivation from a Teacher to learn. If you need personal attention and if your budget allows, select 1-1 Class. If you need peer interaction or have budget constraints, select a Group Class.
UrbanPro has a list of best Hadoop