The training was thorough and well versed with numerous practicals. It was well upto the industry standards.
"Anything that can be automated, should be automated." I'm a System Software Engineer with 9 years of experience in Product (IoT, Embedded Systems), Quality (QA, Risk management), Infrastructure (Automation, IaC, Cloud, DevOps) and Release Engineering (CI/CD, Branching, Packaging, Delivery, Software Configuration Management, Application Lifecycle Management). AREAS OF EXPERTISE: C, Python, GNU/Linux, Embedded Linux, DevOps, Git/GitLab, CI/CD, Automation, Cloud (MS Azure, Google Cloud, AWS), Django, Distributed Systems, Embedded Systems. Programming - C, C++, .NET Scripting - Python, Bash Virtualization and Cloud - VMWare vSphere, Microsoft Hyper-V, Microsoft Azure, Docker, Kubernetes, QEMU CI /CD - Jenkins, GitlabCI Configuration Management - Ansible Database - Mysql, SQLite, MongoDB, Elasticsearch Monitoring & Alerting (Infrastructure and Application) - Grafana, Prometheus, AlertManager, Zabbix, ELK (Kibana)
"Anything that can be automated, should be automated." I'm a System Software Engineer with 9 years of experience in Product (IoT, Embedded Systems), Quality (QA, Risk management), Infrastructure (Automation, IaC, Cloud, DevOps) and Release Engineering (CI/CD, Branching, Packaging, Delivery, Software Configuration Management, Application Lifecycle Management). AREAS OF EXPERTISE: C, Python, GNU/Linux, Embedded Linux, DevOps, Git/GitLab, CI/CD, Automation, Cloud (MS Azure, Google Cloud, AWS), Django, Distributed Systems, Embedded Systems. Programming - C, C++, .NET Scripting - Python, Bash Virtualization and Cloud - VMWare vSphere, Microsoft Hyper-V, Microsoft Azure, Docker, Kubernetes, QEMU CI /CD - Jenkins, GitlabCI Configuration Management - Ansible Database - Mysql, SQLite, MongoDB, Elasticsearch Monitoring & Alerting (Infrastructure and Application) - Grafana, Prometheus, AlertManager, Zabbix, ELK (Kibana)
I have total 11 years of IT experience. With Extensive knowledge on Big Data, Hadoop, Data Science, Machine Learning, and Blockchain. My exp. includes 4.5 Years of Teaching exp. with as Individual classroom/online and also in corporate training. Profile is available on request.
I have total 11 years of IT experience. With Extensive knowledge on Big Data, Hadoop, Data Science, Machine Learning, and Blockchain. My exp. includes 4.5 Years of Teaching exp. with as Individual classroom/online and also in corporate training. Profile is available on request.
I have 2 years of teaching experience in Hadoop Ecosystem including Apache Spark, Apache HBase, Hive, Impala, Mapreduce, Oozie, Sqoop, Kafka. Also in Scala basics and Shell scripting basics
I am a Staff Data Engineer with over 10 years of experience in designing, developing, and optimizing large-scale Big Data and Cloud-based Data Engineering pipelines. I currently work at Altimetrik, where I lead strategic initiatives around data architecture, real-time streaming, and cloud-native data platforms. In addition to my engineering role, I’m a passionate educator and mentor, offering home/online tutoring in Big Data, Cloud, Python, PySpark, SQL, and Data Engineering concepts for professionals and students aspiring to break into the data field. I believe in simplifying complex concepts and empowering the next generation of data engineers. Big Data Technologies: Apache Spark, Hadoop, Kafka, Hive, HBase Cloud Platforms: AWS (Glue, EMR, Redshift, S3), GCP (BigQuery, Dataflow, Pub/Sub), Azure Data Lake Data Engineering: ETL/ELT pipelines, Batch and Streaming data, Data Lakehouse, Delta Lake Programming: Python, PySpark, SQL, Shell Scripting Orchestration: Airflow, DBT, AWS Step Functions DevOps & CI/CD: Docker, Terraform, Git, Jenkins Data Modeling & Warehousing: Dimensional Modeling, SCDs, Snowflake, Redshift Monitoring & Logging: Datadog, Prometheus, ELK Stack
I am a Staff Data Engineer with over 10 years of experience in designing, developing, and optimizing large-scale Big Data and Cloud-based Data Engineering pipelines. I currently work at Altimetrik, where I lead strategic initiatives around data architecture, real-time streaming, and cloud-native data platforms. In addition to my engineering role, I’m a passionate educator and mentor, offering home/online tutoring in Big Data, Cloud, Python, PySpark, SQL, and Data Engineering concepts for professionals and students aspiring to break into the data field. I believe in simplifying complex concepts and empowering the next generation of data engineers. Big Data Technologies: Apache Spark, Hadoop, Kafka, Hive, HBase Cloud Platforms: AWS (Glue, EMR, Redshift, S3), GCP (BigQuery, Dataflow, Pub/Sub), Azure Data Lake Data Engineering: ETL/ELT pipelines, Batch and Streaming data, Data Lakehouse, Delta Lake Programming: Python, PySpark, SQL, Shell Scripting Orchestration: Airflow, DBT, AWS Step Functions DevOps & CI/CD: Docker, Terraform, Git, Jenkins Data Modeling & Warehousing: Dimensional Modeling, SCDs, Snowflake, Redshift Monitoring & Logging: Datadog, Prometheus, ELK Stack
I am a technical lead and a big data engineer having 8 years of experience in the IT field. I have executed several projects in java/j2ee along with AWS and big data. I am interested to teach the students in respective technologies and give them the practical real-world scenarios. I am also mentoring in my company and giving the learning and teaching guidance to newly coming employees. I am also an AWS certified associate developer.
I am a technical lead and a big data engineer having 8 years of experience in the IT field. I have executed several projects in java/j2ee along with AWS and big data. I am interested to teach the students in respective technologies and give them the practical real-world scenarios. I am also mentoring in my company and giving the learning and teaching guidance to newly coming employees. I am also an AWS certified associate developer.
I am working as a Big Data Engineer in a global financial firm with 8+ years of experience. I am well versed in the below tech stacks with proficency - Hadoop, Spark, python, scala, hive, sql, linux, AWS. Since I have given & taken numerous interviews in different organizations I can help & guide my students to have a interview ready profile that will help to crack any big data job
I am working as a Big Data Engineer in a global financial firm with 8+ years of experience. I am well versed in the below tech stacks with proficency - Hadoop, Spark, python, scala, hive, sql, linux, AWS. Since I have given & taken numerous interviews in different organizations I can help & guide my students to have a interview ready profile that will help to crack any big data job
Highly Motivated and Passionate Data driven ML Practitioner. Discovering Business insights by leveraging Data Science / Machine Learning expertise in diverse domains from the past 4 years Passionate about invigorate data science methodologies as well as aiding students / professionals to pursue interest and build skills in this area of expertise.
Highly Motivated and Passionate Data driven ML Practitioner. Discovering Business insights by leveraging Data Science / Machine Learning expertise in diverse domains from the past 4 years Passionate about invigorate data science methodologies as well as aiding students / professionals to pursue interest and build skills in this area of expertise.
Over all 20 yrs IT experience from HCLT, AMDOCS, COGNIZANT. Till last year, worked as SM in DWBI team. Currently given 60 HRS training in Imarticus with 20 students. Almost all components are covered in the training with industry standard examples with Java APIs.Reporting done by Apache Zeppelin. Spark SQL, Scala are also covered.
Over all 20 yrs IT experience from HCLT, AMDOCS, COGNIZANT. Till last year, worked as SM in DWBI team. Currently given 60 HRS training in Imarticus with 20 students. Almost all components are covered in the training with industry standard examples with Java APIs.Reporting done by Apache Zeppelin. Spark SQL, Scala are also covered.
I am technical trainer with expertise on Hadoop, Bigdata, AWS, PySpark, Snowflake, Shell Scripting & Oracle. I have worked with MNC's for almost 10 years & have been delivering Online & Classroom training's since 2014. Certified in Spark & AWS. I love to train people technically as per the industry standards with practical hands-on approach. Saif.
I am technical trainer with expertise on Hadoop, Bigdata, AWS, PySpark, Snowflake, Shell Scripting & Oracle. I have worked with MNC's for almost 10 years & have been delivering Online & Classroom training's since 2014. Certified in Spark & AWS. I love to train people technically as per the industry standards with practical hands-on approach. Saif.
I am an undisputed pioneer in providing career-oriented corporate, customer, and online training. I focused hands-on learning on varied technical skills so as to enable the professionals to keep pace with the ever-changing technology. I provides best training courses in Hadoop, Selenium, Data Science, Informatica, AngularJS.
I am an undisputed pioneer in providing career-oriented corporate, customer, and online training. I focused hands-on learning on varied technical skills so as to enable the professionals to keep pace with the ever-changing technology. I provides best training courses in Hadoop, Selenium, Data Science, Informatica, AngularJS.
I have worked on IT industry for around 15 years. My main domain has been IT Training. I have delivered trainings in various domains like Java , Python, Operating systems , Cloud computing , AWS. I have given trainings in companies like Vodafone , Cognizant , Syntel, Quinnox, Tibco , Amplify mindware, etc. Currently I am retired from a product based company. In AWS training I will be covering topics required for AWS certification. Total hours required to complete this course would be 40.
I have worked on IT industry for around 15 years. My main domain has been IT Training. I have delivered trainings in various domains like Java , Python, Operating systems , Cloud computing , AWS. I have given trainings in companies like Vodafone , Cognizant , Syntel, Quinnox, Tibco , Amplify mindware, etc. Currently I am retired from a product based company. In AWS training I will be covering topics required for AWS certification. Total hours required to complete this course would be 40.
• I am a seasoned Bigdata Hadoop, PySpark trainer I have been teaching online for 3 years, have trained many individuals, and made them capable to excel in the big data world. • Extensive hands-on experience with working-level knowledge on Big data and NoSQL (Not Only SQL) technologies including Hadoop’s, PIG, Hive, HBase, SQOOP, Flume, Oozie, YARN, Map Reduce using Java, HDFS, Master-Slave architectural model, Clustering, managing, and monitoring Hadoop clusters using Ambari with Spark & Python
• I am a seasoned Bigdata Hadoop, PySpark trainer I have been teaching online for 3 years, have trained many individuals, and made them capable to excel in the big data world. • Extensive hands-on experience with working-level knowledge on Big data and NoSQL (Not Only SQL) technologies including Hadoop’s, PIG, Hive, HBase, SQOOP, Flume, Oozie, YARN, Map Reduce using Java, HDFS, Master-Slave architectural model, Clustering, managing, and monitoring Hadoop clusters using Ambari with Spark & Python
We have certified trainers as well as industrial experts with experience over a decade in IT industry. We have expertise in Data Warehousing tools, Business Intelligence, Big Data, Hadoop, Middleware, Databases, Operating Platforms, Date Center Technology with Virtualization, Mobile Application Development. The things which make us different from others are Effective Pricing, Expertise of our teachers in our institute, well equipped rooms, projector, and well equipped faculty. We have more than 12 teachers in our institute. We have 2 branches in Pune.
The training was thorough and well versed with numerous practicals. It was well upto the industry standards.
We have certified trainers as well as industrial experts with experience over a decade in IT industry. We have expertise in Data Warehousing tools, Business Intelligence, Big Data, Hadoop, Middleware, Databases, Operating Platforms, Date Center Technology with Virtualization, Mobile Application Development. The things which make us different from others are Effective Pricing, Expertise of our teachers in our institute, well equipped rooms, projector, and well equipped faculty. We have more than 12 teachers in our institute. We have 2 branches in Pune.
The training was thorough and well versed with numerous practicals. It was well upto the industry standards.
Industry experts with 15+ exp in DWBI. We are experts in Datawarehousing technology stack. We will be providing extensive training on BigData - Hadoop with variety of technology stack used across industry. we will also provide hands on sessions with Spark. We are adding unique feature tp the training with PySpark.
Industry experts with 15+ exp in DWBI. We are experts in Datawarehousing technology stack. We will be providing extensive training on BigData - Hadoop with variety of technology stack used across industry. we will also provide hands on sessions with Spark. We are adding unique feature tp the training with PySpark.
Hi, I deliver PySpark with AWS in depth real-time training with hands on. 4 Months duration with 175+ Assignments, Use-case & also implementation of CCA-175 Practice Test. 3 End to End Project for real-time knowledge. Airflow for Orchestration. Shell Scripting for Automation.
Hi, I deliver PySpark with AWS in depth real-time training with hands on. 4 Months duration with 175+ Assignments, Use-case & also implementation of CCA-175 Practice Test. 3 End to End Project for real-time knowledge. Airflow for Orchestration. Shell Scripting for Automation.
I am a python trainer I have more than 5 years of python and python based framework experience. I worked on finance, telecommunication, and healthcare fields. My skills are the python, Django, Flask docker, elasticsearch, Kubernetes, data science, AI and ML. I will available only online (Hangout/Skype/Zoom) )training sessions and I will provide on-project training sessions and also provide assessments.
I am a python trainer I have more than 5 years of python and python based framework experience. I worked on finance, telecommunication, and healthcare fields. My skills are the python, Django, Flask docker, elasticsearch, Kubernetes, data science, AI and ML. I will available only online (Hangout/Skype/Zoom) )training sessions and I will provide on-project training sessions and also provide assessments.
Big Data lead with over 9 years of IT experience. I use to provide online coaching and tuition from last 2 years. I have completed big data courses from IIM-Kashipur.
Big Data lead with over 9 years of IT experience. I use to provide online coaching and tuition from last 2 years. I have completed big data courses from IIM-Kashipur.
I will cover as mentioned Big Data Hadoop Course Content Hadoop Installation & setup Introduction to Big Data Hadoop. Understanding HDFS & Mapreduce Deep Dive in Mapreduce Introduction to Hive Advance Hive & Impala Introduction to Pig Flume, Sqoop & HBase Writing Spark Applications using Scala Spark framework RDD in Spark Data Frames and Spark SQL Machine Learning using Spark (Mlib) Spark Streaming Hadoop Administration – Multi Node Cluster Setup using Amazon EC2 Hadoop Administration – Cluster Configuration Hadoop Administration – Maintenance, Monitoring and Troubleshooting ETL Connectivity with Hadoop Ecosystem Project Solution Discussion and Cloudera Certification Tips & Tricks Hadoop Application Testing Roles and Responsibilities of Hadoop Testing Professional Framework called MR Unit for Testing of Map-Reduce Programs Test Execution Test Plan Strategy and writing Test Cases for testing Hadoop Application Hadoop Projects You will be working on
I will cover as mentioned Big Data Hadoop Course Content Hadoop Installation & setup Introduction to Big Data Hadoop. Understanding HDFS & Mapreduce Deep Dive in Mapreduce Introduction to Hive Advance Hive & Impala Introduction to Pig Flume, Sqoop & HBase Writing Spark Applications using Scala Spark framework RDD in Spark Data Frames and Spark SQL Machine Learning using Spark (Mlib) Spark Streaming Hadoop Administration – Multi Node Cluster Setup using Amazon EC2 Hadoop Administration – Cluster Configuration Hadoop Administration – Maintenance, Monitoring and Troubleshooting ETL Connectivity with Hadoop Ecosystem Project Solution Discussion and Cloudera Certification Tips & Tricks Hadoop Application Testing Roles and Responsibilities of Hadoop Testing Professional Framework called MR Unit for Testing of Map-Reduce Programs Test Execution Test Plan Strategy and writing Test Cases for testing Hadoop Application Hadoop Projects You will be working on
Big Data Developer with 4+ years of experience and Having working knowledge of MR, Sqoop, Hive, Shell Script, Oozie, Spark, and Python, Core Java. Successfully done Google Cloud Data Engineer Certification. Also having experience in training.
Big Data Developer with 4+ years of experience and Having working knowledge of MR, Sqoop, Hive, Shell Script, Oozie, Spark, and Python, Core Java. Successfully done Google Cloud Data Engineer Certification. Also having experience in training.
I am software developer designer architect for past 15 years. worked for several fortune 500 companies. Worked on predominantly on java technologies. Have 2 years into bigdata. Have no prior teaching experience.
I am software developer designer architect for past 15 years. worked for several fortune 500 companies. Worked on predominantly on java technologies. Have 2 years into bigdata. Have no prior teaching experience.
Currently working in the projects on Spark, have good knowledge of using Spark with Python. I can teach you Spark Core, Sprak SQL and Dataframe, all using Python.
Currently working in the projects on Spark, have good knowledge of using Spark with Python. I can teach you Spark Core, Sprak SQL and Dataframe, all using Python.
3+ years of experience working as a Data Engineer utilizing analytical thinking and relevant expertise to help the organization comprehend its long term goals with the help of Hadoop, Spark , Pig,Hive,Hbase,Scala
3+ years of experience working as a Data Engineer utilizing analytical thinking and relevant expertise to help the organization comprehend its long term goals with the help of Hadoop, Spark , Pig,Hive,Hbase,Scala
I have been working on BigData projects from last 2.5 Years. Currently I am working on 5th project on BigData in client location.
I have been working on BigData projects from last 2.5 Years. Currently I am working on 5th project on BigData in client location.
I have experience on developing big data application on AWS cloud and databricks, I have provided training and job assistance to many people on AWS and big data.
I have experience on developing big data application on AWS cloud and databricks, I have provided training and job assistance to many people on AWS and big data.
I have 4 years of professional experience in Hadoop and Spark. I have 15 years of overall IT experience.
I have 4 years of professional experience in Hadoop and Spark. I have 15 years of overall IT experience.
Find more Apache Spark
Selected Location Do you offer Apache Spark ?
Create Free Profile >>You can browse the list of best Apache Spark tutors on UrbanPro.com. You can even book a free demo class to decide which Tutor to start classes with.
The fee charged varies between online and offline classes. Generally you get the best quality at the lowest cost in the online classes, as the best tutors don’t like to travel to the Student’s location.
It definitely helps to join Apache Spark near me in Pune, India, as you get the desired motivation from a Teacher to learn. If you need personal attention and if your budget allows, select 1-1 Class. If you need peer interaction or have budget constraints, select a Group Class.
UrbanPro has a list of best Apache Spark