UrbanPro

Big Data Training near me in Prestige Blue Chip Software Park, Bangalore

Find Best Big Data Training in Bangalore

Big Data classes image

How would you like to attend?

Recommended
verified Highly Rated Tutors verified Free Demo Class
Big Data cover image
Excellent (5.0)

1,093 Student Reviews

Excellent (5.0)

1,093 Student Reviews

Online and Offline Big Data Training

Select from 807 Online & Offline Big Data Training in your city (Last updated: 09 May 2026)

Amit raj Big Data trainer in Bangalore
Sponsored
(240)
locationImg Kannuru, Bangalore
18 yrs of Exp 104 students
rsIcon 1750 per hour
TECHNICAL ARCHIECT & LEAD ENGINEER – GEN AI & DATABRICKS, CLOUD (AZURE, AWS), DEVOPS, DBT
Online Classes Online Classes
Tutor's home

Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == ===================  Database architecture  Database architecture development  Data Architecture  Big Data ETL  Technical solution development  Azure data solutions  Data insight provision  Technical guidance  IT Architecture  Technical solutions  Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).

Skills: Apache Spark , Scala and more
Also teaches: Python Training, and more
TECHNICAL ARCHIECT & LEAD ENGINEER – GEN AI & DATABRICKS, CLOUD (AZURE, AWS), DEVOPS, DBT
Online Classes Online Classes
Tutor's home

Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == ===================  Database architecture  Database architecture development  Data Architecture  Big Data ETL  Technical solution development  Azure data solutions  Data insight provision  Technical guidance  IT Architecture  Technical solutions  Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).

Skills: Apache Spark , Scala and more
Also teaches: Python Training, and more
locationImg Dharmaram College, Bangalore
Tutor
Online Classes Online Classes
Student's Home
Tutor's home

I am an IT professional working in a reputed software company. I am a developer and an administrator. In my six years of career I have handled multiple projects as a developer and as an administrator. I am involved in giving training's on the technologies. In my tenure I experienced that learning different technology is not difficult rather its the most easiest, the only thing which we have to keep in mind is that we have to understand the fundamentals of any technologies, and my experience tell me that every thing with respect to computer training is very much related to real world and if we are able to establish connection between both then any technology related to computer learning will become very easy. Lets Start.

Tutor
Online Classes Online Classes
Student's Home
Tutor's home

I am an IT professional working in a reputed software company. I am a developer and an administrator. In my six years of career I have handled multiple projects as a developer and as an administrator. I am involved in giving training's on the technologies. In my tenure I experienced that learning different technology is not difficult rather its the most easiest, the only thing which we have to keep in mind is that we have to understand the fundamentals of any technologies, and my experience tell me that every thing with respect to computer training is very much related to real world and if we are able to establish connection between both then any technology related to computer learning will become very easy. Lets Start.

locationImg Jayanagar East End Main Road, Bangalore
10 yrs of Exp
SAP BI Architect Trainer
Online Classes Online Classes
Student's Home
Tutor's home

15 Years experience in US (America), Singapore on SAP BI, Big data, Cloud & Visualization. SAP Hands on experience in implementing SAP HANA, SAP BI (Business Intelligence), BW, SAP Business Objects, BOBJ, BPC. Cloud (AWS and SAP HANA Cloud), NOSQL, Tableau & Qlikview, Hadoop HANA Integration, Predictive Analytics. We provide excellent quality training on the below technologies. 1) SAP HANA, SAP BW on HANA or B/4 HANA, SAP BO, Business Objects. 2) Amazon Web services (AWS). How to run SAP systems on AWS. 3) SAP Business Objects cloud & HANA cloud Platform (HCP). ( Move to cloud) 4) Big data, Hadoop, NO SQL, Tableau & Qlikview. You will get Industry experience, real time scenarios how to handle any issues by attending this training. I will teach you how these tools can be implemented in real time. Real time documents will be provided with all these training. All the teaching/learning option like online classes and one to one classes are available. Taught the above technologies to 100's of students in US, Singapore, Thailand and around the world. Complete End to end training in the above areas with Industry functionality.

Also teaches: SAP,
SAP BI Architect Trainer
Online Classes Online Classes
Student's Home
Tutor's home

15 Years experience in US (America), Singapore on SAP BI, Big data, Cloud & Visualization. SAP Hands on experience in implementing SAP HANA, SAP BI (Business Intelligence), BW, SAP Business Objects, BOBJ, BPC. Cloud (AWS and SAP HANA Cloud), NOSQL, Tableau & Qlikview, Hadoop HANA Integration, Predictive Analytics. We provide excellent quality training on the below technologies. 1) SAP HANA, SAP BW on HANA or B/4 HANA, SAP BO, Business Objects. 2) Amazon Web services (AWS). How to run SAP systems on AWS. 3) SAP Business Objects cloud & HANA cloud Platform (HCP). ( Move to cloud) 4) Big data, Hadoop, NO SQL, Tableau & Qlikview. You will get Industry experience, real time scenarios how to handle any issues by attending this training. I will teach you how these tools can be implemented in real time. Real time documents will be provided with all these training. All the teaching/learning option like online classes and one to one classes are available. Taught the above technologies to 100's of students in US, Singapore, Thailand and around the world. Complete End to end training in the above areas with Industry functionality.

Also teaches: SAP,
locationImg Taverekere, Bangalore
1 yrs of Exp
Online Classes Online Classes
Student's Home
Tutor's home

I am working as a Senior Data Scientist. I love sharing my knowledge/experience by teaching Data Science/Analytics/AI/ML. Also as a freelancer I Mentor learner with on different courses/Capstone. Feel free to reach out to me.

Online Classes Online Classes
Student's Home
Tutor's home

I am working as a Senior Data Scientist. I love sharing my knowledge/experience by teaching Data Science/Analytics/AI/ML. Also as a freelancer I Mentor learner with on different courses/Capstone. Feel free to reach out to me.

locationImg Madiwala, Bangalore
5 yrs of Exp
Online Classes Online Classes

I am a Certified trainer , I am working in a leading IT company and am having 10+ years of experience in finacle core banking and having strong knowledge on both functional and technical knowledge. Can also teach on manual testing(banking), finacle treasury.

Also teaches: ,
Online Classes Online Classes

I am a Certified trainer , I am working in a leading IT company and am having 10+ years of experience in finacle core banking and having strong knowledge on both functional and technical knowledge. Can also teach on manual testing(banking), finacle treasury.

Also teaches: ,
locationImg Koramangala 8th Block, Bangalore
14 yrs of Exp
Tutor
Online Classes Online Classes

Python programming, machine learning, data science , analytics.

Tutor
Online Classes Online Classes

Python programming, machine learning, data science , analytics.

locationImg Tavarekere Main Road, Bangalore
9 yrs of Exp
Online Classes Online Classes
Student's Home

Worked as hadoop trainer. I am also cloudera certified hadoop developer. I am having total 9+ years of experience with 3+ years of experience in hadoop. Able to handle all concepts of hadoop ecosystem and introduction to spark.

Online Classes Online Classes
Student's Home

Worked as hadoop trainer. I am also cloudera certified hadoop developer. I am having total 9+ years of experience with 3+ years of experience in hadoop. Able to handle all concepts of hadoop ecosystem and introduction to spark.

locationImg Tavarekere Main Road, Bangalore
Tutor's home

SpiritSofts is a global training institute based at Hyderabad India California USA. Our trainers have experts in | Hyperion online training, we are providing courses Online Training, Oracle HFM Hyperion Financial Management Online Training, Hyperion Esbase Training, Hyperion Planning Training, Hyperion FDM/FDMEE Training, ODI Training, DRM training, OBIEE Training, Tableau online training, MSBI Training, Hadoop training, Sales Force training,

Also teaches: ,
Tutor's home

SpiritSofts is a global training institute based at Hyderabad India California USA. Our trainers have experts in | Hyperion online training, we are providing courses Online Training, Oracle HFM Hyperion Financial Management Online Training, Hyperion Esbase Training, Hyperion Planning Training, Hyperion FDM/FDMEE Training, ODI Training, DRM training, OBIEE Training, Tableau online training, MSBI Training, Hadoop training, Sales Force training,

Also teaches: ,
locationImg Koramangala 6th Block, Bangalore
Tutor's home

CloudThat is focused on quickly empowering IT professionals and organizations with leveraging Cloud and Big Data. Co-founded by Bhavesh Goswami, an ex-amazonian who was part of the AWS product development team and Himanshu Mody , who brings in 12 years of experience in IT Training and Consulting business allowing us to deliver high quality training with best practices. We have presence in Mumbai and Bengaluru but offer on-site and pre-scheduled public batches in different IT centric cities of India and abroad.

Also teaches: ,
Tutor's home

CloudThat is focused on quickly empowering IT professionals and organizations with leveraging Cloud and Big Data. Co-founded by Bhavesh Goswami, an ex-amazonian who was part of the AWS product development team and Himanshu Mody , who brings in 12 years of experience in IT Training and Consulting business allowing us to deliver high quality training with best practices. We have presence in Mumbai and Bengaluru but offer on-site and pre-scheduled public batches in different IT centric cities of India and abroad.

Also teaches: ,
locationImg Koramangala 6th Block, Bangalore
Online Classes Online Classes
Tutor's home

We are the best Workshops and training providers in Bangalore. We believe in transferring the knowledge in Big data Hadoop, Hive,Pig,MongoDB,MapReduce, Cloud Computing, Mobile application Development, Mobile application testing.

Also teaches: , Mobile App Development and more
Online Classes Online Classes
Tutor's home

We are the best Workshops and training providers in Bangalore. We believe in transferring the knowledge in Big data Hadoop, Hive,Pig,MongoDB,MapReduce, Cloud Computing, Mobile application Development, Mobile application testing.

Also teaches: , Mobile App Development and more
locationImg Bhavani Nagar, Bangalore
2 yrs of Exp
Online Classes Online Classes
Student's Home
Tutor's home
Online Classes Online Classes
Student's Home
Tutor's home
locationImg Koramangala 5th Block, Bangalore
7 yrs of Exp
Online Classes Online Classes
Student's Home
Tutor's home
Online Classes Online Classes
Student's Home
Tutor's home

How to Find the Best Big Data Training in Bangalore in 3 Simple Steps?

Search Tutor

Step 1: Find a Big Data Tutor

Browse hundreds of experienced dance tutors across Bangalore. Compare profiles, teaching styles, reviews, and class timings to find the one that fits your goals — whether it's Apache Spark, Hadoop, Scala,

Book Demo

Step 2: Book & Attend a Free Demo Class

Select your preferred tutor and book a free demo session. Experience their teaching style, ask questions, and understand the class flow before you commit.

Pay & Start

Step 3: Pay & Start Learning

Once you're satisfied, make the payment securely through UrbanPro and start your dance journey! Learn at your own pace — online or in-person — and track your progress easily.

Find the best Big Data Tutor Training

Selected Location

    Reviews for top Big Data Training

    A
    Aditya attended Big Data
    star star star star star Verified Student
    "I'm really impressed with Anil's Big Data training. He's a great teacher who explains..."
    S
    Deepak attended Big Data
    star star star star star
    "Soumitra possesses excellent knowledge and expertise. I had a highly enriching learning..."
    A
    Tejasri attended Big Data
    star star star star star Verified Student
    "Made Big Data concepts easier to understand and hands on experience with coding also..."
    A
    Gaurav attended Big Data
    star star star star star Verified Student
    "This Scala course equipped me with valuable skills in big data development. The curriculum..."
    A
    Rushi attended Big Data
    star star star star star Verified Student
    "I would like to share my experience with my Scala Teacher, Amit .His unique teaching..."
    A
    Kirti attended Big Data
    star star star star star Verified Student
    "I truly appreciate the dedication and expertise Amit brought to every session. Their..."
    V
    Srihari attended Big Data
    star star star star star Verified Student
    "Kiran Sir was a humble person and an Amazing mentor. He helped me restart my career..."
    C
    Sumit attended Big Data
    star star star star star Verified Student
    "I enrolled in the full stack development course at Colgstack and landed a job at..."

    Do you offer Big Data Training?

    Create Free Profile >>

    Key highlights about Big Data Training

    Free Demo Class
    :
    Available
    Average Price
    :
    INR 650/hr
    Total Available
    :
    807
    Class Format
    :
    Online or Offline classes

    FAQ

    How do I find the best Big Data Training near me in Prestige Blue Chip Software Park, Bangalore near me ?

    You can browse the list of best Big Data tutors on UrbanPro.com. You can even book a free demo class to decide which Tutor to start classes with.

    What is the typical Fee charged for Big Data Training near me in Prestige Blue Chip Software Park, Bangalore?

    The fee charged varies between online and offline classes. Generally you get the best quality at the lowest cost in the online classes, as the best tutors don’t like to travel to the Student’s location.

    Monthly Fee for 1-1 Classes

    INR INR 6,000 - INR 9,600 for 12 classes per month

    Hourly Fee for 1-1 Classes

    INR INR 500 - INR 800

    Monthly Fee for Group Classes

    INR INR 4,800 - INR 7,680 for 12 classes per month

    Hourly Fee for Group Classes

    INR INR 400 - INR 640

    Monthly Fee for Big Data Training at home

    INR INR 6,000 - INR 12,000 for 12 classes per month

    Hourly Fee for Big Data Training at home

    INR INR 500 - INR 1,000

    Monthly Fee for Online Big Data Training

    INR INR 6,000 - INR 9,600 for 12 classes per month

    Hourly Fee for Online Big Data Training

    INR INR 500 - INR 800

    Does joining Big Data Training help?

    It definitely helps to join Big Data Training near me in Prestige Blue Chip Software Park, Bangalore, as you get the desired motivation from a Teacher to learn. If you need personal attention and if your budget allows, select 1-1 Class. If you need peer interaction or have budget constraints, select a Group Class.

    Where can I find Big Data Training near me?

    UrbanPro has a list of best Big Data Training

    Big Data Questions

    How do I move from testing role with 1 yr experience to big data dev engineer being good at java and having leant hadoop and associated tools?

    If you are good at Java and have already learnt Hadoop and other technologies in its eco-system, then...

    I work for a big company and have 2.9 years of experience (Linux, Sybase, Java) but I'm not interested...

    It will very easy for you to learn Big data technologies as you have hands on in Java ,Linux & RDBMS....

    What is the best way to learn BigData?

    Learning Big Data involves gaining proficiency in various tools, frameworks, and concepts related...

    What are the new seminar topic related to bigdata?

    Here are some Topics "Edge Computing in Big Data Architectures": Explore how edge computing is...

    How are you using BigData and with which tools?

    Organizations often use big data technologies for tasks such as: Data Storage and Management: Hadoop...


    Big Data Lessons

    Big Data Hadoop Training

    What is Big Data? Big data means really a big data, it is a collection of large datasets that cannot be processed using traditional computing techniques....

    REFERENCE BOOKS FOR DATA SCIENCE

    Dear All, You can use the following books to master the DATA SCIENCE Concepts 1) First Course in Probability-Ronald Russel 2)Applied Regression Analysis-Drapper...

    Microsoft Power BI

    Microsoft Power BI is a free, self-service business intelligence cloud service that provides non-technical business users with tools for aggregating, analyzing,...

    Things to learn in Python before choosing any Technological Vertical

    Day 1: Python Basics Objective: Understand the fundamentals of Python programming language. Variables and Data Types (Integers, Strings, Floats,...

    Loading Hive tables as a parquet File

    Hive tables are very important when it comes to Hadoop and Spark as both can integrate and process the tables in Hive. Let's see how we can create a hive...

    Find Best Big Data Training?

    Find Now »

    Find Best Big Data Training?

    Find Now »

    This website uses cookies

    We use cookies to improve user experience. Choose what cookies you allow us to use. You can read more about our Cookie Policy in our Privacy Policy

    Accept All
    Decline All

    UrbanPro.com is India's largest network of most trusted tutors and institutes. Over 55 lakh students rely on UrbanPro.com, to fulfill their learning requirements across 1,000+ categories. Using UrbanPro.com, parents, and students can compare multiple Tutors and Institutes and choose the one that best suits their requirements. More than 7.5 lakh verified Tutors and Institutes are helping millions of students every day and growing their tutoring business on UrbanPro.com. Whether you are looking for a tutor to learn mathematics, a German language trainer to brush up your German language skills or an institute to upgrade your IT skills, we have got the best selection of Tutors and Training Institutes for you. Read more