loading="eager" fetchpriority="high" decoding="sync" /> J P Nagar, Bangalore, India - 560078.
10 yrs of Exp
Details verified of Nandish R✕
Identity
Education
Know how UrbanPro verifies Tutor details
Identity is verified based on matching the details uploaded by the Tutor with government databases.
Kannada Mother Tongue (Native)
Telugu Proficient
English Proficient
bitm 2017
Bachelor of Computer Science (B.Sc. (Computer Science))
J P Nagar, Bangalore, India - 560078
Phone Verified
Email Verified
Report this Profile
Is this listing inaccurate or duplicate? Any other problem?
Please tell us about the problem and we will fix it.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Python Training classes
10
Course Duration provided
1-3 months
Seeker background catered to
Individual, Corporate company, Educational Institution
Certification provided
Yes
Python applications taught
Data Extraction with Python , Help in assignment, Automation with Python , Core Python, Data Visualization with Python, Data Analysis with Python , Data Science with Python, PySpark
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Google Cloud Platform
6
Teaching Experience in detail in Google Cloud Platform
Google Cloud Platform (GCP) Teaching Experience Experienced in teaching Google Cloud Platform (GCP) for Data Engineering with a strong practical focus. Provide hands-on training on Google Cloud Storage (GCS) for data ingestion and data lake architecture. Teach BigQuery covering data warehousing concepts, analytical SQL, partitioning, clustering, and cost optimization. Train learners on Cloud Composer (Apache Airflow) for workflow orchestration, DAG design, scheduling, retries, and monitoring. Cover Pub/Sub for real-time and event-driven data pipeline implementations. Provide hands-on sessions on Dataproc with Spark and PySpark for distributed data processing on GCP. Teach Cloud SQL for relational data storage and integration with analytical systems. Explain end-to-end ETL and ELT pipeline architecture using GCP services. Focus on batch and streaming data pipelines, error handling, and best practices. Emphasize real-world use cases, production-ready design, and scalability. Follow a step-by-step, beginner-friendly, and industry-aligned teaching approach. Help learners gain job-ready skills aligned with modern cloud data engineering roles.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in ETL Training
6
Teaching Experience in detail in ETL Training
ETL Teaching Experience I have extensive teaching experience in ETL (Extract, Transform, Load) processes, with a strong focus on designing, building, and maintaining reliable data pipelines used in real-world data engineering environments. My training helps learners clearly understand how data moves from source systems to analytics-ready platforms. I teach data extraction techniques from various sources such as relational databases, files, APIs, and streaming systems. I explain best practices for handling incremental loads, full loads, schema changes, and data validation during extraction. My transformation training focuses on data cleaning, standardization, enrichment, aggregation, and business rule implementation. I teach how to apply transformations using SQL, Python, Spark, and modern transformation tools, with emphasis on performance, reusability, and maintainability. I cover loading strategies into data warehouses and analytical systems, including bulk loads, incremental loads, upserts, and partitioned data loading. I also teach how to manage historical data, slowly changing dimensions, and data consistency. I provide detailed guidance on ETL pipeline architecture, including batch vs streaming pipelines, error handling, logging, monitoring, retries, and dependency management. I train learners on building scalable and fault-tolerant pipelines that can run in production environments. Throughout the training, I emphasize data quality checks, validation frameworks, and testing strategies to ensure trustworthy data. My teaching approach is step-by-step, practical, and use-case driven, enabling learners to build ETL pipelines confidently and apply these skills in real data engineering projects.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Big Data Training
6
Big Data Technology
Apache Spark, Hadoop
Teaching Experience in detail in Big Data Training
Big Data Teaching Experience Teach core Big Data fundamentals including distributed computing, scalability, fault tolerance, and data partitioning. Explain when and why Big Data technologies are required compared to traditional data processing systems. Provide hands-on training in Apache Hadoop, covering HDFS architecture and large-scale data storage concepts. Teach Apache Spark and PySpark for distributed data processing using RDDs, DataFrames, and Spark SQL. Explain Spark concepts such as transformations, actions, lazy evaluation, and execution plans. Train learners on performance optimization techniques including partitioning, caching, and resource tuning. Cover Kafka fundamentals for real-time data ingestion and streaming pipelines. Explain batch vs streaming processing and real-world Big Data use cases. Teach integration of Big Data tools with cloud platforms and data warehouses. Guide learners in designing scalable and fault-tolerant Big Data pipelines. Emphasize hands-on labs and real production-like scenarios. Follow a step-by-step, beginner-friendly, and industry-aligned teaching approach.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Oracle Training
6
Oracle Database Versions
Oracle 11g DBA
Oracle Products taught
Oracle Developer, Oracle Database, Oracle PL/SQL
Teaching Experience in detail in Oracle Training
I provide specialized Oracle Database training designed specifically for data engineering and analytics applications. My courses cover both core and advanced Oracle SQL, including DDL, DML, and DCL operations that are commonly used in production data pipelines. I guide learners through complex joins, subqueries, and Common Table Expressions (CTEs), and explain window (analytic) functions such as ROW_NUMBER, RANK, LEAD, LAG, and running totals. I also teach aggregation techniques and data summarization for analytical workloads, as well as date and string functions frequently used in ETL transformations. Learners gain practical knowledge of indexes, execution plans, and query optimization for large datasets, along with PL/SQL basics for procedural transformations and batch processing. The course includes incremental data extraction and loading strategies from Oracle databases, data modeling concepts, normalization, and relational design principles. I emphasize data quality checks, constraints, and validations, and show how Oracle integrates with ETL tools, Big Data platforms, and cloud data warehouses. All sessions are hands-on, scenario-based, and geared toward real-world data engineering tasks, with a focus on job-ready skills and interview preparation.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in SQL Programming Training
6
Teaching Experience in detail in SQL Programming Training
I provide specialized Oracle Database training designed specifically for data engineering and analytics applications. My courses cover both core and advanced Oracle SQL, including DDL, DML, and DCL operations that are commonly used in production data pipelines. I guide learners through complex joins, subqueries, and Common Table Expressions (CTEs), and explain window (analytic) functions such as ROW_NUMBER, RANK, LEAD, LAG, and running totals. I also teach aggregation techniques and data summarization for analytical workloads, as well as date and string functions frequently used in ETL transformations. Learners gain practical knowledge of indexes, execution plans, and query optimization for large datasets, along with PL/SQL basics for procedural transformations and batch processing. The course includes incremental data extraction and loading strategies from Oracle databases, data modeling concepts, normalization, and relational design principles. I emphasize data quality checks, constraints, and validations, and show how Oracle integrates with ETL tools, Big Data platforms, and cloud data warehouses. All sessions are hands-on, scenario-based, and geared toward real-world data engineering tasks, with a focus on job-ready skills and interview preparation.
Upcoming Live Classes
1. Which classes do you teach?
I teach Big Data, ETL, Google Cloud Platform, Oracle Training, Python Training and SQL Programming Classes.
2. Do you provide a demo class?
Yes, I provide a free demo class.
3. How many years of experience do you have?
I have been teaching for 10 years.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Python Training classes
10
Course Duration provided
1-3 months
Seeker background catered to
Individual, Corporate company, Educational Institution
Certification provided
Yes
Python applications taught
Data Extraction with Python , Help in assignment, Automation with Python , Core Python, Data Visualization with Python, Data Analysis with Python , Data Science with Python, PySpark
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Google Cloud Platform
6
Teaching Experience in detail in Google Cloud Platform
Google Cloud Platform (GCP) Teaching Experience Experienced in teaching Google Cloud Platform (GCP) for Data Engineering with a strong practical focus. Provide hands-on training on Google Cloud Storage (GCS) for data ingestion and data lake architecture. Teach BigQuery covering data warehousing concepts, analytical SQL, partitioning, clustering, and cost optimization. Train learners on Cloud Composer (Apache Airflow) for workflow orchestration, DAG design, scheduling, retries, and monitoring. Cover Pub/Sub for real-time and event-driven data pipeline implementations. Provide hands-on sessions on Dataproc with Spark and PySpark for distributed data processing on GCP. Teach Cloud SQL for relational data storage and integration with analytical systems. Explain end-to-end ETL and ELT pipeline architecture using GCP services. Focus on batch and streaming data pipelines, error handling, and best practices. Emphasize real-world use cases, production-ready design, and scalability. Follow a step-by-step, beginner-friendly, and industry-aligned teaching approach. Help learners gain job-ready skills aligned with modern cloud data engineering roles.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in ETL Training
6
Teaching Experience in detail in ETL Training
ETL Teaching Experience I have extensive teaching experience in ETL (Extract, Transform, Load) processes, with a strong focus on designing, building, and maintaining reliable data pipelines used in real-world data engineering environments. My training helps learners clearly understand how data moves from source systems to analytics-ready platforms. I teach data extraction techniques from various sources such as relational databases, files, APIs, and streaming systems. I explain best practices for handling incremental loads, full loads, schema changes, and data validation during extraction. My transformation training focuses on data cleaning, standardization, enrichment, aggregation, and business rule implementation. I teach how to apply transformations using SQL, Python, Spark, and modern transformation tools, with emphasis on performance, reusability, and maintainability. I cover loading strategies into data warehouses and analytical systems, including bulk loads, incremental loads, upserts, and partitioned data loading. I also teach how to manage historical data, slowly changing dimensions, and data consistency. I provide detailed guidance on ETL pipeline architecture, including batch vs streaming pipelines, error handling, logging, monitoring, retries, and dependency management. I train learners on building scalable and fault-tolerant pipelines that can run in production environments. Throughout the training, I emphasize data quality checks, validation frameworks, and testing strategies to ensure trustworthy data. My teaching approach is step-by-step, practical, and use-case driven, enabling learners to build ETL pipelines confidently and apply these skills in real data engineering projects.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Big Data Training
6
Big Data Technology
Apache Spark, Hadoop
Teaching Experience in detail in Big Data Training
Big Data Teaching Experience Teach core Big Data fundamentals including distributed computing, scalability, fault tolerance, and data partitioning. Explain when and why Big Data technologies are required compared to traditional data processing systems. Provide hands-on training in Apache Hadoop, covering HDFS architecture and large-scale data storage concepts. Teach Apache Spark and PySpark for distributed data processing using RDDs, DataFrames, and Spark SQL. Explain Spark concepts such as transformations, actions, lazy evaluation, and execution plans. Train learners on performance optimization techniques including partitioning, caching, and resource tuning. Cover Kafka fundamentals for real-time data ingestion and streaming pipelines. Explain batch vs streaming processing and real-world Big Data use cases. Teach integration of Big Data tools with cloud platforms and data warehouses. Guide learners in designing scalable and fault-tolerant Big Data pipelines. Emphasize hands-on labs and real production-like scenarios. Follow a step-by-step, beginner-friendly, and industry-aligned teaching approach.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in Oracle Training
6
Oracle Database Versions
Oracle 11g DBA
Oracle Products taught
Oracle Developer, Oracle Database, Oracle PL/SQL
Teaching Experience in detail in Oracle Training
I provide specialized Oracle Database training designed specifically for data engineering and analytics applications. My courses cover both core and advanced Oracle SQL, including DDL, DML, and DCL operations that are commonly used in production data pipelines. I guide learners through complex joins, subqueries, and Common Table Expressions (CTEs), and explain window (analytic) functions such as ROW_NUMBER, RANK, LEAD, LAG, and running totals. I also teach aggregation techniques and data summarization for analytical workloads, as well as date and string functions frequently used in ETL transformations. Learners gain practical knowledge of indexes, execution plans, and query optimization for large datasets, along with PL/SQL basics for procedural transformations and batch processing. The course includes incremental data extraction and loading strategies from Oracle databases, data modeling concepts, normalization, and relational design principles. I emphasize data quality checks, constraints, and validations, and show how Oracle integrates with ETL tools, Big Data platforms, and cloud data warehouses. All sessions are hands-on, scenario-based, and geared toward real-world data engineering tasks, with a focus on job-ready skills and interview preparation.
Class Location
Online class via Zoom
Student's Home
Tutor's Home
Years of Experience in SQL Programming Training
6
Teaching Experience in detail in SQL Programming Training
I provide specialized Oracle Database training designed specifically for data engineering and analytics applications. My courses cover both core and advanced Oracle SQL, including DDL, DML, and DCL operations that are commonly used in production data pipelines. I guide learners through complex joins, subqueries, and Common Table Expressions (CTEs), and explain window (analytic) functions such as ROW_NUMBER, RANK, LEAD, LAG, and running totals. I also teach aggregation techniques and data summarization for analytical workloads, as well as date and string functions frequently used in ETL transformations. Learners gain practical knowledge of indexes, execution plans, and query optimization for large datasets, along with PL/SQL basics for procedural transformations and batch processing. The course includes incremental data extraction and loading strategies from Oracle databases, data modeling concepts, normalization, and relational design principles. I emphasize data quality checks, constraints, and validations, and show how Oracle integrates with ETL tools, Big Data platforms, and cloud data warehouses. All sessions are hands-on, scenario-based, and geared toward real-world data engineering tasks, with a focus on job-ready skills and interview preparation.
Reply to 's review
Enter your reply*
Your reply has been successfully submitted.
Certified
The Certified badge indicates that the Tutor has received good amount of positive feedback from Students.