I came to the centre to improve my knowledge of python for my college course. Within a relatively small period of time, I was able to make a really big improvement in my knowledge of syntax and how to use efficient logic. Thanks, Ravi.
2,329 Student Reviews
Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == =================== Database architecture Database architecture development Data Architecture Big Data ETL Technical solution development Azure data solutions Data insight provision Technical guidance IT Architecture Technical solutions Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).
Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == =================== Database architecture Database architecture development Data Architecture Big Data ETL Technical solution development Azure data solutions Data insight provision Technical guidance IT Architecture Technical solutions Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).
I am a software professional with 19 years of experience. A freelance Data Science and Machine Learning coach. I have a masters in Data Science and Machine Learning.
I am a software professional with 19 years of experience. A freelance Data Science and Machine Learning coach. I have a masters in Data Science and Machine Learning.
Python is the most sought after language both by students and working professionals. Data Analysis using Python helps you to analyse Data . The course helps you understand EDA using Python , understand algorithms such as Linear Regression, Logistic Regression, KNN, Decision trees and Random forests
Python is the most sought after language both by students and working professionals. Data Analysis using Python helps you to analyse Data . The course helps you understand EDA using Python , understand algorithms such as Linear Regression, Logistic Regression, KNN, Decision trees and Random forests
We help you in learning Python with practical exposure. We have helped more than 50 people get high paying software jobs too..
We help you in learning Python with practical exposure. We have helped more than 50 people get high paying software jobs too..
Experienced Chief Technology Officer with a demonstrated history of working in the e-learning and travel industry. Skilled in the cutting edge technologies. Worked on Blockchain , Search Engine Optimization (SEO), Webservers, Django,Nodejs, reactjs, angular, Selenium WebDrivers. Strong entrepreneurship professional with a Dual Degree (B.Tech + M.Tech) focused in Computer Science from Indian Institute of Technology, Kanpur.
Experienced Chief Technology Officer with a demonstrated history of working in the e-learning and travel industry. Skilled in the cutting edge technologies. Worked on Blockchain , Search Engine Optimization (SEO), Webservers, Django,Nodejs, reactjs, angular, Selenium WebDrivers. Strong entrepreneurship professional with a Dual Degree (B.Tech + M.Tech) focused in Computer Science from Indian Institute of Technology, Kanpur.
I am an IT professional working as a full stack developer. My skill are core java, spring boot, python and angular2+.
I am an IT professional working as a full stack developer. My skill are core java, spring boot, python and angular2+.
I'm a Engineer i have completed my engineering in computer science and i have done my all project including finial year project on python, machine learning and data science so i got interested to make other people also learn and make use of it and i got so many offer but i wanted to teach online only.
I'm a Engineer i have completed my engineering in computer science and i have done my all project including finial year project on python, machine learning and data science so i got interested to make other people also learn and make use of it and i got so many offer but i wanted to teach online only.
I am a Software Engineer with strong experience in Python, Data Science, and Machine Learning. I have professional IT industry experience, including hands-on work in Python programming, data analysis, and building real-world applications. I mentor students and working professionals in Python, Data Science fundamentals, and Machine Learning, helping them build strong foundations and practical understanding. My teaching style focuses on clear explanations, real-life examples, and hands-on practice. I guide learners from basic to intermediate levels, covering Python fundamentals, OOP concepts, data handling with Pandas and NumPy, basic Machine Learning, and interview preparation. I also help students understand real-world industry applications. I am patient, structured, and goal-oriented, ensuring student confidence and clarity. Classes can be conducted online or offline based on mutual convenience.
I am a Software Engineer with strong experience in Python, Data Science, and Machine Learning. I have professional IT industry experience, including hands-on work in Python programming, data analysis, and building real-world applications. I mentor students and working professionals in Python, Data Science fundamentals, and Machine Learning, helping them build strong foundations and practical understanding. My teaching style focuses on clear explanations, real-life examples, and hands-on practice. I guide learners from basic to intermediate levels, covering Python fundamentals, OOP concepts, data handling with Pandas and NumPy, basic Machine Learning, and interview preparation. I also help students understand real-world industry applications. I am patient, structured, and goal-oriented, ensuring student confidence and clarity. Classes can be conducted online or offline based on mutual convenience.
I am working as Manager Analytics. As per my job role I am wraggling with data, solve business problems as per business requirements. As a trainer, I am working previously since four years. I have mentoring so much students on data science and business analytics.
I am working as Manager Analytics. As per my job role I am wraggling with data, solve business problems as per business requirements. As a trainer, I am working previously since four years. I have mentoring so much students on data science and business analytics.
IIT Madras alumnus founds Skillcone, and industry professionals are focusing on technology training & consulting. We currently train in Angular 2, 4 & 6, Artificial Intelligence, Machine Learning, MEAN Stack, Robotics, Virtual Reality, Blockchain, Spring & Hibernate, JavaScript, NodeJS, Modern Web Development, Java J2EE, Testing, Python, UI and related technologies. We have trained over 10,000+ people, and have previously conducted workshops, and training programs for corporate, colleges, and fishers alike. IIT alumnus creates our courses, and industry professionals - so you are going to be getting the latest, and most current concepts in the course. We don't believe in blackboard/presentation based learning, and we train you to code. Our coding vs learning methodology and real-time coding based classes are what sets apart. Our training sessions are completely hands-on, and project-based right from the start. If you are looking to learn a technology and start working at the industry level, do join us. The training is done on live projects for you to get complete working knowledge of technology.
I came to the centre to improve my knowledge of python for my college course. Within a relatively small period of time, I was able to make a really big improvement in my knowledge of syntax and how to use efficient logic. Thanks, Ravi.
IIT Madras alumnus founds Skillcone, and industry professionals are focusing on technology training & consulting. We currently train in Angular 2, 4 & 6, Artificial Intelligence, Machine Learning, MEAN Stack, Robotics, Virtual Reality, Blockchain, Spring & Hibernate, JavaScript, NodeJS, Modern Web Development, Java J2EE, Testing, Python, UI and related technologies. We have trained over 10,000+ people, and have previously conducted workshops, and training programs for corporate, colleges, and fishers alike. IIT alumnus creates our courses, and industry professionals - so you are going to be getting the latest, and most current concepts in the course. We don't believe in blackboard/presentation based learning, and we train you to code. Our coding vs learning methodology and real-time coding based classes are what sets apart. Our training sessions are completely hands-on, and project-based right from the start. If you are looking to learn a technology and start working at the industry level, do join us. The training is done on live projects for you to get complete working knowledge of technology.
I came to the centre to improve my knowledge of python for my college course. Within a relatively small period of time, I was able to make a really big improvement in my knowledge of syntax and how to use efficient logic. Thanks, Ravi.
I am working in a service-based company from last year in Bangalore. And teaching python, machine learning, deep learning. I am giving home/online/tutor home tuition.
I am working in a service-based company from last year in Bangalore. And teaching python, machine learning, deep learning. I am giving home/online/tutor home tuition.
I have completed my BTECH in CSE and have 4.10 years of experience in IT INDUSTRY and I am good in teaching and making students learn in easy way.
I have completed my BTECH in CSE and have 4.10 years of experience in IT INDUSTRY and I am good in teaching and making students learn in easy way.
Besant Technologies - No.1 Rated IT Training Institute in Bangalore and Chennai. We are specialized in Database Developer Training, Database Administration Training, Data warehousing Training, Web Designing Training, SAP Training, Java Training, Software Testing Training, Microsoft Technologies Training, Oracle Applications Training, Mobile Applications Training, Oracle Fusion Middleware Training, Cloud Computing Training, IBM Training and more.
I have completed Python training in velachery. The training was very good. The trainer, Mr. Ganesh, shared a lot of practical experience and gave us many insights into digital marketing career. He covered the complete training in detail and shared with us many useful tools.
Besant Technologies - No.1 Rated IT Training Institute in Bangalore and Chennai. We are specialized in Database Developer Training, Database Administration Training, Data warehousing Training, Web Designing Training, SAP Training, Java Training, Software Testing Training, Microsoft Technologies Training, Oracle Applications Training, Mobile Applications Training, Oracle Fusion Middleware Training, Cloud Computing Training, IBM Training and more.
I have completed Python training in velachery. The training was very good. The trainer, Mr. Ganesh, shared a lot of practical experience and gave us many insights into digital marketing career. He covered the complete training in detail and shared with us many useful tools.
I am a react/ JavaScript tutor from Bangalore. With a year of experience. I am available on weekends. have 7 months experience of teaching at Masai School
I am a react/ JavaScript tutor from Bangalore. With a year of experience. I am available on weekends. have 7 months experience of teaching at Masai School
Having 25+ years of experience in IT industry. Empowering job seekers to become good employees who works effectively in every opportunity.please go through www.ibridge360.com for more details
Having 25+ years of experience in IT industry. Empowering job seekers to become good employees who works effectively in every opportunity.please go through www.ibridge360.com for more details
I can teach each and every topics very easily and make it stronger for the students.
I can teach each and every topics very easily and make it stronger for the students.
I am software development engineer with 10+ years experience. My domain knowledge spans Machine Learning, Data Modeling, Data Warehousing, ETL and System Management & Monitoring. I provide training for Machine Learning in Python, Data Analysis, Extraction, Visualization in Python, Oracle SQL and Pl/SQL, Core Java. I have Master's degree in Computer Science and Engineering from IIT Delhi.
I am software development engineer with 10+ years experience. My domain knowledge spans Machine Learning, Data Modeling, Data Warehousing, ETL and System Management & Monitoring. I provide training for Machine Learning in Python, Data Analysis, Extraction, Visualization in Python, Oracle SQL and Pl/SQL, Core Java. I have Master's degree in Computer Science and Engineering from IIT Delhi.
I am an engineer , i have a passion for teaching programming. I have certification in Python, MySQL. I finished my M.Tech degree in 2019. I give home tutor as well as teach online
I am an engineer , i have a passion for teaching programming. I have certification in Python, MySQL. I finished my M.Tech degree in 2019. I give home tutor as well as teach online
Professional and having extensive experience in ML and AI. Total 12 years of experience in Software industry. Have done Certificate and PGD from prestigious institutes such as IIMB,IIT Madras and IIITB.
Professional and having extensive experience in ML and AI. Total 12 years of experience in Software industry. Have done Certificate and PGD from prestigious institutes such as IIMB,IIT Madras and IIITB.
I am a software engineer. I am giving online training because i am passionate towards teaching. I have received best feedback while i was working as a freelancers.
I am a software engineer. I am giving online training because i am passionate towards teaching. I have received best feedback while i was working as a freelancers.
I'm a software Engineer working banglore. My key skills are backend development using django for web development. I have 4 years of overall experience in this field.
I'm a software Engineer working banglore. My key skills are backend development using django for web development. I have 4 years of overall experience in this field.
I have very good experience in teaching from 5years of to various subjects for all standard. My teaching experience include 1) Python Programming 2) Database(SQL) Currently I am an IT professional as software developer, and I can include my real time IT experience in my teaching to various IT technology .
I have very good experience in teaching from 5years of to various subjects for all standard. My teaching experience include 1) Python Programming 2) Database(SQL) Currently I am an IT professional as software developer, and I can include my real time IT experience in my teaching to various IT technology .
I am having 11 years of Experience of which 2 yrs is on Data Science.I am working as a Architect for reputed software firm.I have completed 3 batches for Python and 1 Batch for Python with Data Science Training.I try to cover simple real time examples and case study in my training.My training is supported by well defined assignment and notes.
I am having 11 years of Experience of which 2 yrs is on Data Science.I am working as a Architect for reputed software firm.I have completed 3 batches for Python and 1 Batch for Python with Data Science Training.I try to cover simple real time examples and case study in my training.My training is supported by well defined assignment and notes.
I have done a PG specialization in Big Data Analytics & Data Science and am currently working in a Fortune 100 company. I have 4 + years of experience in solving core big data and machine learning problems. Skillset: 1.R 2.Python 3.Machine Learning (with R & Python) 4.Hadoop (Architecture , MapReduce , Hive , Pig , Spark , Hbase , Flume , Kafka , Spark , Phoenix , Hadoop ETL+ Reporting) 5.Data Visualisation ( using R packages , python packages , Tableau) 6.Business Analytics & Viz in Excel I can train aspiring data scientists and hadoop enthusiasts on the following skills.
I have done a PG specialization in Big Data Analytics & Data Science and am currently working in a Fortune 100 company. I have 4 + years of experience in solving core big data and machine learning problems. Skillset: 1.R 2.Python 3.Machine Learning (with R & Python) 4.Hadoop (Architecture , MapReduce , Hive , Pig , Spark , Hbase , Flume , Kafka , Spark , Phoenix , Hadoop ETL+ Reporting) 5.Data Visualisation ( using R packages , python packages , Tableau) 6.Business Analytics & Viz in Excel I can train aspiring data scientists and hadoop enthusiasts on the following skills.
I am truly passionate about my work and always eager to connect with people. While I enjoy all aspects of work, I think my favorite stage of work is to understand his or her objectives and help them technically. As we go through that collaborative process, the ideas start to flow and that's always the fun part.
I am truly passionate about my work and always eager to connect with people. While I enjoy all aspects of work, I think my favorite stage of work is to understand his or her objectives and help them technically. As we go through that collaborative process, the ideas start to flow and that's always the fun part.
Browse hundreds of experienced dance tutors across Bangalore. Compare profiles, teaching styles, reviews, and class timings to find the one that fits your goals — whether it's Automation with Python, Core Python, Data Analysis with Python, and more
Select your preferred tutor and book a free demo session. Experience their teaching style, ask questions, and understand the class flow before you commit.
Once you're satisfied, make the payment securely through UrbanPro and start your dance journey! Learn at your own pace — online or in-person — and track your progress easily.
Find the best Python Training Tutor classes
Selected Location Do you offer Python Training classes?
Create Free Profile >>You can browse the list of best Python tutors on UrbanPro.com. You can even book a free demo class to decide which Tutor to start classes with.
The fee charged varies between online and offline classes. Generally you get the best quality at the lowest cost in the online classes, as the best tutors don’t like to travel to the Student’s location.
It definitely helps to join Python Training classes near me in Prestige Blue Chip Software Park, Bangalore, as you get the desired motivation from a Teacher to learn. If you need personal attention and if your budget allows, select 1-1 Class. If you need peer interaction or have budget constraints, select a Group Class.
UrbanPro has a list of best Python Training classes
Are you looking for the best online coaching for Python Training? UrbanPro.com is your one-stop destination...
Both languages are having their own importance.
It is similar to Python 2.7 Standard version
There isn’t a widely recognized industry-standard certification specifically for Django. However,...
First of all, most of the companies have criteria for graduate or equivalent.2nd, if you want to go with...
I have downloaded a 5MB data base consisting of words and their meanings in a json file . I want to write a python progrsm which will ask the user to...
Python for Beginners------Anyone can learn Python Prerequisites to learn Python To learn Python we don’t require the knowledge of any other...
Capacity to store and process large measures of any information, rapidly. With information volumes and assortments always expanding, particularly from...
Day 1: Python Basics Objective: Understand the fundamentals of Python programming language. Variables and Data Types (Integers, Strings, Floats,...
When you are upturn in your career from beginner to experienced in programming world, your team will start looking at ‘how you are writing?’...