Helped me build my basics then prepared me for advanced topics ensuring I have a strong understanding of everything.
2,327 Student Reviews
Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == =================== Database architecture Database architecture development Data Architecture Big Data ETL Technical solution development Azure data solutions Data insight provision Technical guidance IT Architecture Technical solutions Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).
Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == =================== Database architecture Database architecture development Data Architecture Big Data ETL Technical solution development Azure data solutions Data insight provision Technical guidance IT Architecture Technical solutions Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).
By profession, I am working in a Multinational Company as Software Developer and also I have 2 years of offline teaching experience in Python and solving the doubts of students.
By profession, I am working in a Multinational Company as Software Developer and also I have 2 years of offline teaching experience in Python and solving the doubts of students.
I'm a fourth-year Computer Science Engineering undergraduate at PES University, Bengaluru. I've been teaching for a few years now, sharing my passion for technology and learning.
Helped me build my basics then prepared me for advanced topics ensuring I have a strong understanding of everything.
I'm a fourth-year Computer Science Engineering undergraduate at PES University, Bengaluru. I've been teaching for a few years now, sharing my passion for technology and learning.
Helped me build my basics then prepared me for advanced topics ensuring I have a strong understanding of everything.
I am currently working as an IT Professional having 10 years of experience in software development. I teach students mainly Python and java programming and prepare them for the job interviews and exams. My students are excelling in their programming fields.
I am currently working as an IT Professional having 10 years of experience in software development. I teach students mainly Python and java programming and prepare them for the job interviews and exams. My students are excelling in their programming fields.
I am having 13 years of industry experience. As per my experience, Students like to have more hands on experience and require a tutor to have good explanation capability. I believe in these factors and make sure that the students get a clear understanding and practical hands on the subject .
I am having 13 years of industry experience. As per my experience, Students like to have more hands on experience and require a tutor to have good explanation capability. I believe in these factors and make sure that the students get a clear understanding and practical hands on the subject .
Free training for student
Free training for student
I am Software Development Manager in Profession with 20+ Years of experience technical delivery of large scale projects. Worked in MNC's Wipro Technologies and Cisco Systems.
I am Software Development Manager in Profession with 20+ Years of experience technical delivery of large scale projects. Worked in MNC's Wipro Technologies and Cisco Systems.
I am working in middle management in a highly reputed IT company and have total 12 years of experience in IT field To utilize my spare time and also to earn some money I would like to resume my teaching. I have previously taught in schools and at home. I have always secured high grades during my school and college grades and would definitely strive to get my students the same level of knowledge so that they can shine.
I am working in middle management in a highly reputed IT company and have total 12 years of experience in IT field To utilize my spare time and also to earn some money I would like to resume my teaching. I have previously taught in schools and at home. I have always secured high grades during my school and college grades and would definitely strive to get my students the same level of knowledge so that they can shine.
I have taught individuals, MNC, FDP,STTP for various academic institutions for the last 10 years. Most of my students have found my courses instrumental in getting them jobs in very good payscale. I teach Python, advanced python, Machine learning,Data Science,Deep Learning, NLP using different libraries of python like Pandas,Numpy,Scilit learn, Tensor flow, Keras etc
I have taught individuals, MNC, FDP,STTP for various academic institutions for the last 10 years. Most of my students have found my courses instrumental in getting them jobs in very good payscale. I teach Python, advanced python, Machine learning,Data Science,Deep Learning, NLP using different libraries of python like Pandas,Numpy,Scilit learn, Tensor flow, Keras etc
I am a AI researcher with 8+ years of experience in the design and development of NLP algorithms for document understanding,speech/text translation, question-answering, and dialog systems and delivering the research findings to the end products. Developing end-end NLP solutions from requirement gathering to product deployment Specialties: Deep Neural Networks, Natural Language Processing (Question-Answering, Dialog Systems, Machine Translation), Speech Processing (Speech recognition, Speech Translation), Computer Vision (Object Recognition and Detection) Computational: Python, Tensorflow, Scikit-learn, Pytorch
I am a AI researcher with 8+ years of experience in the design and development of NLP algorithms for document understanding,speech/text translation, question-answering, and dialog systems and delivering the research findings to the end products. Developing end-end NLP solutions from requirement gathering to product deployment Specialties: Deep Neural Networks, Natural Language Processing (Question-Answering, Dialog Systems, Machine Translation), Speech Processing (Speech recognition, Speech Translation), Computer Vision (Object Recognition and Detection) Computational: Python, Tensorflow, Scikit-learn, Pytorch
I'm an Engineer with 2 yrs of experience and I'm certified in python for data Science. I will take online tutoring for the same to impart the knowledge that I have. The future is data science so this will definitely will be a good addition.
I'm an Engineer with 2 yrs of experience and I'm certified in python for data Science. I will take online tutoring for the same to impart the knowledge that I have. The future is data science so this will definitely will be a good addition.
I am a software engineer, living just outside Bangalore city and Practicing yoga and meditation. I help young people understand technology skills and teach them Software Development, Ethical Hacking, Drone making, Electric Unicycle making etc. I ride single wheel electric unicycle. Currently I am running HackerFarm, a place to come and engage with internet doing meaningful social projects.
I am a software engineer, living just outside Bangalore city and Practicing yoga and meditation. I help young people understand technology skills and teach them Software Development, Ethical Hacking, Drone making, Electric Unicycle making etc. I ride single wheel electric unicycle. Currently I am running HackerFarm, a place to come and engage with internet doing meaningful social projects.
Assistant professor in engineering college and placement trainer. Certified by Wipro, emc and NPTEL MHRD. Passionate for teaching and expert in Java, python and data science / machine learning.
Assistant professor in engineering college and placement trainer. Certified by Wipro, emc and NPTEL MHRD. Passionate for teaching and expert in Java, python and data science / machine learning.
Learn new skills to go ahead for your career. ipython is a product-driven group working on state of the art projects for our domestic and international clients carrying a lot of expertise in product development in the area of python programming, cloud, front end and backend development, full stack Linux. We at www.ipython.live believe that just classroom training is not adequate, thus we incorporate theoretical knowledge with hands-on practical exposure via students contribution in our in-house projects providing a deeper and rich online learning experience.
Learn new skills to go ahead for your career. ipython is a product-driven group working on state of the art projects for our domestic and international clients carrying a lot of expertise in product development in the area of python programming, cloud, front end and backend development, full stack Linux. We at www.ipython.live believe that just classroom training is not adequate, thus we incorporate theoretical knowledge with hands-on practical exposure via students contribution in our in-house projects providing a deeper and rich online learning experience.
I a computer engineer.i am giving tutoring zoom online.i am certied in python.i have a btech degrees in computer science systems engineering. My key skills are Python, Mysql, Arduino,Project management and Powerbi data analysis.
I a computer engineer.i am giving tutoring zoom online.i am certied in python.i have a btech degrees in computer science systems engineering. My key skills are Python, Mysql, Arduino,Project management and Powerbi data analysis.
I have given private tuitions for programming and testing in Chandigarh.
I have given private tuitions for programming and testing in Chandigarh.
Iam a Engineer.. iam giving online tution on python full stack...I have a certified degree from the visveshvaraya university and currently giving tution on python full stack online course.
Iam a Engineer.. iam giving online tution on python full stack...I have a certified degree from the visveshvaraya university and currently giving tution on python full stack online course.
Hello, my name is Arjun.M, currently I am working as a softwaare engineer at Genpact..,former software engineer at itilite, former senior software engineer at Agathsya Technologies Pvt Ltd company,banglore. Overall experience is 6 year in python django framework and angular 4+. I'm very much interested to work on python django as I have good working experience and my dream is to become outstanding AI developer. As | am a quick learner and an innovative person, | can quickly adapt new technology thereby enhancing my skills and capabilities and | am best suitable for this job opportunity. Kindly find my resume for additional information. | will be highly obliged if you offer me interview call letter. Thank you for your time and consideration.. I look forward to speaking with you about this employment opportunity
Hello, my name is Arjun.M, currently I am working as a softwaare engineer at Genpact..,former software engineer at itilite, former senior software engineer at Agathsya Technologies Pvt Ltd company,banglore. Overall experience is 6 year in python django framework and angular 4+. I'm very much interested to work on python django as I have good working experience and my dream is to become outstanding AI developer. As | am a quick learner and an innovative person, | can quickly adapt new technology thereby enhancing my skills and capabilities and | am best suitable for this job opportunity. Kindly find my resume for additional information. | will be highly obliged if you offer me interview call letter. Thank you for your time and consideration.. I look forward to speaking with you about this employment opportunity
I am a Post Graduate in Big Data Analytics who has "Desire to Inspire" and believe that teaching is the best way to achieve it. I am an enthusiastic Coding and Math educator with 1.5+ years of experience with BrightChamps and Bhanzu to young minds.
I am a Post Graduate in Big Data Analytics who has "Desire to Inspire" and believe that teaching is the best way to achieve it. I am an enthusiastic Coding and Math educator with 1.5+ years of experience with BrightChamps and Bhanzu to young minds.
I'm an English and Coding tutor who blends human teaching with AI tools like ChatGPT to make learning fast and fun. I help students master grammar, communication, and spoken English with confidence. I also teach Python, Web Development, and Data Structures in an easy-to-follow, hands-on way. My classes are online, interactive, and fully personalized. Book a trial and experience smart learning that adapts to you!
I'm an English and Coding tutor who blends human teaching with AI tools like ChatGPT to make learning fast and fun. I help students master grammar, communication, and spoken English with confidence. I also teach Python, Web Development, and Data Structures in an easy-to-follow, hands-on way. My classes are online, interactive, and fully personalized. Book a trial and experience smart learning that adapts to you!
I am a technology enthusiasts and working in big data domain for the last 6 years. My key skills include python, spark, snowflake, big data tech stack.
I am a technology enthusiasts and working in big data domain for the last 6 years. My key skills include python, spark, snowflake, big data tech stack.
I am a Web Developer, currently working and have an experience of more than 3 years in python,django,graphql, postgres,aws and other tech stack. I can help you with starting from scratch even if you dont know programming and are keen to get into it.
I am a Web Developer, currently working and have an experience of more than 3 years in python,django,graphql, postgres,aws and other tech stack. I can help you with starting from scratch even if you dont know programming and are keen to get into it.
I am a senior program manager and handle multiple technologies and large scale teams. I have done MBA and B.E/B.Tech from marquee colleges of India. I like to teach and train students slowly so that they grasp enough to be self sufficient and I dont take typical approach of briefing and running away. I want to groom more people to become independent technologically. I expect learners to revise in their free time and understand the business application.
I am a senior program manager and handle multiple technologies and large scale teams. I have done MBA and B.E/B.Tech from marquee colleges of India. I like to teach and train students slowly so that they grasp enough to be self sufficient and I dont take typical approach of briefing and running away. I want to groom more people to become independent technologically. I expect learners to revise in their free time and understand the business application.
Our institute empowers learners with practical skills in data science and coding. We offer industry-oriented training in Python, SQL, analytics, and real-world projects, guided by experienced mentors to help students build strong careers in today’s technology-driven world.
Our institute empowers learners with practical skills in data science and coding. We offer industry-oriented training in Python, SQL, analytics, and real-world projects, guided by experienced mentors to help students build strong careers in today’s technology-driven world.
Having 5 Years of experience in programming. And 3 Years of experience in Python & Java.
Having 5 Years of experience in programming. And 3 Years of experience in Python & Java.
Browse hundreds of experienced dance tutors across Bangalore. Compare profiles, teaching styles, reviews, and class timings to find the one that fits your goals — whether it's Automation with Python, Core Python, Data Analysis with Python, and more
Select your preferred tutor and book a free demo session. Experience their teaching style, ask questions, and understand the class flow before you commit.
Once you're satisfied, make the payment securely through UrbanPro and start your dance journey! Learn at your own pace — online or in-person — and track your progress easily.
Find the best Python Training Tutor classes
Selected Location Do you offer Python Training classes?
Create Free Profile >>You can browse the list of best Python tutors on UrbanPro.com. You can even book a free demo class to decide which Tutor to start classes with.
The fee charged varies between online and offline classes. Generally you get the best quality at the lowest cost in the online classes, as the best tutors don’t like to travel to the Student’s location.
It definitely helps to join Python Training classes near me in Eva Mall, Bangalore, as you get the desired motivation from a Teacher to learn. If you need personal attention and if your budget allows, select 1-1 Class. If you need peer interaction or have budget constraints, select a Group Class.
UrbanPro has a list of best Python Training classes
Hi Sai, To find right career path you need to try things ( Which is long way). I would suggest you to...
In Python 2, there are two different types: “str” and “unicode”. Since these are fundamentally different...
Python with Machine Learning
you can learn mongo db in youtube channel , there are so many youtubers in youtube who teach mongoDB, you can watch their tutorials
Python is hands down one of the easiest programming languages to learn and develop logic. It has variety...
Operation Syntax Comment Assignment = Control flow right to left Add AND += Incremental Subtract AND -= Decremental Multiply...
Python can be used in any futuristics technology A= Analytics Data Science Artificial Intelligence(AI) Neural Network(NN) Natural Language Processing(NLP) Computer...
1. Download python from python official site search "python download" in google 2. Install in your machine 3. verify using : "python --version" command 4....
Assume how the Facebook application will store the millions of customer's record in real-time: facebook = { 'jose': { 'name': 'jose', 'age': 33, 'hobby':...
import pandas as pd from datetime import datetimeimport matplotlib.dates as datesimport matplotlib.pyplot as plt def gantt_chart(df_phase): ...