Very good teacher. Explains concepts very well. Clears all doubts. Very kind and patient teacher. Very satisfied with the learning experience
2,327 Student Reviews
Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == =================== Database architecture Database architecture development Data Architecture Big Data ETL Technical solution development Azure data solutions Data insight provision Technical guidance IT Architecture Technical solutions Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).
Objectives – Having 18.5 years of experience As a Technical Lead and Architect, I am passionate about leveraging my expertise in Databricks, Data Build Tool, Spark, Confluent Kafka, Data Lake, Lake House and Cloud Solutions to drive innovation and efficiency. With a solid background in data architecture and IT Infrastructure, I am to contribute to the robust and vision of your company by providing ro bust technical solutions that align with strategic business goals. My Goal is to enhance data-driven decision-making processes, optimize big data pipelines and implement secure and scalable cloud architectures that people the organization forward in the ever-evolving technical landscape. Certification & Achievements – 1. Confluent Certified Administrator for Apache Kafka: expiry July 2026. Confluent Certified Administrator for Apache Kafka • Amit Raj • Confluent 2. Databricks Certified Data Engineer Professional: expiry Aug 2026 Databricks Certified Data Engineer Professional • Amit Raj • Databricks Badges 3. Databricks Accredited Lakehouse Fundamentals: expiry June 2025 Academy Accreditation - Databricks Lakehouse Fundamentals • Amit Raj • Databricks Badges 4. Microsoft Certified: Azure Administrator Associate (AZ104): Microsoft certification ID: 1100039942 Expires on: August 3, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 5. Microsoft Certified: Azure Security Engineer Associate (AZ500): Microsoft certification ID: 1100039942 Expires on: August 6, 2025 Credentials - AmitRaj-8869 | Microsoft Learn 6. Recognition certificate from Fidelity for designing global solutions for Data exchange. 7. Got Achievement medal from DIB(Client) with appreciation for design event-based enterprise architecture & contribution – EventHub SUMMARY • Overall total of 18.5+ years of Experience years of Experience in Application Design, Development & Deployment of Hadoop Eco System/Java/J2EE systems with good exposure to Enterprise Architecture. • Relevant Experience 9.2 years in Big Data technologies working with multiple clients and domain knowledge. • Experienced in Cassandra data modelling, cluster setup and data management. • Experienced in working with Spark-SQL, Spark SQL and Spark Structure Streaming, MLib to process and analyse data queries. • Experienced in designing solutions using Spark Streaming and Kafka Streaming for Payment Gateway/point of sales events. • Individual Contribution (Kafka Architect): Delivered UAT and PROD Cluster within the timeline for Kafka cluster using Cloudera 6.x, CSP 2.0. • Implemented a unified data platform to gather data from different sources using Kafka Producers and consumers in Scala and java. • Solid background in Object-Oriented analysis & design, UML and various design patterns. • Worked using Azure cloud(Blob, EventHub), Kubernetes, docker with Spark, scala, Schema Registry, Avro Schema with home security application for Honeywell • Implemented KSQL, KTable and KStream using Confluent Kafka along with Kafka Connect. • Hands-on Data bricks - Databricks Clusters, Data Lakehouse, Delta lake, DBFS, EXPLORE, Analyze, Clean, Transform and Load Data using Databricks. • Experience with Azure: Azure Synapse Analytics, ADLS, ADF, CosmoDB, Azure Function, Stream Analytics, Power BI. • Experience with SQL and NoSQL databases including Mysql, Oracle, Cassandra, and PostgreSQL. BigTable • Experience building and optimizing the ‘big data’ data pipeline. • Experince with Azure Devops, CI/CD pipeline, Kubernetes and docker • Motivated Technical Architect with 5 years of progressive experience. • Having Experience AWS (Ec2,S3) • Having experience with Snowflake to design data lake and load data from multiple sources to the Snowflake database. • Effectively manages assignments and team members. • Dedicated to self-development to provide expectation-exceeding service. Customer-focused, successfully contributing to company profits by improving team efficiency and productivity. • Utilizes excellent organizational skills to enhance efficiency and lead teams to achieve outstanding delivery. SKILLS == =================== Database architecture Database architecture development Data Architecture Big Data ETL Technical solution development Azure data solutions Data insight provision Technical guidance IT Architecture Technical solutions Big data frameworks Technical Skills: Hortonworks2.5, Cloudera5/6, Apache Hadoop2/3 ,Spark2/3,Apache Kafka, Confluent Kafka, Hive 2/3, Impala, Sqoop, OOZie, Zookeeper, Snowflake, Data Build tool (DBT), HBase, Apache Cassandra /DataStax Cassandra, Data Bricks, Azure Cloud, AWS cloud, Talend, Airflow etc. Programming Language Python , Scala & Java Other Tools Kibana, Logstash, ElasticSearch, ELK. ============================= PROJECT UNDERTAKEN: Project: Implementation of Data Warehouse and reporting platform Roles: Databricks Architect & Engineer Teams: 12 members Technical Skills: Azure Cloud, Azure Data Factory (ADF), ADLS, Databricks, Spark3.x, Python, Scala2.15, DB2, Oracle 12g, Azure SQL My Contribution Data Bricks Infrastructure Solution: - Configured Unified Data Access Control using Unity Catalog – E1 & BY System provide a specific set of permissions, like Read Only, or, Write Only to a specific Group of Users on one, or, some of the Delta Tables, or, even at the Row Level, or, Column Level, which can contain Personally Identifiable Information, i.e., PII, of that Delta Tables - Provide Data Governance with centralized place: administer (TAI) the access to the data, and, also audit the access to the data. - Applied Data lineage for E1 & BY tables with look-up tables using Unity Catalog. - Implemented Data sharing protocol to apply secure data sharing downstream using Unity Catalog and - Design Architecture of Unity Catalog which can be linked to multiple Databricks Workspaces- DEV, UAT, PROD environment. - Created Metastore for the Unity Catalog - Apply User Management of the Unity Catalog for the TAI Lakehouse project: Users, Groups, or, the Service Principle, and, the permissions those have - Configure Data Bricks Cluste with spark 3.x for DEV, UAT & PROD for TAI -E1 & BY System. - Design & apply medallion architecture, Setup a Data Lake house with Bronze, Silver and Gold layers of a storage system using Azure Data Lake Gen2. Azure Cloud Infra and Security: - Install self-hosted integration runtime for the DB2 ON DEV, UAT & PROD and Oracle on-prem cluster on the source system. - Install Azure Virtual network managed IR On DEV, UAT & PROD. - Installed Db2 connector on DEV, UAT & PROD. - Created linked service lnk_BY_Azure_SQL, lnk_E1_Azure_SQL, lnk_Db2_E1 - Install and Configure Azure Key-Vault, added all the credential for Azure SQL, ADLS, Databricks, Users, global users, linked service to Azure Key Vault. – DEV, UAT & PROD. - Created 3 nodes for DEV and 5 nodes for PROD cluster to migrate data. - Setup and configure Azure Active Directory to provide team access policy for Databricks cluster, Azure Data Factory, Azure SQL, Azure Data Lake house. - Coordinate with TAI Client and Microsoft support team to resolve throughput issues. As Azure & Databricks Data Engineer: - developed most critical data ingestion pipelines using Azure Data Factory (ADF) for E1 to migrate 12.8TB of 120 tables from Db2 to ADLS RAW as a parquet file. There are many large tables with 2-4 TB of volume data containing 400 to 800 million records. - Initial & Incremental migration pipelines for both the E1 and BY sources with a watermark based on Julian's date & time - Design Audit table (Process log) and Control table (System) to achieve dynamic pipeline and audit information for master and child pipeline. - Design architecture solution to achieve delete for PKSNAPSHOT – E1 & BY. - Build dynamic delete pipeline using ADF (load PKTBL), Databricks PySpark for Daily, Weekly, OnDemand, and Yearly frequency to delete records from target (Analytics layer – gold layer) based on source system delete column and delete table - Build transformation using Datarbicks Spark with Scala for E1 to - Apply a transformation with a lookup table and transform to the Silver layer. - Build transformation to transform on Analytics layer (Gold) using Databricks Spark & scala. - Implemented UPSERT using Spark Structure Streaming with 5 minutes on the Analytics layer - Design pipeline architecture for master pipeline, child pipeline with different activity ID, Pipeline ID, Master pipeline ID with different pipeline Run ID to make sure for smooth transition audit. - Build logic, developed using Pyspark on Datarbicks – applied on DEV, UAT & PROD to check counter – master pipeline IN PROGRESS - or NOT so that pipeline execution should not overlap. - Pass pipeline parameter to insert or update Audit/Control table using Databricks -Pyspark. - Monitor Performance in DEV & PROD, worked with the team to reduce time. - Milestone – to achieve 10-minute SLA for Incremental load on E1 & BY (end-to-end completion time) - Milestone – achieved 1.53.45hrs to load (400millions record with2.3TB) at RAW as parquet file using ADF pipeline - Interact with Azure Devos’s engineer to build a CICD pipeline for DEV, UAT & PROD with - Develop pipeline as POC using Databricks Workflow, compare the cost with Azure Pipeline, and present to the client. My Contribution to Past Project: Project: Data Exchange (Security Framework) Roles: Technical Lead & Architect – Confluent KStream & KSQL Client: Fidelity & Westpac Team: 9 members Technical Skills: AzureDevops, Jdk 19.0, Confluent Kafka, Kstream, KSQL, Azure Databricks, DBFS, Delta lake, Azure Data Factory, ADLSGen2, Confluent Schema Registry, AES Algorithm, Hash Algorithm, Kubernetes Cluster(AKS).
I have four years of working experience in the IT industry. I have mainly worked as a python developer. I'm into data science and machine learning also. I have certifications in python, data science and machine learning. I also have two years of experience in teaching Python. I am very passionate about teaching, and I believe Python is one of the most powerful, fast and easy to learn languages out there, and I want to reach out to a vast number of audiences through this. Python has facilitated a tremendous amount of data science projects, and I wish to open that door for more people who are interested in this field.
I have four years of working experience in the IT industry. I have mainly worked as a python developer. I'm into data science and machine learning also. I have certifications in python, data science and machine learning. I also have two years of experience in teaching Python. I am very passionate about teaching, and I believe Python is one of the most powerful, fast and easy to learn languages out there, and I want to reach out to a vast number of audiences through this. Python has facilitated a tremendous amount of data science projects, and I wish to open that door for more people who are interested in this field.
Python basic and advance with real time scenario. Daily Assignment , Interview Preparation
Python basic and advance with real time scenario. Daily Assignment , Interview Preparation
I have 5 years of experience in Python scripting including numpy,scipy for data analsysis. Training is on real time scenarios and projects. I will be providing 2 real time projects for the practice.
I have 5 years of experience in Python scripting including numpy,scipy for data analsysis. Training is on real time scenarios and projects. I will be providing 2 real time projects for the practice.
I am a Software Engineer working in the field of Artificial Intelligence for the past 4 years. I am providing home/online classes for Python and Artificial Intelligence.I have worked on projects in collaboration with Microsoft,Intel,SMR,Flipkart,Mc Donalds,Reliance .
I am a Software Engineer working in the field of Artificial Intelligence for the past 4 years. I am providing home/online classes for Python and Artificial Intelligence.I have worked on projects in collaboration with Microsoft,Intel,SMR,Flipkart,Mc Donalds,Reliance .
I am an experienced and qualified AI, Machine Learning, and Python trainer with over 24 years of industry and teaching experience. I have trained thousands of professionals, developers, and students, equipping them with practical AI and ML skills. As a patent holder in AI and a recipient of awards like 'Women in AI' and 'divHERsity Champion,' I blend innovation with hands-on learning. Passionate about simplifying complex technologies, I have empowered countless learners to excel in tech-driven roles. My experience spans corporate leadership, entrepreneurship.
I am an experienced and qualified AI, Machine Learning, and Python trainer with over 24 years of industry and teaching experience. I have trained thousands of professionals, developers, and students, equipping them with practical AI and ML skills. As a patent holder in AI and a recipient of awards like 'Women in AI' and 'divHERsity Champion,' I blend innovation with hands-on learning. Passionate about simplifying complex technologies, I have empowered countless learners to excel in tech-driven roles. My experience spans corporate leadership, entrepreneurship.
I am a seasoned technology professional with 19+ years of experience in AI, Machine Learning, Data Science, Cloud Computing, and Software Engineering. Over my career, I have led and mentored teams across EdTech, Banking, and Global Enterprises, delivering successful projects in Python, .NET, Java, React, SQL, and Cloud platforms (AWS, Azure, GCP). I enjoy teaching and mentoring students/professionals to build strong foundations in programming, data analytics, AI/ML, and modern software development practices. My teaching style is practical and industry-oriented, with real-world examples from my projects like predictive maintenance, Generative AI assistants, and cloud-native applications. I provide online training, doubt clarification, and career guidance for students and working professionals, covering topics like: Python, Java, C#, SQL, and Data Structures AI/ML and Data Science fundamentals Cloud (AWS, Azure, GCP) and DevOps basics. Software Engineering best practices and interview preparation. My goal is to help learners gain confidence, hands-on skills, and career-ready knowledge.
I am a seasoned technology professional with 19+ years of experience in AI, Machine Learning, Data Science, Cloud Computing, and Software Engineering. Over my career, I have led and mentored teams across EdTech, Banking, and Global Enterprises, delivering successful projects in Python, .NET, Java, React, SQL, and Cloud platforms (AWS, Azure, GCP). I enjoy teaching and mentoring students/professionals to build strong foundations in programming, data analytics, AI/ML, and modern software development practices. My teaching style is practical and industry-oriented, with real-world examples from my projects like predictive maintenance, Generative AI assistants, and cloud-native applications. I provide online training, doubt clarification, and career guidance for students and working professionals, covering topics like: Python, Java, C#, SQL, and Data Structures AI/ML and Data Science fundamentals Cloud (AWS, Azure, GCP) and DevOps basics. Software Engineering best practices and interview preparation. My goal is to help learners gain confidence, hands-on skills, and career-ready knowledge.
I am AI engineer with 5.5 years of experience, currently working at Mercedes Benz, having expertise of Machine Learning, Natural Language Processing and Deep Learning and MySQL Database with real time project deployment
I am AI engineer with 5.5 years of experience, currently working at Mercedes Benz, having expertise of Machine Learning, Natural Language Processing and Deep Learning and MySQL Database with real time project deployment
Learn from an IITIian 25 years in IT industry , MTech from IIT Kharagpur in computer science Hands on on python coding and works on string fundamentals along with data science, Machine learning AI Get hands on from an expert
Very good teacher. Explains concepts very well. Clears all doubts. Very kind and patient teacher. Very satisfied with the learning experience
Learn from an IITIian 25 years in IT industry , MTech from IIT Kharagpur in computer science Hands on on python coding and works on string fundamentals along with data science, Machine learning AI Get hands on from an expert
Very good teacher. Explains concepts very well. Clears all doubts. Very kind and patient teacher. Very satisfied with the learning experience
.
.
I have worked as Assistant Professor in engineering colleges. I have an experience of 3 years in teaching. I have completed my post graduation in Computer Science & Engineering.
I have worked as Assistant Professor in engineering colleges. I have an experience of 3 years in teaching. I have completed my post graduation in Computer Science & Engineering.
I have completed my course in java full stack development and I got job in IT company the role of full stack java developer and while doing the course I did many projects I want to discuss few that are Banking application by using java full stack and portfolio by using frontend technologies HTML,CSS,JS.
I have completed my course in java full stack development and I got job in IT company the role of full stack java developer and while doing the course I did many projects I want to discuss few that are Banking application by using java full stack and portfolio by using frontend technologies HTML,CSS,JS.
I have been working on AWS for more than 2.8 years. I have worked on several client projects. As a DevOps engineer job has been to build and optimize infrastructure, build CI/CD pipeline to ensure continuous integration and deployment with zero downtown. I also have extensive experience working on Kubernetes cluster and Docker containers. I am AWS Certified Developer Associate.
I have been working on AWS for more than 2.8 years. I have worked on several client projects. As a DevOps engineer job has been to build and optimize infrastructure, build CI/CD pipeline to ensure continuous integration and deployment with zero downtown. I also have extensive experience working on Kubernetes cluster and Docker containers. I am AWS Certified Developer Associate.
Have trained batchmates with proper study material
Have trained batchmates with proper study material
Provides Python Training at Basic to Expertise level
Provides Python Training at Basic to Expertise level
I have an expereince of one year teaching in coaching institute.i taught mathematics science and information technology to cbse and state board student. I personally help the student in who is not able to student. I enjoyed teacjing
I have an expereince of one year teaching in coaching institute.i taught mathematics science and information technology to cbse and state board student. I personally help the student in who is not able to student. I enjoyed teacjing
I am currently working as a Python Developer and have 7 years of experience. I have worked with Oracle, McAfee and ST-Ericsson before my current job. I have provided training to students at a training institute and also worked with startups to provide basic training on Python. I have been involved in designing and developing python applications from scratch and currently we have developed web based reporting/calculation application. Apart from python the training would help in logically soling a given problems.
I am currently working as a Python Developer and have 7 years of experience. I have worked with Oracle, McAfee and ST-Ericsson before my current job. I have provided training to students at a training institute and also worked with startups to provide basic training on Python. I have been involved in designing and developing python applications from scratch and currently we have developed web based reporting/calculation application. Apart from python the training would help in logically soling a given problems.
• AI/ML expert around 12 years’ experience in managing project delivery, planning, analysis, design & core program coding, installation, implementation & testing for large & complex systems in Banking & Retail Domain. • 4 years exp as project manager in Implementation, Change Management & Capacity • planning in managing & controlling large and complex change projects. • I am also PRINCE2 Practitioner, ITIL Intermediate Service Delivery (SD), Lean Six Sigma – Green Belt, DB2 admin, MS Project 2013 certified • 5 years hands-on experience in Business Analytics analyzing both qualitative & quantitative data with data visualization and predictive analysis using SAS, R, PYTHON, SPSS, IBM Watson Analytics & Excel • 5 years’ hands-on experience in Data Visualization with TABLEAU, QLIKVIEW, IBM Watson Analytics • 5 years’ experience in Machine Learning/AI models using R & PYTHON like Regression models (linear, Polynomial, SVR. Decision Tree, Random Forest), Classification models (Logistic Regression, K-NN, SVM), Clustering models, Association Rule Models (APRIORI, Eclat) ,Deep Learning models, Natural language processing(NLP) & Deep Learning models (Neural Networks )
• AI/ML expert around 12 years’ experience in managing project delivery, planning, analysis, design & core program coding, installation, implementation & testing for large & complex systems in Banking & Retail Domain. • 4 years exp as project manager in Implementation, Change Management & Capacity • planning in managing & controlling large and complex change projects. • I am also PRINCE2 Practitioner, ITIL Intermediate Service Delivery (SD), Lean Six Sigma – Green Belt, DB2 admin, MS Project 2013 certified • 5 years hands-on experience in Business Analytics analyzing both qualitative & quantitative data with data visualization and predictive analysis using SAS, R, PYTHON, SPSS, IBM Watson Analytics & Excel • 5 years’ hands-on experience in Data Visualization with TABLEAU, QLIKVIEW, IBM Watson Analytics • 5 years’ experience in Machine Learning/AI models using R & PYTHON like Regression models (linear, Polynomial, SVR. Decision Tree, Random Forest), Classification models (Logistic Regression, K-NN, SVM), Clustering models, Association Rule Models (APRIORI, Eclat) ,Deep Learning models, Natural language processing(NLP) & Deep Learning models (Neural Networks )
Greeting from CAREER, CAREER is a best-in-class learning solutions organization in India's IT capital, Bengaluru. We offer a wide range of courses in the area of software. Our institute provides job aspirants a platform to build a career in the growing IT sector. We skill, reskill, and upskill freshers and working professionals. Get trained & become a certified exert. We provide real time hands-on trainings on AWS devOps, AZURE devOps, Artificial Intelligence, Machine Learning, Python, Linux Administration, GCP, RPA, Hadoop, Data science, Data Analytics, Full Stack Developer, Salesforce, Cybersecurity Power BI, Tableau, Microsoft. Net, SQL and other required courses. Real-time hands-on training with lab sessions. Well experience trainers with good hands-on Knowledge. Weekdays and weekend batches available. Every weekday one hour hands-on real time training. We help in providing the placements for the candidates. 100% placement oriented trainings. Job related query/Discussion. Our professional trainers are having 10+ years of hands on experience in IT domain were they will be training on real time hands-on training with lab sessions. we provide demo session for the students before joining. Each course minimum duration is 2 months. Offline class/ Online training will be provided to the students.
Greeting from CAREER, CAREER is a best-in-class learning solutions organization in India's IT capital, Bengaluru. We offer a wide range of courses in the area of software. Our institute provides job aspirants a platform to build a career in the growing IT sector. We skill, reskill, and upskill freshers and working professionals. Get trained & become a certified exert. We provide real time hands-on trainings on AWS devOps, AZURE devOps, Artificial Intelligence, Machine Learning, Python, Linux Administration, GCP, RPA, Hadoop, Data science, Data Analytics, Full Stack Developer, Salesforce, Cybersecurity Power BI, Tableau, Microsoft. Net, SQL and other required courses. Real-time hands-on training with lab sessions. Well experience trainers with good hands-on Knowledge. Weekdays and weekend batches available. Every weekday one hour hands-on real time training. We help in providing the placements for the candidates. 100% placement oriented trainings. Job related query/Discussion. Our professional trainers are having 10+ years of hands on experience in IT domain were they will be training on real time hands-on training with lab sessions. we provide demo session for the students before joining. Each course minimum duration is 2 months. Offline class/ Online training will be provided to the students.
Experienced Software Engineer with a demonstrated history of working in the pharmaceuticals industry. Skilled in Data Science,machine Learning ,Business Intelligence and SQL.Strong engineering professional with a Bachelor of Engineering - BE focused in Electrical, Electronics and Communications Engineering from Siddhartha Institute of Engineering & Technology.
Experienced Software Engineer with a demonstrated history of working in the pharmaceuticals industry. Skilled in Data Science,machine Learning ,Business Intelligence and SQL.Strong engineering professional with a Bachelor of Engineering - BE focused in Electrical, Electronics and Communications Engineering from Siddhartha Institute of Engineering & Technology.
I worked as a software developer for 2+ years, and as you know, developers have computer skills and programming languages like java, python, and javascript. I have a BE degree in computer science.
I worked as a software developer for 2+ years, and as you know, developers have computer skills and programming languages like java, python, and javascript. I have a BE degree in computer science.
I am working as a data scientist in worlds ine of the reputed company, I am having good Experience in Machine Learning, Data Science, Deep Learning, Natural Language Processing, SQL , HTML ,CSS, Angular , my experties in Machine learning, Time series, forecasting.
I am working as a data scientist in worlds ine of the reputed company, I am having good Experience in Machine Learning, Data Science, Deep Learning, Natural Language Processing, SQL , HTML ,CSS, Angular , my experties in Machine learning, Time series, forecasting.
Experienced software engineer with a solid background in Python, web development, data scraping, and analysis. Holding a bachelor's degree in mechanical engineering, I possess a strong aptitude for problem-solving and a keen eye for detail. Beyond my technical prowess, I am also passionate about sharing knowledge and have successfully taught mathematics, physics, and programming. Whether you need assistance in honing your skills or building a job-ready profile, I am here to provide expert guidance and support.
Experienced software engineer with a solid background in Python, web development, data scraping, and analysis. Holding a bachelor's degree in mechanical engineering, I possess a strong aptitude for problem-solving and a keen eye for detail. Beyond my technical prowess, I am also passionate about sharing knowledge and have successfully taught mathematics, physics, and programming. Whether you need assistance in honing your skills or building a job-ready profile, I am here to provide expert guidance and support.
I have completed my BTECH in CSE and have 4.10 years of experience in IT INDUSTRY and I am good in teaching and making students learn in easy way.
I have completed my BTECH in CSE and have 4.10 years of experience in IT INDUSTRY and I am good in teaching and making students learn in easy way.
Browse hundreds of experienced dance tutors across Bangalore. Compare profiles, teaching styles, reviews, and class timings to find the one that fits your goals — whether it's Automation with Python, Core Python, Data Analysis with Python, and more
Select your preferred tutor and book a free demo session. Experience their teaching style, ask questions, and understand the class flow before you commit.
Once you're satisfied, make the payment securely through UrbanPro and start your dance journey! Learn at your own pace — online or in-person — and track your progress easily.
Find the best Python Training Tutor classes
Selected Location Do you offer Python Training classes?
Create Free Profile >>You can browse the list of best Python tutors on UrbanPro.com. You can even book a free demo class to decide which Tutor to start classes with.
The fee charged varies between online and offline classes. Generally you get the best quality at the lowest cost in the online classes, as the best tutors don’t like to travel to the Student’s location.
It definitely helps to join Python Training classes near me in Mico Layout Police Station, Bangalore, as you get the desired motivation from a Teacher to learn. If you need personal attention and if your budget allows, select 1-1 Class. If you need peer interaction or have budget constraints, select a Group Class.
UrbanPro has a list of best Python Training classes
Python’s main weaknesses include slower performance compared to compiled languages, limited support...
It depends on the Learner, Let me assume a person who don't know anything related to programming. ...
total = 0 with open('input.txt', 'r') as inp, open('output.txt', 'w') as outp: for line in inp: ...
As an experienced tutor registered on UrbanPro.com, I'd be delighted to provide you with comprehensive...
Here are some good books on advanced topics in Python: 1. **"Fluent Python" by Luciano Ramalho**: Focuses...
Operation Syntax Function Bitwise And a & b and_(a, b) Bitwise Exclusive Or a ^ b xor(a, b) Bitwise Inversion ~ a invert(a) Bitwise...
Day 1: Python Basics Objective: Understand the fundamentals of Python programming language. Variables and Data Types (Integers, Strings, Floats,...
Question: As a Network Administrator, one of my trainee wants to know the below: He has certain number of Machines(IP’s). On daily basic he...
MICROSOFT PROJECT contains project work and project groups, schedules and finances.Microsoft Project permits its users to line realistic goals for project...
Currently, in the programming world, Python is one of the languages with a high rising demand profile. And this article will explain why that isn’t...