Overall, 14 Years of experience in the fields of Big Data / BI and GCP
Certified Google Cloud Professional cloud Architect with 3 years of experience
Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as Google BigQuery,DataFlow,DataFusion,Cloud Function,Pub\Sub, Composer,Airflow,Google Cloud Storage
Good experience in Data migration from on-prem MSSQL to Azure Cloud Snowflake DB
Completed Azure DP-900 certification
Knowledge in various ETL and Data Integration development tools like Informatica,Abi Intio and IBM Data Stage
Experience in Data Validation automation tool
Experience in Business Intelligence testing in various reports using Tableau, Power BI and Cognos framework tools
Good experience in Management tools as Azure Deveps, Jira,ALM and VSTS
Experience in preparing Test Strategy, Test Plan and Test estimation
Worked in Agile and Waterfall models
Good knowledge in good automation tools
Expertise in analyzing& reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code.
Gained expertise in design/architecture of Bigdata platforms and cloud technologies, building infrastructures with a secure solution for multi-site data centers and protecting/securing large volumes of data.
Good experience on multi-cluster architecture like on-premises to cloud, cloud to cloud architecture
Strong experience in delivering Big Data related projects using open-source technologies like Hadoop,py Spark, Sqoop, Hive, HBase, Kafka, Oozie, bigquery,gcs
Extensive Work Experience of infrastructure domains. E.g. Public Google Cloud Platform, Operating Systems like UNIX, Windows,
Extensive experience in implementing DevOps methodologies on Cloud platforms and through hands on experience in designing & creation of CI/CD pipelines with the tools like Jenkins, GIT, GitHub.
Having good Project Management skills which involves initiating, planning, executing, monitoring, controlling
Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing.
Experience in optimizing ETL workflows.
Good experience on Hadoop tools related to Data warehousing like Hive and also involved in extracting the data from these tools on the cluster using pyspark JDBC API.
Skilled in executing programming code for intermediate to complex modules following development standards, planning and conducting code reviews for changes and enhancements that ensure standards compliance and systems interoperability.
Hands-on experience in working on Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager, Application Master, YARN and MapReduce Concepts.
Excellence in managing Hive data warehouse tool-creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries.
Extensive expertise in Extracting and Loading data to various databases including Oracle, MS SQL Server, Teradata, Flat files, XML files using Talend.
Extensive expertise in developing XSD, XSLT and preparing XML files compatible to the xsd to parse the xml data into flat files to process into HDFS.
Good Experience in working with SerDe’s like Avro Format, Parquet format data.