over 7 years of experience in designing, implementing, and optimizing data pipelines and solutions with cutting-edge technologies to extract, transform, and load (ETL) large datasets efficiently. Collaborated with cross-functional teams to understand business requirements and translate them into scalable and reliable data architectures. Skilled in programming languages such as SQL and Python, with experience in big data processing frameworks including PySpark and Apache Spark SQL. Hands-on experience in cloud services as Azure Data Factory,Databricks and AWS Glue. Known for quickly acquiring new skill set to contribute significantly to diverse projects and further elevating the scope of impact within analytics and application development.
Over 4 years of experience within the hospital sector, showcasing a dynamic blend of software development and analytics process to enhance operational efficiency.Have collaborated extensively with cross-functional teams and functional heads for creating essential daily reports for higher management, medical practitioners financial stakeholders.This data-driven approach has consistently supported financial decision-making and insightful lab analyses.
Client: Narayana Health Hospital | Health City Cayman Islands | Aarthi Scans
Role: Data Engineer
Responsibilities:
Requirement was to Extract data from different sources [Pi server/Oracle/SAP] to staging using the transformation business logic as required and loading from staging tables to datawarehouse for Unconventional energy data using the interfaces created.
Client: Royal Dutch Shell
Role: ETL Developer
Responsibilities:
• Collaborate with onshore/offshore Functional Analyst to gather Business requirements,
translate and apply business rules for data transformations for ETL functions.
• Developed ETL’s as Enhanced Request, Change Request, New development as scoped by
business, deployed, optimized and maintained the data integrity.
• Excellent knowledge in doing Root Cause Analysis, debugging and fixing the defects raised by
business.
• Experience in creating RFC, Config files for deploying in various environment [UAT/INT/PROD].
• In depth knowledge in writing T-SQL Queries, stored procedures, CTE’s, triggers, Views.
• Experience in providing logging, Error handling by using Event handlers, custom logging for
SSIS packages and Exception data handling.
• Involved in preparing unit test cases, performing Code reviews, provided application
production support, monitored running of critical packages.
• Created complex ETL packages using SSIS to extract data from staging tables to data
warehouse with Incremental loads, Batch processing, Versioning tables, Skeleton approach
and defining data quality columns.
• Been able to deliver large amounts of changes in very little time and with great quality.
• Experienced in using Breakpoints, Containers -For loop, Sequence container, For each loop.
Data Flow Transformations -Lookups, Pivot task, Script task, Execute Sequel Task, Merge, Join,
Union, Conditional split, Derived column and loading data from Excel files, Flat files sources to SQL Server.
• Experience in handling production data for business and performed operations like Production
Build verification, Data reconciliation, validation and error handling after extracting data into
SQL server.
Programming Languages: SQL, Python (PySpark)
Big Data Processing: PySpark, Spark SQL
Cloud Platforms: Microsoft Azure (Azure Data Factory, Storage Accounts, Key Vault, Logic Apps)
ETL Tools: SQL Server Integration Services (SSIS), Databricks
Databases: MySQL, Oracle, PostgreSQL