Senior Etl Engineer Job in Accelirate Softech

Senior Etl Engineer

Apply Now
Job Summary

Senior ETL Engineer
Location: Pune

Role Overview:

We are seeking a Senior ETL Engineer to design, develop, and optimize our end-to-end data infrastructure, including ETL pipelines, data lakes, and data warehouses. This role is instrumental in enabling reliable, scalable, and high-performance data operations while aligning with data governance best practices. You will be an integral part of a collaborative data team, working closely with Data Engineers, Architects, Analysts, Scientists, and other stakeholders to modernize and migrate our data systems to the cloud.

Key Responsibilities:

  • Convert SSIS stored procedures into Matillion mappings, and develop associated job schedules and audit frameworks.
  • Design, build, and maintain efficient ETL pipelines to extract, transform, and load data from multiple sources into the data lake and data warehouse.
  • Implement both full and incremental loads using SCD (Slowly Changing Dimensions) and CDC (Change Data Capture) methodologies for real-time and batch data processing.
  • Develop and maintain robust error handling, job monitoring, and job audit mechanisms.
  • Conduct data profiling, data cleansing, and root cause analysis to resolve data inconsistencies and improve data quality.
  • Optimize data pipelines for performance, scalability, and data quality, using benchmarking and tuning techniques.
  • Build dashboards or reports for ETL job monitoring, performance tracking, and system health checks.
  • Support data governance practices, ensuring accuracy, availability, and security of production data.
  • Contribute to cloud migration initiatives, transitioning data workloads from on-premise systems to modern cloud data platforms.

Required Skills & Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience as an ETL Engineer, Data Engineer, or in a similar data engineering role.
  • Experience with cloud-based data platforms such as AWS, Azure, or GCP.
  • Proficiency with data warehousing technologies like Snowflake, Redshift, Azure Synapse, or Databricks.
  • Hands-on experience with ETL tools such as Matillion, SSIS, Informatica, AWS Glue, Azure Data Factory, StreamSets, STRiim, Oracle ODI, Coalesce, or Talend.
  • Solid understanding of data modeling principles (E/R and Dimensional Modeling) and tools like ER/Studio or ERwin.
  • Proficiency with SQL and programming/scripting languages such as Python, Scala, or Java.
  • Experience with database tools such as Microsoft SSMS or Oracle Developer.
  • Strong analytical, problem-solving, and data troubleshooting skills.
  • Excellent communication and collaboration abilities.
  • Ability to work efficiently in a fast-paced, agile, and dynamic environment.

Qualification :
Bachelor's degree in Computer Science, Engineering, or a related field.
Experience Required :

Minimum 5 Years

Vacancy :

2 - 4 Hires

Apply Now