We are looking for a highly experienced Senior Data Engineer to lead the design, development, and optimization of scalable data pipelines, APIs, and cloud data platforms. This pivotal role will focus on ETL/ELT processes using Matillion and Snowflake, as well as integration with Databricks and machine learning workflows. The ideal candidate is a data engineering expert with deep knowledge of modern data architecture, cloud platforms, and API development.
Key Responsibilities:
Design, develop, and manage ETL/ELT pipelines using Matillion and Snowflake.
Build and maintain scalable, secure RESTful APIs for data integration.
Collaborate with Data Scientists and ML Engineers to integrate data workflows with ML pipelines on Databricks.
Optimize Snowflake data warehouse performance and maintain data models.
Apply best practices for data quality, governance, and security.
Automate data validation and reconciliation processes.
Document architecture, processes, and technical designs.
Minimum Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
8+ years of experience in data engineering, data warehousing, and data integration.
Required Skills and Experience:
Advanced expertise with Matillion ETL and Snowflake.
Proficiency in Databricks and machine learning workflow integration.
Strong programming skills in Python, Java, or Scala.
Experience with API development frameworks and data platform integration.
Deep understanding of data modeling, warehousing, and ELT best practices.
Proficiency with SQL and CI/CD pipelines, Git, and DevOps methodologies.
Familiarity with cloud environments such as AWS, Azure, or GCP.
Strong understanding of data governance, security, and compliance frameworks.
You have successfully created your alert.
You will receive an email when a new job matching your criteria is posted.
Please check your email. It looks like you haven't verified your account yet. Here's what you're missing out on:
Didn't receive the link? Resend Verification Link