Job Description
The ideal candidate for this role will have a strong background in data engineering principles, database management, and programming skills. They will be responsible for developing, maintaining, and optimizing data pipelines and systems that support the acquisition, storage, transformation, and analysis of large volumes of data.
Key Responsibilities
* Design, develop, and maintain data pipelines in Azure for ingesting, transforming, and loading data from various sources into centralized Azure data lakes, Databricks Delta Lake, and Snowflake.
* Implement efficient ELT/ETL processes to ensure data quality, consistency, and reliability.
* Work closely with cross-functional teams to understand data requirements and translate them into technical solutions.
Requirements
* Bachelor's Degree level qualification in a computer or IT-related subject.
* 10+ years of overall IT industry experience.
* 8+ years of overall Bigdata data pipeline experience.
* 8+ years of experience as a Data Engineer, with a focus on designing and implementing data solutions on Azure Databricks.