Job Overview
This role is designed for a skilled Data Engineer to contribute to the company's data architecture, pipeline development, and quality assurance.
Key Responsibilities
* Lead data engineering projects to design, build, and maintain scalable data pipelines
* Collaborate with cross-functional teams to drive business outcomes through data-driven insights
* Analyze data quality and develop strategies to improve data accuracy and reliability
Requirements
* 5+ years of experience in data engineering with expertise in AWS-native tools (Glue, Lambda, Step Functions)
* Strong proficiency in Python and SQL programming languages
* Familiarity with data warehousing, ETL/ELT processes, and cloud architecture principles
* Experience with orchestration tools like Airflow or Step Functions
* Knowledge of modern data platforms such as Databricks or Snowflake
* Understanding of CI/CD and version control best practices for data pipelines
Bonus Points
* Prior experience working with large-scale datasets and high-performance computing environments
* Expertise in machine learning algorithms and their applications in data engineering