INTRODUCTION
ATOM is the end-to-end insurance operating system, enabling any entity within the insurance lifecycle to operate the entirety of their activities and cooperate/interact with third parties. Its unique advantage is to incorporate not only insurance underwriting and claims handling, but also all associated support and corporate activities.
ATOM Technologies is the team behind the platform. Based in the DIFC Innovation Hub, ATOM Technologies is a group of more than 25+ passionate and dedicated insurance and software professionals on a mission to upgrade the insurance industry from the inside out.
SUMMARY
The Data Engineering Lead is responsible for designing, building, and operating the company's data platform on AWS, including data lakes, data marts, and analytical/reporting solutions. The role combines hands-on engineering with technical leadership, mentoring, and best‑practice enforcement across the data engineering team. The role is based in our headquarter in DIFC, Dubai, United Arab Emirates. The role requires a mandatory base location in Dubai.
KEY RESPONSIBILITIES
Architecture & Data Platform
* Design and evolve the cloud data architecture on AWS (S3 data lake, Glue ETL, data marts, Athena, Redshift/other warehouses, BI tools).
* Define and implement data modelling standards (dimensional models, star/snowflake schemas, canonical data models) to support analytics, ML, and reporting use cases.
Hands‑on Engineering & Delivery
* Develop, test, and maintain robust ETL/ELT pipelines using AWS Glue, Python, SQL, and related services to ingest and transform data from multiple sources into the data lake and data marts.
* Optimise performance and cost of data workloads (partitioning, compression, file formats such as Parquet/ORC, query tuning in Athena and data warehouse).
Analytics, BI, ML & AI Enablement
* Partner with data scientists, ML engineers, and analytics teams to productionise ML/AI and LLM-based solutions, ensuring reliable data pipelines and feature stores.
* Enable self‑service reporting and BI by exposing curated, well‑documented datasets for dashboards and analytical tools.
Governance, Quality, and Security
* Define and enforce data governance, data quality, and metadata management practices across the platform.
* Implement and oversee security best practices (IAM roles and policies, encryption at rest and in transit, row/column‑level security, auditing and monitoring).
Leadership, Mentoring & Best Practices
* Lead and mentor a team of data engineers, setting standards for coding, testing, documentation, code review, and DevOps/CI‑CD practices.
* Conduct regular code reviews, provide technical guidance, and drive adoption of engineering best practices (clean code, reusable components, version control, testing, observability).
Collaboration & Stakeholder Management
* Work closely with product, engineering, and business stakeholders to understand data requirements and translate them into scalable technical solutions.
* Communicate technical concepts clearly to both technical and non‑technical audiences, supporting planning, estimation, and roadmap definition.
REQUIRED EXPERIENCE
* Strong hands‑on experience as a data engineer with recent focus on AWS (S3, Glue, Athena, Lambda, IAM; Redshift or similar warehouse).
* Excellent programming skills in Python and strong SQL across large datasets.
* Proven experience building and operating data lakes, data marts, and ETL/ELT pipelines in production.
* Solid background in data modelling for analytics and BI (dimensional modelling, slowly changing dimensions, fact/measure design).
* Exposure to ML/AI and LLM use cases (supporting feature pipelines, model inputs/outputs, monitoring).
* Strong understanding of data governance, security, and compliance in cloud environments.
* Demonstrated experience in team leadership, mentoring, code review, and driving engineering best practices.Ability to deeply understand complex data structures, schemas, and relationships across multiple source systems.
* Ability identify data quality issues, inconsistencies, and gaps, and drive remediation strategies.
* Ability to translate raw data into business-relevant narratives, metrics, and KPIs.
* Define data product vision, roadmap, and success metrics aligned with company strategy.
* Experience with additional AWS analytics/ML services (e.g. Redshift, EMR, SageMaker, MSK/Kafka).
* Experience with modern BI tools (e.g. Power BI, Tableau, Looker, QuickSight).
* Experience setting up CI/CD for data pipelines and infrastructure‑as‑code (e.g. Terraform, CloudFormation).