Dublin 2, Ireland | Posted on 10/03/2026
If you enjoy building dataplatforms that directly power business decisions and financial reporting, this role offers the opportunity to make a real impact.
At Engineering Insights, we are looking for a Principal Cloud Data Engineer to help design and implement modern data platforms that support complex financial reporting environments. This is a highly hands‑on role where you will work on building scalable datapipelines, evolving platform architecture, and delivering production‑grade solutions that enable reliable, audit‑ready data operations.
You will collaborate closely with senior architects, finance and data stakeholders, and a small remote engineering team to design and implement robust data systems. The role combines deep technical work with technical leadership, ensuring solutions are both architecturally sound and practical for real‑world production environments.
This position also involves close collaboration with our client’s teams, working from the Dublin office to ensure strong alignment between engineering solutions and business requirements. At Engineering Insights, we believe in practical engineering excellence, designing systems that are scalable, transparent, and built to support long‑term business growth.
Key Responsibilities
Design, build, extend, and maintain scalable datapipelines (batch and event‑driven)
Implement new platform features and architectural extensions as business and technical requirements evolve
Develop proof‑of‑concept (POC) solutions to validate new technical or business capabilities
Build and maintain layered data architectures (Raw → Standardised → Data Mart)
Centralise and productionise complex business logic and financial metrics
Ensure auditability, traceability, and governance across all data transformations
Lead and mentor a small remote engineering team (3–5 engineers)
Define and promote engineering standards and best practices
Contribute to infrastructure setup and CI/CD processes for data workloads
Work directly with finance, FP&A, and data stakeholders to translate business logic into technical solutions
Troubleshoot and optimise performance, reliability, and scalability of data systems
Participate in client meetings, representing engineering solutions clearly and professionally
Requirements
10+ years of experience in related Data Platform Engineering roles
Strong Python and/or SQL skills (advanced level)
Proven experience building and maintaining production ETL/ELT pipelines
Experience with data modelling including fact/dimension and finance‑oriented models
Experience handling multi‑source data ingestion (databases and file‑based sources such as Excel)
Strong understanding of pipeline monitoring, logging, retry logic, and failure handling
Experience with AI‑assisted coding tools (e.g., GitHub Copilot)
Experience working with Git and Git submodules
Hands‑on experience with AWS data stack (e.g., S3, Redshift, Glue, Lambda, Step Functions)
Experience with Infrastructure as Code (Terraform or similar)
Strong understanding of layered data architecture principles
Ability to design systems that scale with increasing data volumes and complexity
Experience with performance optimisation
Understanding of data lineage, governance, and traceability principles .
Nice to Have
Experience in insurance, reinsurance, or financial services
Familiarity with financial reporting and accounting concepts
Experience in integrating data platforms with PowerBI or similar BI tools
Experience with Azure Repos or Azure Pipelines
Exposure to commission or premium calculation logic
Experience leading small, distributed engineering teams
Experience working in regulated environments
What to Expect
Competitive salary with performance‑based bonuses
Hybrid work flexibility
Opportunities for professional development through trainings, and seminars
#J-18808-Ljbffr