Jobs
My ads
My job alerts
Sign in
Find a job Employers
Find

Data team lead

Dublin
Brightwater Recruitment
Posted: 27 April
Offer description

About the Company
OAK Global is a leading risk partner, embracing the global reach afforded through operating in the Lloyd’s market. Underwriting through Lloyd’s Syndicates 2843 and 1440.
Headquartered in London, OAK Global operates across key global markets, addressing complex risks with expertise across diverse industries. The company prides itself on fostering trusted partnerships, leveraging advanced analytics and underwriting expertise, and maintaining financial stability.
OAK Global is shaping the future of (re)insurance with innovation, reliability, and a forward-looking vision.
About the Position
OAK Global offers a rare opportunity to work within a true greenfield environment. Free from legacy systems, complex migrations, or transformation constraints, we are building a modern, best-in-class technology architecture designed to deliver exceptional business value from day one.
As a Data Engineering Lead at OAK Global, reporting into the Head of Risk Analytics & Data Science in London, you will have the opportunity to lead the build and implementation of OAK’s Data Platform from the ground up within a small multidisciplinary team of actuaries, data scientists, AI engineers and platform engineers. The platform will integrate a wide range of internal and external datasets to support capital and risk analytics, complex modelling workflows and proprietary applications across the business.
The platform is being built in AWS using modern data engineering practices and cloud-native infrastructure.
This is a senior contributor role focused on platform development. As the first engineer on OAK Global’s Data Platform, you will play a key role in shaping the architecture, tooling and engineering standards that underpin it.
Key Responsibilities

Play a central role in the design and evolution of OAK’s Data Platform
Lead the build and implementation of key data platform components including pipeline frameworks, data models and shared libraries
Set standards for quality, reliability and maintainability
Design schemas and storage structures across different layers of the platform
Translate platform architecture and requirements into robust, scalable implementations
Design and implement data ingestion and transformation pipelines combining fragmented datasets including exposure data, modelling outputs, internal analytics and unstructured information
Build shared libraries to ensure consistent data access across applications
Contribute to datasets supporting AI and modelling workflows across OCEAN

Data Governance, Quality & Reliability

Define data definitions and calculation logic for core datasets
Design and implement a data catalogue and metadata framework
Implement data reliability practices including validation, testing and monitoring
Ensure pipelines meet requirements around auditability, consistency and reproducibility
Collaborate with Capital & Risk, Underwriting and Finance & Operations teams to ensure data is reliable and well-defined
Work closely with the Head of Data & Integration to establish and uphold best-practice engineering standards across the platform including CI/CD, infrastructure-as-code and automated testing
Provide technical guidance and mentorship to engineers and analysts working with data

Experience / Requirements
Experience & Attributes:

Lead data engineering experience (Lead/Staff Data Engineer or similar)
Experience working across ingestion, transformation, storage and consumption layers and track record building production ETL pipelines in AWS
Experience designing and contributing to data platform architecture
BSc/MSc in Computer Science, Engineering, Mathematics, Physics or a related discipline, or equivalent practical experience.

Technical Skills:

Python, SQL
Experience with Data processing Libraries (e.g., Polars, Pandas, Dask, PyArrow)
S3, Glue, Lambda, ECS, DynamoDB
Redshift or similar analytical databases
ETL / ELT pipeline design and optimisation
Data modelling and schema design
Working with APIs and external data providers
Data quality, validation and monitoring

Software Engineering & DevOps

Git / GitHub
Infrastructure-as-code (IaC)
Strong understanding of modern software engineering and cloud architecture principles

Nice to Have:

Experience with open table formats (e.g., Iceberg, Delta Lake or similar)

What You’ll Build

A cloud-native data platform on AWS
Scalable data ingestion and transformation pipelines integrating internal systems, vendor models, APIs and unstructured data
Canonical data models and definitions used across analytics and AI workflows
Shared data access libraries and platform components
A data catalogue and metadata framework describing core datasets and lineage
Reliable pipelines with strong validation, monitoring and traceability

Why Join OAK Global?
Join a culture that values Creativity, Excellence, Determination, Authenticity, and Respect.
You’ll work closely with industry-leading talent, shape OAK Global’s technology direction, and directly enable the company’s strategic growth.
Remuneration Package
Highly competitive remuneration package
Location

Dublin 2 Location – TBC ASAP

#J-18808-Ljbffr

Apply
Create an E-mail Alert
Job alert activated
Saved
Save
Similar job
Human resources operation specialist
Dublin
Brightwater
Operations specialist
Similar job
Showroom consultant
Dublin
Brightwater
Showroom consultant
Similar job
Benefits analyst
Dublin
Brightwater
Benefit analyst
Similar jobs
jobs Dublin
jobs County Dublin
jobs Leinster
Home > Jobs > Data Team Lead

About Jobijoba

  • Company Reviews

Search for jobs

  • Jobs by Job Title
  • Jobs by Industry
  • Jobs by Company
  • Jobs by Location

Contact / Partnership

  • Contact
  • Publish your job offers on Jobijoba

Legal notice - Terms of Service - Privacy Policy - Manage my cookies - Accessibility: Not compliant

© 2026 Jobijoba - All Rights Reserved

Apply
Create an E-mail Alert
Job alert activated
Saved
Save