Overview
Our client is a leading US healthcare company with offices in Dublin city centre. We are seeking a Senior Data Engineer to design, build, and maintain robust data pipelines and platforms. This role focuses on extracting, transforming, and loading (ETL) data from diverse sources into centralised systems such as data warehouses or data lakes, ensuring data quality, integrity, and availability for analytics, reporting, and advanced use cases.
This is a full-time position with a hybrid working model: a mix of office-based collaboration and remote working.
Responsibilities
* Data Integration: Ingest and integrate data from multiple structured and unstructured sources (databases, APIs, log files, streaming platforms, external providers). Consolidate data into unified, reliable datasets for analysis and reporting.
* Data Transformation & Processing: Develop routines to clean, normalise, and aggregate data. Apply processing techniques to handle large-scale and complex datasets, ensuring readiness for analytics, reporting, or machine learning.
* Engineering Best Practice: Contribute to shared frameworks, automation, and standards for code development, deployment, and pipeline orchestration.
* Data Governance: Implement controls and governance practices to ensure compliance with organisational and regulatory standards.
* Collaboration: Work with analytics, product, and infrastructure teams to define and implement best practices for scalable and reliable data solutions. Explore new tools and technologies to evolve the data platform, particularly in cloud-based environments.
* Monitoring & Support: Develop monitoring systems, alerts, and automated error-handling to maintain reliability and integrity of pipelines. Proactively resolve issues to ensure data availability.
Experience / Qualifications
* Degree in Computer Science, Information Technology, Data/Database Management, or a related field.
* Strong background in designing and delivering data solutions, including data modelling.
* Proficiency in PySpark, SQL, and Python for data processing, transformation, and analysis.
* Experience with orchestration tools such as Azure Data Factory or Airflow.
* Advanced SQL skills, including complex functions and windowing operations.
* Hands-on experience with DevOps tools and CI/CD pipelines.
* Familiarity with applying data governance frameworks in regulated environments.
Preferred Skills & Experience
* Experience with Azure Databricks and Snowflake.
* Exposure to machine learning/AI model deployment in production environments.
* Self-motivated and proactive, with strong ownership of deliverables.
Soft Skills
* Strong communication and collaboration abilities.
Remuneration Package
A strong salary is on offer, along with a generous benefits package.
Contact
Please contact Derek Smyth on 01 5927861 or click the apply button.
#J-18808-Ljbffr