 
        
        Role: Data Engineer
Location: Dublin
Employment: 6 month fixed term contract initially
Superb opportunity for a hands-on Data Engineer to design, build and optimise scalable data solutions on Databricks and AWS within a global financial services company. You'll work between data engineering, analytics, and cloud architecture, building robust pipelines, supporting data products, and contributing to the modernisation of enterprise data platforms.
Responsibilities
 1. Develop, enhance, and maintain Data Lakehouse and Data Warehouse environments on AWS using Databricks.
 2. Design and implement ETL/ELT pipelines for large-scale data processing (Python, SQL, Spark).
 3. Create and optimise database schemas, tables, indexes, and stored procedures.
 4. Work closely with business stakeholders to gather requirements and deliver reliable, production-grade data solutions.
 5. Collaborate with cross-functional teams to develop and maintain data pipelines from source systems to analytics layers.
 6. Manage orchestration, version control (Git), and CI/CD processes within an Agile delivery model.
 7. Apply best practices in data modelling, security, and performance tuning.
Skills & Experience
 8. 5+ years' experience in Data Engineering / BI / DW development.
 9. 3+ years' hands-on experience with Databricks on AWS.
 10. Strong skills in Python, SQL, and modern data frameworks (Spark, Delta Lake).
 11. Solid understanding of data modelling, ETL, and data architecture principles.
 12. Experience with AWS services (S3, Glue, Lambda, Step Functions, IAM, VPC).
 13. Exposure to machine learning / AI initiatives is a plus.
 14. Background in financial services or regulated data environments desirable.
#LI-LDM