I have a brand new role for a lead data engineer to join a payments company based in Dublin. This is a 12 month initial day rate contract role where you will be required to be onsite 3 days a week. You will be looking at up to €600 per day for this particular role.
Requirements:
* Advanced
SQL
developer with hands-on experience in
Databricks
, Snowflake,
Python
, and
PySpark
* Skilled in building reusable data models and pipelines using Hadoop, Apache NiFi, and cloud lakehouse architectures
* Experienced in orchestration tools such as Airflow,
Databricks
Workflows, and Step Functions
* Strong knowledge of
AWS
data services including S3, IAM, Glue, Lake Formation, networking, and encryption
* Proficient in CI/CD workflows using Git, Terraform/CloudFormation, and automated deployment practices
* Familiar with data governance and lineage tools such as Unity Catalog, OpenLineage, or Atlas
* Proven track record of building and optimizing
Spark/PySpark
pipelines in production environments
* Experience with Python packaging, OpenTelemetry, or financial crime analytics
* Comfortable working across
Azure, AWS
, and cloud data warehouse platforms
Key Responsibilities
* Define and enforce lakehouse architecture standards (bronze/silver/gold layers), schema governance, data lineage, SLAs, and cost controls
* Integrate diverse datasets using ingestion frameworks such as Apache NiFi, SFTP/FTPS, and APIs
* Architect secure and compliant AWS infrastructure using services like S3, IAM, KMS, Glue, Lake Formation, EC2/EKS, Lambda, Step Functions, CloudWatch, and Secrets Manager
* Champion data quality through anomaly detection, reconciliation, contract testing, and reliability metrics (SLIs/SLOs)
* Embed metadata and lineage using tools like Unity Catalog, Glue, or OpenLineage to support audit and compliance needs
* Drive CI/CD practices for data assets including infrastructure as code, test automation, versioning, and environment promotion
* Mentor engineers on distributed data performance, Delta Lake optimization, caching strategies, and cost-performance trade-offs
* Collaborate with data science, product, and compliance teams to translate analytical needs into scalable data models and serving layers
* Conduct code reviews and lead technical design discussions to ensure consistency and efficiency
* Establish secure data access controls including secrets management, key rotation, masking, and row/column-level permissions
* Support continuous improvement through backlog management, delivery tracking, and stakeholder engagement
* Participate in incident response including root cause analysis and preventative engineering
For more info -