Data and Databricks Test Automation Engineer
Our client is a global financial services organisation that builds large scale data platforms for major international customers.
They are growing their data engineering capability and now need a Test Automation Engineer who knows Databricks and data pipelines to make sure the data they deliver is accurate, secure and reliable.
You will work with the data engineering team to test Databricks notebooks, Delta Lake tables, streaming jobs and ETL pipelines, and to put proper automated testing around them.
It is a hands on role.
What you will do
Design and build automated tests for Databricks notebooks, workflows and Delta Lake tables
Test data pipelines end to end including ETL and ELT jobs
Put in place data quality checks, reconciliation and performance tests for Spark jobs
Create test dashboards, reports and documentation
Work with engineers to fix issues and improve test coverage
Must have
At least 2 years experience working with Databricks and Spark in a data environment
Strong Python experience ideally PySpark and good SQL
Experience testing data pipelines or ETL processes
Good knowledge of Delta Lake or Lakehouse style architecture
Experience using Git or other version control
Experience working in agile teams
Would like
Experience with AWS data services such as S3, Glue or Lambda
Experience with data quality tools or frameworks
Experience testing Unity Catalog or access and governance rules
Experience building test automation frameworks for data platforms
Experience in financial services or other regulated environments
If you are a data-focused test engineer who wants to work on modern Databricks and AWS platforms in a global environment, send your CV for immediate review.
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology and Engineering
Industries
Banking, Investment Banking, and Financial Services
Location
Dublin, County Dublin, Ireland
#J-*****-Ljbffr