Our client is a global financial services organisation that builds large scale data platforms for major international customers. They are growing their data engineering capability and now need a Test Automation Engineer who knows Databricks and data pipelines to make sure the data they deliver is accurate, secure and reliable.You will work with the data engineering team to test Databricks notebooks, Delta Lake tables, streaming jobs and ETL pipelines, and to put proper automated testing around them. It is a hands on role.What you will doDesign and build automated tests for Databricks notebooks, workflows and Delta Lake tablesTest data pipelines end to end including ETL and ELT jobsPut in place data quality checks, reconciliation and performance tests for Spark jobsCreate test dashboards, reports and documentationWork with engineers to fix issues and improve test coverageMust haveAt least 2 years experience working with Databricks and Spark in a data environmentStrong Python experience ideally PySpark and good SQLExperience testing data pipelines or ETL processesGood knowledge of Delta Lake or Lakehouse style architectureExperience using Git or other version controlExperience working in agile teamsWould likeExperience with AWS data services such as S3, Glue or LambdaExperience with data quality tools or frameworksExperience testing Unity Catalog or access and governance rulesDesirableExperience building test automation frameworks for data platformsExperience in financial services or other regulated environmentsIf you are a data-focused test engineer who wants to work on modern Databricks and AWS platforms in a global environment, send your CV for immediate review.