ETL Tester (Databricks)

ETL Tester (Databricks)

Your new company

The program involves building a new enterprise data platform using the Medallion Architecture (Bronze, Silver, Gold) to support regulatory, analytical, and reporting requirements. Due to the highly regulated nature of the data, there is a critical need for specialist data testing and quality assurance to ensure data accuracy, completeness, performance, and reliability.
The Databricks ETL Tester will play a key role in validating ETL pipelines, data transformations, aggregations, and migrated datasets, supporting confidence in the new platform and enabling safe decommissioning of legacy systems.


Your new role

The Databricks ETL Tester is responsible for designing and executing complex data testing activities across Databricks-based data solutions. The role ensures the integrity and quality of data across all layers of the platform and supports the migration of large-scale historical datasets from legacy environments.
The tester will work closely with Databricks Engineers responsible for backend pipeline development, while front-end and reporting solutions (e.g. Power BI) are delivered by a separate BAU team.

Technology Environment
  • Platform: Databricks on Microsoft Azure
  • Processing Framework: Apache Spark
  • Architecture: Medallion Framework
    • Bronze: Raw ingestion layer using watermarking for incremental loads
    • Silver: Transformation layer (joins, calculations, facts and dimensions)
    • Gold: Aggregated and curated datasets for consumption (e.g. Power BI)
  • Languages & Tools:
    • SQL (primary testing language)
    • Python
    • Databricks notebooks (hundreds in scope)
    • Databricks Gini (AI-assisted development)
  • Data Volumes: Very large datasets, including 10+ years of historical data

What you'll need to succeed

Required Skills & Experience (Essential)
  • Proven experience in data testing and data quality assurance
  • Very strong SQL skills (mandatory)
  • Hands-on experience testing ETL / ELT pipelines
  • Strong understanding of data modelling (facts and dimensions)
  • Ability to analyse complex data transformations and lineage
  • Experience working in Agile delivery teams
  • Strong attention to detail and high-quality defect documentation skills


What you'll get in return

A highly detail-oriented data testing specialist who is comfortable working with complex data models, large-scale ETL pipelines, and long-running historical datasets. You collaborate effectively with engineers, can translate technical logic into robust SQL-based tests, and take pride in delivering high-confidence data outcomes in regulated environments.

What you need to do now


If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.

If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion on your career.



LHS 297508

Summary

Job Type
Contract
Industry
Government & Public Services
Location
NSW - Sydney CBD
Specialism
Data & Advanced Analytics
Ref:
2992679

Talk to a consultant

Talk to Philip Beacom, the specialist consultant managing this position, located in Sydney
Level 14, Chifley Tower, 2 Chifley Square

Telephone: -