- Mid-level to Senior Data Engineer Opportunity with local Government
- 6-month contract with extensions
- Demonstrated experience with designing, building, and maintaining data pipelines and ETL processes along with cloud platforms (e.g., Azure, AWS, GCP) and related data storage and processing services experience
Your new company
This role is with a local government organisation impacting on the lives of Australians who are seeking an experienced Data Engineer to work as part of a passionate, dynamic team.
The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines that facilitate the collection, storage, and processing of large volumes of data. The Data Engineer works closely with data scientists, analysts, and other teams to ensure that high-quality, reliable data is available for business intelligence and analytics purposes.
Your new role
- Design, build, and maintain scalable data pipelines to efficiently collect, process, and store data from multiple sources.
- Ensure data quality, consistency, and availability by implementing data validation and cleansing processes.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver appropriate solutions.
- Develop and optimise ETL (Extract, Transform, Load) processes to integrate data from various systems and platforms.
- Work with cloud platforms (e.g., Azure) to design and implement data storage solutions and processing frameworks.
- Implement data governance practices to ensure compliance with data security, privacy, and regulatory standards.
- Monitor and troubleshoot data pipelines and workflows, ensuring high availability and performance.
- Document data models, workflows, and processes to provide clear visibility and maintainability.
- Continuously optimise and automate data workflows to improve efficiency and reduce manual intervention.
- Adhere to data governance best practices, ensuring data security, privacy, and compliance with organisational and regulatory standards.
- Stay up-to-date with the latest industry trends, technologies, and best practices in data engineering and cloud solutions.
What you'll need to succeed
- 3-5 years’ experience in data engineering, data architecture, or a related field.
- Proven experience in designing, building, and maintaining data pipelines and ETL processes.
- Strong experience with cloud platforms (e.g., Azure, AWS, GCP) and related data storage and processing services.
- Advanced skills in SQL and experience with relational and non-relational databases.
- Experience with programming languages such as Python, Java, or Scala for data manipulation and automation.
- Familiarity with data warehousing concepts and solutions.
- Strong understanding of data governance, data security, and compliance practices.
- Experience working with data modelling tools and creating data architectures that support business needs.
- Proven ability to collaborate with cross-functional teams, including data analysts, scientists, and business stakeholders.
- Experience in automating and optimising data workflows for efficiency and scalability.
- Experience with version control systems (e.g., Git) and CI/CD pipelines for data engineering workflows.
- Strong problem-solving skills and ability to troubleshoot and optimise data pipelines.
- Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders.
What you'll get in return
Excellent opportunity to work with one of the prominent VIC Govt organisations.
Offering fantastic employee support and career growth
What you need to do now
If you're interested in this role, click 'apply now' or forward an up-to-date copy of your resume to Kanika.Behl@hays.com.au
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
LHS 297508