Data Engineer

  • New Zealand
  • Auckland
  • Permanent
  • Great benefits package on offer

Overview:

Join a leading financial services organization as a Data Engineer.

Based in Auckland this forward-thinking company is on a mission to redefine excellence in through innovative technology and exceptional service.

In this pivotal role, you will design, build, and operate reliable data ingestion and transformation pipelines in Azure Databricks, playing a crucial role in delivering actionable insights and high-priority business metrics. If you’re passionate about data quality and operational resilience, and ready to contribute to their Data and AI strategy, we want to hear from you!

Required Skills:

  • Proven experience with Azure Databricks, Spark, Python, and SQL.
  • Strong understanding of medallion architecture and practical ELT patterns.
  • Familiarity with data quality monitoring and the lifecycle of Databricks SQL alerts.
  • Ability to collaborate effectively with cross-functional teams and external vendors to resolve data-related issues.
  • Excellent documentation skills for pipelines, runbooks, and change records.

Nice to Have Skills:

  • Experience with Power BI for data visualization and reporting.
  • Background in agile project methodologies.
  • Knowledge of data governance and compliance best practices.
  • A proactive mindset in managing risks and escalating issues as necessary.

Preferred Education and Experience:

  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
  • A minimum of 3 years of professional experience in data engineering or a related discipline.

If you are ready to embrace the challenge of transforming data into high-quality insights for clients and stakeholders, apply with an updated CV.

** Please note that applications will not be actioned until the new year, have a safe and happy holiday. **

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • New Zealand
  • Auckland
  • Contract
  • Negotiable

Opportunity knocks:

We’re on the hunt for a talented Data Engineer to join our client on an exciting 6-month contract! In this role, you’ll dive into BAU work, design efficient data pipelines, and lead the way in migrating to new platforms. If you’re passionate about harnessing the power of data to drive business insights and ready for a thrilling new challenge, we want to hear from you!

About You:

* Proficiency in Microsoft technologies, including SSIS, SQL, and Power BI
* 3-5 years of hands-on experience as a Data Engineer
* A quick learner who adapts effortlessly to existing systems and processes
* Strong analytical and problem-solving skills

Nice to Have Skills:

* Experience in regulated or ennterprise-scale environments
* Familiarity with data migration best practices and methodologies

If this role sounds like YOU, don’t wait! Hit APPLY and share your CV with us and let’s get chatting!

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Data Engineer

  • Australia
  • Melbourne
  • Contract
  • AU$113 - AU$130 per hour + inc super

Our client is a federal government organisation with offices throughout Australia. Due to growth, they are seeking APS6 AWS Data Engineer to join their team in Richmond or Geelong.

  • 12-month contract
  • Hybrid with 3 days per week onsite

Key duties and responsibilities

  • Experience in AWS Cloud, AWS S3, AWS Glue or similar tools within the cloud environment
  • Provide level 2/3 technical support for AWS, Control-M, Teradata (legacy EDW), Snowflake, and ETL tool-based data integration solutions.
  • DevOps – Ability to understand DevOps process and can use DevOps tools in accordance with the process
  • Programming – High level of competency in Programming, including knowledge of supplementary programming languages such as Python
  • Experience in Control M or similar scheduling applications.
  • Worked on File transfer tools e.g. GoAnyWhere or any other similar tools
  • Version control – Ability to demonstrate knowledge of version controls and its appropriate uses
  • Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
  • Prioritise work items and add them to a work queue.
  • Understand, analyse and size user requirements.
  • Development and maintenance of SQL analytical and ETL code.
  • Development and maintenance of system documentation.
  • Work within a state of the art, greenfield dev ops environment.
  • Collaboration with data consumers, database development, testers and IT support teams.

To apply you will need the following skills and experience:

Essential criteria

  • Proficiency in AWS services related to data engineering: AWS Glue, S3, Lambda, EC2, RDS
  • Data pipeline design and development using ETL/ELT frameworks
  • Proficiency with Snowflake as a Cloud EDW/Teradata as On-Prem EDW
  • Proficiency in programming languages: Python (preferred)
  • Control-m Orchestration / Monitoring or similar applications.
  • Strong experience in Operational Support processes and working in a BAU environment

Desirable criteria

  • Experience with infrastructure-as-code tools: CloudFormation or Terraform
  • Exposure to at least one ETL tool like DBT, Talend, Informatica etc
  • Strong SQL Proficiency.
  • SAS (Base, EG, SAS Viya)

Federal government role – proof of Australian citizenship required

APPLY:
Submit your resume or contact Shelley at shelley.harrison@talentinternational.com or call on 0418 572 482 for further information. Shortlisted will be contacted.

For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

APS6 Senior Data Engineer

  • Australia
  • Sydney
  • Contract
  • AU$700 - AU$800 per day

Position: APS6 Senior Data Engineer (2 Positions)
Location: NSW (Hybrid – minimum 3 days in office per week)
Salary/Rate: $700-$800/day
Contract: 12 months
Start Date: Monday, 9 February 2026
Closing Date: Friday, 16 January 2026, 11:59pm (Canberra time)

About the Role:
We are seeking two experienced APS6 Senior Data Engineers to join a leading federal agency on a Labour Hire contract. You will be responsible for supporting and maintaining data assets across Cloud Data Lake, Cloud EDW, Legacy EDW, and SAS analytics platforms. This role focuses on operational support, ensuring data pipelines meet SLAs, and supporting all operational processes within the data platform.

Key Responsibilities:

  • Provide operational support for Cloud and on-prem data platforms (AWS, Snowflake, Teradata, SAS)

  • Develop, maintain, and optimise ETL/ELT pipelines and SQL analytical code

  • Support daily data delivery processes, month-end processing, and BAU data requests

  • Work with scheduling and orchestration tools (Control-M or similar)

  • Collaborate with stakeholders, data consumers, developers, testers, and IT support teams

  • Facilitate continuous improvement, maintain system documentation, and provide team training

Essential Skills & Experience:

  • Strong proficiency in AWS services: Glue, S3, Lambda, EC2, RDS

  • Experience designing and developing ETL/ELT data pipelines

  • Proficiency with Snowflake (Cloud EDW) and Teradata (On-Prem EDW)

  • Programming experience, preferably in Python

  • Experience with orchestration/scheduling tools such as Control-M

  • Strong operational support experience in BAU environments

Desirable Skills:

  • Experience with infrastructure-as-code tools (CloudFormation or Terraform)

  • Exposure to ETL tools such as DBT, Talend, Informatica

  • Strong SQL proficiency

  • Experience with SAS (Base, EG, SAS Viya)

Requirements:

  • Australian citizenship is mandatory (proof required prior to engagement)

  • Labour Hire Licence required for NSW contracts

  • Maximum 37.5 hours per week

  • Security clearance not required

Why Apply:

  • Work in a state-of-the-art, greenfield DevOps environment

  • Gain hands-on experience with modern Cloud and on-prem data platforms

  • Hybrid working arrangements with flexible in-office days

  • Be part of a federal agency making a meaningful impact for Australians

How to Apply:
Interested candidates must submit:

  1. Resume/CV

  2. Responses to all essential and desirable criteria (max 3000 characters per criterion)

Applications to: priya.gabriel@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.