
Data Engineer
We’re excited to be working with a large insurance client as they have a requirement to for a Data Engineer to join the team.
This role offers the chance to work on a flagship Google Data Platform (GCP), driving data transformation at enterprise scale and contributing to one of the most advanced cloud data environments in the region.
What you’ll be working on
-
Designing and building enterprise-grade ELT pipelines on GCP.
-
Developing in dbt, including advanced features such as exception handling, quarantining, snapshots, and reusable ELT patterns.
-
Leveraging BigQuery and other big-data platforms (Snowflake, Databricks) for large-scale, high-performance analytics.
-
Driving SQL and Python-based solutions, with a focus on optimisation and performance tuning.
-
Contributing to a data mesh environment where domain teams own production-ready data products.
What makes this role exciting
-
Enterprise scale – Deliver solutions for one of the largest insurance data landscapes in ANZ.
-
Cutting-edge platform – Work within a modern Google Data Platform, with strong adoption of dbt + BigQuery.
-
Collaborative ways of working – Join agile delivery squads leveraging Jira/Confluence, CI/CD pipelines, and best-practice engineering standards.
About You
We’re looking for a Data Engineer with experience in:
-
GCP / Google Data Platform (BigQuery, dbt, SQL, Python)
-
ELT design and performance tuning for large enterprises
-
Working with big data platforms (BigQuery, Snowflake, Databricks)
-
Agile environments with CI/CD practices
If this role is of interest to yourself click apply now and or email David at david.reynolds@talentinternational.com