Senior Machine Learning Engineer

  • Australia
  • Australian Capital Territory
  • Contract
  • Negotiable
  • 6-month contract with potential extension options
  • Multiple locations available across ACT, NSW, VIC, QLD, SA, and TAS with a hybrid working model
  • Baseline security clearance required (Australian citizenship essential)
  • Strong machine learning engineering experience including Python, ML frameworks, and cloud-based ML solutions

This role presents an exciting opportunity for a Senior Machine Learning Engineer to lead the delivery of advanced, secure, and scalable machine learning solutions within a complex enterprise environment. You will play a key role in enabling data-driven decision-making and delivering innovative AI capabilities, while working closely with technical and business stakeholders to support strategic objectives.

Your duties will include:

  • Designing and implementing machine learning models for complex datasets
  • Developing and maintaining end-to-end ML pipelines for training and deployment
  • Optimising models for performance, scalability, and accuracy in production
  • Collaborating with data scientists, engineers, and stakeholders to translate requirements into solutions
  • Ensuring compliance with government security frameworks and ethical AI principles
  • Providing technical leadership, mentorship, and ongoing model monitoring

Skills and Experience we are looking for:

  • Proven experience in machine learning engineering within complex environments
  • Strong proficiency in Python and ML frameworks such as TensorFlow, PyTorch, or Scikit-learn
  • Experience with data engineering, feature engineering, and model optimisation
  • Familiarity with MLOps practices and tools including MLflow, Kubeflow, or Airflow
  • Knowledge of cloud platforms and containerisation technologies
  • Excellent problem-solving, communication, and stakeholder engagement skills

Application Process If you would like to apply, please contact Sanat on 0400 016 163 or email sanat.anmadwar@talentinternational.com

For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Data Engineer

  • Australia
  • Melbourne
  • Contract
  • AU$113 - AU$130 per hour + inc super

Our client is a federal government organisation with offices throughout Australia. Due to growth, they are seeking APS6 AWS Data Engineer to join their team in Richmond or Geelong.

  • 12-month contract
  • Hybrid with 3 days per week onsite

Key duties and responsibilities

  • Experience in AWS Cloud, AWS S3, AWS Glue or similar tools within the cloud environment
  • Provide level 2/3 technical support for AWS, Control-M, Teradata (legacy EDW), Snowflake, and ETL tool-based data integration solutions.
  • DevOps – Ability to understand DevOps process and can use DevOps tools in accordance with the process
  • Programming – High level of competency in Programming, including knowledge of supplementary programming languages such as Python
  • Experience in Control M or similar scheduling applications.
  • Worked on File transfer tools e.g. GoAnyWhere or any other similar tools
  • Version control – Ability to demonstrate knowledge of version controls and its appropriate uses
  • Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
  • Prioritise work items and add them to a work queue.
  • Understand, analyse and size user requirements.
  • Development and maintenance of SQL analytical and ETL code.
  • Development and maintenance of system documentation.
  • Work within a state of the art, greenfield dev ops environment.
  • Collaboration with data consumers, database development, testers and IT support teams.

To apply you will need the following skills and experience:

Essential criteria

  • Proficiency in AWS services related to data engineering: AWS Glue, S3, Lambda, EC2, RDS
  • Data pipeline design and development using ETL/ELT frameworks
  • Proficiency with Snowflake as a Cloud EDW/Teradata as On-Prem EDW
  • Proficiency in programming languages: Python (preferred)
  • Control-m Orchestration / Monitoring or similar applications.
  • Strong experience in Operational Support processes and working in a BAU environment

Desirable criteria

  • Experience with infrastructure-as-code tools: CloudFormation or Terraform
  • Exposure to at least one ETL tool like DBT, Talend, Informatica etc
  • Strong SQL Proficiency.
  • SAS (Base, EG, SAS Viya)

Federal government role – proof of Australian citizenship required

APPLY:
Submit your resume or contact Shelley at shelley.harrison@talentinternational.com or call on 0418 572 482 for further information. Shortlisted will be contacted.

For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

APS6 Senior Data Engineer

  • Australia
  • Sydney
  • Contract
  • AU$700 - AU$800 per day

Position: APS6 Senior Data Engineer (2 Positions)
Location: NSW (Hybrid – minimum 3 days in office per week)
Salary/Rate: $700-$800/day
Contract: 12 months
Start Date: Monday, 9 February 2026
Closing Date: Friday, 16 January 2026, 11:59pm (Canberra time)

About the Role:
We are seeking two experienced APS6 Senior Data Engineers to join a leading federal agency on a Labour Hire contract. You will be responsible for supporting and maintaining data assets across Cloud Data Lake, Cloud EDW, Legacy EDW, and SAS analytics platforms. This role focuses on operational support, ensuring data pipelines meet SLAs, and supporting all operational processes within the data platform.

Key Responsibilities:

  • Provide operational support for Cloud and on-prem data platforms (AWS, Snowflake, Teradata, SAS)

  • Develop, maintain, and optimise ETL/ELT pipelines and SQL analytical code

  • Support daily data delivery processes, month-end processing, and BAU data requests

  • Work with scheduling and orchestration tools (Control-M or similar)

  • Collaborate with stakeholders, data consumers, developers, testers, and IT support teams

  • Facilitate continuous improvement, maintain system documentation, and provide team training

Essential Skills & Experience:

  • Strong proficiency in AWS services: Glue, S3, Lambda, EC2, RDS

  • Experience designing and developing ETL/ELT data pipelines

  • Proficiency with Snowflake (Cloud EDW) and Teradata (On-Prem EDW)

  • Programming experience, preferably in Python

  • Experience with orchestration/scheduling tools such as Control-M

  • Strong operational support experience in BAU environments

Desirable Skills:

  • Experience with infrastructure-as-code tools (CloudFormation or Terraform)

  • Exposure to ETL tools such as DBT, Talend, Informatica

  • Strong SQL proficiency

  • Experience with SAS (Base, EG, SAS Viya)

Requirements:

  • Australian citizenship is mandatory (proof required prior to engagement)

  • Labour Hire Licence required for NSW contracts

  • Maximum 37.5 hours per week

  • Security clearance not required

Why Apply:

  • Work in a state-of-the-art, greenfield DevOps environment

  • Gain hands-on experience with modern Cloud and on-prem data platforms

  • Hybrid working arrangements with flexible in-office days

  • Be part of a federal agency making a meaningful impact for Australians

How to Apply:
Interested candidates must submit:

  1. Resume/CV

  2. Responses to all essential and desirable criteria (max 3000 characters per criterion)

Applications to: priya.gabriel@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior PII Analyst

  • Australia
  • Melbourne
  • Contract
  • Negotiable
  • Initial 12 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Data Privacy Laws | Data Loss Prevention

The Role: The Senior PII Analyst t is responsible for conducting detailed data discovery and mapping exercises to identify, classify, and document Personally Identifiable Information (PII) across the digital landscape.

The Responsibilities:

  • Contribute to strategic planning and implementation of PII analysis activities for programs supporting the achievement of data security goals including leading the identification and cataloguing of PII assets across core systems.
  • Provide a range of analysis services, including: working with stakeholders to identify vulnerabilities that could lead to data breaches or non-compliance; align practices with regulations like GDPR, CCPA, HIPAA, or other data protection laws; and prepare reports for audits and regulatory reviews.
  • Undertake risk assessment to locate and classify data that qualifies as PII including mapping the flow of sensitive data between systems to identify unauthorised replication, “shadow” spreadsheets, and integration risks and reviewing data collection points (web forms, manual uploads, applications) to identify where unneeded PII is entering the environment.
  • Implement Data Protection Measures by recommending encryption, masking, or anonymization techniques and work with IT and security teams to enforce access controls.
  • Collaborate with business owners to document the specific business justification for retaining sensitive fields, challenging “just-in-case” collection practices while building partnerships and networks within the team and other relevant business units.

Skills & Experience Required:

  • Minimum 7 years of experience as a Senior PII Analyst delivering business analysis, governance and compliance and its application to the IT infrastructure and service improvements to meet business goals
  • Strong knowledge of data privacy laws and frameworks with familiarity of data discovery tools and data loss prevention (DLP) solutions.
  • Demonstrated experience in business analysis, governance and compliance and its application to the IT infrastructure and service improvements to meet business goals.
  • Highly-developed planning and organisational skills, with experience establishing priorities, implementing improvements and meeting deadlines.
  • High level ability to analyse, isolate and interpret business needs and develop and document appropriate functional specifications and solutions.

What’s in it for you:

  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Data Privacy Laws | Data Loss Prevention

Apply today and Peter Li will reach out to disclose further information.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Data Analyst

  • Australia
  • Melbourne
  • Contract
  • Negotiable
  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)

The Role: The Senior Data Analyst is responsible for safeguarding the system-agnostic business information models by modelling and stitching data within the existing enterprise data ecosystem, ensuring consistency and continuity.

The Responsibilities:

  • Develop and maintain advanced data transformations and analytical workflows using SQL, Python, and Databricks, operating on large-scale datasets within a Lakehouse (Delta Lake / Spark) architecture
  • Design and document business-aligned data models using best practice modelling principles.
  • Contribute to the definition and implementation of internal data modelling standards across the team.
  • Design and build well-structured, efficient reports and dashboards in Power BI that are tailored to business needs and based on trusted, modelled datasets.
  • Investigate complex data problems and independently prototype and implement analytical solutions within Databricks
  • Conduct data profiling, validation, and quality remediation to ensure accuracy, completeness, and consistency

Skills & Experience Required:

  • Minimum 7 years of experience as a Data Analyst delivering high-quality analytics using SQL and Python within Databricks.
  • Strong understanding of Lakehouse architecture, particularly working with large-scale structured datasets.
  • Demonstrated ability to build and maintain robust data models and curated datasets to support reporting and analysis.
  • Experience working with semantic modelling tools (e.g. Lucidchart or equivalent) to visualise data models.
  • Extensive hands-on experience designing, building, and maintaining Power BI dashboards aligned with best practices.

What’s in it for you:

  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)

Apply today and Peter Li will reach out to disclose further information.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • New Zealand
  • Auckland
  • Permanent
  • Great benefits package on offer

Overview:

Join a leading financial services organization as a Data Engineer.

Based in Auckland this forward-thinking company is on a mission to redefine excellence in through innovative technology and exceptional service.

In this pivotal role, you will design, build, and operate reliable data ingestion and transformation pipelines in Azure Databricks, playing a crucial role in delivering actionable insights and high-priority business metrics. If you’re passionate about data quality and operational resilience, and ready to contribute to their Data and AI strategy, we want to hear from you!

Required Skills:

  • Proven experience with Azure Databricks, Spark, Python, and SQL.
  • Strong understanding of medallion architecture and practical ELT patterns.
  • Familiarity with data quality monitoring and the lifecycle of Databricks SQL alerts.
  • Ability to collaborate effectively with cross-functional teams and external vendors to resolve data-related issues.
  • Excellent documentation skills for pipelines, runbooks, and change records.

Nice to Have Skills:

  • Experience with Power BI for data visualization and reporting.
  • Background in agile project methodologies.
  • Knowledge of data governance and compliance best practices.
  • A proactive mindset in managing risks and escalating issues as necessary.

Preferred Education and Experience:

  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
  • A minimum of 3 years of professional experience in data engineering or a related discipline.

If you are ready to embrace the challenge of transforming data into high-quality insights for clients and stakeholders, apply with an updated CV.

** Please note that applications will not be actioned until the new year, have a safe and happy holiday. **

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.