Data Architect/s

  • Australia
  • Adelaide
  • Contract
  • Negotiable

Role Purpose

The Data and Information Architect is accountable for leading the design and development of the Department’s data and information architecture, ensuring alignment with the organisation’s overall technology strategy and digital capability priorities.

The role contributes to the enterprise technology roadmap and drives initiatives that enable the organisation to become an insight-driven organisation. Working closely with internal stakeholders and external partners, the role translates business needs into scalable, secure, and maintainable data and information solutions.


Key Outcomes & Responsibilities

Architecture Strategy & Leadership

* Lead the design and development of the organisation’s data and information architecture across multiple viewpoints and levels of abstraction to support business, information, security, and technology decision-making.
* Assess current and target state architectures, identify gaps, and develop recommendations aligned to strategic and tactical objectives.
* Define transition roadmaps and architecture plans that align organisational mission and vision with innovative, value-driven outcomes.
* Act as a trusted architectural adviser to senior management, providing direction to delivery teams and thought leadership on emerging technology trends.

Design, Modelling & Standards

* Present architecture roadmaps and landscape views to senior stakeholders.
* Develop UML, ER, and other architectural models for application and database teams.
* Translate business and application requirements into component, interface, and integration specifications.
* Ensure solutions adhere to enterprise architectural principles including reusability, modularity, loose coupling, security, performance, scalability, availability, and integrity.

Data Management & Governance

* Design and implement data models, data integration patterns, and data governance frameworks to ensure data quality, consistency, and security.
* Develop and maintain data dictionaries, taxonomies, and data lineage artefacts to support governance and compliance requirements.
* Provide guidance, coaching, and training to teams on data management standards and best practices.

Technology Evaluation & Delivery

* Evaluate technologies, software, processes, and development methodologies to define best-practice approaches for application, data, and infrastructure solutions.
* Lead the implementation of data and information technologies including data warehousing, analytics platforms, and big data solutions.
* Collaborate with IT operations and development teams to design, implement, and maintain infrastructure and network solutions aligned with business and enterprise strategy.

Collaboration & Stakeholder Engagement

* Work closely with Data, Information and Analytics teams to deliver data visualisation and reporting solutions that enable insight-driven decision-making.
* Represent the organisation in cross-agency forums and oversee specialist consultancy services related to application and data architecture.
* Initiate, lead, and manage collaborative relationships with internal and external stakeholders including executives, clients, consultants, and service providers.
* Provide expert advice to Executive Leadership on solution boundaries, scope, and architectural impacts.

Work Health & Safety

* Contribute to a safe and healthy work environment by identifying and reporting hazards, incidents, and injuries in accordance with organisational policies and procedures.
* Comply with reasonable WHS instructions from line management and WHS Officers.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • Australia
  • Adelaide
  • Contract
  • Negotiable
  • Data Engineer
  • 12 months initial contract; with possible extension
  • Adelaide Based Position

Our client is looking for a highly proficient and dedicated data engineer professional who thrives in a fast-paced, supportive environment to assist with the development of a new data-driven, flagship IT system with high visibility in the sector.

Key Competencies:

  • Minimum of 10+ years’ experience as a Data Engineer designing, developing and deploying domain-specific data solutions with high volume, complex disparate datasets on cloud platforms, preferably Microsoft Azure.
  • Demonstrated experience designing production, operational data solutions that underpin business critical applications.
  • Experience developing data pipelines that meet availability, latency, and reliability requirements for business-critical processes.
  • Proven ability to independently identify and translate complex requirements from a variety of sources (e.g. functional and non-functional requirements, user stories, etc.) into technical data solution designs.
  • Proven experience implementing data quality controls, validation rules, reconciliation processes, and exception handling across different layers.
  • Experience designing auditability, traceability, and reprocessing mechanisms to support operational assurance and compliance.
  • Experience implementing data security controls, including role-based access control (RBAC).
  • Experience implementing monitoring, logging, and alerting for data pipelines and data services to ensure operational visibility and rapid issue resolution.
  • Experience managing schema evolution, backward compatibility, and change impacts across multiple environments (development, test, and production).
  • Proven experience designing, implementing and maintaining scalable ETL/ELT pipelines with complex transformations across Azure Data Factory, Databricks and SQL services that includes ingestion data from disparate datasets and sources (e.g. Databricks, SQL, API) and with various file formats (e.g. CSV, JSON, Excel, Parquet).
  • Advanced SQL, Python, Apache Spark, Databricks and Azure service engineering skills.
  • Extensive understanding of data lake and lake warehouse protocols, schema evolution and data-quality frameworks.
  • Experience creating data models, including those associated with software/application design, administration functionality, data process (e.g. data mastery/MDM, data lake) and presentation in reports and dashboards.
  • Experience with Azure DevOps (e.g. repositories, branching strategies, CI/CD, etc.).
  • Understanding of Australian government data governance, security and compliance standards, policies and industry best practice.
  • Experience with lineage mapping (source to target).
  • Experience developing and maintaining dbt models that follow engineering best practice.
  • Experience collaborating with multidisciplinary internal and external teams, including technical staff (e.g. developers, data engineers, testers, etc.), product owners, and senior leadership to inform design decisions and to validate development outcomes.
  • Experience working in agile, product-led delivery teams, while not essential is preferred.
  • Demonstrated experience and ability to set and manage one’s own work activities to ensure that timelines are met and outputs are fit-for-purpose and of a high-quality.
  • Communicates orally and in writing in a manner that is clear, fluent and holds the audience’s attention, and prepares and delivers information in a logical, sequential and succinct manner.
  • Ability and willingness to work at a rapid pace in a supporting environment with no existing system to refer to and evolving requirements.
  • Integrity, honesty, fairness, impartiality and commitment to values-based leadership.
  • Relevant qualifications preferred (e.g. Computer Science, Data Engineering, etc.).

Apply now or reach out to Ivan Aureus at 0480 806 152.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Contract Data Analyst - Finance/Costing

  • New Zealand
  • Auckland
  • Permanent
  • Negotiable
  • 12-month contract | Hamilton | Hybrid 3 days in the Hamilto office
  • Intermediate level skills required + excellent communication skills
  • Available to start ASAP – 1-2 weeks

Opportunity knocks…

We’re seeking a Data Analyst for a 12-month contract to support commercial and costing decision-making across the end-to-end value chain. This role is about more than producing numbers – it’s about turning complex data into clear, business-ready insights that leaders can act on.
If you’re a Data analyst with strong Excel, commercial curiosity, and confidence engaging stakeholders, this role offers variety, visibility, and impact.


The inside word…

  • High-impact commercial / costing analytics role
  • Work with large, complex datasets across multiple business areas
  • Focus on insight, storytelling, and decision support
  • Fast-paced, varied workload (not repetitive)
  • Regular engagement with senior stakeholders
  • Hybrid working: typically 3 days in office

You’re a legend because…

  • You have a strong analytical mindset and naturally spot patterns and anomalies
  • You are advanced in Excel (complex formulas, large models, heavy data manipulation) – essential
  • You can translate analysis into clear insights and recommendations
  • You’re comfortable engaging stakeholders and discussing ambiguity
  • You bring commercial curiosity and care about business outcomes

Nice to have:

  • Power BI experience
  • Finance or costing exposure

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Commercial Business and Data Analyst

  • Australia
  • Melbourne
  • Permanent
  • AU$100800 - AU$135000 per annum

Opportunity

  • Permanent career opportunity $90,000 to 120,000 + superannuation
  • Junior to intermediate commercially focused Business and Data Analyst
  • Inner Melbourne (less than 3km from the CBD), access to parking

As the Commercial BA/Data Analyst you will own and support a range of commercial pricing and estimation databases and tools which support this expanding business. You will work closely with the IT Manager, Operations Manager, Estimators, and Finance along with the Executive Leadership Team (ELT).

Based in Docklands, you will be joining a family friendly tight-knit national team of 30-50+ staff which offers a social culture, on-site gym, walking groups, access to parking onsite, regular social events, and commitment to drive and work with bleeding edge when it comes to AI and automation technology.

Role
The three main drivers for this role are to: improve project execution through accurate estimation, support standardised supplier onboarding, and drive and own pricing processes and systems.

Day-to-day this will involve:

  • Maintaining the integrity, accuracy and usability of the organisation’s pricing, estimation, booking and logistics, subcontractor registration, and claims databases and applications, which are a combination of off-the-shelf API based applications, Metabase, as well as bespoke SQL and Excel based applications
  • Ensure data accuracy and governance
  • Develop dashboards and reports for the business
  • Analyse and manage processes and workflow, develop SOP’s for pricing and estimation, and seek out opportunities for continuous improvement – drive business improvement in these areas
  • Drive projects as well as provide operational support to the business for the pricing and estimation eco-system
  • Assist with migration of PowerBI to Metabase

About YOU

  • 2-5 years’ experience as a Business and Data Analyst or Data Analyst in a small to medium enterprise (approx. 50-100 staff)
  • Hands-on expertise in Data, Reporting and Analytics
  • Strong SQL development, advanced Excel, reports design and development, and the ability to extract, transform and load data.
  • Experienced with Azure SQL, Excel, and reporting dashboards (eg Tableau, PowerBI, Metabase)
  • Understanding of API requirements
  • Any experience migrating from or to PowerBI or Metabase would be very highly regarded
  • Ability to pivot from end-user support to planning and driving projects in a varied, hybrid role
  • Strong business process analysis and systems analysis skills – the ability to drive, identify and document user requirements including business process maps, standard operating procedures (SOP’s).
  • Tertiary Qualifications in ICT, Data, or Business Systems (ERP)

You’re a highly collaborative Data Analyst able to work with a range of people from the CEO to Estimators with strong business and data analysis experience.

Above all you have a thirst for knowledge and developing your skills and experience in a varied role combining BA, Data Analysis, and some Data Engineering.

Any experience within insurance, construction or engineering organisations would be highly regarded.

Application
For a confidential conversation about this great career opportunity, please contact:

Kylie.McManus@ talentinternational.com // 0408 388 680
Liam.Lasslett@talentinternational.com // 0407 542 822

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Salesforce Reporting Analyst

  • Australia
  • Melbourne
  • Contract
  • AU$65.03 - AU$73.85 per hour

Our client is a large, purpose-driven Federal Government Agency seeking an APS6 Salesforce Reporting Analyst to enable strategic reporting and visualisation within Salesforce Health Cloud.

Australian Citizenship required (MANDATORY – Federal Government)

  • 12-month initial contract with the possibility of further 12-month extension
  • APS6 hourly rate: $65.03 – $73.85 + Super hourly
  • Hybrid working arrangement
  • Location: Melbourne CBD
  • Already holding or ability to obtain Baseline AGSVA clearance

Responsibilities may include but are not limited to:

  • Build and maintain robust datasets and reports that offer strategic insights.
  • Preparing reports within Salesforce in response to time-critical bodies of work such as responding to requests for data for Senate Estimates briefs.
  • Create interactive dashboards within Salesforce to support business decision-making.
  • Interpreting data and analysing results using standardised techniques.
  • Ensure data accuracy, integrity and security across the Departments instance of Salesforce.
  • Preparing SOPS and reports for executive and project teams.
  • Assist with system administration, including flows, objects, permissions and enhancement requests
  • Implementing ideas and strategies to achieve service delivery efficiencies and/ or cost reductions within accountabilities.
  • Supporting Director / Assistant Director with ad-hoc unscheduled tasks and data requests when required.
  • Partner with internal stakeholders to support projects and continuous improvement initiatives.
  • Contributing to continuous improvement and other support activities across the project including business-as-usual and training activities by providing leadership, knowledge and skills transfer to cross functional team members.

Essential criteria:

  • 2+ years’ experience in Salesforce reporting and analytics is essential. Experience with Salesforce Health Cloud would be highly desirable
  • Experience in drawing insights, building narratives and reporting on complex data sets
  • Experience in data visualisation
  • Experience in Tableau highly desirable
  • Ability to manage and prioritise multiple tasks in a fast-paced environment
  • Desirable: experience working clinical/health data
  • Military experience/knowledge is desirable but not essential.

APPLICATIONS CLOSE WEDNESDAY, 11th FEBRUARY.

APPLY:
Submit your resume, or for further information please contact Liam.Lasslett@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Databricks SME

  • Australia
  • Sydney
  • Permanent
  • AU$200000 - AU$210000 per annum

Our client is seeking an experienced AWS Databricks Subject Matter Expert (SME) to support the delivery, optimisation, and ongoing enhancement of enterprise-scale data solutions on AWS. This is a hands-on role, focused on building and improving Databricks workloads, supporting data pipelines, and working within established platform and architectural standards.

You will work closely with data engineers, analytics teams, and stakeholders to deliver reliable, secure, and high-performing data solutions, rather than owning end-to-end architecture. Ideally, the successful candidate can start ASAP or with a 2-week notice period.

Responsibilities

  • Build, maintain, and optimise Databricks solutions on AWS to support analytics and data engineering use cases

  • Develop and support batch and streaming data pipelines using PySpark, SQL, and Databricks workflows

  • Contribute to solution design and technical decisions within existing architectural frameworks

  • Implement performance tuning, cost optimisation, and operational improvements across Databricks workloads

  • Integrate Databricks with AWS services including S3, IAM, Glue, Redshift, Athena, Lambda, and Kinesis

  • Apply established security, governance, and access controls using Unity Catalog, IAM, and encryption

  • Support data ingestion, transformation, and consumption layers across the platform

  • Collaborate with cloud, security, and platform teams to ensure solutions meet enterprise standards

  • Participate in code reviews, documentation, and knowledge sharing

  • Support CI/CD pipelines and automation for Databricks deployments

Requirements

  • Proven hands-on experience as an AWS Databricks SME in enterprise environments

  • Strong working knowledge of Databricks, Apache Spark, Delta Lake, and Unity Catalog

  • Advanced skills in PySpark and SQL, including performance tuning

  • Solid experience with AWS services such as S3, IAM, Glue, Redshift, EMR, and streaming services

  • Understanding of data security, governance, and compliance in regulated environments

  • Experience working within predefined architecture and platform standards

  • Familiarity with REST integrations, data APIs, and downstream analytics / BI tools

  • Experience with CI/CD pipelines, Git-based workflows, and automated deployments

  • Strong communication skills and ability to collaborate with both technical and non-technical stakeholders

If you think you have the above skills and experiences, click the ‘Apply’ button or send your resume to alex.nguyen@talentinternational.com

Please Notes: Visa sponsorship is not available. Only shortlisted applicants will be contacted for this role.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • New Zealand
  • Auckland
  • Permanent
  • Great benefits package on offer

Overview:

Join a leading financial services organization as a Data Engineer.

Based in Auckland this forward-thinking company is on a mission to redefine excellence in through innovative technology and exceptional service.

In this pivotal role, you will design, build, and operate reliable data ingestion and transformation pipelines in Azure Databricks, playing a crucial role in delivering actionable insights and high-priority business metrics. If you’re passionate about data quality and operational resilience, and ready to contribute to their Data and AI strategy, we want to hear from you!

Required Skills:

  • Proven experience with Azure Databricks, Spark, Python, and SQL.
  • Strong understanding of medallion architecture and practical ELT patterns.
  • Familiarity with data quality monitoring and the lifecycle of Databricks SQL alerts.
  • Ability to collaborate effectively with cross-functional teams and external vendors to resolve data-related issues.
  • Excellent documentation skills for pipelines, runbooks, and change records.

Nice to Have Skills:

  • Experience with Power BI for data visualization and reporting.
  • Background in agile project methodologies.
  • Knowledge of data governance and compliance best practices.
  • A proactive mindset in managing risks and escalating issues as necessary.

Preferred Education and Experience:

  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
  • A minimum of 3 years of professional experience in data engineering or a related discipline.

If you are ready to embrace the challenge of transforming data into high-quality insights for clients and stakeholders, apply with an updated CV.

** Please note that applications will not be actioned until the new year, have a safe and happy holiday. **

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • New Zealand
  • Auckland
  • Contract
  • Negotiable

Opportunity knocks:

We’re on the hunt for a talented Data Engineer to join our client on an exciting 6-month contract! In this role, you’ll dive into BAU work, design efficient data pipelines, and lead the way in migrating to new platforms. If you’re passionate about harnessing the power of data to drive business insights and ready for a thrilling new challenge, we want to hear from you!

About You:

* Proficiency in Microsoft technologies, including SSIS, SQL, and Power BI
* 3-5 years of hands-on experience as a Data Engineer
* A quick learner who adapts effortlessly to existing systems and processes
* Strong analytical and problem-solving skills

Nice to Have Skills:

* Experience in regulated or ennterprise-scale environments
* Familiarity with data migration best practices and methodologies

If this role sounds like YOU, don’t wait! Hit APPLY and share your CV with us and let’s get chatting!

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Data Engineer

  • Australia
  • Melbourne
  • Contract
  • AU$113 - AU$130 per hour + inc super

Our client is a federal government organisation with offices throughout Australia. Due to growth, they are seeking APS6 AWS Data Engineer to join their team in Richmond or Geelong.

  • 12-month contract
  • Hybrid with 3 days per week onsite

Key duties and responsibilities

  • Experience in AWS Cloud, AWS S3, AWS Glue or similar tools within the cloud environment
  • Provide level 2/3 technical support for AWS, Control-M, Teradata (legacy EDW), Snowflake, and ETL tool-based data integration solutions.
  • DevOps – Ability to understand DevOps process and can use DevOps tools in accordance with the process
  • Programming – High level of competency in Programming, including knowledge of supplementary programming languages such as Python
  • Experience in Control M or similar scheduling applications.
  • Worked on File transfer tools e.g. GoAnyWhere or any other similar tools
  • Version control – Ability to demonstrate knowledge of version controls and its appropriate uses
  • Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
  • Prioritise work items and add them to a work queue.
  • Understand, analyse and size user requirements.
  • Development and maintenance of SQL analytical and ETL code.
  • Development and maintenance of system documentation.
  • Work within a state of the art, greenfield dev ops environment.
  • Collaboration with data consumers, database development, testers and IT support teams.

To apply you will need the following skills and experience:

Essential criteria

  • Proficiency in AWS services related to data engineering: AWS Glue, S3, Lambda, EC2, RDS
  • Data pipeline design and development using ETL/ELT frameworks
  • Proficiency with Snowflake as a Cloud EDW/Teradata as On-Prem EDW
  • Proficiency in programming languages: Python (preferred)
  • Control-m Orchestration / Monitoring or similar applications.
  • Strong experience in Operational Support processes and working in a BAU environment

Desirable criteria

  • Experience with infrastructure-as-code tools: CloudFormation or Terraform
  • Exposure to at least one ETL tool like DBT, Talend, Informatica etc
  • Strong SQL Proficiency.
  • SAS (Base, EG, SAS Viya)

Federal government role – proof of Australian citizenship required

APPLY:
Submit your resume or contact Shelley at shelley.harrison@talentinternational.com or call on 0418 572 482 for further information. Shortlisted will be contacted.

For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

APS6 Senior Data Engineer

  • Australia
  • Sydney
  • Contract
  • AU$700 - AU$800 per day

Position: APS6 Senior Data Engineer (2 Positions)
Location: NSW (Hybrid – minimum 3 days in office per week)
Salary/Rate: $700-$800/day
Contract: 12 months
Start Date: Monday, 9 February 2026
Closing Date: Friday, 16 January 2026, 11:59pm (Canberra time)

About the Role:
We are seeking two experienced APS6 Senior Data Engineers to join a leading federal agency on a Labour Hire contract. You will be responsible for supporting and maintaining data assets across Cloud Data Lake, Cloud EDW, Legacy EDW, and SAS analytics platforms. This role focuses on operational support, ensuring data pipelines meet SLAs, and supporting all operational processes within the data platform.

Key Responsibilities:

  • Provide operational support for Cloud and on-prem data platforms (AWS, Snowflake, Teradata, SAS)

  • Develop, maintain, and optimise ETL/ELT pipelines and SQL analytical code

  • Support daily data delivery processes, month-end processing, and BAU data requests

  • Work with scheduling and orchestration tools (Control-M or similar)

  • Collaborate with stakeholders, data consumers, developers, testers, and IT support teams

  • Facilitate continuous improvement, maintain system documentation, and provide team training

Essential Skills & Experience:

  • Strong proficiency in AWS services: Glue, S3, Lambda, EC2, RDS

  • Experience designing and developing ETL/ELT data pipelines

  • Proficiency with Snowflake (Cloud EDW) and Teradata (On-Prem EDW)

  • Programming experience, preferably in Python

  • Experience with orchestration/scheduling tools such as Control-M

  • Strong operational support experience in BAU environments

Desirable Skills:

  • Experience with infrastructure-as-code tools (CloudFormation or Terraform)

  • Exposure to ETL tools such as DBT, Talend, Informatica

  • Strong SQL proficiency

  • Experience with SAS (Base, EG, SAS Viya)

Requirements:

  • Australian citizenship is mandatory (proof required prior to engagement)

  • Labour Hire Licence required for NSW contracts

  • Maximum 37.5 hours per week

  • Security clearance not required

Why Apply:

  • Work in a state-of-the-art, greenfield DevOps environment

  • Gain hands-on experience with modern Cloud and on-prem data platforms

  • Hybrid working arrangements with flexible in-office days

  • Be part of a federal agency making a meaningful impact for Australians

How to Apply:
Interested candidates must submit:

  1. Resume/CV

  2. Responses to all essential and desirable criteria (max 3000 characters per criterion)

Applications to: priya.gabriel@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.