Senior PII Analyst

  • Australia
  • Melbourne
  • Contract
  • Negotiable
  • Initial 12 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Data Privacy Laws | Data Loss Prevention

The Role: The Senior PII Analyst t is responsible for conducting detailed data discovery and mapping exercises to identify, classify, and document Personally Identifiable Information (PII) across the digital landscape.

The Responsibilities:

  • Contribute to strategic planning and implementation of PII analysis activities for programs supporting the achievement of data security goals including leading the identification and cataloguing of PII assets across core systems.
  • Provide a range of analysis services, including: working with stakeholders to identify vulnerabilities that could lead to data breaches or non-compliance; align practices with regulations like GDPR, CCPA, HIPAA, or other data protection laws; and prepare reports for audits and regulatory reviews.
  • Undertake risk assessment to locate and classify data that qualifies as PII including mapping the flow of sensitive data between systems to identify unauthorised replication, “shadow” spreadsheets, and integration risks and reviewing data collection points (web forms, manual uploads, applications) to identify where unneeded PII is entering the environment.
  • Implement Data Protection Measures by recommending encryption, masking, or anonymization techniques and work with IT and security teams to enforce access controls.
  • Collaborate with business owners to document the specific business justification for retaining sensitive fields, challenging “just-in-case” collection practices while building partnerships and networks within the team and other relevant business units.

Skills & Experience Required:

  • Minimum 7 years of experience as a Senior PII Analyst delivering business analysis, governance and compliance and its application to the IT infrastructure and service improvements to meet business goals
  • Strong knowledge of data privacy laws and frameworks with familiarity of data discovery tools and data loss prevention (DLP) solutions.
  • Demonstrated experience in business analysis, governance and compliance and its application to the IT infrastructure and service improvements to meet business goals.
  • Highly-developed planning and organisational skills, with experience establishing priorities, implementing improvements and meeting deadlines.
  • High level ability to analyse, isolate and interpret business needs and develop and document appropriate functional specifications and solutions.

What’s in it for you:

  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Data Privacy Laws | Data Loss Prevention

Apply today and Peter Li will reach out to disclose further information.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Data Analyst

  • Australia
  • Melbourne
  • Contract
  • Negotiable
  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)

The Role: The Senior Data Analyst is responsible for safeguarding the system-agnostic business information models by modelling and stitching data within the existing enterprise data ecosystem, ensuring consistency and continuity.

The Responsibilities:

  • Develop and maintain advanced data transformations and analytical workflows using SQL, Python, and Databricks, operating on large-scale datasets within a Lakehouse (Delta Lake / Spark) architecture
  • Design and document business-aligned data models using best practice modelling principles.
  • Contribute to the definition and implementation of internal data modelling standards across the team.
  • Design and build well-structured, efficient reports and dashboards in Power BI that are tailored to business needs and based on trusted, modelled datasets.
  • Investigate complex data problems and independently prototype and implement analytical solutions within Databricks
  • Conduct data profiling, validation, and quality remediation to ensure accuracy, completeness, and consistency

Skills & Experience Required:

  • Minimum 7 years of experience as a Data Analyst delivering high-quality analytics using SQL and Python within Databricks.
  • Strong understanding of Lakehouse architecture, particularly working with large-scale structured datasets.
  • Demonstrated ability to build and maintain robust data models and curated datasets to support reporting and analysis.
  • Experience working with semantic modelling tools (e.g. Lucidchart or equivalent) to visualise data models.
  • Extensive hands-on experience designing, building, and maintaining Power BI dashboards aligned with best practices.

What’s in it for you:

  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)

Apply today and Peter Li will reach out to disclose further information.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • New Zealand
  • Auckland
  • Permanent
  • Great benefits package on offer

Overview:

Join a leading financial services organization as a Data Engineer.

Based in Auckland this forward-thinking company is on a mission to redefine excellence in through innovative technology and exceptional service.

In this pivotal role, you will design, build, and operate reliable data ingestion and transformation pipelines in Azure Databricks, playing a crucial role in delivering actionable insights and high-priority business metrics. If you’re passionate about data quality and operational resilience, and ready to contribute to their Data and AI strategy, we want to hear from you!

Required Skills:

  • Proven experience with Azure Databricks, Spark, Python, and SQL.
  • Strong understanding of medallion architecture and practical ELT patterns.
  • Familiarity with data quality monitoring and the lifecycle of Databricks SQL alerts.
  • Ability to collaborate effectively with cross-functional teams and external vendors to resolve data-related issues.
  • Excellent documentation skills for pipelines, runbooks, and change records.

Nice to Have Skills:

  • Experience with Power BI for data visualization and reporting.
  • Background in agile project methodologies.
  • Knowledge of data governance and compliance best practices.
  • A proactive mindset in managing risks and escalating issues as necessary.

Preferred Education and Experience:

  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
  • A minimum of 3 years of professional experience in data engineering or a related discipline.

If you are ready to embrace the challenge of transforming data into high-quality insights for clients and stakeholders, apply with an updated CV.

** Please note that applications will not be actioned until the new year, have a safe and happy holiday. **

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Platform Manager AI/ML

  • Australia
  • Melbourne
  • Permanent
  • Negotiable

Join one of Australia’s leading national retailers as an AI/ML Platform Engineering Manager, where you will play a pivotal role in shaping and scaling the organisation’s next-generation machine learning capabilities. This is a permanent leadership opportunity overseeing a high-performing team of six engineers and driving end-to-end ownership of the AI/ML platform.

About the Role

In this role, you will lead the design, development, and delivery of critical AI/ML platform components supporting enterprise-wide machine learning initiatives. You will partner closely with data scientists, ML engineers, software developers, and platform teams to translate business needs into scalable and secure platform capabilities.

Key Responsibilities

  • Lead the architecture and development of AI/ML platform components including model training environments, feature stores, model registries, deployment pipelines, and monitoring frameworks.
  • Manage end-to-end delivery of platform enhancements, ensuring scalability, reliability, and high performance across all ML workloads.
  • Collaborate cross-functionally with technical and non-technical stakeholders to gather requirements and enable seamless delivery of AI/ML solutions.
  • Implement and champion best practices in MLOps, including model versioning, lifecycle management, CI/CD, observability, and platform security.
  • Drive continuous improvement by integrating new tools, frameworks, and automation to streamline and accelerate ML workflows.
  • Oversee platform reliability, incident response, operational support, and capacity planning to maintain enterprise-grade availability.
  • Ensure all platform capabilities adhere to organisational data governance, security standards, and relevant regulatory requirements.
  • Provide regular reporting, insights, and strategic recommendations to senior leadership on platform performance, risks, and future opportunities.

About You

You are a technical leader with a passion for building robust, scalable AI/ML platforms and empowering teams to deliver impactful machine learning outcomes. You thrive in a fast-paced, collaborative environment and bring deep experience in MLOps, automation, and cloud-native engineering.

What’s on Offer

  • Permanent leadership role in a high-growth AI/ML environment
  • Opportunity to shape a major retailer’s enterprise ML ecosystem
  • Lead and develop a team of six talented engineers
  • Competitive salary, benefits, and long-term career pathways

For more information contact Melissa Haddad at melissa.haddad@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Salesforce Data Migration Designer

  • Australia
  • Sydney
  • Contract
  • AU$800 - AU$825 per day

Salesforce Data Migration Designer

Contract & Location:

  • 6-month initial contract (possible extension)

  • Hybrid role, Sydney-based (2 days on-site)

  • Australian citizens eligible for Baseline clearance only

Role Summary:
Responsible for planning, designing, and delivering Salesforce data migration strategies, ensuring data integrity, accuracy, and a seamless transition from legacy systems. Collaborate with business and technical teams to achieve project objectives.

Key Responsibilities:

  • Plan and execute Salesforce data migrations using ETL processes.

  • Lead data mapping workshops and create clear SQL/SOQL mapping documentation.

  • Analyse data, resolve anomalies, and ensure data quality and readiness for cutover.

  • Work closely with Salesforce developers, testers, and business analysts throughout testing and migration phases.

Requirements & Skills:

  • Strong SQL or query language skills

  • Experience with Salesforce (SOQL, data architecture, objects)

  • Understanding of ETL/migration processes and data models

  • Ability to produce clear mapping documentation

  • Experience in large-scale migrations, Apex, Lightning, or AWS Athena is a plus

  • Strong collaboration and communication skills

If you’re an Australian citizen eligible for Baseline clearance and ready to take on this hybrid Sydney-based role, apply now!

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.