Senior PII Analyst

  • Australia
  • Melbourne
  • Contract
  • Negotiable
  • Initial 12 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Data Privacy Laws | Data Loss Prevention

The Role: The Senior PII Analyst t is responsible for conducting detailed data discovery and mapping exercises to identify, classify, and document Personally Identifiable Information (PII) across the digital landscape.

The Responsibilities:

  • Contribute to strategic planning and implementation of PII analysis activities for programs supporting the achievement of data security goals including leading the identification and cataloguing of PII assets across core systems.
  • Provide a range of analysis services, including: working with stakeholders to identify vulnerabilities that could lead to data breaches or non-compliance; align practices with regulations like GDPR, CCPA, HIPAA, or other data protection laws; and prepare reports for audits and regulatory reviews.
  • Undertake risk assessment to locate and classify data that qualifies as PII including mapping the flow of sensitive data between systems to identify unauthorised replication, “shadow” spreadsheets, and integration risks and reviewing data collection points (web forms, manual uploads, applications) to identify where unneeded PII is entering the environment.
  • Implement Data Protection Measures by recommending encryption, masking, or anonymization techniques and work with IT and security teams to enforce access controls.
  • Collaborate with business owners to document the specific business justification for retaining sensitive fields, challenging “just-in-case” collection practices while building partnerships and networks within the team and other relevant business units.

Skills & Experience Required:

  • Minimum 7 years of experience as a Senior PII Analyst delivering business analysis, governance and compliance and its application to the IT infrastructure and service improvements to meet business goals
  • Strong knowledge of data privacy laws and frameworks with familiarity of data discovery tools and data loss prevention (DLP) solutions.
  • Demonstrated experience in business analysis, governance and compliance and its application to the IT infrastructure and service improvements to meet business goals.
  • Highly-developed planning and organisational skills, with experience establishing priorities, implementing improvements and meeting deadlines.
  • High level ability to analyse, isolate and interpret business needs and develop and document appropriate functional specifications and solutions.

What’s in it for you:

  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Data Privacy Laws | Data Loss Prevention

Apply today and Peter Li will reach out to disclose further information.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Senior Data Analyst

  • Australia
  • Melbourne
  • Contract
  • Negotiable
  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)

The Role: The Senior Data Analyst is responsible for safeguarding the system-agnostic business information models by modelling and stitching data within the existing enterprise data ecosystem, ensuring consistency and continuity.

The Responsibilities:

  • Develop and maintain advanced data transformations and analytical workflows using SQL, Python, and Databricks, operating on large-scale datasets within a Lakehouse (Delta Lake / Spark) architecture
  • Design and document business-aligned data models using best practice modelling principles.
  • Contribute to the definition and implementation of internal data modelling standards across the team.
  • Design and build well-structured, efficient reports and dashboards in Power BI that are tailored to business needs and based on trusted, modelled datasets.
  • Investigate complex data problems and independently prototype and implement analytical solutions within Databricks
  • Conduct data profiling, validation, and quality remediation to ensure accuracy, completeness, and consistency

Skills & Experience Required:

  • Minimum 7 years of experience as a Data Analyst delivering high-quality analytics using SQL and Python within Databricks.
  • Strong understanding of Lakehouse architecture, particularly working with large-scale structured datasets.
  • Demonstrated ability to build and maintain robust data models and curated datasets to support reporting and analysis.
  • Experience working with semantic modelling tools (e.g. Lucidchart or equivalent) to visualise data models.
  • Extensive hands-on experience designing, building, and maintaining Power BI dashboards aligned with best practices.

What’s in it for you:

  • Initial 9 Month Contract | Potential For Extensions
  • Clayton Location | 3 Days On-Site & 2 Days WFH
  • Data Transformation Program | Databricks. Lakehouse (Delta Lake/Spark)

Apply today and Peter Li will reach out to disclose further information.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer

  • New Zealand
  • Auckland
  • Permanent
  • Great benefits package on offer

Overview:

Join a leading financial services organization as a Data Engineer.

Based in Auckland this forward-thinking company is on a mission to redefine excellence in through innovative technology and exceptional service.

In this pivotal role, you will design, build, and operate reliable data ingestion and transformation pipelines in Azure Databricks, playing a crucial role in delivering actionable insights and high-priority business metrics. If you’re passionate about data quality and operational resilience, and ready to contribute to their Data and AI strategy, we want to hear from you!

Required Skills:

  • Proven experience with Azure Databricks, Spark, Python, and SQL.
  • Strong understanding of medallion architecture and practical ELT patterns.
  • Familiarity with data quality monitoring and the lifecycle of Databricks SQL alerts.
  • Ability to collaborate effectively with cross-functional teams and external vendors to resolve data-related issues.
  • Excellent documentation skills for pipelines, runbooks, and change records.

Nice to Have Skills:

  • Experience with Power BI for data visualization and reporting.
  • Background in agile project methodologies.
  • Knowledge of data governance and compliance best practices.
  • A proactive mindset in managing risks and escalating issues as necessary.

Preferred Education and Experience:

  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or a related field.
  • A minimum of 3 years of professional experience in data engineering or a related discipline.

If you are ready to embrace the challenge of transforming data into high-quality insights for clients and stakeholders, apply with an updated CV.

** Please note that applications will not be actioned until the new year, have a safe and happy holiday. **

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.