AWS Data Engineer

  • Australia
  • Australian Capital Territory
  • Contract
  • Negotiable
  • Strong experience with AWS data engineering including AWS Glue, PySpark, Lambda, and modern data lake technologies
  • 12-month contract with 4 x 12-month extension options
  • Work from Canberra or remotely from anywhere in Australia
  • Baseline security clearance required

Our Client

This Federal Government agency works in close collaboration with business and the Australian Government to register and administer intellectual property rights and legislation, patents, trademarks, and plant breeder’s rights. With a vision to build prosperity through innovation, our client is working to build a world-class intellectual property systems that gives customers access to the services they need in order to innovate.

The Role

We are seeking an AWS Data Engineer to play a key role in designing, building, and maintaining large-scale data workflows within a modern cloud data platform. In this role, you will contribute to the development of a robust enterprise data lake environment, enabling high-quality data integration, transformation, and accessibility across the organisation.

Your duties will include:

  • Designing and building scalable data ingestion pipelines from databases, APIs, flat files, and XML sources into a cloud data lake environment
  • Developing and maintaining data transformation workflows, including complex XML transformations and cross-agency data exchange processes
  • Supporting and maintaining production data workflows across the platform to ensure reliability and performance
  • Managing platform infrastructure components such as CI/CD pipelines and AWS data processing services
  • Collaborating with SQL developers, business analysts, and DevOps engineers to deliver data solutions and integrations
  • Providing technical leadership, mentoring team members, and documenting workflows to support knowledge sharing and operational continuity

Skills and Experience we are looking for:

  • Strong experience designing and developing ETL/ELT pipelines and data workflows within cloud environments
  • Hands-on experience with AWS services including Glue, Lambda, DMS, OpenSearch, and S3
  • Advanced data engineering capabilities using PySpark, Python, and SQL
  • Experience working with modern data formats such as Parquet and Iceberg within data lake architectures
  • Proven ability to work within Agile delivery teams, collaborating with both technical and business stakeholders
  • Strong problem-solving, communication, and documentation skills with the ability to mentor and guide technical teams

Application Process

If you would like to apply for this opportunity, please click ‘APPLY’. For further information, please contact Jaela Smith on 02 6129 6302 or email jaela.smith@talentinternational.com

For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.