Credit Analyst

  • Australia
  • Sydney
  • Permanent
  • AU$80000 - AU$90000 per annum + equity + private health

Role Overview
We’re hiring a Credit Analyst to join a high-growth FinTech redefining how eCommerce brands access capital. You’ll assess financial and business data to evaluate risk, support funding decisions, and work cross-functionally to shape better lending outcomes. You’ll get hands-on with deal analysis, customer insights, risk strategy, and portfolio performance. This is a career-defining opportunity to learn from global teams, influence core business outcomes, and grow with a scale-up backed by tier-one investors.
Key Responsibilities

  • Analyse financial and non-financial data of eCommerce brands

  • Identify credit risk and evaluate lending opportunities

  • Support regular client calls to understand funding needs

  • Help shape new risk models, tools and product initiatives

  • Partner with Sales and Ops to understand customer dynamics

  • Track portfolio health and contribute to risk mitigation strategy

What We’re Looking For

  • 1-2 years experience in credit, finance, investment or accounting

  • Strong academics and degree in Business, Finance, Law, Engineering or similar

  • Skilled in financial analysis and comfortable working with data

  • Strong communication and stakeholder engagement skills

  • Commercial mindset with a balance of growth and risk thinking

  • Self-starter, detail-focused, curious and analytical

  • Experience in Excel; SQL is a bonus

  • Interest in FinTech, eCommerce or lending a strong plus

Why You’ll Love It

  • 25 days annual leave plus public holidays

  • Private health, life, and critical illness cover

  • Pension scheme to secure your future

  • Generous paid parental and adoption leave

  • Work from anywhere globally for up to 60 days per year

  • Equity scheme – share in the company’s success

  • Hybrid working across global offices (Sydney, Dublin, London, US)

  • Join a collaborative and ambitious team backed by leading banks and VCs

Be part of a global company changing how the world’s best online brands fund their growth. Apply now and help shape smarter financial solutions.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Enterprise Data Architect

  • Australia
  • Adelaide
  • Permanent
  • Negotiable
  • Enterprise Data Architect
  • Full Time Permanent Role
  • Adelaide Based Position with Hybrid working setup

The Enterprise Data Architect plays a key role in shaping, implementing, and governing the organisation’s enterprise data architecture and long-term data strategy. The position is responsible for driving the adoption and maturity of a unified data platform, using Databricks and related technologies to deliver scalable, governed, and cost-effective data solutions.

Through advanced data modelling, governance frameworks, and secure analytics delivery, this role supports improved decision-making across business functions including engineering, finance, operations, and project management. It also underpins the shift toward AI-driven insights and data-centric practices across the organisation.

The position requires strong technical expertise, the ability to bridge business and technology needs, and adaptability to evolving priorities and emerging opportunities.

Responsibilities:

  • Define and maintain enterprise data architecture, frameworks, and roadmaps.

  • Lead the design and delivery of scalable, secure, and cost-effective data solutions on Databricks and cloud platforms.

  • Oversee enterprise-level data governance, metadata management, and master data practices.

  • Establish and enforce standards for data modelling, warehousing, and medallion architectures (Bronze/Silver/Gold).

  • Design and maintain high-quality data pipelines to support analytics, BI, and AI/ML use cases.

  • Provide expert guidance in database and data warehouse design to meet both transactional and analytical needs.

  • Collaborate with business stakeholders, data scientists, analysts, and engineers to align data solutions with organisational priorities.

  • Evaluate and recommend tools, methods, and technologies to enhance data engineering and analytics capabilities.

  • Act as a subject matter expert on data governance, security, privacy, and compliance.

Qualifications:

  • Significant background in architecting and implementing enterprise-scale data platforms using Databricks alongside major cloud environments (Azure, AWS, or GCP).

  • Track record of delivering robust, secure, and cost-efficient data solutions that scale with business needs.

  • Hands-on expertise with Databricks Lakehouse, Unity Catalog, Delta Live Tables, dbt, Python, SQL, CI/CD pipelines (Azure DevOps or GitHub), and infrastructure automation (Terraform or Bicep).

  • Deep understanding of data modelling techniques, modern warehousing practices, and layered architectures such as Bronze/Silver/Gold.

  • Strong history of working with cross-disciplinary teams to deliver integrated enterprise data outcomes.

  • Advanced knowledge of governance, security, privacy, and compliance standards for enterprise data environments.

  • Degree qualifications in Computer Science, IT, Engineering, or an equivalent field.

Desirable:

  • Industry certifications in Databricks, Azure, AWS, or GCP (e.g., Certified Data Engineer, Solutions Architect).

  • Familiarity with data visualisation and semantic modelling tools including Power BI or Tableau.

  • Experience gained in consulting or project-driven settings, with an appreciation for aligning technology with business outcomes.

If the above role sounds of interest, please click on “Apply Now”, or get in touch with Ivan via 0480 806 152 / (08) 8228 1502

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

EL1 Lead Actuary

  • Australia
  • Melbourne
  • Contract
  • Negotiable

The opportunity
Our client is a community focused federal government agency. They have an exciting opportunity available for an EL1 level Lead Actuary to join their Analytics, Data & Actuarial Division.

  • Richmond or Geelong location plus hybrid work-from-home
  • 12 month initial contract + 12 month extension, rates fully negotiable
  • Role only open to Australian Citizens – Federal government client

The role
As an El1 lead Actuary, your responsibilities will include:

  • Providing statistical and actuarial advice
  • Providing oversight and management of reporting, analysis and data management activities as required, including regular performance monitoring
  • Designing actuarial, statistical and mathematical models to undertake analytical work that respond to business issues, including actuarial monitoring and analyses, data tabulations, scheme projects and cost benefit analyses
  • Undertaking ad hoc modelling requests and report production
  • Presenting the outcomes of work undertaken, tailored to context via verbal and written communication

    About you
    To be successful in this role you will have:

    • Extensive experience as an Actuary, including leadership
    • Strong proficiency in R or Python, with preference towards R, including clean/transform/summarise large data set (e.g. data table, dplyr)
    • Functional programming practices – train/evaluate/infer machine/ learning model e.g. xgboost
    • Proficiency in SQL
    • Experience building statistical or machine learning models
    • Relevant academic qualifications including tertiary qualifications in Actuarial Studies or Mathematics/Statistics, progressing towards Associate and/or Fellowship qualifications.
    • Experience with agile
    • Experience with version controls systems i.e. git
    • Experience with package development in R (Devtools, roxygen2), and dashboard development in R (shiny)
    • Excellent communication skills and stakeholder engagement, ideally proven in larger corporate, government or insurance organisations

    Please note that as our client is a federal government organisation, you must be an Australian Citizen to be eligible for this role.

    APPLY
    Submit your resume, or for further information please contact Jarrodd.edwards@talentinternational.com

    For over 30 years Talent has been redefining the contracting experience with industry leading support, exclusive contractor benefits & a world-class digital platform ENGAGE to access it all. Apply today to see how we can elevate your career

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Reporting Specialist

  • Australia
  • Melbourne
  • Contract
  • Inner Western suburbs | ASAP start | $$

Join this dynamic team as a Reporting Specialist supporting a high-profile project for a major utilities client. In this critical role, you’ll be responsible for ensuring consistent, accurate, and insightful reporting to support project delivery and strategic decision-making.
This is your opportunity to contribute to a visible and impactful initiative, empowering teams with data-driven insights while promoting collaboration and operational excellence.

Key Responsibilities:

  • Develop, manage, and deliver regular reports to internal and external stakeholders.
  • Ensure data integrity and consistency across all reporting outputs.
  • Collaborate cross-functionally to gather, interpret, and analyse project data.
  • Provide insights that support project tracking, issue resolution, and continuous improvement.

Skills and experience:

  • Previous experience in a similar role working in a regulated environment
  • Strong analytical and data management capabilities
  • Proficiency in reporting tools (e.g., Excel, Tableau, Power BI)
  • Ability to collaborate across multiple teams and departments
  • High attention to detail and strong organisational skills
  • Excellent written and verbal communication skills
  • Experience with project management tools and methodologies (e.g., Jira, Asana, MS Project) highly regarded

Apply now or contact Alistair Barr on 0480 804 583 for a confidential discussion.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

ABBYY Consultant // Data mining & Task mining

  • Australia
  • Sydney
  • Contract
  • AU$1000 - AU$1100 per day

Our client is looking for a Consultant with ABBYY AI Tooling experience. They are seeking a highly skilled Consultant with expertise in ABBYY AI technologies to support initiatives in data mining and task mining. The successful candidate will play a key role in analyzing complex processes, identifying automation opportunities, and leveraging ABBYY’s AI-driven tooling to deliver actionable insights and efficiency improvements.

Responsibilities

  • Work with stakeholders to gather business requirements and translate them into ABBYY AI Tooling solutions.

  • Configure, deploy, and optimize ABBYY Timeline and related tools for process discovery, data mining, and task mining.

  • Perform in-depth analysis of business processes using ABBYY to identify inefficiencies, bottlenecks, and automation opportunities.

  • Build data models and visualizations to present findings and recommendations to business and technical stakeholders.

  • Collaborate with cross-functional teams, including business analysts, RPA developers, and process owners, to implement data-driven improvements.

  • Provide guidance and training on ABBYY AI Tooling best practices.

  • Stay current with ABBYY product updates and AI/automation trends to continuously enhance solution delivery.

Requirements

  • Proven experience in ABBYY AI Tooling (Timeline, Process Intelligence, or equivalent modules).

  • Strong background in data mining, process/task mining, or process intelligence solutions.

  • Proficiency in working with large datasets, data modeling, and process visualization.

  • Knowledge of business process management (BPM), robotic process automation (RPA), and intelligent automation frameworks.

  • Hands-on experience with SQL, Python, or other scripting languages (preferred).

  • Excellent problem-solving, analytical, and communication skills.

  • Ability to engage with both technical and non-technical stakeholders to translate insights into business value.

  • ABBYY certifications (e.g., ABBYY Timeline or Process Intelligence Specialist) are highly desirable.
  • Experience in consulting, automation, or digital transformation projects is a plus.

If you think you have the above skills and experiences, click the ‘Apply’ button or send your resume to alex.nguyen@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer // Data Lake & Databricks

  • Australia
  • Sydney
  • Contract
  • AU$800 - AU$850 per day

Our client is seeking a skilled and motivated Data Engineer to join their dynamic team. The ideal candidate will possess extensive experience with Microsoft Azure, Databricks, real-time integrations, and data streaming (Kafka). You will be responsible for designing, building, and maintaining their data infrastructure to support their data analytics and business intelligence needs. Your expertise in data lakes, SQL procedures, and CICD pipelines will be critical to ensuring efficient and reliable data processes.

Responsibilities:

  1. Azure Data Solutions:

    • Design, implement, and manage data solutions using Azure Blob Storage, Azure Kubernetes Service (AKS), and Azure Data Factory (ADF).
    • Ensure the scalability and reliability of our Azure-based data infrastructure.
  2. Databricks Development:

    • Develop and maintain data pipelines using Databricks, with a focus on PySpark and Python.
    • Optimize data workflows for performance and cost-efficiency within the Databricks environment.
  3. Real-time Data Integration:

    • Design and implement real-time data integration solutions using data streaming technologies such as Kafka, Azure Functions, ADF, and Flink.
    • Develop and maintain CICD pipelines to automate deployment and monitoring of data streaming processes.
  4. Data Lake Construction:

    • Build and maintain data lakes on Databricks to support scalable and flexible data storage and analytics.
    • Ensure data quality, consistency, and security within the data lake environment.
  5. SQL Development:

    • Write, debug, and optimize complex PL/SQL and T-SQL procedures.
    • Collaborate with data analysts and other stakeholders to meet their data querying and reporting needs.

Requirements:

  • 6-10 years of experience in data engineering or a related role.
  • Strong expertise in Microsoft Azure services, including Azure Blob Storage, AKS, and ADF – A MUST
  • Proficiency in Databricks with a focus on PySpark and Python.
  • Hands-on experience with real-time data integration and streaming technologies (Kafka, Azure Functions, ADF, Flink).
  • Proven experience building and maintaining data lakes on Databricks – A MUST
  • Strong knowledge of PL/SQL and T-SQL, with hands-on experience in writing and debugging SQL procedures.
  • Excellent problem-solving skills and the ability to work in a fast-paced, collaborative environment.
  • Strong communication skills and the ability to work effectively with cross-functional teams.

Preferred Qualifications:

  • Master’s degree in Computer Science, Information Technology, or a related field.
  • Certifications in Microsoft Azure and/or Databricks.
  • Experience with other data integration and ETL tools.
  • Familiarity with additional programming languages and data processing frameworks.

If you think you have above skills and experiences, click ‘Apply’ button or send your resume to alex.nguyen@talentinternational.com

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.

Data Engineer (APS6)

  • Australia
  • Melbourne
  • Contract
  • AU$115 - AU$135 per hour

Our client is a federal government organisation with offices throughout Australia. Due to growth, they are seeking APS6 Data Engineer to join their team in Richmond or Geelong.

  • 12-month initial contract plus 12-month extension
  • Hybrid with 3 days per week onsite
  • Federal government role – Australian citizenship required

Key duties and responsibilities

  • Experience in AWS Cloud, AWS S3, AWS Glue or similar tools within the cloud environment
  • Provide level 2/3 technical support for AWS, Control-M, Teradata (legacy EDW), Snowflake, and ETL tool-based data integration solutions.
  • DevOps – Ability to understand DevOps process and can use DevOps tools in accordance with the process
  • Programming – High level of competency in Programming, including knowledge of supplementary programming languages such as Python
  • Experience in Control M or similar scheduling applications.
  • Worked on File transfer tools e.g. GoAnyWhere or any other similar tools
  • Version control – Ability to demonstrate knowledge of version controls and its appropriate uses
  • Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
  • Prioritise work items and add them to a work queue.
  • Understand, analyse and size user requirements.
  • Development and maintenance of SQL analytical and ETL code.
  • Development and maintenance of system documentation.
  • Work within a state of the art, greenfield dev ops environment.
  • Collaboration with data consumers, database development, testers and IT support teams.

To apply you will need the following skills and experience:

Essential criteria
1. Proficiency in AWS services related to data engineering: AWS Glue, S3, Lambda, EC2, RDS
2. Data pipeline design and development using ETL/ELT frameworks
3. Proficiency with Snowflake as a Cloud EDW/Teradata as On-Prem EDW
4. Proficiency in programming languages: Python (preferred)
5. Control-m Orchestration / Monitoring or similar applications.
6. Strong experience in Operational Support processes and working in a BAU environment

Desirable criteria
1. Experience with infrastructure-as-code tools: CloudFormation or Terraform
2. Exposure to at least one ETL tool like DBT, Talend, Informatica etc
3. Strong SQL Proficiency.
4. SAS (Base, EG, SAS Viya)

APPLY:
Submit your resume or contact Shelley at shelley.harrison@talentinternational.com or call on 0418 572 482 for further information. Shortlisted will be contacted.

Apply now

Submit your details and attach your resume below. Hint: make sure all relevant experience is included in your CV and keep your message to the hiring team short and sweet - 2000 characters or less is perfect.