AMP (amp.com.au) Logo

AMP (amp.com.au)

Senior Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Australia
Senior level
Remote
Hiring Remotely in Australia
Senior level
The Senior Data Engineer will engineer and improve enterprise-scale data platforms on AWS, design and build end-to-end data pipelines, and optimize data processing solutions using Apache Spark and Python. Responsibilities include data governance, compliance controls, and collaboration with cross-functional teams to support analytics and data integration needs.
The summary above was generated by AI

If you live in Australia or New Zealand, you’ve likely heard of AMP. But at a time when society is changing, we are too. We’re now a nimbler business with new leadership and thinking. 

For us, these are exciting times. There’s a real potential for big thinkers to help us redefine what financial services could be. And turn our legacy into something even more positive and powerful for the future.

Help people create their tomorrow, while you create yours

We help people with their banking, super, retirement and finances. Through upturns, downturns, recessions, and major life transitions. Every day, we help people see and make more of their financial potential, so that they can create their tomorrow. And we’ve been doing it for over 170 years. 

If we do our job well, we genuinely add to the prosperity of our country and its people. 

How you'll make an impact

  • Engineer, operate, and continuously improve enterprise‑scale data platforms and data pipelines on AWS, enabling secure, scalable, and highly reliable analytics and data products across the organisation.
  • Design and build end‑to‑end data pipelines across the full SDLC, translating business and analytical requirements into robust technical designs, development, testing, deployment, and ongoing optimisation.
  • Take end‑to‑end ownership of deliverables with minimal supervision, proactively identifying risks, clarifying requirements, and driving work through to production outcomes.
  • Develop high‑performance big data processing solutions using Apache Spark (extensive Scala/Spark expertise is required; PySpark also desirable), Python, and SQL, optimising for large‑scale batch and streaming workloads.
  • Build, maintain, and optimise data pipelines using the AWS data stack, including Amazon S3, AWS Glue, EMR, Athena, and Amazon Redshift, ensuring performance, scalability, and cost efficiency.
  • Apply AI‑driven software engineering practices to accelerate delivery and improve quality, including AI‑assisted code generation, refactoring, automated testing, documentation, and operational troubleshooting across data pipelines.
  • Apply AI‑driven design techniques to data platform and pipeline architecture, leveraging AI tools to evaluate architectural options, optimise data flows, improve schema and partition design, and enhance performance and cost outcomes.
  • Design and automate batch and streaming data pipelines, implementing resilient orchestration, dependency management, and error‑handling patterns aligned with enterprise standards and best practices.
  • Define and operate CI/CD and automation practices for data pipelines and infrastructure, including Git‑based version control, automated testing, deployment pipelines, and infrastructure‑as‑code, with a strong focus on repeatability and reliability.
  • Establish and enforce data governance controls across the AWS data ecosystem, including data access management, schema evolution, data quality frameworks, metadata management, and data lifecycle and retention policies.
  • Implement secure data integration patterns, including IAM‑based access controls, secrets management, encryption at rest and in transit, and secure connectivity between AWS services and upstream and downstream systems.
  • Embed observability and reliability practices into data pipelines, including structured logging, metrics, data quality monitoring, alerting, runbooks, and operational dashboards to enable proactive issue detection and resolution.
  • Support incident and problem management for data platforms and pipelines, performing root cause analysis, implementing remediation actions, and driving continuous improvement informed by operational insights and AI‑assisted analysis.
  • Maintain visibility of enterprise data assets, including datasets, tables, pipelines, jobs, models, and usage patterns, and proactively identify and remediate data quality issues, pipeline failures, or unmanaged assets.
  • Communicate clearly and work collaboratively with engineering, analytics, architecture, security, risk, and business stakeholders, translating requirements into delivery plans and ensuring alignment on scope, trade‑offs, and outcomes.
  • Partner with security, risk, and compliance teams to embed data security, privacy, and regulatory controls into data platform designs and pipeline implementations from design through to production.
  • Implement compliance and data protection controls, including data classification, retention, auditing, and access reviews, in line with enterprise policies and regulatory obligations.

What you'll bring to the role

  • Strong experience engineering, operating, and scaling enterprise data platforms on AWS or comparable cloud ecosystems, with a focus on reliability, security, performance, and operational excellence.
  • Extensive experience with big data platforms and distributed data processing, with deep, hands‑on knowledge of Scala and Apache Spark (design, performance tuning, and production operations) across large‑scale datasets.
  • Hands‑on experience designing and building end‑to‑end data pipelines, including batch and streaming workloads, using AWS services such as Amazon S3, AWS Glue, EMR, Athena, Redshift, and associated orchestration frameworks.
  • Practical experience applying AI‑assisted software engineering techniques across the full data pipeline SDLC, including AI‑driven design, code generation, refactoring, automated testing, optimisation, documentation, and operational troubleshooting.
  • Experience designing data models, curated data layers, and data products that support analytics, reporting, and downstream consumption at scale.
  • Strong experience implementing data governance, security, and compliance controls across cloud data platforms, including data access management, encryption at rest and in transit, data classification, retention, and auditing.
  • Proven experience defining and operating CI/CD pipelines for data engineering workloads, including environment management, release automation, and infrastructure‑as‑code using tools such as GitHub Actions or equivalent CI/CD platforms.
  • Solid understanding of identity and access management in cloud environments, including IAM‑based access control, role separation, secrets management, and secure service‑to‑service integration.
  • Experience implementing monitoring, observability, and operational reliability practices for production data platforms and pipelines, including logging, metrics, alerting, data quality monitoring, and incident management.
  • Demonstrated ability to operate and support enterprise‑grade data platforms, including incident response, root cause analysis, performance tuning, and continuous improvement.
  • Strong written and verbal communication skills, with the ability to explain complex technical concepts to different audiences and collaborate effectively within a team environment.
  • Highly self‑organised and delivery‑focused, with a demonstrated ability to work autonomously end‑to‑end (from problem definition through build, release, and production support), proactively unblocking yourself and minimising dependency on others while collaborating effectively when needed.
  • Exposure to responsible data and AI practices, including data ethics, model risk awareness, data usage controls, and governance frameworks applied within analytics and data platform environments.
  • Experience working with enterprise data governance tooling and practices, such as data cataloguing, metadata management, data lineage, data quality frameworks, and policy enforcement across cloud data platforms.
  • Relevant cloud or data engineering certifications, preferably in AWS (Data Analytics, DevOps, or Solutions Architecture) or comparable cloud data and DevOps certifications.
  • Strong collaboration and stakeholder engagement skills, with the ability to work effectively in enablement‑driven and governance‑led environments involving security, risk, compliance, architecture, and analytics stakeholders.
  • Experience designing and implementing AI‑augmented data patterns, including vector storage, embeddings, semantic search, and knowledge grounding techniques to support advanced analytics, search, or downstream AI use cases.
  • Familiarity with agentic and workflow‑driven design patterns as applied to data engineering, including multi‑step pipeline orchestration, tool integration, automated decision logic, and guardrails embedded into data processing workflows.

You’ll thrive here if…

If you can adapt from BAU to the ambiguous with ease, you’ll do well here. Change is never easy, so bring your commitment, grit and growth mindset.

Because we run lean, you’ll be expected to jump in and deliver across a variety of areas. Meaning, you’ll be closer to the action and executive decisions that influence where we go next.

If you’re someone that can hold their own, you’ll find AMP quite liberating. 

Why we think you’ll love working at AMP

Doing what we’ve always done is not an option, so your clever ideas will get airtime here. You’ll be encouraged to speak up and try new things. If they don’t work, we move on – better for it.

We know there’s no one way of doing things. So, you won’t have to sacrifice who you are or how you work to fit in here. We’re inclusive and flexible in many of the ways you’d expect. And in some of the ways you wouldn’t. As long as your health and wellbeing come first - at home and at work. 

In fact, most of what makes AMP such a welcoming, enjoyable place to work are our people. Wherever you go, you’ll find moments to connect, feel valued and do meaningful work. 

Whether it’s through our first-class leaders who are invested in you and your success. Through year-round opportunities to volunteer, fundraise and give back to the community. Or in the everyday challenges you face as we work together to strengthen this great organisation. Challenges that will stretch you, amplify your potential and compound the impact you have. 

We believe in the power of inclusion and diversity

We’re dedicated to fostering inclusion, diversity, and a warm feeling of belonging at AMP. It sparks creativity, ignites innovation, and turns up the dial on the quality of our decisions and performance. This not only makes our workplace more engaged, but also leads to better connections with our customers. 

We're your allies in the search for the perfect fit - when you apply, let us know how we can support you to put your best self forward during our selection process. 

We're also committed to enhancing employment opportunities for Aboriginal and Torres Strait Islander people, so we enthusiastically encourage candidates from these backgrounds to apply and explore our Reconciliation Plan on our website

Ready to create your tomorrow?

If you’re someone who sees opportunity where others see challenge, come and work with us in smart, progressive ways as we transform an iconic Australian brand for the future. And, through a series of career-defining moments, create your own tomorrow.

Don’t procrastinate! We review applications when we get them, and if we discover the ideal candidate, we may close the role earlier than the advertised close date.

Looking forward to meeting you.

HQ

AMP (amp.com.au) Sydney, New South Wales, AUS Office

50 Bridge St, Sydney, New South Wales, Australia, 2000

AMP (amp.com.au) Sydney, New South Wales, AUS Office

12 Macquarie St, Sydney, New South Wales, Australia, 2150

Similar Jobs

14 Days Ago
In-Office or Remote
Victoria, AUS
Senior level
Senior level
Information Technology • Software
The Senior Geospatial Data Engineer is accountable for designing and improving geospatial dataflows, ensuring data quality and compliance while leading a team in a secure environment.
Top Skills: Apache NifiFmePowershellPythonSQL
24 Days Ago
In-Office or Remote
Western Australia, AUS
Senior level
Senior level
Automotive
Collaborate to design scalable data pipelines, develop cloud solutions for analytics, and ensure data quality while troubleshooting and optimizing performance.
Top Skills: AthenaAWSC#DbtJavaPythonSnowflakeSparkSQL
14 Hours Ago
Easy Apply
Remote or Hybrid
Easy Apply
Senior level
Senior level
Cloud • Information Technology • Security • Software • Cybersecurity
The Senior Partner Business Manager will build and scale partner relationships, drive joint sales strategies, and manage partner performance to boost revenue growth.
Top Skills: AICloud SecurityCyber SecurityNetwork Security

What you need to know about the Sydney Tech Scene

From opera to comedy shows, the Sydney Opera House hosts more than 1,600 performances a year, yet its entertainment sector isn't the only one taking center stage. The city's tech sector has earned a reputation as one of the fastest-growing in the region. More specifically, its IT sector stands out as the country's third-largest, growing at twice the rate of overall employment in the past decade as businesses continue to digitize their operations to stay competitive.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account