RDC.AI Logo

RDC.AI

Data Platform Engineer

Posted 10 Days Ago
Be an Early Applicant
In-Office
Sydney, New South Wales
Senior level
In-Office
Sydney, New South Wales
Senior level
The Data Platform Engineer develops and maintains data pipelines, dashboards, and ensures data quality for data-driven decision-making.
The summary above was generated by AI

About RDC

Rich Data Co (RDC) Delivering the Future of Credit, Today! We believe credit should be accessible, fair, inclusive, and sustainable. We are passionate about AI and developing new techniques that leverage traditional and non-traditional data to get the right decision in a clear and explainable way. Global leading financial institutions are leveraging RDC’s AI decisioning platform to offer credit in a way that aligns to the customers’ needs and expectations. RDC uses explainable AI to provide banks with deeper insight into borrower behaviour, enabling more accurate and efficient lending decisions to businesses.

Purpose of Role 

The Data Platform Engineer is responsible for developing, operating, and supporting scalable and reliable data pipelines, dashboards, and reporting tools that enable data-driven decision-making across the organisation. This hybrid role combines core data engineering skills (such as pipeline development with Apache Airflow, data transformation in Python, and query optimisation in SQL) with operational responsibilities (such as troubleshooting, validation, and ensuring the uptime and integrity of production data workflows). You will work across cloud-based platforms, relational and NoSQL databases, and modern visualisation tools to ensure high data quality, availability, and performance. As part of a collaborative, fast-moving data team, you will support both project-based data initiatives and the day-to-day stability of the data platform, ensuring that business users have access to timely, accurate, and actionable insights. This is a hands-on role requiring strong technical skills and a proactive approach to both building and supporting the data infrastructure that powers RDC’s analytics and operational reporting environment.

Accountability & Outcomes

Design, develop, and maintain ETL pipelines using Apache Airflow, ensuring scalable and efficient data processing. Build and troubleshoot Python-based data processing scripts for transformation, ingestion, and automation. Support the daily operation of data systems, including pipeline health monitoring, incident response, and root cause analysis. Collaborate with internal teams to define data requirements, integrate multiple sources, and ensure end-to-end data accuracy. Perform data validation, profiling, and QA checks across pipelines and environments. Develop and maintain dashboards and reports using JavaScript, SQL, and BI tools such as Power BI or Tableau. Document data workflows, standards, and pipeline logic for operational consistency. Contribute to the continuous improvement of reliability, observability, and performance in data systems.

Capabilities

Experience

Essential

A minimum of 5+ years of hands-on experience in data engineering, data operations, or data platform roles. Proven experience with Apache Airflow for building, scheduling, and monitoring ETL pipelines. Strong Python programming skills for data processing, automation, and pipeline support. Experience supporting and debugging production-grade data pipelines in cloud environments (preferably AWS). Solid SQL skills with the ability to write and optimise complex queries for reporting and validation. Familiarity with monitoring tools and logs (e.g., CloudWatch, log parsers, pipeline alerting frameworks). Experience working with RDBMS (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., DynamoDB). Demonstrated ability to respond to operational issues, analyse data quality anomalies, and implement remediation. Experience with agile project delivery, managing both sprint-based tasks and day-to-day support incidents. 

Desirable

Experience in cloud environments such as AWS, GCP, or Azure. Exposure to DevOps practices, including CI/CD pipelines for deploying data workflows. Experience supporting dashboard reliability and data refresh monitoring. Familiarity with data warehouse solutions and dimensional modelling concepts. Background supporting data governance, audit logs, and regulatory compliance for operational data. 

Knowledge and Skill

Essential

A deep understanding of ETL/ELT processes, including pipeline dependencies, failure points, and logging. Strong proficiency in SQL and Python, including pandas and PySpark (if applicable). Familiarity with JavaScript and frontend dashboard development frameworks (e.g., React or Angular). Understanding of data quality checks, schema evolution, and operational alerting. Experience using BI/reporting tools such as Power BI, Tableau, or similar for visualisation. Comfortable working with APIs and integrating real-time or batch data feeds. Ability to document data architecture, dependencies, and support runbooks clearly. Excellent communication and collaboration skills, especially across engineering, product, and analytics teams. 

Desirable

Understanding of data platform observability (e.g., data freshness, completeness, latency). Experience with metadata management, data lineage tools, or documentation platforms. Familiarity with infrastructure-as-code for data services (e.g., Terraform for Airflow deployment). Awareness of data privacy frameworks. Ability to contribute to the architecture of a scalable, reliable data platform in a fast-paced environment. 

Join the Future of Credit!

  • Work at a 5-Star Employer of Choice 2023 - RDC was named one of HRD Australia’s “best companies to work for in Australia”.
  • Join a fast-growing global AI company - Grow your skills, capabilities and gain AI and global experience.
  • High performance team - Work alongside some of the best product teams, data scientists and credit experts in the industry.
  • Vibrant team culture - Join an innovative and agile team who celebrates wins and solves problems together.
  • Work-life balance - Highly flexible working arrangements - work how’s right for you!
  • Financial inclusion - Be part of a company that is striving for global financial inclusion and driving the future of credit.

Top Skills

Apache Airflow
AWS
Cloudwatch
DynamoDB
JavaScript
MySQL
Postgres
Power BI
Python
SQL
Tableau
HQ

RDC.AI North Sydney, New South Wales, AUS Office

146 Arthur St, Level 10, North Sydney, New South Wales, Australia, 2060

Similar Jobs

10 Days Ago
In-Office or Remote
Sydney, New South Wales, AUS
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
Design and optimize backend storage solutions on AWS. Collaborate on technical roadmaps while mentoring juniors and ensuring system reliability and efficiency.
Top Skills: AWSCi/CdCloudwatchDockerDynamoDBEbsEfsFsxGlacierJavaKotlinKubernetesOpentelemetryPrometheusS3Terraform
10 Days Ago
In-Office or Remote
Sydney, New South Wales, AUS
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Software Engineer, you will design and optimize scalable backend storage solutions on AWS, develop APIs, and troubleshoot issues while mentoring junior engineers and advocating for best practices in cloud infrastructure.
Top Skills: AWSCloudwatchCodepipelineDockerDynamoDBEbsEfsFsxGithub ActionsGlacierJavaJenkinsKotlinKubernetesOpentelemetryPrometheusS3Terraform
12 Days Ago
In-Office
Sydney, New South Wales, AUS
Mid level
Mid level
Software
The Data Engineer will build, enhance, and maintain data platforms, design data pipelines, monitor performance, and ensure compliance with data management guidelines.
Top Skills: AirflowAWSDbtEmrMskMwaaPythonRedshiftSQL

What you need to know about the Sydney Tech Scene

From opera to comedy shows, the Sydney Opera House hosts more than 1,600 performances a year, yet its entertainment sector isn't the only one taking center stage. The city's tech sector has earned a reputation as one of the fastest-growing in the region. More specifically, its IT sector stands out as the country's third-largest, growing at twice the rate of overall employment in the past decade as businesses continue to digitize their operations to stay competitive.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account