Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations.
The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency.
At ResMed, we are global leaders in connected devices and digital health, dedicated to helping millions of people sleep, breathe, and live better lives. We are seeking a dynamic and experienced Lead Engineer to join our innovative team and drive the future of healthcare technology.
Let’s Talk About the Team
The primary role of the Analytics Engineering function within the Global Data Platform, Global Technology Solutions team is to design and deliver new (or improved) data engineering solutions (data products) across the business. Data Engineers in this function work closely with stakeholders to understand and define customer needs, develop high-quality solutions, and monitor closely through the release cycle.
Let’s Talk About the Role
As a Lead Engineer, you will play a pivotal role in building large-scale data processing systems to serve the analytics needs of users across ResMed. Your responsibilities will include:
-
Building large-scale data processing systems to serve the analytics needs of users across ResMed.
-
Implementing data pipeline integrations and solutions, incorporating highly scalable cloud computing and large-scale data stores including data lakes, data warehouses, and data marts.
-
Collaborating closely with data architects to determine appropriate data management systems and working with data scientists/analysts to identify necessary data for analysis.
-
Managing GitHub repositories, including working with GitHub actions.
-
Managing the infrastructure of the team's analytics platform using tools such as Terraform.
-
Developing analytics models using SQL and dbt on the Snowflake data platform.
-
Developing data pipelines using Kafka and Flink.
-
Orchestrating data pipeline operations using Dagster and Python.
-
Frequently leading sub-functional teams or projects and mentoring junior team members.
Let’s Talk About You
Minimum Requirements:
-
Bachelor’s degree in a STEM field.
-
Minimum of 10 years of experience with SQL on a large analytics data platform.
-
Minimum of 10 years of experience with developing software requirements, software coding, and software testing.
-
Minimum of 6 years of experience with Python, shared version control systems, and maintaining data pipelines.
-
Minimum of 5 years of experience with GitHub and using actions to deploy cloud applications.
Preferred Qualifications:
-
Experience with dbt, Dagster, and the Snowflake data platform.
-
Experience with managing software integration and deployment on GitHub.
-
Master’s degree in a STEM field.
-
Minimum of 12 years of related experience.
Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant.