Vigil Logo

Vigil

Senior GCP Data Engineer

Posted 15 Days Ago
Remote
Senior level
Remote
Senior level
Senior Data Engineer responsible for designing, building, and maintaining data pipelines on Google Cloud Platform (GCP). Must have deep understanding of GCP services, experience with data modeling and optimization, strong analytical skills, and proficiency in Java, Python.
The summary above was generated by AI

SUMMARY:

As a Senior Data Engineer, you will be responsible for designing, building, and maintaining efficient and reliable data pipelines on the Google Cloud Platform (GCP). This role requires a strong background in GCP services and a proven track record of creating effective data solutions that align with business requirements.

We are looking for candidates who are as excited about pushing their own development as they are about advancing our technology stack. 

Our core developers are passionate about software engineering and enjoy developing their skills and abilities in a friendly and supportive environment of keen learners.

WHAT WILL YOU BE DOING:

You will join our engineering team and be a valued member working closely in a collaborative, autonomous, cross-functional team. You will help with the following:

  • Engineer robust data pipelines for extracting, transforming, and loading (ETL) data into GCP (Google Cloud Platform) 
  • Design and implement scalable and efficient data models on GCP (Google Cloud Platform) 
  • Develop and maintain data architecture, ensuring optimal performance and reliability
  • Utilize GCP's big data services such as BigQuery, Dataflow, and Dataprep for large-scale data processing
  • Implement and maintain data security measures following industry best practices
  • Collaborating with product management teams to understand their requirements
  • Manage and maintain changes to tracking specifications based on product
    and feature teams requirements
  • Communicate your needs clearly and responsibly

WHAT WE ARE LOOKING FOR:

  • Deep understanding of GCP as a platform, its capabilities and services
  • Experience with Dataflow, BigQuery, Pub/Sub, Cloud Functions, and Memorystore / Redis
  • Strong experience with database management and optimization, including experience with SQL and NoSQL databases. This includes an in-depth understanding of data modelling, storage, and efficient querying techniques
  • Strong analytical and problem-solving skills to resolve complex technical issues.
  • Strong experience with CI/CD pipelines
  • Experience with Java, Python
  • Experience working in Agile development environments, particularly in a Scrum framework.
  • Strong English communication skills, both written and verbal

AWESOME BUT NOT REQUIRED:

  • Terraform
  • Kafka

WHAT’S IN IT FOR YOU?

  • Be part of our collegial environment where responsibility and authority are shared equally amongst colleagues and help create our company culture
  • A culture in which we don’t criticise failure but ensure we learn from our mistakes
  • An Agile environment where your ideas are welcome
  • The possibility to grow and experience different projects
  • Ongoing Training & Mentoring
  • The possibility to travel

- ATTENTION! THIS POSITION IS FOR PORTUGAL OR BRAZIL BASED ONLY

Top Skills

Java
Python

Similar Jobs

2 Days Ago
India
Remote
4,000 Employees
Mid level
4,000 Employees
Mid level
Artificial Intelligence • Fintech • Hardware • Information Technology • Sales • Software • Transportation
As a Data Platform Engineer, you'll own and drive data management initiatives, focusing on data ingestion, processing, and storage. You'll design scalable systems, ensure fault tolerance, work on ETL frameworks, and collaborate with multiple teams to deliver features.
Be an Early Applicant
2 Days Ago
Bengaluru, Karnataka, IND
Remote
11,000 Employees
Senior level
11,000 Employees
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Principal Data Platform Engineer at Atlassian, you will design and build scalable data solutions, partnering with analytical teams and data engineers to address gaps and enhance data capabilities. Your role involves leveraging big data technologies and developing RESTful APIs while ensuring data quality and facilitating continuous integration.
2 Days Ago
Pune, Maharashtra, IND
Remote
6,000 Employees
Senior level
6,000 Employees
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Sr. Data Engineer will work on financial services projects using PySpark and Scala with a focus on debugging and data analysis. The role requires understanding of SDLC, experience with Big Data application life cycles, and proficiency in GIT, with bonus skills in CICD tools like Jenkins and Ansible.

What you need to know about the Sydney Tech Scene

From opera to comedy shows, the Sydney Opera House hosts more than 1,600 performances a year, yet its entertainment sector isn't the only one taking center stage. The city's tech sector has earned a reputation as one of the fastest-growing in the region. More specifically, its IT sector stands out as the country's third-largest, growing at twice the rate of overall employment in the past decade as businesses continue to digitize their operations to stay competitive.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account