Question
Full-time
Remote
2-5

Data Engineer

8/7/2025

The Data Engineer will build and maintain data pipelines and ETL/ELT processes on Google Cloud Platform to ensure reliable data flow. They will collaborate with cross-functional teams to implement solutions and monitor pipeline performance.

Working Hours

40 hours/week

Company Size

11-50 employees

Language

English

Visa Sponsorship

No

About The Company
From Technologist to Technologist. We are the catalysts of digital evolution. Our core expertise lies in connecting Top Technologists with Top Companies through unparalleled IT headhunting solutions. Our international expertise helps businesses of all sizes innovate and succeed. Our clients are global companies with over 100k employees serving 700+ clients in 50+ countries. Rooted in a profound grasp of technology, our premier IT Headhunting Services drive transformative growth for Fortune 500 corporations. We maintain the highest standards of technological innovation to meet global demands.
About the Role

Our client represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates, and Society to Rise™.

They are a USD 6 billion company with 163,000+ professionals across 90 countries, helping 1279 global customers, including Fortune 500 companies. They focus on leveraging next-generation technologies, including 5G, Blockchain, Metaverse, Quantum Computing, Cybersecurity, Artificial Intelligence, and more, on enabling end-to-end digital transformation for global customers.

Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the ‘2021 Global 100 Most sustainable corporations in the World by Corporate Knights. 

We are currently searching for a Data Engineer:

Responsibilities:

  • Build and maintain data pipelines and ETL/ELT processes on Google Cloud Platform (GCP) to ensure reliable and efficient data flow.
  • Collaborate with Senior Data Engineers and cross-functional teams (Data Scientists, Product Managers) to gather requirements and implement solutions.
  • Implement data models, schemas, and transformations to support analytics and reporting.
  • Monitor, troubleshoot, and optimize pipelines to ensure data quality, integrity, and performance.
  • Ensure compliance with data governance, security, and regulatory standards within the GCP environment.
  • Document data workflows, tools, and best practices to support scalability and operational excellence.
  • Stay up to date on GCP services and trends to continuously improve infrastructure capabilities.

Requirements:

  • Bachelor’s degree in Computer Science, IT, Data Engineering, or a related field.
  • Minimum 3 years of experience in data engineering, including building pipelines on Google Cloud Platform (GCP).
  • Proficiency with GCP tools, such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer.
  • Strong skills in Python or Java, and advanced SQL for data processing.
  • Experience in data modeling, schema design, and data warehousing.
  • Understanding of data governance and cloud security practices.
  • Familiarity with Git and basic CI/CD practices is a plus.
  • Strong problem-solving and communication skills for technical collaboration.

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Hybrid 3 days at Scotiabank locations in Mexico City / CDMX

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/.


Requirements

Requirements:

  • Bachelor’s degree in Computer Science, IT, Data Engineering, or a related field.
  • Minimum 3 years of experience in data engineering, including building pipelines on Google Cloud Platform (GCP).
  • Proficiency with GCP tools, such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer.
  • Strong skills in Python or Java, and advanced SQL for data processing.
  • Experience in data modeling, schema design, and data warehousing.
  • Understanding of data governance and cloud security practices.
  • Familiarity with Git and basic CI/CD practices is a plus.
  • Strong problem-solving and communication skills for technical collaboration.


Key Skills
Data EngineeringGoogle Cloud PlatformETLData PipelinesData ModelingSchema DesignData WarehousingPythonJavaSQLData GovernanceCloud SecurityGitCI/CDProblem-SolvingCommunication
Apply Now

Please let Sequoia Connect know you found this job on PrepPal. This helps us grow!

Apply Now
Get Ready for the Interview!

Do you know that we have special program that includes "Interview questions that asked by Sequoia Connect?"

Elevate your application

Generate a resume, cover letter, or prepare with our AI mock interviewer tailored to this job's requirements.