···
Log in / Register
Data Engineer (GCP & Databricks)
$MXN 45,000-50,000/year
Indeed
Full-time
Onsite
No experience limit
No degree limit
Perif. Blvd. Manuel Ávila Camacho 80, El Parque, 53398 Naucalpan de Juárez, Méx., Mexico
Favourites
Share
Some content was automatically translatedView Original
Description

Job Summary: We are seeking an experienced Data Engineer proficient in cloud data ecosystems—especially Google Cloud Platform (GCP) and Databricks—to design, build, and maintain scalable and reliable data pipelines. Key Highlights: 1. Design and construction of scalable cloud-based data pipelines 2. Implementation of distributed processing with Databricks and Apache Spark 3. Collaboration with analytics, engineering, and business teams **Join us and be part of the change!** We are looking for a **Data Engineer** with strong experience in cloud data ecosystems—particularly **Google Cloud Platform (GCP)** and **Databricks**—to contribute to the design, construction, and maintenance of scalable and reliable data pipelines. The ideal candidate has hands-on experience handling large volumes of data, distributed processing, and modern data architecture. They will be responsible for ensuring the availability, quality, and efficiency of data flows supporting analytics, reporting, and data-driven solutions. **Responsibilities:** * Design, build, and maintain **scalable data pipelines** in cloud environments. * Implement data ingestion, transformation, and loading processes using **Python, SQL, and Spark**. * Work with **BigQuery** and **Cloud Storage** for storage and processing of large-scale data. * Develop and optimize advanced **analytical SQL queries**. * Implement distributed processing workflows using **Databricks and Apache Spark**. * Orchestrate data pipelines using **Airflow or Cloud Composer**. * Monitor, operate, and maintain data infrastructure in **production environments**. * Ensure data quality, reliability, and availability. * Collaborate with analytics, engineering, and business teams to enable data-driven solutions. * Document data processes, pipelines, and architecture. **Requirements:** * **3–5 years of experience** in data engineering. * Practical experience working with **Google Cloud Platform (GCP)**. * Proficiency in **BigQuery** and **Cloud Storage**. * Experience working with **Databricks and Apache Spark**. * Advanced **analytical SQL** skills. * Experience using **Python** for data engineering tasks. * Experience with **pipeline orchestration tools**, such as **Airflow or Cloud Composer**. * Experience operating data solutions in **production environments**. ***Desirable:*** * Knowledge of **Terraform** for infrastructure-as-code. * Experience with **CI/CD** practices applied to data pipelines. * Experience using **dbt** for analytical transformations. * Ability to understand **technical documentation in English**. * Experience working with modern data architectures (Data Lake, Lakehouse, or Data Warehouse). **Schedule:** * Hybrid (2 days in office / 3 days remote) * Monday to Friday, 7:30 AM – 3:00 PM. **Benefits** At Rocket Code, we value our employees and offer a comprehensive benefits package to support their personal and professional growth: * **Referral bonus:** Receive a reward for bringing talented individuals to our team. * **Continuous training:** Access to resources and specialized training in AI and digital transformation methodologies. * **Courses and certifications:** Stay at the forefront of your career with company-sponsored courses and certifications. Employment type: Full-time, indefinite-term contract Salary: $45,000.00 – $50,000.00 per month Work location: Hybrid remote work in Miguel Hidalgo, CDMX

Source:  indeed View original post
Juan García
Indeed · HR

Company

Indeed
Juan García
Indeed · HR
Similar jobs

Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.