




Job Summary: Responsible for designing, developing, and maintaining robust and scalable data pipelines for integration, transformation, and loading of information into analytical environments. Key Highlights: 1. Design and development of scalable ETL/ELT data pipelines 2. Optimization of data flows into BigQuery 3. Focus on data quality and system performance Responsible for designing, developing, and maintaining robust and scalable data pipelines, ensuring correct integration, transformation, and loading of information into analytical environments such as BigQuery. This role is critical to ensuring the quality, availability, and efficiency of data used by the organization. **Key Responsibilities** * Design, build, and maintain scalable ETL/ELT data pipelines. * Migrate and optimize data flows into BigQuery. * Implement incremental loading strategies to ensure data integrity. * Optimize ingestion and transformation process performance without impacting source systems. * Continuously monitor, debug, and improve data processes. * Document data flows and development best practices. **Requirements** * Experience in ETL/ELT process development. * Knowledge of data modeling and query optimization. * Experience working with BigQuery or other cloud-based data platforms. * Ability to implement incremental loading strategies and efficiently handle large volumes of data. * Focus on data quality and system performance. **Desired Skills** * Analytical thinking and results orientation. * Development and documentation best practices. * Teamwork and effective communication skills. Employment Type: Full-time, Project-based or Fixed-term Contract Duration: 9 months Salary: $35,000.00 \- $40,000.00 per month Benefits: * Grocery vouchers Application Question(s): * Do you agree that this position is for a 6-month project? Work Location: Remote/hybrid in 03230, Actipan, CDMX


