




DESCRIPTION Hello! Are you a data expert eager to join an amazing team at Axity? This opportunity is for you! We are looking for a **Sr Data Engineer Transy** with at least 5 years of experience to join our team. If you're passionate about turning data into valuable insights and love working with cutting-edge cloud technologies, we'd love to meet you! At Axity, we are a group of technology enthusiasts constantly striving to innovate and deliver state-of-the-art solutions to our clients. We believe in the power of data to drive growth and strategic decision-making. As a Sr Data Engineer Transy, you will play a key role in designing, building, and optimizing our data solutions, working with the most advanced technologies in the market. **What will you do on a day-to-day basis?** * Design, build, and maintain robust and scalable data pipelines. * Collaborate with cross-functional teams to understand their data needs and translate them into technical solutions. * Develop and implement cloud-based data architecture solutions, with a special focus on Google Cloud Platform (GCP). * Optimize database performance and ETL/ELT processes. * Ensure data quality, integrity, and security. * Explore and evaluate new technologies and tools to enhance our data capabilities. * Mentor and guide junior data engineers. * Actively participate in defining the company's data strategy. * **What are we looking for in you?** * Solid experience (minimum 5 years) in data engineering. * In-depth knowledge of data architecture principles and database design. * Practical and demonstrable experience with Google Cloud Platform (GCP) and its related services (BigQuery, Dataflow, Dataproc, Cloud Storage, etc.). * Proficiency in programming languages such as Python and/or Scala. * Experience with workflow orchestration tools like Airflow. * Knowledge of relational and NoSQL databases. * Experience in building and optimizing ETL/ELT pipelines. * Understanding of Data Warehousing and Data Lakes concepts. * Excellent problem-solving and analytical thinking skills. * Ability to work independently and as part of a team. Good communication skills—we like it when you share your ideas! * **Nice to haves (Don’t worry if you don’t meet all!)** * Experience with other cloud platforms (AWS, Azure). * Knowledge of Big Data tools such as Spark or Hadoop. * Experience with data visualization tools (Tableau, Looker). * GCP certifications. * Knowledge of agile methodologies. This position is for hybrid work at our office in Mexico City: Avenida Insurgentes Sur No. 1458, Edificio Torre Origami, Piso 11, Colonia Actipan, Benito Juárez, Ciudad de México, C.P. 03230 If this description resonates with you and you're ready for a new challenge, apply now! We’re excited to meet you and explore how you can fit into our Axity team. Let’s build the future of data together! REQUIREMENTS **General:** * Knowledge of DWH architecture * Knowledge of Data Lake architectures * Knowledge of the GCP ecosystem * Knowledge of ETL/ELT tools and data modeling techniques **Knowledge in development and configuration of:** * BigQuery * DataFlow * Composer * Cloud Storage * Cloud Functions * Cloud EndPoints * Pub/Sub * Airflow * Code repositories **Programming languages:** * SQL (Advanced) * Java * Python * Terraform (IAC Configuration)


