




**Job Description** Definition of data architecture to meet new requirements Analysis of requirements to assess technical feasibility and integration impact of new sources on the existing solution. Design and implementation of scalable and efficient data solutions that facilitate data access and analysis according to user needs. Definition and execution of improvements to the GCP Cloud data ecosystem (optimization of processing, monitoring, and costs). Execution of robust and scalable developments using GCP services depending on user needs. Execution of tests. Definition and application of best practices. **Skills** Positive attitude II Effective Communication II Customer focus I Innovative Thinking II Teamwork II Change Management I Knowledge Management II Commitment II Results Orientation II Analytical Thinking I Intermediate Beginner Intermediate Beginner Intermediate Intermediate Beginner Beginner Intermediate Intermediate Beginner**Qualifications** General: Knowledge of DWH architecture Knowledge of Data Lake architectures Knowledge of GCP ecosystem Knowledge of ETL/ELT tools and data modeling techniques Knowledge in development and configuration of: BigQuery DataFlow Composer Cloud Storage Cloud Functions Cloud EndPoints Pub/Sub Airflow Code repositories Programming languages: SQL(Advanced) Java Python Terraform (IAC Configuration) Intermediate English **About Us** We believe in Latin America's innovative potential and passionately live the digital transformation and convergence; therefore, we want to help you maximize your business capabilities. Now is the time to unify silos, converge toward a common purpose, and connect technologies to transform them into value. Axity, connections that transform.


