




Job Summary: Design and implement scalable and efficient data solutions, optimizing the data ecosystem on GCP Cloud and applying best practices. Key Highlights: 1. Data Architecture Definition for New Requirements 2. Design and Implementation of Scalable and Efficient Data Solutions 3. Optimization of Processing, Monitoring, and Costs on GCP DESCRIPTION * Define Data Architecture to Address New Requirements * Analyze requirements to assess technical feasibility and integration impact of new data sources on the existing solution. * Design and implement scalable and efficient data solutions that facilitate data access and analysis according to user needs. * Define and execute improvements to the GCP Cloud Data Ecosystem (optimization of processing, monitoring, and costs). * Execute robust and scalable development using GCP services based on user requirements. * Conduct testing. * Define and apply best practices. REQUIREMENTS * Knowledge of DWH Architecture * Knowledge of Data Lake Architectures * Knowledge of GCP Ecosystem * Knowledge of ETL/ELT Tools and Data Modeling Techniques Knowledge of Development and Configuration of: BigQuery DataFlow Composer Cloud Storage Cloud Functions Cloud EndPoints Pub/Sub Airflow Code Repositories Programming Languages: SQL (Advanced) Java Python Terraform (IAC Configuration)


