···
Log in / Register
Senior Data DevOps Engineer (GCP)
Indeed
Full-time
Onsite
No experience limit
No degree limit
79Q22222+22
Favourites
Share
Description

Summary: Join a growing team as an experienced Senior Data DevOps Engineer focusing on managing cloud data infrastructure, automating workflows, and enhancing data operations. Highlights: 1. Design and implement cloud data infrastructure using GCP services 2. Automate provisioning and monitoring with Infrastructure as Code (Terraform) 3. Develop automated data pipelines using Python and configure cloud data tools We are looking for an experienced Senior Data DevOps Engineer with expertise in Google Cloud Platform to strengthen our growing team. You will focus on managing cloud data infrastructure, automating workflows, and enhancing data operations. Join us and contribute to building efficient and scalable data solutions. **Responsibilities** * Design and implement cloud data infrastructure using GCP services like DataFlow, BigQuery, and Cloud Composer * Deploy and manage Infrastructure as Code with Terraform to automate provisioning and monitoring * Collaborate with data engineers to develop automated data pipelines using Python * Establish CI/CD pipelines using Jenkins, GitLab CI, or GitHub Actions for smooth deployments * Optimize data platform performance and reliability working with cross\-functional teams * Configure cloud data tools such as Apache Spark and Apache Airflow * Troubleshoot and resolve issues related to scalability and reliability of cloud data systems **Requirements** * Minimum 3 years working experience with GCP services including BigQuery, Cloud Composer, and Dataproc * Proficiency in Python programming and strong SQL skills for pipeline management * Experience with Infrastructure as Code tools like Terraform or CloudFormation * Knowledge of CI/CD pipeline tools such as Jenkins, GitHub Actions, or GitLab CI * Familiarity with Linux operating systems and shell scripting * Understanding of networking protocols including TCP/IP, DNS, and NAT * Competence in using data processing tools like Apache Spark, Apache Airflow, or ELK Stack **Nice to have** * Experience with AWS or Azure platforms including ECS, S3, Data Lake, or Synapse * Ability to work with additional IaC tools such as Ansible * Experience with alternative data workflow automation technologies

Source:  indeed View original post
Juan García
Indeed · HR

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.