




Summary: Seeking an experienced Senior Data DevOps Engineer (Azure) to design, implement, and manage scalable data solutions, ensuring efficient data pipelines, reliability, and performance. Highlights: 1. Design and manage scalable data solutions in Azure cloud environment 2. Automate data processes using Python to improve efficiency and reliability 3. Collaborate with data engineering and cross-functional teams We are seeking an experienced **Senior Data DevOps Engineer (Azure)** to join our team and play a crucial role in building and optimizing data infrastructure and workflows. In this role, you will design, implement, and manage scalable data solutions in the Azure cloud environment. You will work closely with data engineering and cross\-functional teams to ensure efficient data pipelines, system reliability, and performance optimization. **Responsibilities** * Design, deploy, and manage data infrastructure using Azure services, including Data Lake (ADLSv2\), Databricks, Synapse, and Data Factory * Collaborate with the data engineering team to build and maintain efficient data workflows and pipelines * Automate data processes using Python to improve efficiency and reliability * Set up and manage CI/CD pipelines using tools such as Jenkins, GitHub Actions, or similar platforms * Work with cross\-functional teams to enhance the performance, scalability, and reliability of data systems * Install, configure, and maintain data tools such as Apache Spark and Apache Kafka in both cloud and on\-premises environments * Monitor data systems to proactively identify and resolve performance and scalability issues * Troubleshoot and resolve complex issues across data platforms and pipelines **Requirements** * At least 3 years of experience in Data Engineering or related roles * Strong expertise in Python programming and batch processing workflows * Proficiency in SQL for managing and querying large datasets * Advanced experience working with Azure cloud services for data infrastructure management * Hands\-on experience with Infrastructure as Code tools such as Ansible, Terraform, or CloudFormation * Skilled in setting up and managing CI/CD pipelines using tools like Jenkins or GitHub Actions * Practical experience with data tools such as Spark, Airflow, or R for data processing and workflow management * Advanced knowledge of Linux operating systems, including scripting and system management * Strong understanding of network protocols and mechanisms, including TCP, UDP, ICMP, DHCP, DNS, and NAT * Fluent English communication skills, both written and spoken, at a B2\+ level or higher **Nice to have** * Familiarity with additional cloud platforms such as AWS or GCP * Experience with container orchestration tools like Kubernetes for managing data workflows * Knowledge of monitoring and observability tools such as Prometheus, Grafana, or Azure Monitor * Exposure to Big Data technologies and advanced data analytics workflows * Hands\-on experience with data governance and security best practices in the cloud


