




Summary: We are seeking a Senior Data DevOps Engineer with AWS expertise to design, optimize, and secure cloud-based data infrastructures, leading infrastructure and automation decisions. Highlights: 1. Design, optimize, and secure cloud-based data infrastructures with AWS 2. Lead infrastructure and automation decisions, guide team members 3. Collaborate across teams to implement effective data solutions We are hiring a Senior Data DevOps Engineer with strong AWS expertise to design, optimize, and secure cloud\-based data infrastructures. You will take charge of infrastructure and automation decisions, guide team members, and collaborate with various teams to implement effective data solutions. Apply now to be part of a dynamic environment that values your skills and leadership. **Responsibilities** * Create and enhance scalable data infrastructure using AWS services such as ECS, RDS, Athena, Glue, S3, EBS, CloudFormation, IAM, and Redshift * Build and maintain automated workflows with Infrastructure as Code tools including Terraform and AWS CloudFormation * Partner with data engineering and analytics teams to optimize high\-performance data pipelines using Python and AWS Glue * Set and uphold best practices for AWS\-focused CI/CD pipelines using tools like GitHub Actions, AWS CodePipeline, Jenkins, or others * Boost the scalability, reliability, performance, and cost\-effectiveness of cloud data platforms * Integrate advanced data tools like Apache Airflow and Apache Spark on AWS EMR to enhance workflows * Analyze and resolve complex systemic issues across AWS\-hosted systems with thorough root cause analysis * Develop and enforce robust IAM policies, security configurations, encryption standards, and AWS best practices * Plan capacity, optimize resources, and manage costs for essential data platform components * Support and mentor team members, fostering adoption of best practices **Requirements** * Over 3 years of professional experience in Data Engineering, DevOps, or related fields emphasizing cloud infrastructure and automation * Strong Python programming skills for scripting, automation, and workflow efficiency * Expertise in SQL for managing large datasets * Proven experience managing AWS data infrastructure services including Redshift, Glue, EMR, and Athena * In\-depth knowledge of Infrastructure as Code tools such as Terraform, AWS CloudFormation, or Ansible * Proven skills in building and maintaining CI/CD pipelines with Jenkins, GitHub Actions, AWS CodePipeline, or equivalents * Experience with distributed data processing and orchestration tools like Apache Spark and Apache Airflow * Advanced Linux administration and performance tuning capabilities in cloud environments * Strong understanding of networking protocols (TCP, UDP, ICMP, DNS, NAT) in cloud settings * Experience deploying and automating infrastructure at scale using Terraform or similar tools * Proficient in installing and operating data pipeline platforms like Apache Kafka and NiFi * Capability to lead complex technical projects and mentor junior and mid\-level engineers * English proficiency at professional level (B2 or above) **Nice to have** * Knowledge of additional cloud providers such as Azure or GCP and hybrid cloud designs * Experience with statistical tools like R and data visualization platforms such as Tableau or Power BI * Familiarity with advanced CI/CD platforms like Bamboo * Experience implementing hybrid cloud and on\-premises data solutions with secure, scalable flows * Understanding of container orchestration using Kubernetes, Docker, or AWS ECS/EKS


