




Summary: Join our innovative team as a motivated Python Developer, contributing to designing and implementing solutions throughout the Software Development Life Cycle (SDLC) and tackling technical challenges. Highlights: 1. Contribute to designing and implementing solutions throughout the SDLC 2. Develop and maintain data pipelines using Big Data technologies 3. Engage in Agile Scrum ceremonies and collaborate with development teams We are looking for a skilled and motivated **Python Developer** to join our innovative team. In this role, you will contribute to designing and implementing solutions throughout the Software Development Life Cycle (SDLC). We are seeking an adaptable professional who is eager to learn and tackle technical challenges. **Responsibilities** * Create and manage cloud resources in AWS * Support data ingestion processes for various data sources, including technologies like RDBMS, REST HTTP APIs, and flat files * Assist in developing data pipelines using Big Data technologies * Collaborate on transforming data with Spark and cloud\-based services while applying essential business logic * Contribute to automated data quality checks to ensure processing accuracy * Help in building infrastructure for collecting, transforming, and distributing customer data * Suggest enhancements to improve processes for data collection, analytics, and visualization * Assist with maintaining scalability, reliability, and flexibility in data pipelines * Support the analysis of data patterns and trends when required * Assist in creating visualizations to provide useful insights for stakeholders * Engage in Agile Scrum ceremonies within development teams * Support writing reports and queries to convey findings effectively **Requirements** * 2\+ years of experience in data engineering or related fields such as consumer finance or similar industries * Educational background in math, statistics, computer science, or related disciplines * Knowledge of Python or Snowflake * Foundation in tools like Hive, Spark, Kafka, Airflow/Oozie, AWS, and Docker/Kubernetes * Understanding of programming languages (e.g., SAS, SQL, R, Python), database technologies (e.g., PostgreSQL, Snowflake, Redshift), and data visualization tools (e.g., Tableau, Looker) * Willingness to learn new technologies and tools quickly * Organized and able to multitask with attention to deadlines * Familiarity with analytical techniques and methodologies * Good communication skills in English (minimum B1\+ level), enabling clear interaction with non\-technical stakeholders **Nice to have** * AWS certification * Basic understanding of Spark Streaming * Exposure to ELK Stack tools * Knowledge of Cassandra or MongoDB technologies * Experience with CI/CD tools like Jenkins, GitLab, or Jira/Confluence


