




Job Summary: We are seeking a professional to design, develop, and optimize data pipelines; build and maintain integration processes; model data; and ensure information quality. Key Highlights: 1. Professional growth opportunities 2. Continuous training 3. Hybrid or 100% remote work **Requirements** Academic Qualifications: * Bachelor’s degree or engineering degree in Computer Science, Information Technology, or related field. Experience: * Solid experience in **Python** * Experience developing ETL processes * Knowledge of relational (SQL) and non-relational databases * Experience with orchestration tools or frameworks (e.g., Airflow) * Proficiency in version control (Git) * Experience working with APIs and handling data in JSON/CSV formats * Knowledge of cloud environments (AWS, Azure, or GCP) is desirable Responsibilities: * Design, develop, and optimize data pipelines (ETL/ELT) * Build and maintain data integration and transformation processes * Model and structure data for consumption by BI and analytics tools * Ensure data quality, integrity, and availability * Optimize database queries and performance * Implement best practices for versioning, documentation, and testing * Collaborate with cross-functional teams to understand data requirements We Offer: * Competitive salary commensurate with experience * Benefits exceeding statutory requirements * Hybrid or 100% remote work arrangement * Professional growth opportunities * Continuous training Employment Type: Indefinite-term position Salary: $28,000.00 - $40,000.00 per month Work Location: Remote


