




Summary: Join AstraZeneca's R&D IT as a passionate Data Engineer to leverage best practices for improving delivery performance and data engineering capabilities in the D&A space. Highlights: 1. Part of an Agile and DevOps team supporting data engineering across R&D IT 2. Deliver cost-effective data engineering solutions 3. Ensure business data assets are available across the AZ enterprise AstraZeneca is a global, innovation\-driven biopharmaceutical business that focuses on the discovery, development and commercialization of prescription medicines for some of the world's most serious diseases. R\&D IT is AZ’s global IT capability function supporting key science related business areas operating out of sites across the US, UK, Sweden, India \& Mexico. We have a varied range of data engineering projects which use the latest cloud and architectural best practices to deliver a broad range of data solutions for the business. For example, monitoring Patient Safety aspects of the COVID\-19 Vaccine and providing Operational and Financial business intelligence on the latest drugs in the development pipeline. We are looking for a passionate Data Engineer who will leverage tools \& technology best practices to improve delivery performance \& data engineering capabilities in the D\&A space. Hybrid role: 3 days at the office 2 days remote **ROLES \& RESPONSIBILITIES:** * You will be part of a truly Agile and DevOps team which provides data engineering (ETL, Data Products, Reports) support across the R\&D IT portfolio * Deliver cost effective solutions to support data engineering activities, for example data ETL workflows * Test and quality assess new D\&A solutions, to ensure they are fit for release: code assurance, Unit and System Integration Testing, Data testing, release management control and support of UAT processes * Ensure that business data and information assets are made available as data services and artefacts for consumption by the wider AZ enterprise **MANDATORY SKILLS:** * SQL proficiency for data analysis and transformation. * Any Database experience, preferably Snowflake * Experience of working with a range of data analytics architectures. These may include: modern data warehouses (Snowflake, Redshift), distributed computing, visualization analytics * Familiar with using version control (branching, merging etc), ideally Git * Experience and familiarity with data models and artefacts * Interpret data, process data, analyze results and provide ongoing support of productionized applications * Strong analytical skills with the ability to resolve production issues * Understanding of business area / process in scope * Willing to work in a cross cultural environment * Ability to work effectively independently or as part of a team to achieve objectives. * Eager to learn and develop new tech skills as required **DESIRED SKILLS:** * Experience with \& knowledge of Python, particularly in conjunction with serverless technology such as AWS Lambda and AWS Glue * Cloud Engineering Experience deploying Infrastructure as Code, particularly Terraform. * Experience on CI/CD workflows. * DBT fundamentals. Experience with modeling (staging, intermediate, marts), materializations (table, view, incremental, ephemeral), and project configuration.


