




**SUMMARY**The Programmer Analyst will enable marketing analytics by building and maintaining our Google Cloud data foundation. They will transform vendor and internal sources into dependable BigQuery datasets and views, run scheduled jobs, and keep documentation, monitoring, and alerts in good shape. They will work with Marketing and IT to turn business questions into technical plans, support Looker models and dashboards, and make data easy to find, trust, and reuse across the organization. They will help consolidate customer identifiers and events from multiple systems, reconcile keys, define update rules, and sustain fresh profiles that support activation and measurement. Core activities include writing SQL in BigQuery, designing tables and views for performance and usability, implementing data quality checks, and applying appropriate access controls for sensitive data. The role values clear communication, ownership of outcomes, and a consistent focus on reliability and usability. Familiarity with modern AI tools for development and general use is strongly preferred. **RESPONSIBILITIES*** Build and maintain ELT pipelines in Google Cloud that land, transform, and publish data to BigQuery from internal systems and third\-party sources using Cloud Storage, scheduled queries, and lightweight serverless jobs with Cloud Scheduler and Cloud Functions or Cloud Run. * Design and document datasets, tables, and views that support analytics use cases in marketing, including clear naming, partitioning and clustering, and performance considerations. * Contribute to Looker by adding or refining LookML models, dimensions, measures, and Explores, and by maintaining dashboards with sensible refresh and performance settings. * Implement data quality checks and basic alerting for pipeline health and data completeness. Maintain runbooks and respond to incidents with timely fixes and clear communication. * Partner with Marketing and IT to translate requirements into technical tasks, propose simple solution designs, estimate effort, and track delivery. * Integrate identifiers and events from multiple systems, aligning schemas and keys, defining update and merge rules, and keeping profiles current for activation and measurement needs. * Apply security and stewardship practices, including IAM least\-privilege, careful handling of PII, and clear lineage and assumptions in documentation. * Use modern AI assistants when appropriate to accelerate routine tasks such as drafting SQL, documenting changes, or generating tests. **REQUIREMENTS** * **Degree:** BA or BS in Computer Science, Information Systems, Engineering, Statistics, or equivalent practical experience through internships, capstone projects, or personal projects. * **Years of experience:** 0 to 2 years in data engineering, analytics engineering, or business intelligence. * **Required technical skills:** **\-** SQL in BigQuery, including joins, window functions, common table expressions, and performance basics. **\-** Google Cloud fundamentals for data work, including BigQuery, Cloud Storage, IAM basics, and scheduling with Cloud Scheduler. **\-** Version control with Git and GitHub, including branching and pull requests. **\-** Data modeling concepts that support analytics, including star or snowflake patterns and pragmatic table design. * **Preferred skills:** **\-** Looker and LookML for semantic modeling and dashboards. **\-** Dataform or dbt for ELT orchestration and testing. **\-** Airflow or Cloud Composer familiarity. **\-** Python or a similar language for utilities, API integrations, or data quality checks. **\-** Working with REST and JSON for vendor data feeds. **\-** Experience using AI assistants for coding and documentation and interest in Vertex AI. * **Other:** Proficient in English (written and spoken)


