···
Log in / Register
Data Architect
MXN 65,000-75,000/year
Indeed
Full-time
Onsite
No experience limit
No degree limit
Juan González 11, Casco Urbano, 66200 San Pedro Garza García, N.L., Mexico
Favourites
Share
Some content was automatically translatedView Original
Description

**Location:** San Pedro Garza García, Nuevo León, Mexico (Hybrid) **Position:** Permanent **Required Education:** Bachelor's degree in Computer Science or related field **Requirements:** * 5 years designing and developing data architectures, data integration, data transformation, and data modeling (conceptual/logical/physical). * 5 years of advanced SQL and database design. Design and creation of database objects, development of stored procedures, query and process optimization. * 4 years of experience with ETL/ELT tools, data warehouse architecture focused on BI, reporting, financial analytics, accounting, and human resources. * 3 years of experience in Azure Data Lake and/or Lakehouse, lineage design, data traceability, and log analysis. * At least 3 years developing pipelines in Azure Data Factory, APIs, and connectors for integration (OData, REST, Power Platform). * 2 years of experience with Azure Synapse and data architecture using Archimate or LeanIX. * 1 year preferably with experience in other data integration tools such as Informatica Cloud, AWS Glue, GCP DataFlow, DataStage, or others, and knowledge of DataBricks. **Responsibilities:** * Design, implement, and evolve the corporate data architecture ensuring data integration, quality, security, and governance. * Guide technology decisions regarding platforms, data models, integration flows, and best practices. * Develop processes and architecture to document data interoperability, e.g., Interoperability Matrix (Sources vs Destinations). * Design data integration and transformation models, define technical integration specifications, and establish patterns. * Define and implement data processing metrics, latency. * Design and build data warehouses (DWH) and create data storage maps, ETL/ELT pipelines. * Advanced knowledge of SQL Server. * Implement metrics for data architecture such as database latency, data processing time, etc. * Collaborate with other specialists to understand data requirements, provide required data, and define and optimize current data ingestion, extraction, and transformation processes. * Produce technical documentation, logical and physical diagrams of databases. This role guides technology decisions on platforms, data models, integration flows, best practices, and design of data integration and transformation models. **Required Technologies:** * SQL Server, Azure Data Factory, Azure Data Lake, Azure Synapse, DWH, SAP HANA. * Self-sufficient, self-taught, and creative in carrying out tasks. * Analytical and problem-solving skills. * Communication skills to present proposals to managers and directors. * Ability to establish best practices for data ingestion, transformation, and storage development. * Focused on collaborating with various data specialists. * Proactive in proposing improvements to data extraction processes. * Independent in generating technical documentation and development testing. **We Offer:** * Competitive salary (mixed scheme) \+ Statutory benefits * Culture of innovation and collaboration * Leadership in transformative and high-impact projects Job type: Full-time, Indefinite duration Salary: $65,000\.00 \- $75,000\.00 per month Work location: Hybrid remote in 66278 San Pedro Garza García, N.L.

Source:  indeed View original post
Juan García
Indeed · HR
Active now

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.