Job Title:
Senior Data Architect/Analyst - BilingualJob Description
We're Concentrix. The intelligent transformation partner. Solution-focused. Tech-powered. Intelligence-fueled.We are looking for a Senior level Data Architect/Analyst with demonstrated problem solving expertise and practical experience designing, analyzing, and supporting both OLTP and OLAP systems.
Qualifications
Experience
Data Analyst Experience (required):
Recent hands‑on experience in a Data Analyst (DA) role (current or immediately prior role) performing Data modeling for OLTP and OLAP/EDW, Source‑to‑target data mappings, and writing complex transformation rules.
Core DA project delivery responsibilities.
Multi‑Role Capability (required):
Data Analyst (DA) primary role: Data modeling, mapping, profiling, and documentation.
Business Analyst (BA) mindset: Gathering, clarifying, and documenting business and technical requirements directly with stakeholders, as needed.
Project Manager mindset: Planning, organizing, and tracking their own technical work when no formal PM is assigned.
Look for examples of independently owned deliverables and timelines.
Availability & Work Expectations (no negotiable):
Time Zone Availability - Central Time.
Contractor Expectations (required):
This role requires hands‑on delivery, not advisory/consulting work.
Comfortable owning outcomes end‑to‑end.
Strong Communication & Problem‑Solving Skills (required):
Clear, professional written and verbal communication in English.
Ability to explain complex technical concepts to both technical and non‑technical stakeholders,
Screen for examples of documentation, stakeholder interaction, or cross‑team communication.
Ability to analyze ambiguous or complex problems.
Identify and document root cause.
Propose and execute viable, practical solutions.
Look for real troubleshooting or issue‑resolution examples.
Data Modeling & Architecture (required)
Experience designing and maintaining physical data models by translating business requirements into scalable solutions.
Familiar applying enterprise modeling best practices across EDW and OLTP, including:
Kimball dimensional modeling.
Medallion architecture.
Normalized (3NF) transactional models.
Solid knowledge of OLTP modeling, including SQL Server temporal tables.
Working understanding of XML and XSD, including:
Differences between XSD vs XML.
Canonical data models and schema design.
Experience in Fail Fast Execution (required), being familiar with moving forward with initial solutions, testing early, identifying issues quickly, and pivoting without delay rather than waiting for perfect requirements or designs.
Data Engineering & Platform Experience
Hands‑on experience with cloud and enterprise data platforms, including:
Azure, Microsoft Fabric (preferred), AWS (acceptable).
SQL Server, Oracle.
Azure Synapse, Lakehouse architectures.
Experience implementing data distribution and partitioning strategies, including:
Azure Synapse data distribution.
Delta Lake partitioning.
Experience performing data profiling to assess quality, structure, and analytics readiness.
Senior‑level on SQL expertise, including:
T‑SQL and PL/SQL.
DDL/DML creation.
Ability to create and troubleshoot stored procedures, views, functions
Proven experience in:
Query Performance Tuning.
Data Validation.
Ensuring accuracy, completeness, and reliability of datasets.
Data Governance Responsibilities (required)
Experience ensuring analytics and reporting data adheres to data governance standards, including:
Data quality.
Metadata and documentation.
Privacy and security considerations.
Experience acting as a technical data steward for assigned domains by:
Defining and maintaining trusted data definitions.
Partnering with business and technical teams on metrics and business rules.
CI/CD & Version Control (preferred)
Experience using GitHub for version control of SQL, code, and with understanding of:
Branching and merging strategies.
CI/CD deployment approaches for database objects across environments.
Tools & Technologies (Hands‑On preferred)
Data modeling tools:
Erwin (preferred).
ER/Studio or equivalent accepted.
Development tools (preferred):
Visual Studio Code.
XMLSpy.
Source‑to‑target mapping:
Erwin DI Suite / Erwin Mapping Manager (preferred).
Excel‑based mappings (acceptable).
Programming:
Python (working knowledge required).
SDLC & Delivery Methodologies
Experience working in:
Waterfall.
Agile / SCRUM.
Sprint‑based delivery models.
AI & Modern Analytics Practices (Nice to See, Strong Plus)
Experience actively leveraging AI tools in daily work to:
Generate or optimize SQL/Python.
Accelerate analysis.
Improve documentation.
Support data quality and governance tasks.
Look for practical, productivity‑focused usage (not experimentation only).
Development Background (preferred)
Hands‑on development experience in cloud data platforms, including:
Pipelines.
Pyspark Notebooks.
Azure Data Factory.
#LI-Remote #Latam #DataArchitect #DataAnalyst #DataModeling
Location:
MEX Work-at-HomeLanguage Requirements:
Time Type:
Full time