Overview:
We are looking for an experienced
Data Modeler at Architect level with deep expertise in
Data Vault architecture,
data modeling best practices, and hands-on implementation of
hubs, links, satellites, bridge tables, and
Vault Speed concepts. This position will act as the
Lead Data Architect, guiding and mentoring offshore data engineers while providing architectural leadership to client stakeholders.
Key Responsibilities:
- Lead the design and development of enterprise-grade Data Vault data models, including hubs, links, satellites, PIT tables, and bridge tables.
- Provide architectural direction and define data modeling standards, guidelines, and best practices.
- Act as a technical lead for offshore data engineering teamsguiding, reviewing work, and ensuring high-quality deliverables.
- Collaborate with business and technical stakeholders to understand requirements and convert them into scalable data models.
- Advise and consult client management on architecture decisions, modernization strategy, and Data Vault best practices.
- Drive data governance, metadata management, and data quality improvements.
- Ensure optimal performance, scalability, and maintainability across data platforms.
- Work closely with ETL/data engineering teams to ensure successful implementation of data models.
Required Skills & Experience:
- 8-12 years of overall experience in data architecture, data modeling, and data engineering.
- Strong expertise in Data Vault 2.0 methodology and hands-on experience building:
- Hubs
- Links
- Satellites
- Bridge Tables
- PIT and Reference tables
- Experience with VaultSpeed or similar automation tools (advantage).
- Solid understanding of relational modeling, dimensional modeling, and enterprise data warehousing concepts.
- Experience working in cloud data platforms (Azure, AWS, or GCP).
- Strong communication skills with the ability to advise senior client stakeholders.
- Proven leadership experience managing offshore or distributed data teams.
- Ability to drive solutioning, provide technical guidance, and ensure architectural alignment.
Nice-to-Have:
- Exposure to Snowflake, Databricks, BigQuery, Redshift, or similar cloud DW platforms.
- Knowledge of ELT/ETL tools and pipeline orchestration.
- Experience in banking or financial services domain is a plus.