Search by job, company or skills

centience

Data Architect (Enterprise Data & AI Infrastructure)

10-12 Years
Save
new job description bg glownew job description bg glow
  • Posted 6 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

You will oversee architecture across petabyte-scale datasets, mission-critical analytics workloads, architect and evolve the company's enterprise-grade data foundation—powering analytics, machine learning, and AI transformation. Partnering with the Head of Data Science & Analytics, you'll design scalable, governed, and intelligent data ecosystems that ensure reliable, timely insights and innovation readiness.

Responsibilities

1. Enterprise Data Architecture

  • Design and implement multi-layer data warehouse and lakehouse architectures for structured, semi-structured, and unstructured data.
  • Define and enforce data modelling standards (3NF, dimensional, semantic).
  • Build and maintain data flow diagrams, ER models, and lineage maps for end-to-end data flow transparency and traceability.

  • Enable both real-time and batch data ingestion with low-latency, scalable design.

2. Data Infrastructure & Platform Engineering

  • Lead evaluation and integration of enterprise data and AI platforms (Databricks, Snowflake, AWS Redshift, BigQuery, Informatica, Talend, dbt, Apache NiFi, Airflow, SageMaker, MLflow, Hugging Face, LangChain).
  • Design cost-efficient, high-performing data infrastructures supporting analytics, ML, and LLM applications.
  • Ensure seamless data availability for querying, reporting, model deployment, and data app development.
  • Implement pipeline monitoring, error alerting, and system health dashboards for reliability and uptime.

3. Governance, Security & Compliance

  • Establish enterprise-wide governance frameworks ensuring data discoverability, quality, and compliance.
  • Enforce masking, encryption, and least-privilege access aligned with PDPA/GDPR standards.
  • Maintain metadata catalogues and automated lineage tracking to enhance auditability

4. Collaboration & Enablement

  • Partner with DS&A teams to design AI-ready data products.
  • Work with Product, Engineering, and DevOps to improve upstream data quality and standardization.
  • Enable self-service analytics through well-documented, high-quality data models.

5. Vision & Continuous Improvement

  • Introduce modern paradigms (data mesh, data fabric, lakehouse) as the company scales.
  • Lead proof-of-concepts for emerging tools and automation frameworks.
  • Drive documentation, reliability, and collaboration to strengthen the data culture.

Qualifications

  • Bachelor's/Master's in Computer Science, Information Systems, or related field.
  • 10+ years in data architecture, warehousing, or large-scale infrastructure.
  • Proven success in building enterprise DW/Lakehouse in high-volume industries (ecommerce, fintech, internet).
  • Deep expertise in AWS stack (Redshift, Glue, S3, EMR, Lake Formation, IAM, CloudWatch, SageMaker).
  • Strong in SQL, Python, and big data frameworks (Spark, Hadoop, dbt, Airflow).
  • Knowledge of real-time processing, ETL/ELT automation, and MLOps pipelines.
  • Hands-on governance and compliance experience (PDPA, GDPR).
  • AWS Certified Solutions Architect or equivalent preferred.

Success Indicators

  • 99%+ data platform uptime and data availability for analytics and ML.
  • Reduction of data issue escalations by 80% through proactive monitoring and lineage visibility.
  • Analytics and DS teams spending >70% of time on insights, not troubleshooting.
  • Recognized improvement in enterprise data maturity (e.g., via DAMA or internal assessment framework).
  • Robust, documented, and scalable architecture supporting analytics and AI initiatives seamlessly.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147080615