- Regional scale SaaS Platform
- Work within a regional team across multiple Asia locations
- Led end-to-end development of scalable Azure-based data platforms
Summary
We are partnering with a prominent healthcare SaaS platform to assist their expansion specifically on the recruitment of a Senior Data Architect role. Reporting to the Head of Platform, the role will primarily be responsible to design and deliver robust, scalable data architectures within Azure Cloud. The ideal candidate will be someone having a strategic vision with hands-on technical skills (particularly cloud-native data platforms, Infrastructure as Code and modern data architecture) to drive innovation and efficiency through collaboration.
This is a great opportunity to gain hands-on experience with large scale regional projects and collaborate with teammates across multiple Asia locations. Compensation for this position is structured to be highly competitive, exceeding market norms.
Responsibilities
- Architect and deploy scalable data platforms on Azure with Databricks, ensuring high performance, reliability and security.
- Design and administer cloud data infrastructure leveraging Infrastructure as Code (IaC) practices.
- Partner with engineers, architects, and business stakeholders to deliver resilient and effective data solutions.
- Establish and enforce governance and security standards through access controls and data management frameworks.
- Provide technical leadership while mentoring engineering teams to build expertise and maintain quality.
- Align data platform designs with enterprise strategies and overarching business objectives.
- Optimize cloud resources to enhance efficiency, control costs and support scalable data workloads.
Candidate profile
- Degree holder in Computer Science, Information Technology or relevant discipline
- Minimum 8+ years of experience in cloud data platform architecture, with deep expertise in Microsoft Azure.
- Proven track record in designing and deploying Databricks-based solutions, including Spark, Delta Lake, MLflow and Unity Catalog.
- Strong knowledge of data engineering, networking, cloud architecture and modern data practices (ETL/ELT, streaming, data modeling).
- Hands-on proficiency with Infrastructure as Code (Terraform, ARM templates) and DevOps tools (GitHub, CI/CD).
- Skilled in scripting and automation using Python, PowerShell, or Bash, with focus on efficiency and scalability.
- Experienced in data governance, access management, metadata management and compliance frameworks (PDPA, GDPR, HIPAA).
- Familiar with observability and monitoring tools (Azure Monitor, Log Analytics, Grafana) to ensure reliability and performance.
- Excellent communication, documentation and stakeholder management skills, with experience in Agile/Scrum environments.
Interested in exploring this role further Reach out to Kelvin Lau for a confidential conversation.