At Dymon Asia Capital, a leading Alternative Investment Manager, we are committed to delivering superior risk-adjusted returns to our investors within a collegial environment that fosters a culture of Integrity, Teamwork, Respect, Excellence and Entrepreneurism.
We are seeking an experienced Database Engineer with strong cloud infrastructure expertise to design and scale our next-generation data platform. This individual will play a pivotal role in shaping the architecture that powers our multi-asset investment strategies, quantitative research, risk management, and AI-driven workflows.
Responsibilities:
- Design and implement a scalable, cloud-native data architecture leveraging Azure, Databricks, and modern data lakehouse technologies.
- Define and enforce standards for data storage, transformation, and consumption using open table formats (Iceberg, Delta) and efficient file formats (Parquet).
- Architect and optimize distributed compute and orchestration environments (Kubernetes, Databricks Clusters) for performance, cost efficiency, and resilience.
- Collaborate with data engineers, DevOps, and researchers to design robust pipelines, including support for real-time and batch data.
- Implement infrastructure-as-code (Terraform, ARM Templates, etc.) for optimisation and automated deployment and management of data environments.
- Actively contribute to data governance efforts concentrating on lineage, metadata and security models to ensure compliance and reliability across the data ecosystem.
- Evaluate emerging technologies and frameworks, guiding the firm's strategy for scalable and future-proof data infrastructure.
Qualifications
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- Proven experience as a Data Architect, Senior Data Engineer, or similar role in a data-intensive environment.
- Deep expertise with Azure cloud services, Databricks, and Kubernetes.
- Strong knowledge of data lakehouse and open table formats (Iceberg, Delta Lake, Parquet).
- Hands-on experience with infrastructure-as-code (Terraform, Ansible, or similar).
- Strong understanding of data modelling, distributed systems, and scalable architecture patterns.
- Familiarity with streaming frameworks (Kafka, Spark Structured Streaming) and batch ETL/ELT pipelines.
- Excellent communication and stakeholder management skills, with the ability to translate business needs into technical architecture.