We're seeking a hands-on Resident Solution Architect (Databricks)with deep technical expertise in building and optimizing Lakehouse-based data and AI solutions. This is a contract role based in Singapore, offering flexibility to work on-site with clients or remotely within the region
In this role, you'll design, develop, and operationalize Delta Lakehouse architectures using Databricks, driving real-world outcomes for enterprise customers. You'll take ownership of implementation tasks, lead technical delivery, and mentor engineering teams in best practices across data engineering, governance, and AI
Key Responsibilities
Design and implement scalable data pipelines using Delta Live Tables (DLT), Spark SQL, Python, or Scala
Optimize ETL, streaming, and ML workloads for performance, cost efficiency, and reliability
Administer and configure Databricks Workspaces, Unity Catalog, and cluster policies for secure, governed environments
Automate infrastructure and deployments using Terraform, Git, and CI/CD pipelines
Implement observability, cost optimization, and monitoring frameworks using tools like Splunk, Prometheus, or CloudWatch
Collaborate with customers to build AI and LLM solutions leveraging MLflow, DBRX, and Mosaic AI
Required Skills & Experience
Strong hands-on experience with Databricks, including workspace setup, notebooks, clusters, and job orchestration
Expertise in Delta Lake, DLT, Unity Catalog, and SQL Warehouses
Proficiency in Python or Scala for data engineering and ML workflows
Strong understanding of AWS, Azure, or GCP cloud ecosystems
Experience with Terraform automation, DevOps, and MLOps practices
Familiarity with monitoring and governance frameworks for large-scale data platforms
Nice to Have
Experience developing AI/LLM pipelines and RAG architectures on Databricks
Exposure to Bedrock, OpenAI, or Hugging Face integrations
Databricks certifications (Data Engineer, Machine Learning, or Solutions Architect) preferred