Requirements (Mandatory)
- 10+ years in data industry performing solutioning, architecting and engineering tasks.
- 3+ years on databricks solution implementation.
- Proven consulting/services experience and ability to present solutions to senior client stakeholders.
- Handson experience in leading a large data platform implementations
- Deep expertise in Databricks with lakehouse architectures, MLflow, Delta Lake.
- Strong cloud concepts and architecture knowledge (AWS).
- Data modelling, Spark, SQL, ETL/ELT, and orchestration tools.
- Understanding of data governance with Unity Catalog, security, and modern data platform design.
Requirements (Nice to have)
- Knowledge of AI/ML requirements and the ability to work with AI/ML experts.
- Presales experience including Use case definition, POC creation and RFP responses.
- Knowledge of MLOps, data observability, DevOps/CI-CD, and BI tools.
- Databricks or AWS certifications.
- Excellent communication, clientfacing abilities, and solution articulation.
- Handson, consultative, and able to lead teams and delivery.
Job Responsibilities
- Lead endtoend data platform implementations on Databricks, ensuring scalable and highquality delivery.
- Align data architecture with AI/ML initiatives and guide clients on data readiness for AI.
- Mentor and develop data engineering and architecture teams.
- Contribute to internal Solution Review Board with the best practices, reference architectures, and reusable assets