Search by job, company or skills

M

Data Engineer (Databricks & L3 Operations)

8-10 Years
SGD 7,800 - 8,200 per month
new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role Summary

We are seeking an experienced Databricks Operations & Implementation Engineer to design, implement, and manage high-performance data pipelines and operational processes within the Databricks environment. The ideal candidate will combine deep technical expertise in Databricks, Apache Spark, and AWS Cloud with strong operational discipline, ensuring platform stability, governance, and continuous optimization.

Key Responsibilities

Implementation

  • Design, build, and optimize ETL/ELT pipelines leveraging Databricks native capabilities to process large-scale structured and unstructured datasets.
  • Implement data quality frameworks and monitoring solutions using Databricks built-in features to ensure data reliability and consistency.
  • Establish governance, security, and compliance best practices across Databricks environments and integrate with enterprise systems.

Operational Management

  • Monitor and maintain production data pipelines to ensure 99.9% uptime and optimal performance.
  • Implement logging, alerting, and monitoring solutions using Databricks and enterprise tools.
  • Perform cluster health checks, resource utilization reviews, and performance tuning to prevent bottlenecks.
  • Manage incident response for Databricks pipeline failures, including root cause analysis and resolution.
  • Develop and maintain disaster recovery and backup strategies for critical data assets.
  • Conduct cost and performance optimization of Spark jobs and Databricks clusters.
  • Implement automated testing frameworks (unit, integration, and data validation tests) for Databricks pipelines.
  • Maintain detailed runbooks, operational documentation, and troubleshooting guides.
  • Coordinate system upgrades and maintenance windows with minimal business disruption.
  • Manage user access, workspace configuration, and security controls within Databricks.
  • Oversee data lineage and metadata using Databricks Unity Catalog for transparency and compliance.
  • Conduct capacity planning and cost forecasting for Databricks infrastructure and workloads.

Collaboration & Leadership

  • Provide technical mentorship to team members on Databricks best practices and data engineering techniques.
  • Participate in on-call rotations for production systems and ensure platform stability.
  • Lead operational reviews and contribute to continuous improvement initiatives for platform reliability.
  • Collaborate with infrastructure and security teams on cluster provisioning, networking, and access controls.

Requirements / Qualifications

Education & Experience

  • Bachelor's Degree in Computer Science, Computer Engineering, or equivalent field.
  • 8-10 years of experience in system operations, data platform management, or cloud operations.
  • Hands-on project experience with the Databricks platform (primary requirement).
  • Proven experience in cloud operations or architecture (AWS preferred).
  • AWS Cloud Certification required Databricks Certification highly preferred.

Core Technical Skills

  • Expert proficiency in Databricks platform administration, workspace management, cluster configuration, and job orchestration.
  • Deep expertise in Apache Spark (Spark SQL, DataFrames, RDDs) within Databricks.
  • Strong experience with Delta Lake (ACID transactions, versioning, time travel).
  • Hands-on experience with Databricks Unity Catalog for metadata management and data governance.
  • Comprehensive understanding of data warehousing, data profiling, validation, and analytics concepts.
  • Strong knowledge of monitoring, incident management, and cloud cost optimization.

Technology Stack Exposure

  • Databricks (core platform expertise).
  • AWS Cloud Services & Architecture.
  • Informatica Data Management Cloud (IDMC).
  • Tableau for reporting and visualization.
  • Oracle Database administration.
  • ML Ops practices within Databricks (advantageous).
  • Familiarity with STATA, Amazon SageMaker, and DataRobot integrations (nice-to-have).

Morgan McKinley Pte Ltd

EA License No: 11C5502

EAP No. R23111942

EAP: Feng Ye

More Info

Job Type:
Industry:
Employment Type:

Job ID: 128865787

Similar Jobs