Search by job, company or skills

I

Senior Data Engineer (Databricks)

4-7 Years
SGD 6,500 - 8,500 per month
new job description bg glownew job description bg glownew job description bg svg
  • Posted 4 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Position: Senior Data Engineer/Data Lead - Databricks

Location: Singapore

Type of Employment: 2-years Contract (Subject to extension or conversion to permanent employment upon completion)

Purpose of the Position: The Senior Data Engineer / Data Lead will be responsible for designing, developing, and optimizing scalable cloud-based data platforms using Databricks or Snowflake within cloud environments. The role involves leading data engineering initiatives, building high-performance data pipelines, and enabling advanced analytics and AI use cases through robust cloud data architectures.

Key Result Areas and Activities:

  • Design and implement end-to-end data pipelines using Databricks or Snowflake on cloud platforms.
  • Develop scalable data processing workflows using Python and PySpark for large-scale structured and semi-structured datasets.
  • Design and implement optimized data warehouse schemas, including star schema, snowflake schema, and dimensional modeling to support enterprise reporting and analytics.
  • Architect and optimize data lake / lakehouse / warehouse solutions leveraging cloud services.
  • Implement best practices for data modeling, partitioning, performance tuning, and cost optimization.
  • Build batch and real-time ingestion pipelines integrating data from various enterprise source systems.
  • Ensure data quality, governance, and security compliance within cloud environments.
  • Collaborate with data scientists, analysts, and business stakeholders to translate requirements into scalable technical solutions.
  • Lead technical discussions and mentor junior data engineers.
  • Troubleshoot production data issues and ensure high availability of data platforms.

Essential Skills:

  • Strong hands-on experience with Databricks or Snowflake.
  • Proficiency in Python and PySpark for distributed data processing.
  • Experience designing data lakes, lakehouses, or cloud data warehouses.
  • Strong knowledge of SQL and data modeling concepts.
  • Experience with performance tuning and optimization in distributed systems.

Desirable Skills :

  • Experience with Delta Lake or Snowflake advanced optimization techniques.
  • Knowledge of data security, encryption, and compliance frameworks.
  • AWS certification (e.g., Solutions Architect, Data Analytics Specialty).

Qualities:

  • Strong architectural thinking with attention to scalability and performance.
  • Able to communicate complex technical concepts clearly to stakeholders.
  • Strong ownership mindset with leadership capability.
  • Comfortable working in Agile and cross-functional environments.
  • Able to manage multiple initiatives in fast-paced delivery settings.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 143654275

Similar Jobs