Search by job, company or skills

COMBUILDER PTE LTD

Data Engineer

5-7 Years
SGD 7,500 - 9,900 per month
new job description bg glownew job description bg glownew job description bg svg
  • Posted 8 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Responsibilities:

  • Design, develop, and maintain scalable data pipelines to process large volumes of structured and unstructured data.
  • Develop and optimize ETL and data transformation processes using Python, Java, Apache Spark, Databricks and SQL.
  • Implement data lake and data warehouse architectures using modern cloud technologies.
  • Build and maintain data ingestion frameworks for batch and streaming data pipelines.
  • Implement data quality checks, validation, and monitoring to ensure data accuracy and reliability.
  • Develop and manage Databricks-based data processing workflows.
  • Work with Azure or AWS cloud platforms to deploy and maintain data engineering solutions.
  • Implement bronze, silver, and gold data architecture using Delta Lake.
  • Optimize job performance through data partitioning, compaction, and query optimization techniques.
  • Develop and manage workflow orchestration pipelines using tools such as Apache Airflow or similar technologies.
  • Collaborate with DevOps teams to implement CI/CD pipelines for data platform deployments.
  • Work closely with data analysts and business teams to translate business requirements into technical data solutions.
  • Prepare technical documentation and maintain data architecture diagrams and pipeline documentation.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, Data Engineering, or a related field.
  • Minimum 5 to 7 years of experience in data engineering or big data development, with strong programming expertise in Python, Java, Apache Spark, SQL, Databricks, cloud platforms, and modern data engineering practices in developing data processing frameworks and ETL pipelines.
  • Proven experience building enterprise-scale data pipelines and data platforms. Must have hands-on enterprise project experience using Databricks, Delta Lake, and Apache Airflow or similar workflow orchestration tools, as well as data lake architecture and big data platforms.
  • Experience working in Agile development environments.
  • Experience with cloud-based data infrastructure, such as Microsoft Azure or Amazon Web Services (AWS).
  • Preferred certifications: Azure Data Engineer and Azure Fundamentals.
  • Experience with modern DevOps tools such as Docker, Kubernetes, Jenkins, Git / GitHub / Bitbucket, Maven, SonarQube, and Jira.
  • Experience working with large-scale financial or enterprise data platforms is advantageous.
  • Knowledge of data governance, data quality frameworks, and performance optimization techniques.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 144127709

Similar Jobs