Search by job, company or skills

B

Cloud Data Engineer

3-5 Years
SGD 5,000 - 7,200 per month

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago

Job Description

Role Summary:

We are looking for a skilled Data Engineer to design, build, and maintain scalable, cloud-native data platforms. The ideal candidate has strong programming and SQL skills, hands-on experience with AWS or Azure, and a solid understanding of data modeling, ETL pipelines, and big data processing frameworks.

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and data platforms.
  • Build and optimize ETL/ELT workflows using PySpark, Pandas, and SQL.
  • Work with large datasets using distributed processing frameworks such as Apache Spark.
  • Design and implement data models to support analytics and reporting use cases.
  • Develop and manage data solutions on cloud platforms (AWS or Azure).
  • Orchestrate data workflows using tools such as Apache Airflow or Azure Data Factory.
  • Collaborate with cross-functional teams to deliver reliable, secure, and high-quality data solutions.
  • Implement best practices for data quality, performance, security, and governance.
  • Support CI/CD pipelines and automate deployments using DevOps best practices.

Requirements:

  • 3+ years of professional experience in Data Engineering or related roles.
  • At least 2 years of hands-on experience with cloud-native data services.
  • Strong proficiency in one programming language: Python, Java, or .NET.
  • Advanced SQL skills with experience writing complex and optimized queries.
  • Strong understanding of data modeling concepts (e.g., dimensional modeling).
  • Experience with relational databases (PostgreSQL, MySQL, Azure SQL).
  • Experience with NoSQL databases (DynamoDB, CosmosDB) and data lakes.
  • Hands-on experience with distributed data processing frameworks (Apache Spark / PySpark).
  • Experience with workflow orchestration tools such as Airflow or Azure Data Factory.
  • Familiarity with CI/CD pipelines and Infrastructure as Code tools.
  • Experience using Git, Docker, and Terraform is a plus.

Preferred Skills:

  • Experience designing end-to-end data architectures in AWS or Azure.
  • Strong understanding of system design, scalability, and security best practices.
  • Good communication skills and ability to work in cross-functional teams.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 139477839

Similar Jobs