Search by job, company or skills

Zenika Singapore

Data Engineer

5-7 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Is there a Zenika in you

Let's talk skills and passion first.

You're a data enthusiast who thrives on transforming complexity into clarity. With deep technical expertise in AWS and Databricks, you design robust, scalable, and high-performance data pipelines that power intelligent decisions. Curious by nature, you're constantly exploring new tools, automation techniques, and architectures to deliver meaningful data solutions that scale

Your Role as a Zenika Consultant

As a Data Engineer Consultant, you'll play a key role in designing and implementing data platforms for our clients — particularly in public sector and enterprise environments. You'll work hands-on with technologies like AWS, Databricks, and PySpark, and collaborate with cross-functional teams to deliver scalable, production-ready data solution

You'll work on projects that will allow you to:

  • Design and build enterprise-scale data architectures — including data lakes, warehouses, and real-time streaming pipelines
  • Develop and maintain high-performance ETL/ELT pipelines that process large volumes of structured and unstructured data
  • Implement and optimise data transformations in Databricks using PySpark, Python, and SQL, ensuring quality, scalability, and cost-efficiency
  • Collaborate and mentor — guide junior engineers, review code, and drive adoption of best practices in data engineering and DevOps
  • Integrate and automate data flows using APIs, AWS services (Glue, Redshift, S3, EMR, Lambda), and real-time tools like Kafka or Kinesis
  • Monitor and troubleshoot performance bottlenecks, ensuring reliability, consistency, and security across all data operation

What You Bring

  • 5–7 years of experience in data engineering, ideally on AWS-native platforms.
  • Proven experience building ETL/ELT pipelines, migrating data solutions, and managing real-time streaming architectures.
  • Hands-on expertise in Databricks, PySpark, Python, and SQL for large-scale data transformation.
  • Strong understanding of data warehouse design (Redshift, Snowflake) and data governance principles.
  • Familiarity with DevOps concepts, including CI/CD workflows and version control (GitLab, GitHub).
  • Experience with serverless compute (AWS Lambda, Azure Functions) and automation scripting.
  • Solid grasp of data security, performance optimisation, and cost management in cloud environments.
  • Excellent communication and collaboration skills — you translate complex technical details into clear business language.
  • Bonus: Experience with public sector or HR analytics projects, as well as mentoring and leading data teams.
  • Bachelor's degree in Computer Science, Data Engineering, or a related field (Master's preferred)
  • Certifications such as:AWS Certified Data Engineer – Associate or AWS Certified Solutions Architect, Databricks Certified Data Engineer (Associate/Professional), Snowflake or Redshift certifications (a plus), Agile/Scrum Master certification (preferr

Why Join Zenika

  • Work with a global client base across 11 locations, benefiting from over 28,000 Zenika-led training sessions
  • Partner with tech giants like Google Cloud and Scrum.org, and engage in research, open-source contributions, and conferences outside client projects
  • Connect and grow with fellow experts through our annual TechnoZaures, sharing skills and know

Ready to code your story with us Apply Now!

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147144893

Similar Jobs

Early Applicant