- Design and implement scalable data architecture using AWS services such as S3, Glue, Redshift, Lambda, and Step Functions.
- Build and maintain robust ETL/ELT pipelines to transform raw data into clean, structured datasets.
- Ensure data solutions are reliable, high-performing, and production-ready.
- Ensure strong data governance, security, and quality control across all pipelines and datasets.
- Write clean, well-documented code and support testing, automation, and version control practices.
- Collaborate closely with analysts, business stakeholders, and IT teams to understand data needs and deliver user-friendly data solutions.
- Contribute to a knowledge-sharing culture and help uplift the technical standard of the team.
Requirements :
- Minimum 4-5 years of experience in data engineering, with at least 2 years working in cloud-based environments (preferably AWS).
- Experience in building and managing data models and data pipelines for large-scale data environments.
- Proficiency in AWS tools: S3, Glue, Redshift, Lambda, Step Functions.
- Strong problem-solving skills and ability to work independently within a cross-functional team.
- Strong SQL and Python skills for data transformation and automation.
- Hands-on experience working with various data sources, including structured and unstructured data.
- Solid grasp of data warehousing concepts, data modeling, and modern ELT/ETL practices.
- Experience in building data warehouse for a large enterprise.
- Experience with Power BI, Snowflake on AWS, or Databricks.
- Understanding of AWS-based data security and governance best practices.
- Familiarity with DevOps practices
Interested candidates please click Apply.
Please note that only shortlisted candidates will be notified. EA Registration No: R1655133 Links HR Singapore Pte Ltd | EA License No: 09C5322