
Search by job, company or skills
About the Company
Headquartered in Singapore, join a high-growth global tech firm scaling at pace. A fast-moving environment yet stable environment where you will drive innovation across our international data footprint.
About the Role
You will own the design and optimization of our global data ecosystem, ensuring high-performance pipelines and robust governance.
Architect scalable ETL/ELT pipelines using the full AWS Stack (Glue, Redshift, Lambda).
Drive CI/CD automation, CDC, and performance partitioning for high-speed data delivery.
Manage regional data lakes (S3) and build complex dimensional models.
Ensure elite data quality and strict GDPR/PDPA compliance.
Minimum 3 years in Data Engineering.
Mandarin speaking required as you need to liaise with Chinese counterparts who can only speak and write in Mandarin.
Expert in Python and SQL, hands-on with AWS (Glue, Redshift, S3, Athena, EMR).
Databases: Skilled in Relational (PostgreSQL) and NoSQL (DynamoDB/MongoDB).
To apply online please use the apply function, alternatively you may contact Evangeline. (EA: 94C3609/ R24124002 )
Job ID: 146388885