Key Responsibilities
- Design, develop, test, and maintain scalable ETL pipelines to meet business, technical, and user requirements.
- Collect, refine, and integrate new datasets. Maintain comprehensive documentation and data mappings across multiple systems.
- Create optimized and scalable data models that align with organizational data architecture standards and best practices.
- Drive continuous improvement in data quality through optimization, testing, and solution design reviews.
- Ensure all solutions conform to big data architecture guidelines and long-term roadmap.
- Implement robust monitoring, logging, and alerting systems to ensure pipeline reliability and data accuracy.
- Apply best practices in data engineering to design and build reliable data marts within the Hadoop ecosystem for planning, reporting, and analytics.
- Maintain and optimize data pipelines to ensure data accuracy, integrity, and timeliness.
- Manage code in a centralized repository with clear branching strategies and well-documented commit messages.
- Coordinate with stakeholders to ensure smooth production deployment and adherence to data governance policies.
- Proactively identify and implement improvements to data engineering processes and workflows.
- Develop end-to-end solutions for data modeling in the data warehouse, including data acquisition, contextualization, and integration with business processes.
- Ensure adherence to development standards and perform periodic reviews to maintain pipeline performance and sustainability.
- Coordinate and conduct testing with stakeholders to ensure effective deployment of data pipelines and dashboards.
- Serve as the primary data engineering contact at the stakeholder location, ensuring clear communication and alignment on priorities
- Monitor data pipelines continuously and collaborate with stakeholders to troubleshoot and optimize performance.
- Leverage domain knowledge in insurance to design data models and pipelines that support business processes and analytics.
- Work closely with business stakeholders to understand requirements and translate them into scalable data engineering solutions.
Requirements:
- Degree with at least 10 years working experience, preferably in Banking/Insurance Industry
- Proven experience in data engineering, ETL development, and big data technologies
- A strong team player who is meticulous, detail-oriented, and capable of performing under pressure
- Proficiency in tools and platforms such as Hadoop, Spark, Hive, and cloud data services (e.g., AWS, Azure, GCP).
- Possesses strong problem-solving and interpersonal skills.
- Committed, dependable, and adaptable with the flexibility to support during peak periods and tight deadlines
Interested applicants, please Email , and look for
Jensen Fang Lifa
Recruit Express Pte Ltd
EA License No. 99C4599
EA Personnel Registration Number: R2197080
We regret that only shortlisted candidates will be contacted.