What You'll Do:
- Design and deploy data models, tables, views, and data marts across data warehouses, lakes, and operational stores. Build APIs and backend structures to support applications and analytics.
- Clean, transform, and integrate data from multiple sources to ensure accuracy, reliability, and consistency. Automate data workflows, including web scraping when needed.
- Develop and maintain large-scale batch and real-time pipelines using ETL/ELT frameworks and modern tools such as AWS Glue, Python, Lambda, Spring, Spark, or similar technologies.
- Partner with Product Managers, Data Architects, Analysts, and Developers to deliver end-to-end data-driven solutions. Participate in Agile ceremonies, code reviews, and CI/CD workflows.
- Build cloud-native solutions on AWS, Azure, or GCP. Work with big data tools like Hadoop, Spark, Kafka, and RabbitMQ to enable scalable and resilient data flows.
- Apply best practices in data governance, secure storage, access control, and compliance. Ensure proper handling of sensitive and critical data assets.
What We're Looking For:
- Strong programming skills in SQL, Python (pandas), R, or equivalent for data processing and transformation.
- Hands-on experience with ETL/ELT pipeline development and workflow orchestration.
- Proficiency in relational and non-relational databases(PostgreSQL, MySQL, SQL Server, MongoDB, Cassandra, Athena, S3, etc.).
- Experience with data warehouses, data lakes, data marts, and virtualization concepts.
- Knowledge of distributed systems and big data frameworks such as Hadoop, Spark, Kafka, and streaming/microbatch pipelines.
- Comfortable working with APIs, system integration, and backend development.
- Familiarity with web scraping tools (BeautifulSoup, Selenium, Node.js) and automation practices.
- Understanding of data security, access control, encryption, and audit logging.
- Experience in cloud environments and DevOps-oriented CI/CD workflows.
Soft Skills & Attributes:
- Analytical thinker with strong problem-solving skills.
- Excellent communication skills, able to bridge technical and non-technical teams.
- Collaborative mindset and self-motivated in fast-paced, Agile environments.
- Detail-oriented and proactive, with a focus on delivering reliable data solutions.
Qualifications:
- Degree or Diploma in Computer Science, Information Technology, Data Science, or related fields.
- Proven experience in data engineering, big data processing, cloud data infrastructure, or related areas.
Interested candidates are encouraged to submit their resumes outlining their relevant experience and achievements to apply88(@)talentvis.comor click apply!
..We regret to inform that only shortlisted candidates would be notified..
EA License No: 04C3537
EA Personnel No: R22106683
EA Personnel Name: Yang Hui Shan, Sherri