Search by job, company or skills

K

Data Platform Engineer (CAT 1)

2-4 Years
SGD 4,500 - 7,000 per month
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description:

  • Design, develop, and maintain end to end data pipelines for ingesting, transforming, and delivering data from multiple source systems (databases, files, APIs, streaming platforms).
  • Build and optimize ETL / ELT workflows using SQL, Python, and enterprise data integration tools.
  • Ensure data pipelines are scalable, resilient, and performant to meet operational and analytical requirements.

Database & Data Platform Management:

  • Work hands on with RDBMS platforms such as Oracle, DB2, SQL Server, or PostgreSQL for data extraction, transformation, and performance tuning.
  • Develop and optimize SQL queries, views, and stored procedures to support reporting and analytics use cases.
  • Support data modelling activities (logical and physical) for analytics and reporting layers.

Data Quality, Governance & Operations:

  • Implement data validation, reconciliation, and monitoring to ensure data accuracy, completeness, and consistency.
  • Support operational data activities, including incident investigation, root cause analysis, and remediation.
  • Maintain clear documentation for data pipelines, schemas, and operational processes to support audits and knowledge transfer.

Collaboration & Stakeholder Engagement:

  • Collaborate with business users, product owners, and downstream teams to gather requirements and translate them into technical solutions.
  • Work closely with Data Analysts, BI developers, and Data Scientists to enable dashboards, reports, and advanced analytics.
  • Participate in Agile ceremonies and contribute to sprint planning, estimation, and delivery.

Requirements:

  • Bachelor's degree in Computer Science, Engineering, Information Systems, or equivalent practical experience.
  • Strong hands on experience with SQL and relational databases (Oracle, DB2, SQL Server, PostgreSQL).
  • Experience building and supporting ETL / data pipelines in enterprise environments.
  • Solid understanding of data modelling, data quality, and data lifecycle management.
  • Ability to troubleshoot data issues and work in production / operational environments.
  • Experience with Python for data processing or automation.
  • Experience with data streaming technologies (e.g. Kafka, Spark, NiFi).
  • Experience with BI and visualization tools such as Tableau, Qlik, or Power BI.
  • Knowledge of cloud or hybrid data platforms and orchestration tools.
  • Knowledge of Agile / DevOps practices and CI/CD for data pipelines.
  • Minimum 2 years working experience.
  • 3 must have skillset: SQL, Python, Tableau.

More Info

Job Type:
Industry:
Employment Type:

Job ID: 145826271