Search by job, company or skills

I

Data Engineer

5-12 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 4 hours ago
  • Be among the first 10 applicants
Early Applicant
Quick Apply

Job Description

The Data Engineer is responsible for designing, building, and maintaining robust data infrastructure and pipelines that enable Business Intelligence and Analytics capabilities for the Pacific Operations organization. The Engineer needs to architect scalable data solutions by combining technical expertise, data engineering best practices, and deep understanding of data systems and architecture. She/he must be able to deliver reliable, performant data solutions in a timely manner and collaborate with data analysts, business users, and technical stakeholders. The engineer will design and implement ETL/ELT pipelines, data models, and automation frameworks. She/he will provide technical expertise in data integration, optimize data workflows, and ensure data quality and reliability across systems. The Data Engineer should possess and maintain deep knowledge of data architecture patterns, modern data stack technologies, and DataOps practices. The Engineer should be able to lead and participate in cross-functional teams to address technical requirements and system design challenges. She/he must be skilled at documenting technical specifications, designing scalable solutions, and implementing complex data engineering processes.

Key Qualifications

  • 5-7 years experience in data engineering or related technical role
  • Extensive experience in ETL/ELT pipeline development, data modeling, data warehousing, and workflow orchestration
  • Strong proficiency inPython, SQL, Airflow, and dbt
  • Experience with modern cloud data platforms -Snowflake, AWS/GCP data services
  • Advanced SQL skills including query optimization, performance tuning, and complex data transformations
  • Experience designing and implementing data orchestration frameworks usingAirflow, Dataiku or similar tools
  • Strong understanding of data modeling techniques (dimensional modeling, data vault, etc.)
  • Experience with streaming data technologies (Kafka, Kinesis) is a plus
  • Familiar withSupply Chain/Operations data domainspreferred
  • Proficiency in version control (Git), CI/CD practices, and infrastructure-as-code
  • Experience with data quality frameworks and testing methodologies
  • Strong understanding of data governance, security, and compliance requirements
  • Excellent attention to detail and ability to ensure data accuracy and reliability at scale
  • Excellent communication skills with ability to translate technical concepts to non-technical stakeholders
  • Ability to operate in a fast-paced, rapidly changing environment
  • Business acumen and ability to understand complex operational processes and translate them into data requirements
  • Excellent problem-solving skills: ability to analyze and resolve complex technical problems systematically
  • Experience with containerization (Docker) and orchestration tools
  • Strong understanding of database internals, indexing strategies, and partitioning schemes
  • Self-motivated individual able to function effectively when working independently or in a team

Description

  • Design, build, and maintain scalable data pipelines supporting critical business processes (Executive Reporting, Business Review, Special Events, Capital Projects)
  • Develop and optimize ETL/ELT workflows for daily/weekly/monthly/quarterly data processing and integrations
  • Build reusable data transformation frameworks and libraries to support analytics and reporting needs
  • Implement data quality checks, monitoring, and alerting systems to ensure pipeline reliability
  • Design and implement dimensional data models and data warehouse schemas in Snowflake
  • Optimize query performance and data pipeline efficiency to support SLA requirements
  • Develop data integration solutions connecting disparate source systems
  • Implement DataOps best practices including version control, testing, and CI/CD for data pipelines
  • Troubleshoot and resolve data pipeline failures and performance issues
  • Proactively identify opportunities to improve data infrastructure, architecture, and engineering practices
  • Document technical specifications, data lineage, and architecture decisions
  • Develop and maintain data dictionaries, schema documentation, and runbooks
  • Participate in data platform initiatives and propose technical enhancements (Data Architecture, Infrastructure, Automation, Observability)
  • Collaborate with data analysts and business stakeholders to understand requirements and design appropriate data solutions
  • Implement data security and access control policies
  • Support capacity planning and scalability assessments for data systems

Education

Degree in Computer Science, Information Technology, Data Engineering, or related technical field preferred

Masters/ Post Graduate, Bachelors/ Degree

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Singaporean

Job ID: 138863965

Similar Jobs