Search by job, company or skills

Cygnify

Senior Principal Data Engineering Lead

8-12 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 2 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role: Senior Principal Data Engineering Lead

Location: Singapore

To lead and scale the Data Engineering, DataOps and Data Stewardship functions within the Data organization. This role ensures end-to-end delivery excellence of the cloud-native data platform spanning data ingestion, transformation, modeling, and operations to enable reliable, high-quality, and self-service analytics across business domains.

Requirements

  • Team Leadership: Recruit, mentor, and lead a hybrid team of data engineers and stewards across Singapore, Malaysia and India, establishing in-house technical leadership and delivery ownership.
  • Data Engineering Delivery: Oversee design, development, and optimization of ELT/ETL pipelines and data models, ensuring scalable, reusable, and cost-efficient workflows.
  • Data Quality & Stewardship: Institutionalize stewardship processes define ownership models, implement DQ monitoring, and drive remediation workflows with cross-functional data users.
  • Operational Excellence: Manage daily pipeline operations, SLA compliance, and production issue resolution with strong root-cause analysis and continuous improvement.
  • Technical Governance: Set engineering standards for observability, RBAC, cost tagging, and CI/CD practices.
  • Collaboration & Enablement: Enable self-service analytics by curating trusted datasets and modeled views, working with BI and business teams.
  • 812 years of experience in cloud-native data engineering, with strong architecture and delivery experience on AWS.
  • Proven leadership of cross-functional and hybrid engineering teams, including vendor-augmented resources.
  • Experience partnering with BI and business teams to design modelled datasets and enable self-service analytics.
  • Deep hands-on technical expertise, including: Snowflake: schema design, Streams/Tasks, Stored Procedures, UDFs, RBAC, performance tuning, Cortex AI, Streamlit, cost monitoring.
  • Airflow or similar data orchestration tools: orchestration, scheduling, dependency management, and observability.
  • Python and SQL: pipeline scripting, transformation logic, and data validation.
  • ELT/ETL frameworks: Airbyte, Fivetran, and custom connector development.
  • AWS services: S3 (data lake structures and archival), Lambda, KMS, Transfer Family, CloudWatch, Sagemaker.
  • Demonstrated success delivering medallion architecture (Bronze/Silver/Gold) and enabling self-service data use cases.
  • Experience building data quality frameworks, stewardship policies, and data lineage tracking across enterprise datasets.
  • Familiarity with machine learning integration using platforms like AWS SageMaker.
  • Proven ability to troubleshoot complex data issues, lead root-cause analysis, and ensure production stability.
  • Track record of transitioning delivery ownership from vendors to internal teams while maintaining quality and velocity.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 136921713