Search by job, company or skills

RAPSYS TECHNOLOGIES PTE. LTD.

Data Engineer(Python + Airflow + Snowflake)

Early Applicant
  • Posted 26 days ago
  • Be among the first 10 applicants
5-7 Years
SGD 8,000 - 10,000 per month

Job Description

Role:Data Engineer(Python + Airflow + Snowflake)

JD:

Responsibilities

. Design, develop, and maintain complex data pipelines using Python for efficient data processing and orchestration.

. Collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment.

. Implement data integration and transformation processes to ensure optimal performance and reliability of data pipelines.

. Optimize and fine-tune existing data pipelines / Airflow to improve efficiency,scalability, and maintainability.

. Troubleshoot and resolve issues related to data pipelines, ensuring smooth operation and minimal downtime.

. Work closely with AWS services like S3, Glue, EMR, Redshift, and other related technologies to design and optimize data infrastructure.

. Develop and maintain documentation for data pipelines, processes, and system architecture.

. Stay updated with the latest industry trends and best practices related to data engineering and AWS services.

Requirements

. Bachelor's degree in Computer Science, Engineering, or a related field.

. Proficiency in Python, PySpark and SQL for data processing and manipulation.

. Min 5 years of experience in data engineering, specifically working with Apache Airflow and AWS technologies.

. Strong knowledge of AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda.

. Understanding of Snowflake Data Lake is preferred.

. Experience with optimizing and scaling data pipelines for performance and efficiency.

. Good understanding of data modeling, ETL processes, and data warehousing concepts.

. Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment.

. Effective communication skills and the ability to articulate technical concepts to

non-technical stakeholders.

Preferred Qualifications:

. AWS certification(s) related to data engineering or big data.

. Experience working with big data technologies like Snowflake, Spark, Hadoop, or

related frameworks.

. Familiarity with other data orchestration tools in addition to Apache Airflow.

. Knowledge of version control systems like Bitbucket, Git.

More Info

Industry:Other

Function:Data Engineering

Job Type:Permanent Job

Date Posted: 04/09/2025

Job ID: 125483677

Report Job
View More
Last Updated: 28-09-2025 07:56:30 PM
Home Jobs in Singapore Data Engineer(Python + Airflow + Snowflake)

Similar Jobs