Search by job, company or skills
Mandatory skills.
Very Strong Proficiency:
Python: Extensive experience in Python for data manipulation, scripting, and building data applications.
PySpark: Deep expertise in developing and optimizing large-scale data transformations using PySpark.
SQL: Advanced SQL skills, including complex query writing, performance tuning, and database design.
AWS: Hands-on experience designing, deploying, and managing data solutions on various AWS services (S3, EMR, Glue, Lambda etc).
Solid understanding of data warehousing concepts, ETL/ELT principles, and data pipeline best practices.
Excellent problem-solving, analytical, and communication skills.
Ability to work independently and as part of a collaborative team.
Desired skills.
Airflow: Experience with Airflow for orchestrating and managing data workflows.
Snowflake: Familiarity with Snowflake for cloud data warehousing and analytical processing.
Bitbucket (or Git): Proficient in using version control systems for collaborative development.
Domain.
Data Engineering
Mode of Interview: Telephonic/Face to Face/Skype Interview. -Teams or F2F
Date Posted: 25/07/2025
Job ID: 122462543