Possess a degree in Computer Science/Information Technology or related fields.
Min 8 years of relevant working experience in software or data engineering with proficiency in Python.
Strong experience in unit and integration testing.
Familiarity with DevOps practices and Agile methodologies.
Strong software engineering, analytical and problem-solving skills.
Good team player and communication skill.
CAT-1 cleared/eligible candidate is required for this role.
Experience in AWS and Kubernetes (K8s)
Familiarity with data platforms such as Snowflake, Apache Spark, or Apache Hive, as well as orchestration tools like Apache Airflow, Dagster, or Prefect.
Familiarity with GitHub workflows and Datadog.
Develop and maintain data pipelines and ETL/ELT processes, and ensuring maintainability through unit and integration testing.
Collaborate with data teams to understand requirements and automate deployment and monitoring.
Optimize data storage and troubleshoot issues to enhance performance.