Minimum 8 years of relevant experience in software or data engineering, with strong proficiency in Python.
Proven experience in unit and integration testing.
Familiarity with DevOps practices and Agile methodologies.
Strong software engineering fundamentals, analytical, and problem-solving skills.
Excellent teamwork and communication abilities.
Good-to-Have Skills
Experience with AWS and Kubernetes (K8s).
Familiarity with data platforms such as Snowflake, Apache Spark, or Apache Hive, and workflow orchestration tools like Apache Airflow, Dagster, or Prefect.
Working knowledge of GitHub workflows and Datadog.
Expertise in DBT (Data Build Tool) will be an added advantage.