Role and Responsibilities
- Solve the hardest problems with data and AI to deliver social and economic value
- Develop and maintain relationships with a broad range of clients, colleagues, and partners across a variety of contexts and formats
- Build, lead and mentor a world-class team of data engineers
- Maintain a culture of excellence and lead with confidence, charisma, context, and humility working effectively at all levels
- Create and deliver technical blogs & thought leadership on data engineering
- Invest continuously in building and extending your knowledge and skills
Knowledge and Skills Requirements
- Bachelor's or master's degree in science, technology, engineering or mathematics
- Exceptional expertise in expertise in designing, building, and maintaining enterprise-scale data platforms and pipelines
- Practical hands-on experience with at least three major data platforms: Snowflake, BigQuery, MongoDB, Cassandra, PostgreSQL, MySQL, Oracle, Neo4j
- Proficiency with modern data processing frameworks: Spark, dbt, Airflow, or similar technologies
- Advanced knowledge of data streaming solutions: Kafka, Spark Streaming, Flink, or equivalent
- Expert-level coding ability in at least one language (Python, Scala, Java) and working knowledge of others
- Strong experience with cloud data services (AWS, Azure, GCP) including migration strategies and optimization
- Practical experience implementing data governance, security, and compliance frameworks
- 10 years of experience delivering and managing data solutions through incubation,
- proofs-of-concept, to deployment and commercialisation
- Strong ability to develop and maintain relationships amongst clients, colleagues, and partners
- Ability to develop and deliver client proposals, and build consensus supported by detailed analysis, deep expertise and effective communication
- Demonstrated ability to guide, develop and mentor data engineers
- Demonstrated ability to create technical blogs & thought leadership on data engineering and management
- Active engagement in the data community taking and giving courses, following podcasts, reading books, attending meetups, and developing projects