Acting as the primary owner in translating business needs and complex SQL queries into scalable, reusable data models using DBT
Understanding complex business requirements and translating them into effective, insightful dashboards that support decision-making
Building secure and scalable ELT data pipelines in Airflow and GCP
Creating and maintaining comprehensive documentation for both core and business-specific data assets to ensure clarity and consistency across the organisation
Ensuring data models are accurate, complete, and reliable by managing the integrity and quality of all data assets
Continuously improving the performance of DBT models by applying best practices in query design and data transformations
Collaborating with scrum master, product managers, business stakeholders and peers to develop iteratively in sprints following agile processes
Requirements
At least 2 years in the data space as an analytics engineer / data analyst
Advanced proficiency in SQL, DBT and Big Query knowledge is a plus
Basic knowledge of Python programming
Demonstrated experience in building ETL (or ELT), airflow knowledge is a plus
Demonstrated ability to understand and translate business requirements into dashboards, DOMO or PowerBI knowledge is a plus
Ability to connect the dots between technical implementation work and business context