What will you do
Data Architecture & Data Modelling
- Design and implement scalable data storage solutions using Snowflake architecture
- Support structured and unstructured data across staging, serving, and transformed layers
- Implement data modelling patterns including slowly changing dimensions and surrogate key strategies
- Define and implement data quality checks, validation rules, and data masking using Snowflake DMF functions and data masking rules
ETL / ELT & Data Pipeline Engineering
- Build and maintain end-to-end ETL pipelines within the Snowflake environment
- Set up CI/CD pipelines and deployment automation between AWS CodeCommit and Snowflake, including environment promotion
- Ensure reliable, maintainable, and well-documented data pipelines
Cloud Integration & Security
- Configure and optimize secure integrations between Snowflake and AWS services, including S3
- Ensure data access, storage, and movement comply with security and governance requirements
SQL Performance & Optimization
- Write, optimize, and troubleshoot complex SQL queries
- Tune performance to meet scalability, reliability, and business performance requirements
Delivery & Stakeholder Collaboration
- Work closely with functional leads, developers, partner vendors, and management to gather requirements and provide technical guidance
- Take a proactive and assertive role in chasing dependencies, mitigating risks, and driving projects to on-time completion
Qualifications
The ideal candidate should possess
- Strong proficiency in SQL, including complex query development and optimization
- At least 5 years of proven experience in ETL and data pipeline implementation on cloud data warehouse platforms
- Hands-on experience in data modelling and performance tuning
- Strong communication and collaboration skills across technical and non-technical stakeholders
- A results-oriented, proactive, and assertive mindset with a strong drive for delivery
Preferred qualifications:
- SnowPro - Associate certification