Qualifications & Experience
Required:
- Bachelor's degree in computer science, Engineering, Information Systems, or a related field.
- 8+ years of hands-on experience in data integration, data warehousing, or data platform development.
- 3+ years of experience in a lead role, successfully delivering large-scale, complex data projects, particularly involving data migration and system integration.
- Expertise in Data Mapping: Proven, extensive experience creating detailed source-to-target data mapping documents based on complex business requirements.
- Cloud Platform Proficiency: Deep knowledge and practical experience with at least one major cloud platform: AWS (e.g.,S3, Glue, Redshift), Azure (e.g., ADLS, Data Factory, Synapse), or GCP (e.g., Cloud Storage, Dataflow, BigQuery).
- Modern Data Stack: Strong understanding of modern data warehousing technologies like Snowflake, BigQuery, or Redshift.
- ETL/ELT Experience: Hands-on experience with data pipeline and orchestration tools such as dbt, Airflow, Talend, or Informatica.
- Vendor Management: Demonstrated experience managing relationships, contracts, and performance of third-party technology vendors.
- Communication Skills: Exceptional ability to communicate complex technical concepts to both technical and non-technical audiences.
- Advanced proficiency in SQL.
Preferred:
- Master's degree in a relevant field.
- Professional cloud certifications (e.g., AWS Certified Data Analytics, Google Professional Data Engineer).
- Programming experience with Python or Scala for data processing.
- Knowledge of data governance frameworks (e.g., DAMA-DMBOK).
- Experience working in an Agile/Scrum environment, with Scrum Master experience being a plus.
- Experience in [e.g., Financial Services, Retail, Manufacturing] or other relevant industries.