Search by job, company or skills

Rhino Partners

Data Engineer (ETL / Snowflake / AWS)

3-5 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Overview

Rhino Partners is seeking a Data Engineer to support the design, development, and optimisation of scalable data platforms and pipelines. The role focuses on building reliable data ingestion and transformation workflows that enable data-driven products, analytics, and AI initiatives.

The successful candidate will work closely with cross-functional teams including Data Architects, Business Analysts, Frontend Engineers, and Product stakeholders to integrate disparate data sources and deliver production-grade data solutions on modern cloud platforms.

Key Responsibilities

Data Pipeline Engineering

  • Design, develop, and maintain scalable data pipelines that extract data from multiple sources, transform it according to business requirements, and load it into downstream systems.
  • Implement ETL/ELT workflows to support operational systems, analytics platforms, and data products.
  • Build and maintain large-scale batch and real-time data processing pipelines using modern data processing frameworks.

Data Integration & Platform Development

  • Integrate and consolidate data from multiple systems and data silos into scalable, governed data platforms.
  • Design and optimise data flows for performance, reliability, and maintainability.
  • Support development of data lakes, data warehouses, and data marts to enable efficient storage and retrieval.

Collaboration & Delivery

  • Work closely with Project Managers, Data Architects, Business Analysts, and Developers to deliver scalable data-driven applications.
  • Participate in Agile development processes, including sprint planning, code reviews, and continuous delivery.
  • Contribute to pair programming, code quality practices, and engineering standards across the team.

Data Governance & Security

  • Ensure pipelines comply with data governance policies, security standards, and access control practices.
  • Implement secure handling of data across ingestion, transformation, and storage layers.

Skills & Experience

Core Technical Skills

  • Bachelor's degree in Computer Science, Software Engineering, or a related field.
  • 35 years of experience in data engineering, ETL, or data integration projects.
  • Strong proficiency in SQL and Python for data extraction, transformation, and processing.
  • Experience designing and building production-grade ETL pipelines.

Data Engineering Tools & Platforms

Hands-on experience with:

Data Integration / ETL

  • SQL Server Integration Services (SSIS)
  • Python-based data processing pipelines
  • Snowflake data platform

Cloud Platforms

  • Experience working with Government Commercial Cloud (GCC / GCC+) environments such as AWS or Azure.

AWS Data Ecosystem (Preferred)

  • AWS Lambda
  • ECS Container Tasks
  • EventBridge
  • AWS Glue

Databases & Storage

Experience working with:

  • AWS S3, RDS, SQL-based databases
  • Additional experience with PostgreSQL, Athena, MongoDB, MySQL, Cassandra, SQLite, or VoltDB is advantageous.

DevOps & Infrastructure

  • Experience working with CI/CD pipelines (e.g. GitLab).
  • Familiarity with Infrastructure as Code / automation tools such as:
  • Terraform
  • Ansible
  • Puppet
  • Vagrant

Data Architecture & Modelling

  • Familiarity with data platform concepts such as:
  • Data Lakes
  • Data Warehouses
  • Data Marts
  • Data Virtualisation

Additional Knowledge

  • Familiarity with REST APIs and web protocols.
  • Understanding of data governance, access control, and security best practices.
  • Knowledge of system design, data structures, and algorithms.
  • Exposure to AI/ML data pipelines, including concepts such as RAG (Retrieval-Augmented Generation) and Model Context Protocol (MCP), is advantageous.

Working Environment

  • Comfortable working in both Linux and Windows development environments.
  • Interest in bridging data engineering and analytics teams to deliver impactful data products.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 144196097