Job Description
GetGo is Singapore's largest and fastest-growing carsharing platform that enables everyone with the freedom to drive without the burden of ownership. Our vision is to be APAC's #1 carsharing platform as we seek to create a mobility ecosystem that's shared and sustainable for all.
We are looking for an experienced, technically talented, and highly motivated team player who is a data nerd and has a passion for all things data.
Our ideal candidate has a passion for data and can build systems to provide relevant insights to GetGo to fulfil our mission of enabling the #FreedomToDrive.
What You Will Be Doing:
As a Data Engineer II at GetGo, you will play a pivotal role in developing, and optimizing our modern data infrastructure on Databricks. You will be instrumental in ensuring the reliability, quality, and accessibility of data that drives critical business decisions and insights.
Your responsibilities will include:
- Designing, building, and maintaining a robust and scalable lakehouse architecture, data acquisition processes, and data pipelines leveraging the Databricks platform.
- Developing and implementing comprehensive data quality frameworks and initiatives to proactively identify and resolve data anomalies, ensuring data integrity and accuracy.
- Creating and deploying sophisticated monitoring and observability tools to track the performance and health of data pipelines and critical data quality metrics.
- Optimizing data flows and data models to enhance consistency, quality, accessibility, and security, while also improving database efficiency to meet diverse company needs.
- Collaborating closely with cross-functional teams, including Business Intelligence, Marketing, and Finance, to understand their data requirements and translate them into well-designed data marts that support insightful analytics and contribute to achieving company OKRs.
- Applying a strong understanding of distributed relational and tabular data stores, message queueing systems, stream processing technologies, and other scalable big data platform components to implement effective data solutions.
What Makes You A Great Fit
- 3 Years of Data Engineering experience in any relevant industry.
- Experience with cloud tools and Big Data technologies like Spark, and strong proficiency in Python & SQL.
- Experience with Databricks preferred.
- Good to have knowledge and skills in JavaScript, Qgis, PySpark, AWS, GeoSpatial Analysis and Data Visualisation.
- Excellent understanding and curiosity across areas of data science, data engineering, machine learning frameworks, tools, and algorithms
- Strong technical capabilities and proven expertise in developing and implementing a full range of analytical techniques to address commercial challenges.
- Strong personal alignment with our GetGo Values:
- Driven by Purpose
- Stay Curious & Humble
- Collaborate with Empathy
- Make It Better
- Get It Going