Job Description
- Build the company's big data warehouse system, including batch and stream data flow construction
- In-depth understanding of business systems, understanding of project customer needs, design and implement big data systems that meet user needs, and ensure smooth project acceptance
- Responsible for data integration and ETL architecture design and development
- Research cutting-edge big data technologies, optimize big data platforms continuously
Qualifications
- Bachelor degree or above in computer, database, machine learning and other related majors, more than 3 years of data development work
- Have a keen understanding of business data, and can analyze business data quickly ensures the solutions they build are aligned with real-world use cases and deliver actionable insights
- Proficient in SQL language, familiar with MySQL, very familiar with Shell, Java (Scala), Python ensures flexibility and efficiency in managing data pipelines and integrating systems
- Familiar with common ETL technologies and principles proficient in data warehouse database design specifications and practical operations rich experience in spark, MR task tuning ensures optimized resource usage and faster data processing
- Familiar with Hadoop, Hive, HBase, Spark, flink, kafka and other big data components ensures the candidate can contribute immediately to existing platforms and support high-throughput, distributed data operations
- Proficient in the use of mainstream analytical data systems such as clickhousedorisTIDB, have tuning experience are preferred ensure performance at scale and reduce system costs
Han Tze Hui (Yunne)
Adecco Personnel Pte Ltd| EA LIcence No.91C2918 | Personnel Registration No: R24120833