Reporting to Head, Platforms and Data Engineering, the Platforms and Data Engineer will work closely with Data Scientists, Threat/Malware Researchers, Project Managers and Infrastructure Engineers to develop and manage high-performance analytics solutions. The incumbent will be accountable for the design, development, deployment and maintenance of AI analytics platforms as well as its data processing workflows.
Key Responsibilities:
- Familiarize with Ensigns business domain and objectives to develop and deploy data solutions that meet internal business requirements and the needs of partners and customers
- Design, develop, test, deploy operational data transformation processes.
- Design, develop, manage data lake, data warehouse architecture and relational databases.
- Provide monitoring, maintenance and support for data operations as part of maintenance & support as required in client projects.
- Embrace the challenge of dealing with terabytes of data on a daily basis.
- Manage development, staging, production environments to provide overall system functionality, health, scalability, resiliency, and security.
- Responsible for implementing and maintaining complex AI analytics projects, focus on large sets of data to turn information into insights using multiple platforms.
- Deliver detailed documentation and ensure quality throughout project lifecycle.
Qualifications/Requirements:
- Bachelors degree in Computer Science /Computer Engineering or equivalent
- Experience in programming (Spark, Python, Go, Bash) for data engineering purposes
- Understanding of modern software engineering tools such as Git, GitLab
- Familiarity with Docker, Kubernetes and cloud services (AWS)
- Familiarity with network concepts.
- Comfort and experience working in Linux environment
- Aptitude for automation and software profiling
- Proven ability to handle multiple customer projects concurrently
- Detail-oriented, solution-focused and problem solver
Preferred Skills (Good to haves):
- Minimum 3 years of experience developing data engineering pipelines
- Knowledge of different data platforms (e.g. Kafka, MongoDB, Postgres, Elasticsearch), associated tools and cloud-based technologies (e.g. Lambda, Glue).
- Have strong knowledge in DevSecOps to design, develop, test, and deploy applications for customer projects.
- Experience in Cyber Security / Telco industry will be an advantage
- Knowledge in Agile and CI/CD is desirable
- Familiarity with virtualization platform (Proxmox, VMware, KVM)
- Familiarity with Hadoop ecosystem and MPP databases
- Ability to demonstrate programming skills and knowledge without being overly dependent on AI.
*Singaporeans only as the candidate may potentially be subjected to clearance depending on projects