Job Description
You will help drive the adoption of state-of-the-art (SOTA) Machine Learning (ML) and AI models-especially Large Language Models (LLMs) and frameworks like LangChain-to strengthen our nation's cyber defense.
Key Responsibilities
- Develop and maintain data pipelines on cloud platforms to ingest, transform, and integrate data from multiple sources for cybersecurity applications.
- Develop and maintain interactive dashboards (e.g. using visualisation tools such as Tableau or equivalent) for monitoring and reporting on security analytics.
- Integrate LLM API via LangChain and leverage Cloud AI services to enhance AI-driven agent capabilities within security workflows.
- Design, deploy, and manage vector databases (e.g., Pinecone, PGVector, FAISS) to support search and retrieval tasks in AI solutions.
- Build agentic workflows using LangChain and open-source tools (such as Flowise) to automate complex cyber operations.
- Implement human-in-the-loop approval processes to ensure critical decisions are reviewed and authorized appropriately.
- Orchestrate agent-to-agent workflows, enabling autonomous collaboration between AI agents within the security ecosystem.
- Apply prompt engineering best practices to optimize LLM (large language model) performance for cybersecurity applications.
- Implement and maintain CI/CD pipelines using Git for continuous integration, deployment, and testing of AI solutions.
- Drive end-to-end solutions-from experimentation to production-for SOC automation, detection, and human-in-the-loop processes.
- Collaborate closely with cross-functional teams to deliver secure, scalable, and impactful AI solutions.
Requirements
- Bachelor's Degree (or higher) in Computer Science, Data Science, Machine Learning, or related field.
- 2-3 years hands-on experience in Python, LangChain, NLP, and Computer Vision.
- Experience in Building and deploying AI/LLM-powered applications and agentic workflows.
- Experience in Vector database technologies (e.g., Pinecone, PGVector, FAISS).
- Knowledge in CI/CD best practices and tools (preferably Git and related workflows).
- Hands-on experience in cloud platforms (AWS, Azure or GCP) is required, with knowledge on containerization (Docker, Kubernetes).
- Experience using Cloud AI services preferably AWS Sagemaker/Bedrock or Google Vertex AI.
- Familiarity using datawarehousing technologies (such as Amazon Redshift or Google BiqQuery) to manage large data sets will be an advantage.
- Experience in open-source automation frameworks (such as Flowise) is a plus.
Required Technical Skills in any of the following:
- Python programming
- LangChain
- Natural Language Processing (NLP)
- Computer Vision
- Vector databases (Pinecone, PGVector, FAISS)
- CI/CD tools with Git
- Cloud platforms (AWS, Azure or GCP)
- Containerisation (Docker, Kubernetes)
- Cloud AI services (AWS Sagemaker/Bedrock or Google Vertex AI)
- Data warehousing (Amazon Redshift or BigQuery)
- Data pipeline development
- Dashboard development (Tableau or equivalent)
- Open-source automation frameworks (Flowise)
This is a 1-year Contract position under People Advantage(Certis Group). We appreciate your application and regret only shortlisted candidates will be notified.
By submitting your resume, you consent to the handling of your personal data in accordance with Certis Group Privacy Policy (www.certisgroup.com/privacy-policy).
EA Personnel Name: Siti Khatijah
EA Personnel No: R22111204
EA License No: 11C3955