Our client, a highly regarded enterprise organisation, is searching for an experienced technologist to take ownership of their secure data infrastructure. In this critical position, your primary mandate will be to engineer highly resilient cloud environments that safeguard massive, complex datasets. You will serve as the bridge between advanced data architecture and stringent cyber protection standards, ensuring that all information assets remain locked down against modern digital threats.
Your Impact:
- Spearhead the end-to-end architecture of hardened data transit workflows across major public clouds, including AWS, Google Cloud, or Azure.
- Construct high-volume, highly efficient data integration and extraction pipelines whilst embedding deep security protocols at every technical tier.
- Enforce rigorous data protection standards through advanced cryptographic methods, tokenisation, and strict identity and access management.
- Establish proactive observability and telemetry frameworks to instantly detect and mitigate infrastructure vulnerabilities or data-related security events.
- Act as a leading technical authority during internal compliance reviews and threat assessments, working in tandem with senior leadership to shape security policies.
- Champion a culture of secure data handling by mentoring internal teams and advising external partners on best-in-class security protocols.
What You Bring:
- A solid academic foundation with a Bachelor's degree in Computing, Information Technology, or an equivalent technical discipline.
- A wealth of professional backgroundideally around a decadefocusing heavily on building secure data ecosystems (though highly capable mid-level professionals are also warmly encouraged to apply).
- Demonstrated mastery of native cloud security configurations, such as managing VPCs, firewalls, Cloud Trail, or Guard Duty.
- Deep familiarity with modern data storage and warehousing ecosystems, including S3, Redshift, and DynamoDB, alongside both relational (e.g., PostgreSQL, MySQL) and NoSQL databases.
- Strong coding capabilities in Python or Java , paired with hands-on expertise in orchestrating complex workflows via tools like Airflow, NiFi, AWS Glue, or Fluentbit.
- Practical experience managing distributed big data frameworks (such as Hadoop and Spark) and deploying workloads using containerisation solutions like Kubernetes and Docker.
- A deeply analytical mindset, combined with the communication skills necessary to articulate complex security metrics to non-technical stakeholders.