We are seeking a skilled Confluent Integration Engineer to design, build, and operate scalable real-time data streaming solutions within a Private Cloud AI environment. This role focuses on integrating the Confluent Platform with enterprise systems to support event-driven architectures, AI/ML pipelines, and high-performance data platforms.
Key Responsibilities
Solution Design & Implementation
- Design and implement scalable, real-time event streaming solutions using Apache Kafka, Confluent Platform, and Confluent Cloud.
- Support AI and analytics workloads by enabling reliable, low-latency data pipelines.
System Integration & Development
- Integrate Kafka with databases, APIs, microservices, and enterprise applications.
- Develop producers, consumers, connectors, and stream processing applications.
Automation & Tooling
- Build automation frameworks and validation tools to improve testing, deployment, and operational efficiency of streaming platforms.
- Enhance reliability and repeatability through CI/CD pipelines and infrastructure-as-code.
Architecture & Deployment
- Deploy, manage, and optimize Kafka clusters in private cloud and hybrid cloud environments.
- Utilize containerization and orchestration technologies such as Docker and Kubernetes.
- Ensure high availability, scalability, and performance of streaming infrastructure.
Collaboration & Consulting
- Work closely with data engineers, platform architects, DevOps, and AI teams.
- Engage stakeholders to understand business and technical requirements, providing best-practice guidance on event-driven architecture and Confluent usage.
Governance, Security & Monitoring
- Implement data governance, schema management, access control, and lineage tracking.
- Set up monitoring and observability using Confluent Control Center, Prometheus, OpenTelemetry, and related tools.
- Ensure compliance with enterprise security and operational standards.
Required Skills & Qualifications
Technical Skills
- Strong programming experience in Java, Python, or Scala.
- Hands-on expertise with Apache Kafka and the Confluent ecosystem (Connect, Streams, Schema Registry, ksqlDB).
Cloud & Platform Experience
- Experience with private cloud, hybrid cloud, or major cloud platforms (AWS, GCP, Azure).
- Proficiency with Docker, Kubernetes, and CI/CD pipelines.
Architecture & Problem Solving
- Solid understanding of event-driven architecture, distributed systems, and streaming data patterns.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex systems.