
Search by job, company or skills
. Architect and implement the core routing and query execution engine for the data fabric platform
. Develop robust connection management infrastructure spanning heterogeneous data stores (Kafka, MySQL, Flink, Delta/Iceberg)
. Engineer end-to-end request pipelines optimized for P99 latency targets and high-throughput requirements
. Design and implement fault-tolerant systems including error handling, retry mechanisms, and circuit breaker patterns
. Establish comprehensive observability framework incorporating structured logging, distributed tracing, and metrics collection
. Drive API design decisions and define service contracts in collaboration with cross-functional stakeholders
. Minimum 3 years of professional experience developing low-latency, high-throughput services using C++, Go, or Rust in production environments
. Demonstrated expertise in concurrency patterns, asynchronous I/O, connection pool management, and backpressure handling mechanisms
. Production-level experience with in Kafka, MySQL, Apache Flink, Delta Lake/Iceberg
. Proven track record optimizing systems for low-latency requirements (P99 100ms) and high throughput, including profiling and systematic bottleneck resolution
. Extensive experience designing and implementing services using gRPC, Protocol Buffers, or equivalent RPC frameworks at scale
. Demonstrated experience deploying and operating services managing 100K+ concurrent connections with comprehensive observability infrastructure
. Experience with query parsing, optimization, and abstract syntax tree (AST) manipulation
. Implementation experience with adaptive rate limiting or circuit breaker patterns
. Knowledge of zero-copy techniques, memory-mapped I/O, or other advanced performance optimization strategies
. Background in stream processing frameworks and real-time data pipeline architectures
Job ID: 139968797