We are looking for a skilled Kafka Engineer to design, implement, and maintain scalable Kafka-based streaming solutions. You will leverage your expertise in Kafka, Kafka Streams, and Kafka Connect to build high-performance, real-time data pipelines, and integrations. You will work closely with cross-functional teams to ensure the reliability and scalability of our streaming architecture.
Key Responsibilities
- Design and implement scalable, fault-tolerant Kafka solutions, including brokers, producers, consumers, and Kafka Streams clusters.
- Build and optimize Kafka Streams pipelines and implement KSQL queries for real-time data processing.
- Develop and maintain Kafka connectors for external systems (e.g., JDBC, SFTP, NoSQL databases) and write custom connectors as needed.
- Monitor Kafka performance using tools like Confluent Control Center, troubleshoot issues, and optimize system performance.
- Work with data formats like JSON, Avro, XML, and manage schemas with Schema Registry.
- Work with development and DevOps teams to integrate Kafka into existing applications and infrastructure.
Required Skills & Qualifications
- Hands-on experience with Kafka, including brokers, producers/consumers, and Kafka Streams.
- Strong experience with Kafka Connect and connectors (JDBC, SFTP, NoSQL).
- Proficiency in Kafka Streams and KSQL for stream processing.
- Solid understanding of distributed systems, microservices, and event-driven architectures.
- Proficiency in Java, Python, or similar languages. Experience with monitoring tools like Confluent Control Center and Prometheus.
- Familiarity with JSON, Avro, CSV, and Schema Registry.