Job Specifications
Developer - Senior Kafka Engineer (AWS)
• 5–8 years of experience designing, developing, and managing real-time data streaming solutions
• Strong expertise in Apache Kafka core components: brokers, topics, producers, consumers, partitions
• Experience with Kafka Streams, Kafka Connect, and Schema Registry (Avro, JSON, Protobuf)
• Skilled in Kafka performance tuning, monitoring, and cluster management
• Familiarity with Confluent Platform or Amazon MSK (Managed Streaming for Kafka)
• Understanding of exactly-once semantics, consumer groups, and offset management
• Design, develop, and maintain Kafka-based streaming architectures for real-time data ingestion and processing
• Implement Kafka topics, partitions, producers, consumers, and connectors for various data sources and sinks
• Optimize Kafka clusters for performance, scalability, and reliability
• Integrate Kafka with AWS services such as EC2, MSK, S3, Lambda, and Kinesis
• Monitor, troubleshoot, and fine-tune Kafka clusters using tools like Confluent Control Center, Grafana, or Prometheus
• Collaborate with application teams to ensure seamless data streaming pipelines and robust message delivery guarantees
• Implement security, data governance, and schema management using Schema Registry and role-based access controls
• Participate in system architecture discussions, capacity planning, and disaster recovery setup
• Automate infrastructure setup and configuration using Terraform, CloudFormation, or similar IaC tools
• Provide technical mentorship to junior engineers and support teams in Kafka-related problem-solving
• Working experience with AWS cloud services: EC2, S3, Lambda, CloudWatch, IAM, MSK, Glue
• Familiarity with infrastructure automation tools: Terraform, CloudFormation, Ansible
• Experience with CI/CD pipelines: Jenkins, GitHub Actions, AWS CodePipeline
• Understanding of networking and security concepts in cloud environments: VPC, IAM, SSL