Send me Jobs like this
Experience
3 - 8 Years
Job Location
Other - United Arab Emirates (UAE)
Education
Bachelor of Science(Computers), Master of Technology/Engineering(Computers)
Nationality
Any Nationality
Gender
Any
Vacancy
1 Vacancy
Job Description
Roles & Responsibilities
- Design and implement robust Kafka solutions for high-volume data ingestion and real-time data processing, ensuring scalability and fault tolerance.
- Develop and maintain Kafka Connectors for seamless integration with various data sources and sinks, optimizing data flow efficiency.
- Optimize Kafka cluster performance, including brokers, producers, and consumers, to achieve low latency and high throughput.
- Troubleshoot and resolve Kafka-related issues, including performance bottlenecks, data inconsistencies, and connectivity problems.
Desired Candidate Profile
3+ years of hands-on experience with Apache Kafka and Confluent Kafka in a production environment, including experience with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.
Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.
Familiarity with distributed systems, microservices architecture, and event-driven design patterns.
Experience with cloud platforms (e.g., AWS, Azure) and containerization (Kubernetes) is a plus.
Technical Skills:
Proficiency in programming languages such as Java, Python, or Scala.
Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.
Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j, ELK Stack).
Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
Expertise in Kafka Streams for building scalable, fault-tolerant stream processing applications.
Experience with KSQLDB for real-time processing and analytics on Kafka topics.
Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.
Understanding of networking, security, and compliance aspects related to Kafka.
Familiarity with CI/CD pipelines and automation tools (e.g., Jenkins, GitLab CI).
Employment Type
- Full Time
Company Industry
- Recruitment
- Placement Firm
- Executive Search
Department / Functional Area
- Software Development
- Application Development (IT Software)
Keywords
- Streaming Data Engineer
- Data Streaming Architect
- Data Pipeline Engineer
- Kafka
- Data Processing
- Real-time Data Streaming
- Streaming Platform Engineer
- Microservices
Dicetek LLC
Dicetek is a global IT Solutions and Services Company established in 2006 with its corporate headquarters in Singapore. We continue to expand our global network while providing value-added cost-effective consulting services to our clients. DICETEK has operational offices in India, UAE, Singapore & USA. As a world-class company with a regional focus, we primarily concentrate on providing Information Technology Solutions and Professional Consulting Services, across different verticals like Banking & Financial Services, Telecom, Government, Oil & Gas, Logistics, Supply Chain, Real Estate & Manufacturing. We have a solid reputation in the technology industry for providing excellent services to our clients. Our values are represented by our integrity, thought leadership, and commitment to maintaining a high-level of excellence in the constantly evolving world of Information Technology.
Read MoreRizwana Ashfaq Ashfaq - Manager- Talent Acquisition
Office No. 307 - 3rd Floor, New Century Tower, Port Saeed Road,Opp. Deira City Centre, Dubai - United Arab Emirates., Dubai, United Arab Emirates (UAE)