Lead Confluent Kafka Engineer / Architect

Posted 2 Days Ago
Be an Early Applicant
Karachi, Sindh
In-Office
Expert/Leader
Artificial Intelligence • Information Technology • Software • Analytics
The Role
Lead the design, implementation, and optimization of Kafka and Confluent data streaming solutions. Oversee installation, operations, and team leadership, ensuring performance and security.
Summary Generated by Built In
Job Title: Lead Confluent Kafka Engineer/Architect
Location: Remote (Pak) / Hybrid (Riyadh, KSA)
Employment Type: Full-Time / Contract-to-Hire
Experience Level: 10+ years (Senior/Lead)
Role Summary
As a Lead Confluent Kafka Engineer, you'll architect, design, setup, install, implement and optimize high-throughput data streaming solutions using Confluent Platform and Apache Kafka. You'll lead a team of engineers in delivering production-grade pipelines, ensuring scalability, reliability, and security. This role involves hands-on development, mentoring, and collaborating with data architects, DevOps, and stakeholders to implement event-driven architectures. You'll champion best practices in real-time data processing, from proof-of-concepts to enterprise deployments, including full lifecycle management from installation to optimization.
Key Responsibilities
  • Architecture & Design: Lead the design of scalable Kafka clusters and Confluent-based ecosystems (e.g., Kafka Streams, ksqlDB, Schema Registry, Connect) for on-prem, hybrid, and multi-cloud (GCP) environments.
  • Implementation & Development: Build and maintain real-time data pipelines, integrations, and microservices using Kafka producers/consumers; integrate with tools like Flink, Spark, or ML frameworks for advanced analytics.
  • Installation & Setup: Oversee the end-to-end installation and initial configuration of Confluent Platform and Apache Kafka clusters, including:
    • Deploying Confluent Enterprise/Community editions on Kubernetes (via Helm/Operator), bare-metal servers, or managed cloud services (e.g., Confluent Cloud, GCP).
    • Configuring brokers, ZooKeeper/KRaft mode, topics, partitions, replication factors, and security settings (e.g., SSL/TLS, SASL, ACLs) using Ansible, Terraform, or Confluent CLI.
    • Setting up auxiliary components like Schema Registry, Kafka Connect clusters, and monitoring agents (e.g., JMX exporters) with automated scripts for reproducible environments.
    • Performing initial health checks, load testing (e.g., with Kafka's performance tools), and integration with existing infrastructure (e.g., VPC peering, load balancers).
  • Operations & Maintenance: Oversee monitoring, troubleshooting, performance tuning, and lifecycle management (upgrades, patching) of Kafka/Confluent instances; implement DevSecOps practices for CI/CD pipelines.
  • Team Leadership: Mentor junior engineers, conduct code reviews, and drive technical proofs-of-concept (POCs); gather requirements and define standards for Kafka as a managed service (e.g., access controls, documentation).
  • Optimization & Innovation: Ensure high availability (>99.99%), fault tolerance, and cost-efficiency; explore emerging features like Kafka Tiered Storage or Confluent Cloud integrations for AI workloads.
  • Collaboration & Delivery: Partner with cross-functional teams (data engineers, architects, product owners) to align streaming solutions with business goals; provide thought leadership on event-driven patterns.
  • Security & Compliance: Implement RBAC, encryption, and auditing; conduct root-cause analysis for incidents and ensure GDPR/HIPAA compliance in data flows.
Required Qualifications & Skills
  • Bachelor's/Master's in Computer Science, Engineering, or related; certifications like Confluent Developer/Administrator a plus.
  • 10+ years in software engineering; 5+ years hands-on with Apache Kafka & Confluent Platform (Cloud/Enterprise editions).
  • Proficiency in Java/Scala/Python (8/11+); Kafka Streams/Connect/ksqlDB; Schema Registry; REST/gRPC APIs.
  • Event-driven/microservices design; data pipeline optimization; handling high-volume streams (TB/day scale).
  • Expertise in containerization (Docker/Kubernetes); CI/CD (Jenkins/GitHub Actions); Terraform/Ansible for IaC.
  • Multi-cloud experience (AWS, GCP, Azure); monitoring tools (Prometheus, Grafana, Confluent Control Center).
  • Experience with streaming integrations (e.g., Flink, Spark Streaming for CDC).
  • Contributions to open-source Kafka projects or publications on streaming architectures.
  • Knowledge of AI/ML data pipelines (e.g., Kafka + TensorFlow/PyTorch).
  • Familiarity with observability tools and security (OAuth, Kerberos).
  • Strong problem-solving, communication, and leadership; experience leading POCs and cross-team projects.
  • Agile/Scrum leadership in fast-paced environments.
  • Experience in client facing roles and leading teams.

Top Skills

Ansible
Apache Kafka
AWS
Azure
Confluent Platform
Docker
GCP
Grafana
Grpc
Java
Kafka Streams
Ksqldb
Kubernetes
Prometheus
Python
Rest
Scala
Terraform
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
Dubai
65 Employees
Year Founded: 2016

What We Do

Datamatics Technologies (DMT) was established in Dubai. We specialize in providing onsite and offshore professional services, covering the full spectrum of Data Analytics and Data Science domains.

Our experience of working with diverse industry sectors such as Telecoms, Finance, Government and Manufacturing, across multiple regions enables us to engage and deliver for our clients with confidence.

We can offer our full portfolio of services through resource augmentation, managed services, both on T&M or fixed price financial arrangements. Through our end-to-end managed services offering we enable our clients to cut down costs, increase profitability and focus on value addition to their core business activities.
Our project and delivery management team are certified in Agile, PMI and ITIL to ensure the planning and execution are carried out using industry best practices.
We are working with our clients across Middle East and Africa Region.

Similar Companies Hiring

PRIMA Thumbnail
Travel • Software • Marketing Tech • Hospitality • eCommerce
US
15 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees
Idler Thumbnail
Artificial Intelligence
San Francisco, California
6 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account