Empowering Africa’s tomorrow, together…one story at a time.
With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.
Job Summary
We are looking for an engineer to join our team to help build and maintain Kafka-based streaming applications and support the Kafka platform across on-prem and Confluent Cloud environments. The role is a hybrid of development, platform responsibilities, and observability, providing a unique opportunity to work on distributed systems at scale.
Job Description
Core Responsibilities:
- Develop, maintain, and optimize Kafka-based applications and event streaming pipelines using Java(Spring / Spring Boot), Python, or .NET.
- Work with distributed systems concepts: partitions, replication, fault-tolerance, scaling, and event-driven architectures.
- Contribute to provisioning, managing, and securing Kafka clusters both on-prem and in Confluent Cloud.
- Implement and maintain security and authorization mechanisms, including ACLs, Kerberos, SSL, and OAuth for Confluent Cloud.
- Automate infrastructure deployment and configuration using Terraform, Ansible, CloudFormation, Docker, or Kubernetes.
- Configure, monitor, and maintain observability for Kafka clusters, including metrics, alerts, and dashboards (e.g., Prometheus, Grafana, Confluent Control Center, ElasticSearch).
- Assist in troubleshooting production issues and perform root cause analysis.
- Collaborate closely with developers, DevOps/SRE teams, and other stakeholders to ensure reliable and performant streaming systems.
- Contribute to best practices for connector configuration, high availability, disaster recovery, and performance tuning, including streaming applications and pipelines built with Kafka Streams, ksqlDB, Apache Flink, and TableFlow.
Required Skills:
- Strong programming experience in Java(Spring / Spring Boot), Python, or .NET. Ability to write clean, maintainable, and performant code.
- Solid understanding of distributed systems principles and event-driven architectures.
- Hands-on experience with Kafka in production or strong ability to learn quickly.
- Knowledge of Kafka ecosystem components (Connect, Schema Registry, KSQL, MirrorMaker, Control Center, Kafka Streams, Apache Flink, TableFlow) is a plus.
- Familiarity with security best practices for Kafka, including ACLs, Kerberos, SSL, and OAuth.
- Experience with infrastructure as code and containerized environments.
- Experience with monitoring and observability tools for distributed systems.
Desirable Skills / Bonus Points:
- Experience with Confluent Cloud or other managed Kafka platforms.
- Experience with AWS.
- Experience building streaming pipelines across multiple systems and environments.
- Familiarity with CI/CD pipelines and automated deployments.
Behavioral / Soft Skills:
- Strong problem-solving and analytical skills.
- Excellent communication and interpersonal skills.
- Ability to work independently and prioritize across multiple BAU and project tasks.
- Product-minded approach, focusing on delivering value and scalable solutions.
Education
Bachelor's Degree: Information Technology
Absa Bank Limited is an equal opportunity, affirmative action employer. In compliance with the Employment Equity Act 55 of 1998, preference will be given to suitable candidates from designated groups whose appointments will contribute towards achievement of equitable demographic representation of our workforce profile and add to the diversity of the Bank.
Absa Bank Limited reserves the right not to make an appointment to the post as advertised