ABSA

Specialist Platform Engineer

Absa Headquarters (KE) Full time

Empowering Africa’s tomorrow, together…one story at a time.

With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.

My Career Development Portal: Wherever you are in your career, we are here for you. Design your future. Discover leading-edge guidance, tools and support to unlock your potential. You are Absa. You are possibility.

Job Summary

We are looking for an engineer to join our team to help build and maintain Kafka-based
streaming applications and support the Kafka platform across on-prem and Confluent Cloud
environments. The role is a hybrid of development, platform responsibilities, and observability,
providing a unique opportunity to work on distributed systems at scale.

Job Description

Core Responsibilities:

• Develop, maintain, and optimize Kafka-based applications and event streaming pipelines using Java (Spring / Spring Boot), Python, or .NET.

• Work with distributed systems concepts: partitions, replication, fault-tolerance, scaling, and event-driven architectures.

• Contribute to provisioning, managing, and securing Kafka clusters both on-prem and in Confluent Cloud.

• Implement and maintain security and authorization mechanisms, including ACLs, Kerberos, SSL, and OAuth for Confluent Cloud.

• Automate infrastructure deployment and configuration using Terraform, Ansible, CloudFormation, Docker, or Kubernetes.

• Configure, monitor, and maintain observability for Kafka clusters, including metrics, alerts, and dashboards (e.g., Prometheus, Grafana, Confluent Control Center, ElasticSearch).

• Assist in troubleshooting production issues and perform root cause analysis.

• Collaborate closely with developers, DevOps/SRE teams, and other stakeholders to ensure reliable and performant streaming systems.

• Contribute to best practices for connector configuration, high availability, disaster recovery, and performance tuning, including streaming applications and pipelines built with Kafka Streams, ksqlDB, Apache Flink, and TableFlow.

Required Skills:

• Strong programming experience in Java(Spring / Spring Boot), Python, or .NET. Ability to write clean, maintainable, and performant code.

• Solid understanding of distributed systems principles and event-driven architectures.

• Hands-on experience with Kafka in production or strong ability to learn quickly.

• Knowledge of Kafka ecosystem components (Connect, Schema Registry, KSQL, MirrorMaker, Control Center, Kafka Streams, Apache Flink, TableFlow) is a plus.

• Familiarity with security best practices for Kafka, including ACLs, Kerberos, SSL, and OAuth.

• Experience with infrastructure as code and containerized environments.

• Experience with monitoring and observability tools for distributed systems.

Desirable Skills / Bonus Points:

• Experience with Confluent Cloud or other managed Kafka platforms.

• Experience with AWS

• Experience building streaming pipelines across multiple systems and environments.

• Familiarity with CI/CD pipelines and automated deployments.

Behavioural / Soft Skills:

• Strong problem-solving and analytical skills.

• Excellent communication and interpersonal skills.

• Ability to work independently and prioritize across multiple BAU and project tasks.

• Product-minded approach, focusing on delivering value and scalable solutions.

Education

Bachelor's Degree: Information Technology