Reporting to Senior Specialist – RAFM, we are pleased to announce the vacancy for RAFM Integration Engineer – within the Information and Technology vacancy in Safaricom Telecommunication Ethiopia PLC. They will work closely with all stakeholders to ensure all data ingestion pipelines as optimally built, deployed, and operationally supported. In keeping with our current business needs, we are looking for a person who meets the criteria indicated below.
1. Responsibilities
Create and maintain optimal data pipeline architectures in Apache Kafka, Apache Spark, Apache Nifi etc.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers for DR.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Work with data and analytics experts to strive for greater functionality in our data systems.
Understanding of Microservices architectures
Experience with Java technologies and frameworks mainly Spring and Hibernate.
Experience in containerization platforms like Kubernetes, docker-swarms or RedHat OpenShift.
Experience with event-based and message-driven distributed system like Apache Kafka, ActiveMQ, Rabbit MQ or Tibco EMS
2. Qualifications
BS or MS degree in Computer Science (or related fields like Electronic Engineering, Physics or Mathematics).
3+ years of software design, development, implementation & deployment experience with backend data services.
4+ years of hands-on experience in any object-oriented programming & scripting languages such as C++, C#, Java, or Python.
Hands on experience in private cloud (docker/ kubernetes-K8s) computing.
Knowledge of data structures and algorithms and algorithm optimizations.
Advanced working SQL/NoSQL knowledge and experience working with relational/nonrelational databases.
Experience building and optimizing RAFM data pipelines, architectures, and data sets.
How to Apply:
If you feel that you are up to the challenge and possess the necessary qualification and experience, kindly proceed to update your candidate profile on the career portal and then Click on the apply button. Remember to attach your resume.
The closing date for receiving applications is Tuesday, 21st January 2025, 5:30 pm.