The National Bank of Rwanda (NBR) was established in 1964 with the aim of issuing the Rwandan currency named Franc Rwandais (Frw). Over the years, the role of the NBR has evolved. The current Law N°48/2017 of 23/09/2017 as amended to date, confers a clear mandate on the NBR with a mission of ensuring price stability and a sound financial system. Price stability is achieved by conducting appropriate monetary policy in the interest of a stable macroeconomic environment, while financial stability is achieved by regulating and supervising the financial system.
Job Description
The job holder is responsible for designing, developing, and maintaining the systems that collect, manage, and convert raw data into usable formats for data analysis. The job holder ensures that data pipelines, storage solutions, and data processing systems are robust, scalable, and optimized for high-quality data analysis and decision-making
Job Responsibilities
Build and maintain scalable, efficient data pipelines to collect, process, and store large volumes of data from various sources
Implement Extract, Transform, Load (ETL) processes to clean, structure, and transform data for downstream analysis
Design and manage databases (SQL and NoSQL) to store and retrieve data efficiently and ensure databases are optimized for performance, security, and scalability
Utilize big data tools and technologies (e.g., Hadoop, Apache Spark, Kafka) to handle massive datasets and ensure smooth processing
Work closely with data scientists, data analysts, and business intelligence teams to understand data requirements and deliver the necessary infrastructure.
Integrate data from various sources, such as APIs, third-party systems, or cloud platforms, into the organization’s data ecosystem
Implement and enforce best practices for data quality, consistency, and integrity across all data systems
Leverage cloud platforms (e.g., AWS, Azure, Google Cloud) to store and manage data efficiently and cost-effectively
Ensure the data is securely stored and accessed by implementing encryption, access control, and other security best practices
Continuously monitor, tune, and optimize data systems and processes to enhance performance and minimize downtime
Document data engineering processes, pipelines, and solutions to ensure that all workflows are understood, reproducible, and maintainable
Job Requirements
Master’s degree in computer science, data engineering, software development, information Technology, or a related field.
5 years of progressive experience in software development, DevOps, data engineering or a related field.
Professional Certifications in workflow orchestration, cloud platforms, big data technologies, or database management (e.g., Apache Airflow Certification, AWS Certified Data Analytics, Google Professional Data Engineer, or similar) is added advantage