Share this Job
Data Engineer job at Raising The Village (RTV) | Apply Now
Are you looking for Computer Science jobs in Uganda 2025 today? then you might be interested in Data Engineer job at Raising The Village (RTV)
Mbarara, Uganda
Full Time
About the Organisation
Raising The Village is a purpose-driven international non-profit organization committed to ending ultra-poverty in remote, rural communities by providing holistic, data-driven, and scalable solutions that address core needs such as income generation, healthcare, water, education, and infrastructure. With a strong reputation for impactful programming and sustainable development, Raising The Village is recognized for its innovative "last-mile" service delivery model that accelerates community development through partnerships with local governments and grassroots leadership. Since its inception in 2012, the organization has expanded significantly, working with over 800 villages and reaching more than one million people across Uganda.
The organization fosters a collaborative and values-driven work culture that prioritizes inclusivity, compassion, transparency, and excellence, offering meaningful job opportunities with competitive benefits, professional development, and flexible work arrangements for both local and international talent. By leveraging technology, continuous learning, and rigorous monitoring and evaluation, Raising The Village ensures its interventions create measurable impact and long-term resilience for underserved populations. Its core values—equity, impact, innovation, integrity, and community—are embedded in every project and partnership. As part of its commitment to corporate social responsibility, the organization prioritizes sustainability, capacity building, and community ownership to ensure lasting change. For more information, visit www.raisingthevillage.org.
Job Title
Data Engineer job at Raising The Village (RTV)
Raising The Village (RTV)
Job Description
The Data Engineer will play a crucial role in the VENN department by designing, building, and maintaining scalable data pipelines, ensuring efficient data ingestion, storage, transformation, and retrieval. The role involves working with large-scale structured and unstructured data, optimizing workflows, and supporting analytics and decision-making.
The ideal candidate will have deep expertise in data pipeline orchestration, data modeling, data warehousing, and batch/stream processing. They will work closely with cross-functional teams to ensure data quality, governance, and security while enabling advanced analytics and AI-driven insights to support Raising The Village’s mission to eradicate ultra-poverty.
Duties, Roles and Responsibilities
Data Pipeline Development & Orchestration
Design, develop, and maintain scalable ETL/ELT pipelines for efficient data movement and transformation.
Develop and maintain workflow orchestration for automated data ingestion and transformation.
Implement real-time and batch data processing solutions using appropriate frameworks and technologies.
Monitor, troubleshoot, and optimize pipelines for performance and reliability.
Data Architecture & Storage
Build and optimize data architectures, warehouses, and lakes to support analytics and reporting.
Work with both cloud and on-prem environments to leverage appropriate storage and compute resources.
Implement and maintain scalable and flexible data models that support business needs.
Data Quality, Security & Governance
Ensure data integrity, quality, security, and compliance with internal standards and industry best practices.
Support data governance activities, including metadata management and documentation to enhance usability and discoverability.
Collaborate on data access policies and enforcement across the organization.
Cross-functional Collaboration & Solutioning
Work closely with cross-functional teams (analytics, product, programs) to understand data needs and translate them into technical solutions.
Support analytics and AI teams by providing clean, accessible, and well-structured data.
Innovation & Continuous Improvement
Research emerging tools, frameworks, and data technologies that align with RTV’s innovation goals.
Contribute to DevOps workflows, including CI/CD pipeline management for data infrastructure.

SERVICES
COMMERCIAL

SERVICES
INDUSTRIAL

SERVICES
RESIDENTIAL

SERVICES
COMMERCIAL

SERVICES
INDUSTRIAL

SERVICES
RESIDENTIAL

SERVICES
COMMERCIAL

SERVICES
COMMERCIAL

SERVICES
COMMERCIAL

SERVICES
COMMERCIAL

SERVICES
INDUSTRIAL

SERVICES
RESIDENTIAL
Qualifications, Education and Competencies
Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. (Master’s is a plus!)
Experience: 4+ years of hands-on work in data engineering and building data pipelines.
Programming: Strong in SQL and Python—you can clean, process, and move data like a pro.
Data Tools: Experience using workflow tools like Airflow, Prefect, or Kestra.
Data Transformation: Comfortable working with tools like DBT, Dataform, or similar.
Data Systems: Hands-on with data lakes and data warehouses—you’ve worked with tools like BigQuery, Snowflake, Redshift, or S3.
APIs: Able to build and work with APIs (e.g., REST, GraphQL) to share and access data.
Processing: Know your way around batch processing tools like Apache Spark and real-time tools like Kafka or Flink.
Data Design: Good understanding of data modeling, organization, and indexing to keep things fast and efficient.
Databases: Familiar with both relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB) databases.
Cloud: Experience with major cloud platforms like AWS, Google Cloud, or Azure.
DevOps: Know your way around Docker, Terraform, Git, and CI/CD tools for smooth deployments and testing
Skills & Abilities:
Strong ability to design, implement, and optimize scalable data pipelines.
Experience with data governance, security, and privacy best practices.
Ability to work collaboratively and engage with diverse stakeholders.
Strong problem-solving and troubleshooting skills.
Ability to effectively manage conflicting priorities in a fast-paced environment.
Strong documentation skills for technical reports and process documentation.
How to Apply
All Qualified and interested candidates should apply online at the APPLY Button below.