They discuss the notion of “digital colonialism”: how foreign tech platforms and Big Tech investment patterns can undermine local innovation, extract value, and limit data sovereignty for African countries.
The episode argues that in order for civic‑tech to succeed on the continent, builders must prioritise context over code — designing tools that serve local needs, protect data rights, and empower communities rather than replicate foreign models.
Finally, the conversation highlights challenges such as the very small share of global AI and tech investment going to Africa, structural biases in funding, and the necessity of sustainable, revenue‑driven civic tech (rather than grant‑dependent models) to ensure long‑term impact.
The episode explores the balance between craft and community, and how immersive worlds can be designed to invite participation rather than passive consumption. Rick argues for platforms that allow creators to retain ownership and build sustainable communities around their work.
The conversation also probes practical questions: funding models for immersive art, technical capacity-building, and ways to make distribution fairer for African creators. Rick shares lessons learned from building projects that span art, education and social practice.
Listen and watch
Listen to the episode on Spotify or view related videos on our YouTube channel.
Scott Smith — host and contributor at The Angle Africa
He reflects on programming choices that centre African modes of storytelling and ancestral knowledge while also adopting contemporary immersive techniques. This balance, Alby suggests, helps avoid the trap of mimicking western festival models and instead builds something rooted in African creative practice.
Practical challenges are candidly discussed: logistics for touring XR works, finding technical partners, and making sure local communities benefit directly from festival activity. For sustainability, Alby highlights the need for diversified revenue — earned income, commissions, partnerships and grant funding.
The episode closes on a hopeful note: the festival aims to build a continental network and repository of African immersive works, giving creators visibility and a platform to thrive beyond a single show.
Listen and watch
Listen to the recorded conversation on Spotify or watch related videos on our YouTube channel.
Stanley Moloto — host and contributor at The Angle Africa
"Guilt-free screen time is possible when content is designed around learning outcomes and local relevance."
Sedibe also unpacks WonderBooks' business model: a mix of direct school sales, partnerships with NGOs and governments, and a focus on impact metrics that matter to funders. He stresses the need for an iterative workflow — test small, measure, adapt — rather than grand launches that fail to consider classroom realities.
The episode closes with a broader call: rethinking education systems by centring creativity, local storytelling and pragmatic technology that respects teachers' time and pupils' contexts. Sedibe's work shows that combining craft, culture and engineering can reclaim attention and make learning joyful and relevant for African kids.
Ultimately, the episode argues for a hybrid impact model — one that balances product design, local partnerships and sustainable funding to ensure that AR-enhanced lessons move beyond novelty to become dependable classroom tools.
Scott Smith — host and contributor at The Angle Africa
Practically, Penuel proposes pushing for stronger protections for whistleblowers, expanding civic education in schools, and supporting community oversight bodies that publish accessible reports. He stresses that reformers should experiment locally and document what works so reforms are evidence-driven.
Penuel also reflects on the role of movements like the Black Pens, who combine cultural critique with policy demands. Their strength, he argues, is in holding a mirror to power while proposing concrete institutional changes.
Listen and watch
Below is the recorded conversation — you can play it here or watch the full video on YouTube.
"Similar to author Timothy Morton's hyperobjects description, AI to most people seems incredibly complex and hard to comprehend even though we interact with it every day."
These questions create uncertainty as we spiral towards even more unanswered questions. We are excited by its magnificence, yet we are scared of its depth. We all want to use it, no one wants to be left behind. Yet as we wilfully fall into its free space, the concept itself is frightening. AI's complexity is hidden in the feel-good experiences that keep attracting us to it.
Some scholars refer to this kind of technology as pervasive; as it loses its physicality, it disappears into invisibility and covertly weaves itself into our daily lives. Think of the cellphone or the cloud in our current society.
Data Capitalism
In the modern age, Africa has been locked into using imported technology from the Global North. Predictably, this has not produced the expected development heralded by local adoption. Instead, the dominant political economy has ensured that Africa continues to lag technologically. In the era of data-driven technologies and economies, AI will potentially produce worse consequences than the previous technological eras.
AI technology, as it is currently, is an expression of extraction and data capitalism, colonialism and knowledge imperialism in Africa. Recent events on the continent that relate to the AI industry bring these issues to light. For instance, the massive and unjustified data collection and data appetite from the AI industry were highlighted in Fourcade and Healy's Moral Views of Market Society.
It exposes the extraction of cheap labour to power AI (to annotate, moderate and shape data) in the Global South (case of Meta in Kenya, 2023), and the recent unauthorised and unregulated data collection for AI in public places in Kenya (case of Worldcoin, 2023).
The current invisible digital surveillance and the recent rhetoric about digital inclusion that justifies the unlimited amassing of data from unsuspecting populations and governments are a reflection of how powerful the opaque AI agendas in Africa are.
This is coupled with increased funding that is targeting the development of AI technologies for Africa which mirror ideologies of the mother companies. The insidious consequences of these practices are masked by techno-optimistic messaging about AI.
With the aid of badly formulated government policy, large amounts of funding are directed towards AI development, with less investment focused on its governance as proposed in McLennan's paper Techno-optimism or Information Imperialism: Paradoxes in Online Networking, Social Media and Development.
Agendas of Power
The superficial narrative about digital inclusion and algorithmic bias where "the excluded" (or underrepresented) in AI/LLM training data must simply be adequately represented needs to be addressed critically and cautiously. History has shown all too well how science and technology can be deployed through dominant power systems to exploit black bodies or bodies of the South.
From eugenics to the current data-driven classification and commodification, it has been shown that the agendas of power and technologists can be coupled to classify, subjugate and destroy those at the power margins.
"The question must then be posed: When we volunteer our data and ourselves in the name of digital inclusion, where are we being included? Whose agendas dominate in the technology being developed?"
Answering these questions is not feasible when AI features remain silently hidden, invisible from users. This opacity directly affects how users assess the functionality, capability and possibilities of incorporating AI in their lives and the impact it has on wider society and the environment, as shown in Diefenbach's Technology Invisibility and Transparency.
The hyped "inclusion" rhetoric is a campaign for amassing data from the "unincluded" and unsuspecting groups, while African governments and local organisations are lured into the mythical call to the AI "hyperobject", without questioning the underlying power structures and agendas.
It should be noted that while the goal of invisibility might ostensibly be to make technology sophisticated, this opacity hides corporate and political interests, and the agencies that shape it, keeping AI users ignorant, as highlighted in Ndaka's Sustainable AI Techno-futures.
Until we can empower ourselves by gaining a better sense and control of these silences and hidden agendas, we might just consider whether we are better off not rushing blindly into submitting our data to AI in pursuit of inclusion.
Stanley Moloto is a contributor to The Angle Africa focusing on technology and digital rights
