Wonder Needs a Workflow | Nkululeko Sedibe on Building WonderBooks, a South African AR Classroom

From ‘guilt-free screen time’ to a hybrid impact model, the iNKULU founder unpacks what it really takes to turn phones into portals for learning.

Scott Smith
December 3, 2025
Back to Podcast
In this episode, Nkululeko Sedibe explains how his startup, iNKULU Creative, created WonderBooks — a digital learning platform that transforms standard school content into animated, gamified, and story-driven lessons. He describes the work, trade-offs and persistence required to build edtech that actually works for African classrooms.

WonderBooks emerged from frustration with dry textbooks and disengaged students. Sedibe wanted learning to feel like play: lessons that use animation, interactivity and local stories so children engage naturally without realising they’re studying.

The conversation moves through product design, content partnerships and the realities of scaling in environments with patchy connectivity and limited budgets. Sedibe explains how a pragmatic, hybrid model — where offline-first apps sync when possible and local facilitators blend digital and human-led activities — can create meaningful classroom impact.

He talks candidly about market resistance, the difficulty of selling into cash-constrained school systems, and the importance of building teacher workflows that make adoption simple rather than more work. "If teachers don't have a clear, quick path to use the product, it won't last," he says.

"Guilt-free screen time is possible when content is designed around learning outcomes and local relevance."

Sedibe also unpacks WonderBooks' business model: a mix of direct school sales, partnerships with NGOs and governments, and a focus on impact metrics that matter to funders. He stresses the need for an iterative workflow — test small, measure, adapt — rather than grand launches that fail to consider classroom realities.

WonderBooks demo
WonderBooks demo interface. Placeholder image — replace with official asset.

The episode closes with a broader call: rethinking education systems by centring creativity, local storytelling and pragmatic technology that respects teachers' time and pupils' contexts. Sedibe's work shows that combining craft, culture and engineering can reclaim attention and make learning joyful and relevant for African kids.

Ultimately, the episode argues for a hybrid impact model — one that balances product design, local partnerships and sustainable funding to ensure that AR-enhanced lessons move beyond novelty to become dependable classroom tools.

Scott Smith — host and contributor at The Angle Africa

"A hard reset can't be purely top-down or purely cultural — it must be both. Agency without institutions risks short-lived victories; institutions without agency become ossified."

Practically, Penuel proposes pushing for stronger protections for whistleblowers, expanding civic education in schools, and supporting community oversight bodies that publish accessible reports. He stresses that reformers should experiment locally and document what works so reforms are evidence-driven.

Penuel speaking
Photo: Penuel at a public forum. Replace with high-resolution asset if available.

Penuel also reflects on the role of movements like the Black Pens, who combine cultural critique with policy demands. Their strength, he argues, is in holding a mirror to power while proposing concrete institutional changes.

Listen and watch

Below is the recorded conversation — you can play it here or watch the full video on YouTube.

Watch on YouTube

"Similar to author Timothy Morton's hyperobjects description, AI to most people seems incredibly complex and hard to comprehend even though we interact with it every day."

These questions create uncertainty as we spiral towards even more unanswered questions. We are excited by its magnificence, yet we are scared of its depth. We all want to use it, no one wants to be left behind. Yet as we wilfully fall into its free space, the concept itself is frightening. AI's complexity is hidden in the feel-good experiences that keep attracting us to it.

Some scholars refer to this kind of technology as pervasive; as it loses its physicality, it disappears into invisibility and covertly weaves itself into our daily lives. Think of the cellphone or the cloud in our current society.

Data Capitalism

In the modern age, Africa has been locked into using imported technology from the Global North. Predictably, this has not produced the expected development heralded by local adoption. Instead, the dominant political economy has ensured that Africa continues to lag technologically. In the era of data-driven technologies and economies, AI will potentially produce worse consequences than the previous technological eras.

AI technology, as it is currently, is an expression of extraction and data capitalism, colonialism and knowledge imperialism in Africa. Recent events on the continent that relate to the AI industry bring these issues to light. For instance, the massive and unjustified data collection and data appetite from the AI industry were highlighted in Fourcade and Healy's Moral Views of Market Society.

It exposes the extraction of cheap labour to power AI (to annotate, moderate and shape data) in the Global South (case of Meta in Kenya, 2023), and the recent unauthorised and unregulated data collection for AI in public places in Kenya (case of Worldcoin, 2023).

The current invisible digital surveillance and the recent rhetoric about digital inclusion that justifies the unlimited amassing of data from unsuspecting populations and governments are a reflection of how powerful the opaque AI agendas in Africa are.

This is coupled with increased funding that is targeting the development of AI technologies for Africa which mirror ideologies of the mother companies. The insidious consequences of these practices are masked by techno-optimistic messaging about AI.

With the aid of badly formulated government policy, large amounts of funding are directed towards AI development, with less investment focused on its governance as proposed in McLennan's paper Techno-optimism or Information Imperialism: Paradoxes in Online Networking, Social Media and Development.

Agendas of Power

The superficial narrative about digital inclusion and algorithmic bias where "the excluded" (or underrepresented) in AI/LLM training data must simply be adequately represented needs to be addressed critically and cautiously. History has shown all too well how science and technology can be deployed through dominant power systems to exploit black bodies or bodies of the South.

From eugenics to the current data-driven classification and commodification, it has been shown that the agendas of power and technologists can be coupled to classify, subjugate and destroy those at the power margins.

"The question must then be posed: When we volunteer our data and ourselves in the name of digital inclusion, where are we being included? Whose agendas dominate in the technology being developed?"

Answering these questions is not feasible when AI features remain silently hidden, invisible from users. This opacity directly affects how users assess the functionality, capability and possibilities of incorporating AI in their lives and the impact it has on wider society and the environment, as shown in Diefenbach's Technology Invisibility and Transparency.

The hyped "inclusion" rhetoric is a campaign for amassing data from the "unincluded" and unsuspecting groups, while African governments and local organisations are lured into the mythical call to the AI "hyperobject", without questioning the underlying power structures and agendas.

It should be noted that while the goal of invisibility might ostensibly be to make technology sophisticated, this opacity hides corporate and political interests, and the agencies that shape it, keeping AI users ignorant, as highlighted in Ndaka's Sustainable AI Techno-futures.

Until we can empower ourselves by gaining a better sense and control of these silences and hidden agendas, we might just consider whether we are better off not rushing blindly into submitting our data to AI in pursuit of inclusion.

Stanley Moloto is a contributor to The Angle Africa focusing on technology and digital rights