Penuel begins by tracing how public debates and social movements create the conditions for institutional change. He suggests that when citizens invest in shared narratives, norms and practices — when they "own their minds" — they create the social energy that institutions need to be held accountable and to transform.
But ideas alone are not enough. "Owning the system," Penuel says, requires building institutions that are both representative and capable. That means legal reform, transparent oversight, independent media and community-led mechanisms that connect grassroots accountability with formal power.
Drawing on examples from recent civic campaigns, Penuel maps a three-part strategy: 1) nurture public imagination through education, arts and local conversation; 2) strengthen watchdog institutions and legal protections; 3) link local experiments to national policy so promising practices can scale.
Practically, Penuel proposes pushing for stronger protections for whistleblowers, expanding civic education in schools, and supporting community oversight bodies that publish accessible reports. He stresses that reformers should experiment locally and document what works so reforms are evidence-driven.
Penuel also reflects on the role of movements like the Black Pens, who combine cultural critique with policy demands. Their strength, he argues, is in holding a mirror to power while proposing concrete institutional changes.
Listen and watch
Below is the recorded conversation — you can play it here or watch the full video on YouTube.
"Similar to author Timothy Morton's hyperobjects description, AI to most people seems incredibly complex and hard to comprehend even though we interact with it every day."
These questions create uncertainty as we spiral towards even more unanswered questions. We are excited by its magnificence, yet we are scared of its depth. We all want to use it, no one wants to be left behind. Yet as we wilfully fall into its free space, the concept itself is frightening. AI's complexity is hidden in the feel-good experiences that keep attracting us to it.
Some scholars refer to this kind of technology as pervasive; as it loses its physicality, it disappears into invisibility and covertly weaves itself into our daily lives. Think of the cellphone or the cloud in our current society.
Data Capitalism
In the modern age, Africa has been locked into using imported technology from the Global North. Predictably, this has not produced the expected development heralded by local adoption. Instead, the dominant political economy has ensured that Africa continues to lag technologically. In the era of data-driven technologies and economies, AI will potentially produce worse consequences than the previous technological eras.
AI technology, as it is currently, is an expression of extraction and data capitalism, colonialism and knowledge imperialism in Africa. Recent events on the continent that relate to the AI industry bring these issues to light. For instance, the massive and unjustified data collection and data appetite from the AI industry were highlighted in Fourcade and Healy's Moral Views of Market Society.
It exposes the extraction of cheap labour to power AI (to annotate, moderate and shape data) in the Global South (case of Meta in Kenya, 2023), and the recent unauthorised and unregulated data collection for AI in public places in Kenya (case of Worldcoin, 2023).
The current invisible digital surveillance and the recent rhetoric about digital inclusion that justifies the unlimited amassing of data from unsuspecting populations and governments are a reflection of how powerful the opaque AI agendas in Africa are.
This is coupled with increased funding that is targeting the development of AI technologies for Africa which mirror ideologies of the mother companies. The insidious consequences of these practices are masked by techno-optimistic messaging about AI.
With the aid of badly formulated government policy, large amounts of funding are directed towards AI development, with less investment focused on its governance as proposed in McLennan's paper Techno-optimism or Information Imperialism: Paradoxes in Online Networking, Social Media and Development.
Agendas of Power
The superficial narrative about digital inclusion and algorithmic bias where "the excluded" (or underrepresented) in AI/LLM training data must simply be adequately represented needs to be addressed critically and cautiously. History has shown all too well how science and technology can be deployed through dominant power systems to exploit black bodies or bodies of the South.
From eugenics to the current data-driven classification and commodification, it has been shown that the agendas of power and technologists can be coupled to classify, subjugate and destroy those at the power margins.
"The question must then be posed: When we volunteer our data and ourselves in the name of digital inclusion, where are we being included? Whose agendas dominate in the technology being developed?"
Answering these questions is not feasible when AI features remain silently hidden, invisible from users. This opacity directly affects how users assess the functionality, capability and possibilities of incorporating AI in their lives and the impact it has on wider society and the environment, as shown in Diefenbach's Technology Invisibility and Transparency.
The hyped "inclusion" rhetoric is a campaign for amassing data from the "unincluded" and unsuspecting groups, while African governments and local organisations are lured into the mythical call to the AI "hyperobject", without questioning the underlying power structures and agendas.
It should be noted that while the goal of invisibility might ostensibly be to make technology sophisticated, this opacity hides corporate and political interests, and the agencies that shape it, keeping AI users ignorant, as highlighted in Ndaka's Sustainable AI Techno-futures.
Until we can empower ourselves by gaining a better sense and control of these silences and hidden agendas, we might just consider whether we are better off not rushing blindly into submitting our data to AI in pursuit of inclusion.
Stanley Moloto is a contributor to The Angle Africa focusing on technology and digital rights
