As artificial intelligence reshapes global journalism, African newsrooms are grappling with its promises and perils. 

A recent webinar hosted by the Wits Centre for Journalism and African Women in Media brought together Dr Grace Githaiga, CEO at KICTANet, Sara El-Khalili, senior media development manager at Thomson Reuters Foundation, and Emaediong Akpan, lawyer and digital rights advocate, in conversation with WCJ director Dr Dinesh Balliah to unpack the ethical frameworks needed to guide AI adoption across the continent’s media landscape.

From Sci-Fi to Newsrooms: The Urgency of Ethical Guardrails

“AI was once the stuff of sci-fi,” noted moderator Dr. Dinesh Balliah, “but it’s now embedded in our everyday workflows even before this webinar began, AI notetakers were already active.”

Dr. Grace Githaiga emphasised the duality of AI’s impact: “Generative AI can boost newsroom productivity, but it also deepens misinformation and overwhelms fact-checkers.” 

She cited recent cases where fabricated legal references appeared in court judgments and professional reports, underscoring the stakes of unchecked AI hallucinations.

“Hallucination is information that sounds plausible but is factually incorrect. It is fabricated. It is unverifiable,” Githaiga explained. “Disinformation, on the other hand, is deliberately created to deceive.”

Five Ethical Fault Lines in AI-Powered Journalism

Dr. Githaiga outlined five challenges:

  • Hallucination vs. Disinformation
    AI-generated content often blurs the line between plausible fiction and deliberate falsehoods, complicating editorial verification.

  • Deepening Misinformation
    Hyperrealistic AI outputs—text, images, and video—are flooding digital spaces, making it harder for audiences to distinguish fact from fabrication.

  • Credibility as a Competitive Advantage
    “Generative AI sometimes cites non-existent references,” she warned. Newsrooms must double down on verification to preserve trust.

  • Platform Vulnerability
    Social media algorithms amplify sensational AI-generated content. Tools like X’s Grok mine unvetted data, reinforcing disinformation loops.

  • Trust Deficit
    “Trust is scarce,” Githaiga said. “Audiences are confused, yet they still look to journalists for credible information.”

Building Ethical AI Frameworks: Co-Creation and Transparency

Githaiga’s takeaways were clear:

  • Newsrooms must invest in verification tools and revise editorial codes to include AI protocols.
  • Transparency in AI deployment should be non-negotiable.
  • Ethical frameworks must be co-created with journalists, technologists, civil society, and platforms.
  • Public education is essential to help audiences spot synthetic content.

“AI should supplement editorial judgment not replace it.”

AI Use Is Widespread, Policies Are Not

Sara El-Khalili of Thomson Reuters Foundation shared findings from a 2025 global survey of journalists across 70 countries:

  • 81% of respondents use AI in their work.
  • Only 13% have newsroom policies governing its use.
  • Top concerns include loss of creativity, erosion of critical thinking, misinformation, and legal liability.

“We can’t be using something that comes with so much risk without having any guardrails in place,” El-Khalili stressed.

She introduced TRF’s three-step guide to an AI-ready newsroom:

  1. Identify the tool and its risks.
  2. Implement editorial safeguards.
  3. Monitor and revise policies regularly.

Gendered Labour and the Invisible Burden of AI Correction

Emaediong Akpan, a Nigerian lawyer and gender researcher, spotlighted the gendered dimensions of AI in African newsrooms. Her research revealed that women journalists often perform “corrective labor”—unpaid work to fix AI-generated errors and biases.

“AI is not a neutral tool,” Akpan said. “It mirrors existing inequalities and forces journalists to fight misrepresentation of African realities.”

She called for:

  • Gender-balanced ethics committees.
  • Human touch checklists to ensure oversight at every stage of AI use.
  • Gender impact assessments for new tools.
  • A push to decolonize AI and amplify authentic African narratives.

Reimagining AI for African Storytelling

Across the board, speakers agreed: AI must be adapted—not adopted wholesale. African newsrooms need context-driven, inclusive, and accountable frameworks that reflect local realities and uphold journalistic integrity.

As Dr. Githaiga concluded, “Journalists remain central. AI must support—not supplant—their editorial judgment.”