Skip to main content

Posts

Silenced Networks, Silent Losses: The Real Impact of Uganda’s Internet Shutdown

From January 13 to January 26, 2026 , Uganda experienced a government-ordered restriction on internet services surrounding its general election. Initially imposed two days before voting, the shutdown affected nearly all public internet access including social media, messaging apps, web browsing, and critical online tools. The shutdown was gradually lifted over the following days, with some platforms still limited as of January 26 despite restoration of Internet services. ( Anadolu Ajansı ) During this period, the internet was not just an optional convenience; it was a core part of Uganda’s economic infrastructure. Millions of Ugandans rely on mobile money for daily transactions, from paying for transport to buying food and receiving wages, and on the internet for business communication, logistics, e-commerce, and service delivery. When connectivity was suspended, these digital lifelines were abruptly broken. ( Human Rights Watch ) Economic Costs: Who Paid and How Much The financ...
Recent posts

Silence as Complicity: How Media Omission During 2026 Elections Undermines Justice and Accountability in Uganda

In any democratic society, the media plays a central role in documenting events, informing the public, and supporting access to justice. When credible media institutions fail to report on human rights violations, especially during elections, this omission is not neutral. It actively weakens accountability, distorts public memory, and limits victims’ pathways to justice. In contexts like Uganda’s recent elections, media silence has become a structural barrier to human rights protection. Research on media freedom under authoritarian and semi-authoritarian governments shows a clear pattern. States rarely rely only on outright censorship. Instead, they use regulatory pressure, licensing threats, advertising control, intimidation of journalists, and selective access to information to direct narratives. The result is not always loud propaganda, but quiet omission. Violations happen, but they are not recorded by institutions that are considered credible, authoritative, or admissible in lega...

Uganda’s 2026 Internet Shutdown from a Human Rights View

  In the early evening of January 13, 2026 , as millions of Ugandans were preparing to finalize their thoughts and engage in one of the most consequential elections in recent history, the government quietly flipped a switch that plunged our digital world into silence. At 6:00 pm , the Uganda Communications Commission (UCC) ordered a nationwide internet shutdown , cutting off public access to the web and silencing the online voices of citizens across the country . This was just two days before the January 15 general election . ( TheStar ).  This was not a technical glitch. This was an intentional blackout.  See the exact shutdown directives at the end of the blog.  A Country Cut Off at a Critical Moment The shutdown was not partial or isolated. It blocked mobile and fixed-line services , including social media platforms, messaging apps, and VPN access that many citizens rely on to communicate, stay informed, and share eyewitness accounts. ( TheStar ) For a nat...

When “More of the Same” Becomes Dangerous: How Algorithmic Repetition Fuels Radicalization

Written by Rebecca Nanono Introduction Across today’s digital platforms, algorithms promise personalization, relevance, and convenience. However, beneath this promise lies a growing risk. When algorithms repeatedly serve users more of the same content , they can intensify polarization, amplify harmful ideologies, and accelerate pathways to radicalization. For digital rights advocates, feminists, and social justice actors, this is not just a technical flaw. It is a structural governance problem with deeply gendered and political consequences. How Algorithmic Repetition Works Most social media and content platforms rely on engagement-optimizing algorithms . These systems learn from users’ digital footprint such as clicks, likes, shares, watch time, and comments, then prioritize content that maximizes attention. Over time, this creates the following. Feedback loops , where users are repeatedly exposed to similar views Echo chambers , limiting exposure to altern...

Why African Data Powers Modern AI - Even When Africa Is Not at the Table

A look at AI filters, bias, exploitation and what Africans can do about it By Rebecca Nanono, Contributor Introduction Artificial intelligence systems ,from generative text to face filters on apps , are only as smart as the data they are trained on. That means the billions of images, videos, recordings, messages, and internet activity that exist online influence how AI understands the world . And increasingly, African digital content , especially from young, creative users is being sucked into global AI models . This is not just a technical issue but a digital rights and power issue . In this blog, we explore the following. How African data fuels AI Why African women’s images and voices often appear in AI systems The risks of this dynamic What communities and policymakers can do The Data Behind the AI Curtain Modern AI systems, like those powering TikTok’s face filters or global large language models (LLMs), rely on large datasets drawn fr...

The Biggest Missteps of 2025: Putting an End to Data and AI Disasters

2025 was supposed to be the year artificial intelligence and data-driven systems finally delivered on their promise: efficiency, inclusion, and innovation. Instead, it became a year of hard lessons. Across governments, corporations, and platforms, repeated data and AI failures exposed a familiar truth.  Technology is only as ethical as the systems of power that shape it. For women, marginalized communities, and digital rights defenders, these missteps were not abstract “tech problems.” They had real consequences: surveillance without consent, automated exclusion, silencing of voices, and deepened inequalities. As we move forward, ending data and AI disasters must start with naming what went wrong. 1. Treating Data as a Resource, Not a Right One of the biggest missteps of 2025 was the continued framing of personal data as a commodity rather than a human rights issue. Governments and companies rushed to collect, share, and monetize data without meaningful consent, transparenc...

Populism and Misinformation in the Digital Age: Who Really Pays the Price?

In today’s digital age, populism no longer relies solely on rallies, posters, or radio speeches. It thrives online, on social media timelines, encrypted messaging apps, livestreams, and viral videos. The internet has become a powerful political arena, one where emotions often travel faster than facts and where misinformation can shape public opinion long before the truth catches up. At its core, populism claims to speak for “the people” against a corrupt elite. While this framing can sometimes highlight real social grievances, in the digital era it is increasingly fueled by misinformation, disinformation, and simplified narratives that reduce complex realities into shareable slogans. The result is a digital ecosystem where fear, anger, and resentment are easily weaponized,and where women, girls, and marginalized communities often bear the greatest harm. The Digital Amplification of Populism Social media platforms were designed to maximize engagement, not accuracy. Algorithms rewa...