Introduction
“Everyone is on social media.”
A phrase you have likely heard in a conversation before. Social media no longer feels optional. For anyone with internet access, it has become an integral part of daily life, woven into social interaction, self-expression, professional networking, and political engagement. These social media platforms (hereinafter referred to as platforms) shape how we communicate, organise, and even how we imagine ourselves in public spaces. Research by Global WebIndex in February 2025 highlights that 63.9% of the world’s population uses social media. The average daily usage is 2 hours and 21 minutes.
The Many Faces of Online Harm
Among this usage, social media platforms are unfortunately a tunnel through which harms are perpetrated. These harms include but are not limited to cyberbullying and sustained online harassment, Technology-facilitated gender-based violence such as sextortion, non-consensual sharing of intimate images, and online stalking. As well as Privacy and data breaches, Misinformation and disinformation, Online fraud and financial scams, Child-related online abuse, among others. Some more examples can be seen in Uganda’s 2021 elections, where harassment of women politicians online was a problem. Nigeria’s #EndSARS protests highlighted how platform–state tensions impacted women activists. South Africa saw a notable surge of revenge porn, cyberbullying, and sextortion to the point where it needed to be specifically outlawed.
Notably, social media perpetuates harms specific to women. However, women in Africa are even more likely to experience these harms because their online experiences are shaped by deeper structural inequalities that already disadvantage them offline, leading to calls for platforms to be held more accountable for the harms that happen on their platforms. One might ask…. Why should Social Media Platforms be accountable for these harms? The answer is simple: because they have power. The popularity, geographical spread, and financial capacity of these social media platforms have given them what contemporary authors have called “state-like power.”
Platform power
According to the Oxford English Dictionary, a state is “a nation or territory considered as an organised political community under one government.” A traditional state has defined borders, a permanent population, sovereign authority, and formal legitimacy. Social media platforms do not meet this classical definition. They do not possess territory or constitutional authority, nor do they gain legitimacy through democratic consent.
Yet, in practice, platforms increasingly function like quasi-states. They make and enforce rules, these rules referred to by Meta as community standards and by TikTok as community guidelines. The platforms also determine who gets access, arbitrate disputes, shape public discourse, and influence public safety and elections. They negotiate with governments, deploy moderation that is similar to courts in a quasi-judicial role, such as the Meta Oversight Board. They also operate transnationally with budgets larger than some existing governments; all this power, yet without the democratic obligation to be “of the people and by the people.”
While platforms were in principle created for people to use, they do not often feel like they are owned by them and for them when it comes to ownership of platform resources and redress. On the social media redress systems, unlike courts, which derive their authority from constitutions and are mandated to deliver justice in the public interest, social media platforms make decisions based on private rules designed primarily to manage risk and protect business interests. Their processes are opaque, largely automated, and controlled internally, with remedies limited to takedowns, suspensions, or reinstatements. This creates a system where users experience punishment or relief without due process, explanation, or meaningful redress. While courts are accountable through appeals and judicial oversight, platforms answer mostly to public pressure and profit incentives. As a result, platform governance often looks like justice but lacks its depth, independence, and enforceability leaving victims, especially of harms like TFGBV, with decisions that resolve content issues without truly resolving harm.
Platform Accountability
A concern often raised at convenings by digital rights practitioners and evident during a panel at DataFest 2025 is the question of to whom platforms owe responsibility, and who enforces it.
Today, responsibility resides internally within the platforms themselves and their internal teams, such as Trust & Safety teams and Public Policy units. It is therefore self-governing and incentivised by engagement and profit, rather than public interest or safety.
Africa particularly suffers from this internally hosted responsibility, as escalation pathways often sit outside the continent, meaning urgent harms, particularly technology-facilitated gender-based violence (TFGBV), may go unanswered or be resolved only after irreversible damage has been done or even referenced as not being against community standards. Standards that may not align with the values of different African communities.
Globally, momentum for platform accountability is shifting. For years, platforms have been self-regulated. Today, several jurisdictions are moving toward enforceable obligations within their laws. The European Union (EU) Digital Services Act now requires risk assessments, algorithmic transparency, independent audits, and penalties for failure to protect users, including women in politics. The United Nations Entity for Gender Equality and the Empowerment of Women (UN Women), the Office of the United Nations High Commissioner for Human Rights (OHCHR), and the Inter-Parliamentary Union (IPU) recognise technology-facilitated gender-based violence (TFGBV) as a threat to democratic participation, not merely a personal injury.
Across Africa, however, regulatory accountability remains uneven. Platforms often respond to a patchwork of data protection authorities, telecommunications regulators, and consumer agencies — where they exist. This fragmentation creates loopholes and inconsistent enforcement. Some interventions are driven more by revenue motives than by human rights obligations.
Can platform accountability look different in Africa?
Yes, this is possible where incentives across design, policy, and enforcement align.
Design can embed safety safeguards and feminist principles into products from the outset. Considering that most platforms already exist, this may seem like a lost cause. However, they continue to deploy products such as AI. Some of the AI harms we see could have been reduced if guards were considered before product deployment.
Policies can remove financial rewards for accounts and networks that drive TFGBV and disinformation. An example here is the monetisation of X, which has pushed many to publish sensational information or outright falsehoods for impressions. Platforms can introduce verified political accounts, crisis-mode moderation during elections, friction to slow harmful engagement, and reduced amplification of repeat offenders.
Enforcement is where ambition must scale. Election periods show what is possible when governments align priorities across regulators and demand accountability from platforms. That coordination should not only exist during elections; it should be the norm.
Regulators can mandate time-bound responses, conduct risk audits, and impose tiered penalties under data protection and consumer frameworks. Political parties and advertisers can refuse to support platforms or influencers that profit from abuse.
For Africa, the vision is not to copy-paste European frameworks but to build context-responsive models that protect civic spaces and enable women to participate without violence, silencing, or exile from digital public life. This is often made easy by resolutions of the African Commission on Human and Peoples’ Rights. The harmony needed might be made possible by the adoption of the Resolution on Working Towards the Assessment of Public Interest Content in this Digital Era and Developing Guidelines to Ensure a Public Interest Element for All Platforms Operating in Africa – ACHPR/Res.631 (LXXXII) 2025 and the Resolution on Developing Guidelines to Assist States Monitor Technology Companies in Respect of Their Duty to Maintain Information Integrity Through Independent Fact-Checking – ACHPR/Res.630 (LXXXII) 2025 by the African Commission on Human and Peoples’ Rights (ACHPR) during its 82nd Ordinary Session.
We are at a turning point where platforms act like states but are not yet governed like them. If democracy is truly “for the people and by the people,” then digital governance should be as well, and African voices, especially women’s voices, must lead that future.
Africa must move from being a consumer of platform rules to a co-author of them. Building African frameworks for safety, justice, and inclusion online is not optional; it is the next frontier of democratic governance.
Author: Khadijah El-Usman


