The Signpost

File:Wikipedia scale of justice.svg
Olmec and Mrmw
cc-by-sa-3.0
25
5
350
News from Diff

Strengthening Wikipedia’s neutral point of view

Contribute   —  
Share this
By Wikimedia Foundation
This article originally appeared on Diff on March 27, 2025. Licensed CC-BY-SA 4.0. See related articles in this issue at Op-ed, In focus, and Opinion.

The Wikimedia Foundation recently highlighted the extensive research effort that goes into identifying global trends that are critical to our priorities as a movement, a planning exercise we have undertaken annually since 2022. We examine rapidly shifting changes to ask more deeply: “What does the world need from Wikimedia now?”

These trends then drive meaningful conversations on-wiki and in many community spaces to help prioritize actions and investments.

This year, one trend that has generated significant conversation within and beyond the Wikimedia movement relates to neutrality: how trust in information online is declining and a fragmentation of consensus about what information is true. Some believe that the world has become more complex, and people are more divided than ever. As threats to neutrality appear to be on the rise globally, Wikipedia’s neutral point of view (NPOV) policy is needed now more than ever. This core principle has served Wikipedia extremely well, and has become even stronger over nearly a quarter century of volunteer contributions.

How this principle of neutrality translates into NPOV policy varies across the Wikimedia projects, highlighting an opportunity for communities to learn from each other; and to explore whether common global standards for neutrality can better protect the projects (and volunteers) in an environment of expanded threats and growing regulation.

To support the Wikimedia communities and reaffirm our commitment to neutrality, the Wikimedia Foundation will convene a working group of active editors, Trustees, researchers, and advisors to explore recommendations for common standards for NPOV policies that can protect Wikipedia, increase the integrity of the projects, and equip the volunteers trusted to administer these policies with more support.

These conversations will be grounded in the foundational principles underlying NPOV, designed to present a fair, neutral description of the facts without compromising the exploration of ideas, concepts, and perspectives. Reaffirming Wikipedia’s neutrality in response to what we are seeing in the world makes this highly trusted resource even more resilient in its mission to serve accurate, reliable information.

Read on for more information about Wikipedia’s NPOV policies and how to contribute.

A brief history of Wikipedia’s Neutral Point of View (NPOV)

From the beginning of the project, maintaining a neutral point of view has been a core principle of Wikipedia. The Meta-Wiki page, started in 2003, notes that Wikipedia and the Wikimedia projects are "best served not by advancing or detracting from particular points of view on any given subject, but by trying to present a fair, neutral description of the facts – including that various interpretations and points of view exist…This policy exists on all languages of projects that have adopted it, but the details of the policy vary significantly between projects and between different languages in those projects."

Wikipedians have understood this non-negotiable principle requires collaboration in how it is applied in practice: "While NPOV is an ultimate goal in writing an article, it is difficult to achieve immediately as a single writer. It is thus sometimes regarded as an iterative process (as is wiki writing in general), by which opposing viewpoints compromise on language and presentation to produce a neutral description acceptable to all, according to consensus decision-making."

Like all policies, NPOV has been refined over Wikipedia’s 24 years with thousands of inputs across hundreds of languages. Throughout this evolutionary process of community discussion, the neutral point of view principle and its application remains fundamental to the Wikipedia model.

Common global standards for NPOV policies

A cursory review of NPOV policies across different language versions of Wikipedia at a recent community workshop revealed variations, inconsistencies, and many opportunities for the different Wikimedia projects to learn from each other. Editors with extended rights, those trusted by their communities with administering NPOV policies, described the challenges they face when these policies are unclear or underdeveloped in some languages.

As public trust in news sources declines, Wikipedians face evolving challenges in representing views from reliable sources fairly, proportionately, and without editorial bias. This application is often tested when editors cover fast-moving and contentious topics. In recent years, this has ranged from geopolitical conflicts like Israel-Palestine and Russia-Ukraine to issues of special attention like the unbalanced representation of women on the Wikimedia projects, increasing content about the Global South from those who live there, representing medical information about pandemics, to how Wikipedia takes action to enforce its policies when there are concerns of antisemitism. We have seen in some of these cases that reasonable concerns take time and research to address.

As volunteers know well, Wikipedia’s integrity – especially on evolving and contentious topics – is safeguarded by robust community governance processes (e.g., discussion protocols, oversight mechanisms, dispute resolution channels) that prioritize a fair and balanced approach. These well-documented standards are specifically designed to prevent undue influence and preserve an independent, nonprofit model that exists nowhere else at this scale. We have seen time and time again that volunteers have a strong track record of successfully managing neutrality on contentious subjects.

Those that sustain the Wikimedia projects remain humble and clear-eyed: they must constantly adapt and improve their systems as digital and media platforms around the world struggle with bias and disinformation. Stronger community-led content moderation policies that are enforced primarily by local communities protect Wikipedia and Wikipedians alike.

To support these essential self-governance mechanisms, this working group will ask how more common standards for NPOV policies between projects can protect and improve Wikipedia and support volunteers, building on the careful consideration that developed NPOV policies over the past two decades. The intent of the working group will be to facilitate knowledge sharing about community-led policies that have made Wikipedia a highly trusted resource around the world. This will incorporate feedback from work already underway to establish policies and that reflects values shared by the global Wikimedia community.

An initial set of recommendations will be presented to the Wikimedia Foundation’s Board of Trustees at their June 2025 meeting, alongside approval of the Foundation’s plan and budget.

Contributions are welcomed by all who care about these topics within and beyond the Wikimedia movement, on the dedicated Meta-Wiki page or as part of the Foundation’s annual planning conversations.

The world’s reliance on Wikipedia continues to grow – from AI chatbots to search engines to voice assistants to content reusers across the internet. At a time when we are used (and needed!) more than ever, we must reinforce our core principles and sustain our shared values. Strengthening Wikipedia’s neutrality will make it even more trusted to deliver the content that billions of people rely on around the world.


Signpost
In this issue
+ Add a comment

Discuss this story

To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
No comments yet. Yours could be the first!







       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0