How Wikipedia Became a Battleground for Racial Justice, in Slate by Stephen Harrison examines the inner workings of knowledge production in English Wikipedia, using his usual combination of the analysis of talk page debates, policies, the evolution of articles, and editor interviews.
He describes the production process as "part collaboration and part combat", and focuses on deletion debates as an expression of practices and policies that discourage positive change. The deletion nominations of the Black Lives Matter Meta-Wiki page and the article Anna Gifty Opoku-Agyeman are closely examined, including the intervention of individual administrators who may have misapplied policy to prevent changes in controversial areas. He writes:
[Wikipedia's] rules-based view of neutrality may not be as neutral as it seems. "Wikipedia contributors have begun operationalizing a definition of neutrality in order to silence perspectives outside the community accepted point-of-view," [social scientist Jackie] Koerner said in an email. Take the case of Black Birders Week, a series of online events to celebrate black naturalists and birders. This initiative was conceived in response to last month’s racially charged incident in Central Park. In connection with Black Birders Week, Wikipedia editors created new articles about black bird-watching leaders like Anna Gifty Opoku-Agyeman, the Ghanaian-born American activist who co-founded the movement. After Opoku-Agyeman's Wikipedia page went up, so-called deletionist editors moved to have it removed. The deletionist editors argued that even though Opoku-Agyeman had been written up in places like The New York Times and The Wall Street Journal, she was only given passing mention by those sources and therefore was not sufficiently “notable” to merit her own Wikipedia page. At the end of the day, more editors voted to keep the article for Opoku-Agyeman, whose page remains online. But the incident itself shows how the notion of neutrality can be weaponized by some factions to keep certain knowledge off of the encyclopedia.
The Signpost notes that Wikipedia:Articles for deletion/Anna Gifty Opoku-Agyeman was plagued by sockpuppets - starting with the nominator. Discounting the socks, the outcome of the deletion debate was 32 keep versus 4 delete !votes. One editor preferred redirection.–B
This podcast episode is the third part of three discussing the influence of disinformation in what they describe as an often-unreliable internet. Myopically insisting that they "are a disinformation podcast" they suggest that Wikipedia is vulnerable to disinformation. Introduced as the Chief of Staff to the WMF Office of the Executive Director, Ryan Merkley handles himself well; explaining that volunteers, not WMF employees, control content on Wikipedia and that a combination of pending changes, bots, and human editors prevent untruths from spreading on Wikipedia. Merkely adroitly points out that the nature of the semantic wiki, the openness of talk pages and history tabs insulate Wikipedia.
He states that Wikipedia's appearance of reliability is "trust that community has earned" but points out that Wikipedia looks good in part because so much other media looks bad, saying "attacks on traditional journalism that have reduced trust in media, behaviors of media not in line with some of the traditional values of what media should do, like fact-checking" have eroded once reliable sources, causing second-order effects here because we need reliable sources, causing third-order effects because outlets like Google use Wikipedia in information they republish. Merkely admits that against a state-sanctioned disinformation campaign, Wikipedia, like any other social media outlet, would be helpless. The interviewer asks, with concern to the "homogeneity in the community of editors", how the WMF's upcoming universal code of conduct will tamp down harassment. Merkley agrees that some in the audience may have heard that Wikipedia is unfriendly to women and persons of color asserting that "parts of our communities have been hostile" but that he's "really proud" of the Board of Trustees' work on the Code of Conduct and that the code is meant to create a baseline of enforcement upon which communities can improve. –CT
Still in the testing phase, Facebook's new search product allows readers to stay on their platform while reading basic information from Wikipedia and other sources, similar to Google's Knowledge Panel. TechCrunch is not impressed, calling the product "fairly hit or miss". Their examples include a "hit" – searching for "joker" will call up information on the movie – but searching for "parasite" does not give information on the Oscar-winning Best Picture of 2019. In another example, searching for "Donald Trump" calls up useful information, but searching for members of his cabinet does not always work.
Social Media Today broke the story with Facebook Adds Wikipedia Knowledge Boxes in Search Results.