The Signpost

Recent research

Feminist critique of Wikipedia's epistemology, Black Americans vastly underrepresented among editors, Wiki Workshop report

Contribute   —  
Share this
By Markworthen and Tilman Bayer

A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.

Reviewed by Markworthen

This paper[1] by Menking and Rosenberg, published in the journal Science, Technology, & Human Values, is a recondite article. Its depth is both a strength (diligent study of the article will likely enhance Wikipedians' understanding of potential problems, such as our assumptions about what constitutes a reliable source or our epistemological presumptions), and a weakness (most Wikipedians will not read it because it is so dense).

I tried (several times) but cannot improve on the authors' summary. Here, then, is an excerpt from the article abstract:

A repository for established facts, Wikipedia is also a social space in which the facts themselves are decided. As a community, Wikipedia is guided by the five pillars—principles that inform and undergird the prevailing epistemic and social norms and practices for Wikipedia participation and contributions. We contend these pillars lend structural support to and help entrench Wikipedia’s gender gap as well as its lack of diversity in both participation and content. In upholding these pillars, Wikipedians may unknowingly undermine otherwise reasonable calls for inclusivity, subsequently reproducing systemic biases. We propose an alternative set of pillars developed through the lens of feminist epistemology ... Our aim is not only to reduce bias, but also to make Wikipedia a more robust, reliable, and transparent site for knowledge production.

Context

Background reading that will enhance understanding of this Menking & Rosenberg (2021) article:

Talk page discussions about the article

The article has generated some engaging discussions on Wikipedia talk pages, for example:


African Americans are vastly underrepresented among US Wikimedians, but contribute motivated by "black altruism"

Reviewed by Tilman Bayer

Last month, the Wikimedia Foundation published the results of its annual "Community Insights", a global survey of 2,500 Wikimedians (including active editors and program leaders) conducted in September/October 2020.[2]

For the first time, the survey asked about race and ethnicity, confined to two countries where such categories are widely used and accepted: the US (195 responses) and the UK (67). Among US contributors, the findings shows striking gaps among Black/African American editors (0.5% compared to 13% among the general population) and American Indian/Alaska Native editors (0.1% vs 0.9%). Hispanic/Latino/a/x editors show a lesser but still large gap (5.2% vs. 18%). White/Caucasian (89% vs. 72%) and especially Asian Americans (8.8% vs. 5.7% among the general population) are over-represented among contributors in the US.

Survey findings about the race of contributors in the US, compared to the overall population (categories are overlapping and thus sum to more than 100%)

In the UK, the survey similarly found "significant underrepresentation" of Black or Black British editors (0.0% vs. 3.0% in the general population), whereas the percentage of white editors was close to the general population.

A racial or ethnic gap among Wikipedians in the US has long been anecdotally observed or conjectured (see e.g. this 2010 thread which also contained some informed speculation about possible reasons), but this marks the first time that it is backed by empirical survey data, related to the fact that the Wikimedia Foundation's annual surveys are global in nature and there are no internationally accepted definitions of race and ethnicity (or worse, survey questions of this nature would be considered offensive in many countries) [1][2].

Illustration from the 2018 "Pipeline of Online Participation Inequalities" paper

Correspondingly, there has been few research about possible reasons for such gaps. An exception is the 2018 paper "The Pipeline of Online Participation Inequalities: The Case of Wikipedia Editing"[supp 1], which we previously reviewed here with a more general focus, but it also contains some insights about reasons why African Americans contribute at a lower rate.

While this study did not find a significant racial disparity among the earlier parts of the pipeline (measuring whether survey respondents had heard of Wikipedia or had visited Wikipedia), when it comes to "know[ing] that Wikipedia can be edited [...] age, gender, and several racial/ethnic identity categories (Black, Hispanic, Other) emerge as salient explanatory factors where they did not before. Income no longer explains the outcome. Education level associates strongly with knowing Wikipedia can be edited." However, racial and ethnic background factors "do not associate with who contributes content" (i.e. the last part of the pipeline). This points to raising awareness of Wikipedia's editability as a potential strategy for reducing these gaps, although this would not address "the importance of education and Internet skills" gaps for closing knowledge gaps that the authors highlight in their overall conclusions.

Conversely, a 2020 paper answered the question "What drives Black contributions to Wikipedia?"[3][4] with the following conclusions, based on a survey of 318 Black Wikipedia editors in the US:

First, black altruism [measured via the survey question "Writing/editing Wikipedia is a way for me to help the Black community," "Sharing my knowledge through Wikipedia improves content on the Black community," and "It is important that Blacks write/edit Wikipedia"] indirectly influences Wikipedians’ content contribution through their perception of information quality regarding Black content in the Wikipedia universe. In other words, the higher their altruistic tendencies, the stronger their perceptions of information quality. [...] We argue that black altruism mediated through perceptions of information quality [...] helps extend black digital culture in the online encyclopedia. Therefore, we suggest one of the gratifications of Black Wikipedia contribution is likely the production and dissemination of Black cultural information content. [...]

Secondly, our findings suggest that Black Wikipedians’ perceived social presence, is also a significant driver of content contribution. That is to say, Wikipedia contribution uplifts self-esteem and feelings of self-enhancement among Black authors and editors. [...]

Lastly our findings, demonstrated that entertainment is not a factor that significantly motivates Black Wikipedians’ content contribution in the current study. We suggest this may be because Wikipedia content contribution, like long-form blogging, requires “more sustained time, editing, maintenance” (Steele, 2018 p. 116), intellectual endeavor, and a higher degree of technological proficiency than other types of social media activities. Nevertheless, our results are a sharp contrast to previous studies which suggest that amusement is an influential factor in Wikipedia contributions [...]"

Survey respondents were recruited in 2017 via Qualtrics "based on predefined characteristics such as individuals who identified as Black/African American, resided in the United States, and had made at least one edit/contribution to Wikipedia's English edition over the last three years". Interestingly, the resulting sample of 318 Black Wikipedia contributors was much larger than that of the WMF Community Insights survey, which (barring some extreme downward adjustments during the weighting process) appears to have consisted of a single Black respondent in the sample, considering the stated percentage of 0.5% among 195 US-based respondents.


Wikiworkshop 2021

Report by Tilman Bayer

The annual WikiWorkshop, part of The Web Conference, took place as an online event on April 14, 2021, featuring the papers listed below. The organizers reported that 78% of attendees were non-native English speakers, 66% first-time attended Wiki Workshop for the first time, 53% were academic researchers and 34% students.

"References in Wikipedia: The Editors' Perspective"

From the abstract:[5]

"we explore the creation and collection of references for new Wikipedia articles from an editor’s perspective. We map out the workflow of editors when creating a new article, emphasising on how they select references."

"Do I Trust this Stranger? Generalized Trust and the Governance of Online Communities"

From the abstract:[6]

" we hypothesize that administrators’ community governance policy might be influenced by general trust attitudes acquired mostly out of the Wikipedia context. We use a decontextualized online experiment to elicit levels of trust in strangers in a sample of 58 English Wikipedia administrators. We show that low-trusting admins exercise their policing rights significantly more (e.g., block about 81% more users than high trusting types on average). We conclude that efficiency gains might be reaped from the further development of tools aimed at inferring users’ intentions from digital trace data."

"Negative Knowledge for Open-world Wikidata"

From the abstract:[7]

"Like most major KBs [knowledge bases, Wikidata is] incomplete and therefore operates under the open-world assumption (OWA) – statements not contained in Wikidata should be assumed to have an unknown truth. The OWA ignores however, that a significant part of interesting knowledge is negative, which cannot be readily expressed in this data model. In this paper, we review the challenges arising from the OWA, as well as some specific attempts Wikidata has made to overcome them. We review a statistical inference method for negative statements, called peer-based inference, and present Wikinegata, a platform that implements this inference over Wikidata. ... Wikinegata is available at https://d5demos.mpi-inf.mpg.de/negation ."

"A Brief Analysis of Bengali Wikipedia's Journey to 100,000 Articles"

From the abstract:[8]

"This paper analyzes the various associating factors throughout this journey including the number of active editors, number of content pages, pageview, etc., along with the connection to outreach activities with these parameters."

"WikiShark: An Online Tool for Analyzing Wikipedia Traffic and Trends"

From the abstract:[9]

"This paper introduces WikiShark (www.wikishark.com) – an online tool that allows researchers to analyze Wikipedia traffic and trends quickly and effectively, by (1) instantly querying pageview traffic data; (2) comparing traffic across articles; (3) surfacing and analyzing trending topics; and (4) easily leveraging findings for use in their own research."

"Tracing the Factoids: the Anatomy of Information Re-organization in Wikipedia Articles"

From the abstract:[10]

"... we investigate the impact of gradual edits on the re-positioning and organization of the factual information in Wikipedia articles [...] we show that in a Wikipedia article, the crowd is capable of placing the factual information to its correct position, eventually reducing the knowledge gaps. We also show that the majority of information re-arrangement occurs in the initial stages of the article development and gradually decreases in the later stages."

"Wikidata Logical Rules and Where to Find Them"

From the paper[11] (an extended abstract):

"We are interested in soft (approximate) constraints expressed as dependencies (or logical rules), such as the constraint that “a person cannot be born after one of her children”. Such rules have proven to be useful for error detection [4], adding missing facts [3], executing queries faster, and reasoning [1]. Not only these rules are not stated in Wikidata, but, to the best of our understanding, a way to express them as constraints is still to be defined in the repository ... there are very few rules that are exact, i.e., true for each and every case. As an example, consider a rule stating that “a country has always one capital”. This is true for most countries, but there are 15 countries that have two or more capitals. Therefore, the rule has a very high confidence, but it is not exact ... The goal of our work is to create a large collection of rules for Wikidata with their confidence measure. In this abstract, we report on two directions we have been exploring to obtain such rules, our results, and how we believe the Wikimedia community could benefit from this effort."

"Simple Wikidata Analysis for Tracking and Improving Biographies in Catalan Wikipedia"

From the abstract:[12]

"we highlight the possibilities of taking advantage of structured data from Wikidata for evaluating new biographical articles, so facilitating users to get engaged into diversity challenges or track potential vandalism and errors"

Related code: https://github.com/toniher/wikidata-pylisting

"Structural Analysis of Wikigraph to Investigate Quality Grades of Wikipedia Articles"

From the abstract:[13]

"we present a novel approach based on the structural analysis of Wikigraph to automate the estimation of the quality of Wikipedia articles. We examine the network built using the complete set of English Wikipedia articles and identify the variation of network signatures of the articles with respect to their quality. Our study shows that these signatures are useful for estimating the quality grades of un-assessed articles with an accuracy surpassing the existing approaches in this direction."

"Towards Open-domain Vision and Language Understanding with Wikimedia"

From the abstract:[14]

"This work [i.e. research proposal] describes a project towards achieving the next generation of models, that can deal with open-domain media, and learn visio-linguistic representations that reflect data’s context, by jointly reasoning over media, a domain knowledge-graph and temporal context. This ambition will be leveraged by a Wikimedia data framework, comprised by comprehensive and high-quality data, covering a wide range of social, cultural, political and other type of events"

"Language-agnostic Topic Classification for Wikipedia"

From the abstract:[15]

"we propose a language-agnostic approach based on the links in an article for classifying articles into a taxonomy of topics that can be easily applied to (almost) any language and article on Wikipedia. We show that it matches the performance of a language-dependent approach while being simpler and having much greater coverage."

See also: online demo, data dumps, model details

"Fast Linking of Mathematical Wikidata Entities in Wikipedia Articles Using Annotation Recommendation"

From the abstract:[16]

"We evaluate the quality and time-savings of AI-generated formula and identifier annotation recommendations on a test selection of Wikipedia articles from the physics domain. Moreover, we evaluate the community acceptance of Wikipedia formula entity links and Wikidata item creation and population to ground the formula semantics. Our evaluation shows that the AI guidance was able to significantly speed up the annotation process by a factor of 1.4 for formulae and 2.4 for identifiers. Our contributions were accepted in 88% of the edited Wikipedia articles and 67% of the Wikidata items. The >>AnnoMathTeX<< annotation recommender system is hosted by Wikimedia at annomathtex.wmflabs.org. In the future, our data refinement pipeline will be integrated seamlessly into the Wikimedia user interfaces."

"ShExStatements: Simplifying Shape Expressions for Wikidata"

From the abstract:[17]

"Wikidata recently supported entity schemas based on shape expressions (ShEx). They play an important role in the validation of items belonging to a multitude of domains on Wikidata. [...] In this article, ShExStatements is presented with the goal of simplifying writing the shape expressions for Wikidata."

"Inferring Sociodemographic Attributes of Wikipedia Editors: State-of-the-art and Implications for Editor Privacy"

From the abstract:[18]

"we investigate the state-of-the-art of machine learning models to infer sociodemographic attributes of Wikipedia editors based on their public profile pages and corresponding implications for editor privacy. [...] In comparative evaluations of different machine learning models, we show that the highest prediction accuracy can be obtained for the attribute gender, with precision values of 82% to 91% for women and men respectively, as well as an averaged F1-score of 0.78. For other attributes like age group, education, and religion, the utilized classifiers exhibit F1-scores in the range of 0.32 to 0.74, depending on the model class."

"The Language of Liberty: A preliminary study"

From the paper:[19]

"We managed to align more than 37,000 articles across Wikipedia and Conservapedia; of these, about 28,000 pages share an identical title, while the remaining ones are aligned based on redirect pages. In total, the whole corpus contains 106 million tokens and 558,000 unique words. [...] We can notice marked differences in word usage in the two resources: Wikipedia authors tend to use more objective/neutral words (affordable care, american politician), in addition to many non-political terms. In Conservapedia [there] prevail derogatory terms such as rino, which stands for “Republican In Name Only", and Democrat Party, but also topics of high concern to the conservative community such as the homosexual agenda, communist manifesto, and fetal tissue."

"Information flow on COVID-19 over Wikipedia: A case study of 11 languages"

From the abstract:[20]

"we study the content editor and viewer patterns on the COVID-19 related documents on Wikipedia using a near-complete dataset gathered of 11 languages over 238 days in 2020. Based on the analysis of the daily access and edit logs on the identified Wikipedia pages, we discuss how the regional and cultural closeness factors affect information demand and supply."

"Towards Ongoing Detection of Linguistic Bias on Wikipedia"

From the abstract:[21]

"As part of our research vision to develop resilient bias detection models that can self-adapt over time, we present in this paper our initial investigation of the potential of a cross-domain transfer learning approach to improve Wikipedia bias detection. The ultimate goal is to future-proof Wikipedia in the face of dynamic, evolving kinds of linguistic bias and adversarial manipulations intended to evade NPOV issues."

Analysis of two million AfD (Article for Deletions) discussions

From the abstract:[22]

"[Wikipedia article deletion] decisions (which are known as “Article for Deletion”, or AfD) are taken by groups of editors in a deliberative fashion, and are known for displaying a number of common biases associated to group decision making. Here, we present an analysis of 1,967,768 AfD discussions between 2005 and 2018. We perform a signed network analysis to capture the dynamics of agreement and disagreement among editors. We measure the preference of each editor for voting toward either inclusion or deletion. We further describe the evolution of individual editors and their voting preferences over time, finding four major opinion groups. Finally, we develop a predictive model of discussion outcomes based on latent factors."

Among the findings are that "Editors who joined before 2007 tend to overwhelmingly belong to the more central parts of the network" and that "user preferences [for keep or delete] are relatively stable over time for ... more central editors. However, despite the overall stability of trajectories, we also observe a substantial narrowing of opinions in the early period of an AfD reviewer tenure. ... Strong deletionists exhibit the least amount of change, suggesting the possibility of lower susceptibility, or higher resistance, to opinion change in this group." Overall though, the authors conclude that "differences between inclusionists and deletionists are more nuanced than previously thought."

From the abstract:[23]

" we use general and health-specific features from Wikipedia articles to propose health-specific metrics. We evaluate these metrics using a set of Wikipedia articles previously assessed by WikiProject Medicine. We conclude that it is possible to combine generic and specific metrics to determine health-related content’s information quality. These metrics are computed automatically and can be used by curators to identify quality issues."

"Wikipedia Editor Drop-Off: A Framework to Characterize Editors' Inactivity"

A figure from the paper, showing "The different states of drop-off related to activity and their possible transitions"
From the abstract:[24]

"... we present an approach to characterize Wikipedia’s editor drop-off as the transitional states from activity to inactivity. Our approach is based on the data that can be collected or inferred about editors’ activity within the project, namely their contributions to encyclopedic articles, discussions with other editors, and overall participation. Along with the characterization, we want to advance three main hypotheses, derived from the state of the art in the literature and the documentation produced by the community, to understand which interaction patterns may anticipate editors leaving Wikipedia: 1) abrupt interactions or conflict with other editors, 2) excess in the number and spread of interactions, and 3) a lack of interactions with editors with similar characteristics."

The paper is part of an ongoing research project funded by a €83,400 project grant from the Wikimedia Foundation. Some related code can be found at https://github.com/WikiCommunityHealth/ .

Wikimedia Foundation Research Award of the Year

Besides presentations about the papers listed above, the Wiki Workshop event also saw the announcement of the first "Wikimedia Foundation Research Award of the Year" ("WMF-RAY", cf. call for nominations), with the following two awardees:

"Content Growth and Attention Contagion in Information Networks: Addressing Information Poverty on Wikipedia"[25] (also presented at last year's Wikiworkshop), a paper which according to the laudators

"demonstrates causal evidence of the relationship between increases in content quality in English Wikipedia articles and subsequent increases in attention. The researchers conduct a natural experiment using edits done on English Wikipedia via the Wiki Education Foundation program. The paper shows that English Wikipedia articles that were improved by students in the program gained more viewers than a group of otherwise similar articles. It also found that this effect spills over into a range of articles linked to from the improved articles."

"Participatory Research for Low-resourced Machine Translation: A Case Study in African Languages"[26] and Masakhane (which describes itself as "A grassroots NLP community for Africa, by Africans"). The paper

describes a novel approach for participatory research around machine translation for African languages. The authors show how this approach can overcome the challenges these languages face to join the Web and some of the technologies other languages benefit from today."

While the research does not seem to have concerned Wikipedia directly, the laudators find it an "inspiring example of work towards Knowledge Equity, one of the two main pillars of the 2030 Wikimedia Movement Strategy" and expect the project's success

"will directly support a range of Wikimedia Foundation and Wikimedia Movement goals including the newly-announced Abstract Wikipedia which will rely heavily on machine translation too."

Consistent with its title, the paper features an impressive list of no less than 48 authors (with the cited eprint having been submitted to arXiv by Julia Kreutzer of Google Research).

Briefly

References

  1. ^ Menking, Amanda; Rosenberg, Jon (2021-05-01). "WP:NOT, WP:NPOV, and Other Stories Wikipedia Tells Us: A Feminist Critique of Wikipedia's Epistemology". Science, Technology, & Human Values. 46 (3): 455–479. doi:10.1177/0162243920924783. ISSN 0162-2439. Closed access icon
  2. ^ Rebecca Maung (Wikimedia Foundation): 2021 Community Insights Report, Meta-wiki, May 2021
  3. ^ Stewart, Brenton; Ju, Boryung (2019-02-01). "What drives Black contributions to Wikipedia?". Proceedings of the Association for Information Science and Technology. doi:10.1002/pra2.2018.14505501168.Closed access icon
  4. ^ Stewart, Brenton; Ju, Boryung (May 2020). "On Black Wikipedians: Motivations behind content contribution". Information Processing and Management: an International Journal. 57 (3). doi:10.1016/j.ipm.2019.102134. Closed access icon
  5. ^ Kaffee, Lucie-Aimée; Elsahar, Hady (2021-04-19). "References in Wikipedia: The Editors' Perspective" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 535–538. doi:10.1145/3442442.3452337. ISBN 9781450383134.
  6. ^ Hergueux, Jérôme; Algan, Yann; Benkler, Yochai; Fuster-Morell, Mayo (2021-04-19). "Do I Trust this Stranger? Generalized Trust and the Governance of Online Communities" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 539–543. doi:10.1145/3442442.3452338. ISBN 9781450383134.
  7. ^ Arnaout, Hiba; Razniewski, Simon; Weikum, Gerhard; Pan, Jeff Z. (2021-04-19). "Negative Knowledge for Open-world Wikidata" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 544–551. doi:10.1145/3442442.3452339. ISBN 9781450383134.
  8. ^ Dastider, Ankan Ghosh (2021-04-19). "A Brief Analysis of Bengali Wikipedia's Journey to 100,000 Articles" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 552–557. doi:10.1145/3442442.3452340. ISBN 9781450383134.
  9. ^ Vardi, Elad; Muchnik, Lev; Conway, Alex; Breakstone, Micha (2021-04-19). "WikiShark: An Online Tool for Analyzing Wikipedia Traffic and Trends" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 558–571. doi:10.1145/3442442.3452341. ISBN 9781450383134.
  10. ^ Verma, Amit Arjun; Dubey, Neeru; Iyengar, S.R.S.; Setia, Simran (2021-04-19). "Tracing the Factoids: the Anatomy of Information Re-organization in Wikipedia Articles" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 572–579. doi:10.1145/3442442.3452342. ISBN 9781450383134.
  11. ^ Ahmadi, Naser; Papotti, Paolo (2021-04-19). "Wikidata Logical Rules and Where to Find Them" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 580–581. doi:10.1145/3442442.3452343. ISBN 9781450383134.
  12. ^ Hermoso Pulido, Toni (2021-04-19). "Simple Wikidata Analysis for Tracking and Improving Biographies in Catalan Wikipedia" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 582–583. doi:10.1145/3442442.3452344. ISBN 9781450383134.
  13. ^ Chhabra, Anamika; Srivastava, Shubham; S. Iyengar, S. R.; Saini, Poonam (2021-04-19). "Structural Analysis of Wikigraph to Investigate Quality Grades of Wikipedia Articles" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 584–590. doi:10.1145/3442442.3452345. ISBN 9781450383134.
  14. ^ Semedo, David (2021-04-19). "Towards Open-domain Vision and Language Understanding with Wikimedia" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 591–593. doi:10.1145/3442442.3452346. ISBN 9781450383134.
  15. ^ Johnson, Isaac; Gerlach, Martin; Sáez-Trumper, Diego (2021-04-19). "Language-agnostic Topic Classification for Wikipedia" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 594–601. doi:10.1145/3442442.3452347. ISBN 9781450383134.
  16. ^ Scharpf, Philipp; Schubotz, Moritz; Gipp, Bela (2021-04-19). "Fast Linking of Mathematical Wikidata Entities in Wikipedia Articles Using Annotation Recommendation" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 602–609. doi:10.1145/3442442.3452348. ISBN 9781450383134.
  17. ^ Samuel, John (2021-04-19). "ShExStatements: Simplifying Shape Expressions for Wikidata" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 610–615. doi:10.1145/3442442.3452349. ISBN 9781450383134.
  18. ^ Brückner, Sebastian; Lemmerich, Florian; Strohmaier, Markus (2021-04-19). "Inferring Sociodemographic Attributes of Wikipedia Editors: State-of-the-art and Implications for Editor Privacy" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 616–622. doi:10.1145/3442442.3452350. ISBN 9781450383134.
  19. ^ Araque, Oscar; Gatti, Lorenzo; Kalimeri, Kyriaki (2021-04-19). "The Language of Liberty: A preliminary study" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 623–626. doi:10.1145/3442442.3452351. ISBN 9781450383134.
  20. ^ Jung, Changwook; Hong, Inho; Sáez-Trumper, Diego; Lee, Damin; Myung, Jaehyeon; Kim, Danu; Yun, Jinhyuk; Jung, Woo-Sung; Cha, Meeyoung (2021-04-19). "Information flow on COVID-19 over Wikipedia: A case study of 11 languages" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 627–628. doi:10.1145/3442442.3452352. ISBN 9781450383134.
  21. ^ Madanagopal, Karthic; Caverlee, James (2021-04-19). "Towards Ongoing Detection of Linguistic Bias on Wikipedia" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 629–631. doi:10.1145/3442442.3452353. ISBN 9781450383134.
  22. ^ Tasnim Huq, Khandaker; Ciampaglia, Giovanni Luca (2021-04-19). "Characterizing Opinion Dynamics and Group Decision Making in Wikipedia Content Discussions" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 632–639. doi:10.1145/3442442.3452354. ISBN 9781450383134.
  23. ^ Couto, Luís; Lopes, Carla Teixeira (2021-04-19). "Assessing the quality of health-related Wikipedia articles with generic and specific metrics" (PDF). Companion Proceedings of the Web Conference 2021. WWW '21. New York, NY, USA: Association for Computing Machinery. pp. 640–647. doi:10.1145/3442442.3452355. ISBN 9781450383134.
  24. ^ Marc Miquel-Ribé, Cristian Consonni and David Laniado: Wikipedia Editor Drop-Off: A Framework to Characterize Editors' Inactivity wikiworkshop.org, 2021
  25. ^ Zhu, Kai; Walker, Dylan; Muchnik, Lev (2018-06-05). "Content Growth and Attention Contagion in Information Networks: Addressing Information Poverty on Wikipedia". SSRN 3191128.
  26. ^ Nekoto, Wilhelmina; Marivate, Vukosi; Matsila, Tshinondiwa; Fasubaa, Timi; Kolawole, Tajudeen; Fagbohungbe, Taiwo; Akinola, Solomon Oluwole; Muhammad, Shamsuddeen Hassan; Kabongo, Salomon; Osei, Salomey; Freshia, Sackey; Niyongabo, Rubungo Andre; Macharm, Ricky; Ogayo, Perez; Ahia, Orevaoghene; Meressa, Musie; Adeyemi, Mofe; Mokgesi-Selinga, Masabata; Okegbemi, Lawrence; Martinus, Laura Jane; Tajudeen, Kolawole; Degila, Kevin; Ogueji, Kelechi; Siminyu, Kathleen; Kreutzer, Julia; Webster, Jason; Ali, Jamiil Toure; Abbott, Jade; Orife, Iroro; Ezeani, Ignatius; Dangana, Idris Abdulkabir; Kamper, Herman; Elsahar, Hady; Duru, Goodness; Kioko, Ghollah; Murhabazi, Espoir; van Biljon, Elan; Whitenack, Daniel; Onyefuluchi, Christopher; Emezue, Chris; Dossou, Bonaventure; Sibanda, Blessing; Bassey, Blessing Itoro; Olabiyi, Ayodele; Ramkilowan, Arshath; Öktem, Alp; Akinfaderin, Adewale; Bashir, Abdallah (2020-11-06). "Participatory Research for Low-resourced Machine Translation: A Case Study in African Languages". arXiv:2010.02353 [cs].
Supplementary references and notes:
  1. ^ Shaw, Aaron; Hargittai, Eszter (2018-02-01). "The Pipeline of Online Participation Inequalities: The Case of Wikipedia Editing". Journal of Communication. 68 (1): 143–168. doi:10.1093/joc/jqx003. ISSN 0021-9916. Closed access icon (but still available via archive.org)


S
In this issue
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.

Initial discussion

These papers would be more convincing if they were written in something resembling the English language. "An alternative set of pillars developed through the lens of feminist epistemology" is about as meaningful as "Colorless green ideas sleep furiously". The Blade of the Northern Lights (話して下さい) 21:25, 27 June 2021 (UTC)[reply]

  • So I decided to actually find a copy of the original article on the five pillars. It's worse than the excerpts posted here. For instance, the first point on changing Wikipedia to a process is justified by the claim that the existing first pillar excludes certain people from the encyclopedia building process. The specifics people that the authors believe should be included are "individuals who are knowledgeable about and want to contribute information related to complementary and alternative medical practices such as acupuncture, meditation, or Ayurveda." The article details how bad it is that Western medical practicioners "block" the participation of alternative medicine providers (not in the WP:BLOCK sense necessarily) and "leave little room for other kinds of expertise, including patient expertise". The amount of fucks I could give about pseudoscience practicioners being blocked from contributing pseudoscience to Wikipedia is close to zero and I would hope MEDRS never allows the vaguely defined "patient expertise" to be used as a source if that entails patients contributing their experiences directly to Wikipedia.
  • The proposed second pillar envisions a fundamental shift of Wikipedia as a collection of knowledge being neutral to Wikipedia editors being neutral. This is worded in a deliberately obscurantist fashion because at its heart it advocates a system where we actually debate content and points of views and then collectively take a position on the topics that the encyclopedia covers and abandon any pretenses of being objective. The article explicitly says this, by outlining how Wikipedia should have structures for "communal inquiry" and describes as an "upshot" that "whatever we end up endorsing as the product of inquiry, is in no way neutral or objective itself". This misunderstands how Wikipedia should function. We are not scientists engaged in "communal inquiry" nor a source of knowledge unto ourselves. We summarize the existing knowledge and don't take sides.
  • The proposed third pillar misunderstands Wikipedia and focuses on openness of "participation". It misunderstands that what "anyone can edit" actually means. It's not that anyone can go on Wikipedia and type in whatever they want on this website. It's the idea that anyone can take the content here and use it for their purposes, so long as they attribute and preserve the freedom of reuse. If someone wants to go ahead and take the Ayurveda article we have and put it on their website; but edited to talk about how good Ayurveda is, that's OK. So long as they attribute us (but don't say we wrote their edits) as well as preserve the CC-BY-SA it's fine. It's the freedom of content; not the freedom of community and while the article understands this is the case the authors fail to understand that the focus on the former was an intentional choice (calling it an "inappropriate focus on content"). While I don't have a problem with adding a new pillar based on valuing participation; removing the third pillar is a non-starter for me.
  • The proposed fourth pillar believes in killing civility and replacing it with a statement about "epistemic and discursive responsibility". The authors justify this by claiming that "what counts as civil or respectful can vary from person to person and context to context" and that individuals are incapable of having a neutral point of view. These are both true (although I don't see how the second conflicts with the fourth pillar), but the proposed replacement that "Editors Should be Epistemically and Discursively Responsible" has the same issues yet is far more difficult to understand. The author elaborates that this responsibility is "to create a thriving and objective epistemic community", yet the ideas of what this might mean also differ from person to person or context to context. It's the same as the civility except the civility policy actually says you can't justify bad behaviour if it's for a good reason. Saying that editors should be "discursively responsible" entails that it's OK to say things like personal attacks if it's in the interests of the community.
  • The fifth pillar is the only one I actually completely agree with. I can't actually see the difference between it and the existing fifth pillar though so that might be why.
  • The 5 pillars article proposes foundational changes to Wikipedia so that pseudoscientists can POV-push alternative medicine and unironically believes we should repeal WP:NOTFORUM.
Chess (talk) (please use {{reply to|Chess}} on reply) 22:50, 27 June 2021 (UTC)[reply]
A cogent, incisive analysis Chess. Thank you. Mark D Worthen PsyD (talk) [he/his/him] 00:52, 29 June 2021 (UTC)[reply]
Thank you for the compliment despite our obvious disagreement on the article. Chess (talk) (please use {{reply to|Chess}} on reply) 00:59, 29 June 2021 (UTC)[reply]
@Chess: One reason I finally posted a review was to elicit thoughtful discussion to help me (and others) better understand the authors' arguments, and to learn from other Wikipedians' confutations. My opinion of the Menking & Rosenberg (2021) article continues to evolve. Mark D Worthen PsyD (talk) [he/his/him] 14:04, 29 June 2021 (UTC)[reply]
The entire sentence is worth quoting: Take, for example, individuals who are knowledgeable about and want to contribute information related to complementary and alternative medical practices such as acupuncture, meditation, or Ayurveda. Health information on the English-language Wikipedia has become increasingly influential (Laurent and Vickers 2009) and, consequently, it is often closely guarded by a community of editors who are also trained as Western medical practitioners (Shafee et al. 2017). If the latter considers the former to be “dangerous,” then they may block their participation. (p. 15) It's very strange—and, frankly, somewhat "Western supremacist"—to assert in 2021 that medicine is a field of knowledge that is distinctly Western. If you have a heart attack in China, they will treat you with "Western" medicine. If you have cancer in Uganda, they will treat you with "Western" medicine. Researchers at the the Universidad de Chile medical school research "Western" medicine. Only people who are obsessed with the idea that individuals are determined by their ethnic background will fail to see that medicine—the academic kind—is a global field of knowledge with practitioners and contributors in literally every country in the world. JBchrch talk 22:30, 30 June 2021 (UTC)[reply]
"Editors should be epistemically and discursively responsible" is a cruel and unusual sentence. DuncanHill (talk) 22:53, 27 June 2021 (UTC)[reply]

I'm not a huge fan of the conclusions in the paper about why black people participate in Wikipedia less. Considering that they participate at, what, 1/20th of the rate of white people, we could say that we're only engaging 1/20th of the black people who would potentially be interested in editing. Of the ones we do engage, they're interested in black altruism - but what about the ones we don't? I think that's the more important question. The fact that the ones we do engage tend to not cite "entertainment" as a reason is somewhat interesting I think - perhaps we should look at that among non-editors? Since I'd assume there is a large base of black people who would potentially find Wikipedia fun to edit, yet for some reason have avoided it. I do appreciate the effort to look more into demographics, I'm only concerned about a form of survivorship bias interfering with getting useful information. Elli (talk | contribs) 22:37, 27 June 2021 (UTC)[reply]

These are different papers. Stewart and Ju looked at existing African American contributors and their motivations ("black altruism" etc.) but did not try to answer the question why black people in general contribute less. Hargittai and Shaw's "pipeline" paper on the other hand was based on survey data that did include non-contributors. Regards, HaeB (talk) 23:26, 27 June 2021 (UTC)[reply]
Fair, I was addressing how things were presented in Signpost. Elli (talk | contribs) 00:05, 28 June 2021 (UTC)[reply]
I'm interested in where any of us find "amusement" (as the paper put it) in Wikipedia. By making on-wiki friends? By joining the peanut gallery at ANI? By reverting amusing vandalism? But I can see why black editors might struggle to find Wikipedia entertaining just from the amount of overt and extreme racism that unregistered/NOTHERE people spew: it's easy for me (a white person) to brush this off as not reflective of the Wikipedia community as a whole, but maybe not so funny when you know the comments are maliciously targeted at you. And then you do get long-term editors who express consciously racist views. Often it takes just a single really aggressive editor to dishearten you or trigger an enraged retirement. So editing can quickly become unamusing and stressful, but if your purpose is altruism, then this is less likely to take away your motivation. But all of this analysis should really apply to most of us (replacing "racism" with other issues), circling back to my confusion about who is here for amusement. — Bilorv (talk) 09:09, 28 June 2021 (UTC)[reply]
I'm not sure that amusement is my primary motivation, but I do think it's fun to add content to an article. For me, the more obscure the subject, the more fun it's likely to be. WhatamIdoing (talk) 16:52, 29 June 2021 (UTC)[reply]

Yeah this reeks of bad conclusions based on poor evidence. Maybe consider systemic and socioeconomic factors before torpedoing the 5 Pillars and how wiki has functioned decently for a long time. ~Gwennie🐈💬 📋22:41, 27 June 2021 (UTC)[reply]

As everyone knows, epistemology is one of the four main branches of philosophy, along with ethics, logic and metaphysics. And its aim is to study the nature, origin and scope of knowledge, the rationality of beliefs and various related issues. The exercise is as follows: (1) Examine the origin of the two given assertions, e.g. find historical events that could have been used to illustrate such claims. (2) Find subsequent historical events that came across these assertions. (3) Examine the rationales that were used to cover up the failure of the prediction. (4) And, obviously, perform all of these tasks in a way that illustrates how the new form of epistemology works at least slightly better than the biased epistemology of the past. Pldx1 (talk) 09:24, 29 June 2021 (UTC)[reply]
We need an article about ways of knowing and/or forms of knowledge. They keep getting tossed around in discussions, but I'm not sure that we have a shared understanding of what we're talking about. WhatamIdoing (talk) 22:12, 1 July 2021 (UTC)[reply]
Having now had an opportunity to read the Menkin & Rosenberg article in full, I have to say it's definitely not worth £29. There is a huge amount of "filler" - it could easily be cut by a third with no loss of worthwhile content. It is also written in appallingly dense jargon and so hedged about with mays and mights that it is a real struggle to see what, if anything, the authors are trying to say, except when they throw in the occasional bit of the bleeding obvious such as "judgments about what counts as civil or respectful can vary from person to person and context to context". DuncanHill (talk) 22:31, 1 July 2021 (UTC)[reply]

WhatamIdoing's discernment

In the talk page discussion I referenced in my review—WP:5P sidetrack (part II)WhatamIdoing wrote (on 21 Apr 2021 @ 03:23 UTC):

"The integrity of Wikipedia is a function of the size and breadth of its community" means "biased people create biased content". I believe that "epistemically and discursively responsible" means that they want editors to have epistemic responsibility (do good research, including actively seeking out information and views that have been overlooked in the past) and to intentionally make space for voices that are being excluded. ... I do think [Menking & Rosenberg (2021) are] correct about the English Wikipedia being norm-driven; there are things that we do because we always do that even though the rules technically discourage them, and things you can't do because we don't do that, even if the rules permit them. We can't really be rule-governed when IAR is one of the rules, or when some of the rules contradict other rules.

The discussion continues after this, with additional insightful posts by other smart Wikipedians. Mark D Worthen PsyD (talk) [he/his/him] 14:46, 29 June 2021 (UTC)[reply]

What if they were right?

I'm inclined to reject the argument, but I think it is good intellectual discipline to seriously consider the possibility that they are basically right about the 5P being responsible for the diversity gap in the editor population. I think, though, even if they were right, it would be a mistake to attempt a radical reengineering of core principles: to use nautical terms, the Wikipedia community, at least for en.WP, is the analog of an oil tanker with a turning circle that takes hours to execute. I doubt the enterprise could survive such an effort and remain fruitful. If we did want to remake WP on new principles, I think it could only work in the context of a new project. — Charles Stewart (talk) 17:24, 1 July 2021 (UTC)[reply]

I have been trying to do this exercise for the last 15 min, but I am having trouble moving forward. 1) I fail to understand how the concept of a "neutral encyclopedia" can be construed as oppressive (even after reading the article multiple times). 2) I do not understand how we could build a neutral encyclopedia out of the principles proposed by the authors. Aren't they basically arguing for a sort of ethic-weighted and class-weighted summary of what people think is true? 3) I cannot get past the part of the analysis where one considers that the inequalities in society in terms of higher education and socio-economic conditions are the causes of the lack of diversity: I don't see any serious objection to this idea from the authors' material. JBchrch talk 18:19, 1 July 2021 (UTC)[reply]
Further to this (rambling) comment and after skimming the article once again, I see a quote that sums up the issues I have: Acknowledging the ways that knowers are situated leads us to abandon the possibility of having an unbiased position isolatable from an individual’s background beliefs and values, and even their affective or emotional state. Without the possibility of individual or community neutrality, no amount of civility or respect alone is sufficient to allow “the truth” to rise to the surface of discourse. If one holds this view, then one is automatically opposed to any sort of encyclopedic endeavor. JBchrch talk 23:09, 1 July 2021 (UTC)[reply]
I've been thinking about this, and while I don't know that I can answer your question, here's what I've come up with.
When we look at a subject – I'm going to use hand washing – we (and all good encyclopedias) tend to boil it down to the summary statements: "Washing hands with soap and water before eating improves health". Right?
But this isn't always true. There are other valid perspectives:
  • If you're stuck in a desert/don't have much water, you probably should save the water for drinking.
  • If you're allergic to the available soap, then your health might be best if you wash in plain water, or not washing.
  • If you have aquagenic urticaria, you should avoid using water for washing.
  • If you're in space, hand washing is not an efficient use of water.
  • According to the hygiene hypothesis, it's possible that washing your hands before eating will improve your short-term health at the expense of your long-term health.
  • And if the water's dirtier than your hands, you might be better off not washing in it.
You'd go back to the person who said that washing hands is healthful, and you'd hear something like "C'mon, guys, you know I was speaking about the general situation in which there is plenty of clean water and no other contraindications. This is an encyclopedia, not a detailed description of every possible situation that might affect some tiny fraction of people in non-standard circumstances."
Generalizing and summarizing is a valid (and IMO encyclopedic) approach to knowledge. But that doesn't mean that other approaches are wrong. An approach that looks at what's best for (in this example) the health of an individual person in a given set of specific circumstances, rather than a statement that applies in most circumstances, also counts as "knowledge".
I think the difference is that an encyclopedia aims to be the sum of human knowledge, not all of human knowledge. We are not perfect, but even if we reached perfection, there would still be things that would get omitted or glossed over as being Wikipedia:UNDUE for a general summary. If you want to represent the full knowledge of our world, you don't want an encyclopedia. WhatamIdoing (talk) 00:55, 2 July 2021 (UTC)[reply]
@WhatamIdoing: thank you very much for the illuminating explanation and example, they are very helpful. Here are just two thoughts in response: 1) Would it ever possible for a an encyclopedia to be entirely inclusive and diverse? I would argue that, by nature, an encyclopedia (i.e. a bunch of summaries) discriminates in that it rejects some information, even though the rejected information may be relevant to some people. The case made by the authors then, is an easy one : yes summaries are not inclusive. But if you value inclusivity to be the most important value (as the authors seem to do? [In our critique and reimagining of the five pillars, we are concerned with reliability as it relates to the processes by which knowledge is produced on the site and who is excluded from these processes]), an encyclopedia will always be unsatisfactory. 2) I still think that the issues that are discussed boil down to a garbage in-garbage out problem, one that encompasses society as a whole, and not just Wikipedia: if access to education and advanced literacy is not provided equally to all groups of society, this will be reflected in the composition of the people who contribute to Wikipedia, leading to content that excludes and perpetuates bias. In that case, the 5P are still sound: what is more urgent is general societal progress. JBchrch talk 12:54, 2 July 2021 (UTC)[reply]
@JBchrch, I think that the answer to (1) is "no": it is not possible for an encyclopedia to be entirely inclusive and diverse.
I'm not sure about (2). I think that GIGO is a significant problem, but even if it were entirely solved, there would still be a problem. Consider an article like China–United States relations. It needs to include both Chinese and US viewpoints, right? But maybe it shouldn't be limited to that. Okay, we'll add something about the viewpoints of their neighbors. Maybe that means we add POVs about this relationship by India, Russia, Japan, Canada, and Mexico. Oh, wait – what about more distant groups? Okay, we add the European Union. And the UK and Australia, because they speak English? But now we're over-representing wealthy countries. There wasn't a single poor country on the list, and if the US would pay more, and China wouldn't have such cheap prices, maybe poor countries could boost their manufacturing. And Cuba and Vietnam and Laos are also communist countries, so what do they think? And… and… and… and… – and we don't have an encyclopedia article any longer. We have a book instead. WhatamIdoing (talk) 02:29, 3 July 2021 (UTC)[reply]
It's not a problem of how the 5 pillars are phrased or how they're implemented or whatever. On a fundamental level the authors of the paper disagree with Wikipedia's goals. The proposed changes to 5P aren't mere rephrasings (except the fifth) but subtle and massive alterations to the purpose of Wikipedia. Trying to reconcile this with the idea of a neutral encyclopedia is impossible because the author's do not believe neutrality exists, that Wikipedia should attempt to attain neutrality, and advocate that the Wikipedia community spends its time discussing what point of view it wishes to adopt. The authors don't want to change the oil tanker's direction they want to blow it up entirely. Chess (talk) (please use {{reply to|Chess}} on reply) 01:11, 2 July 2021 (UTC)[reply]
Yes, Chess, and here is the problem I see with the arguments of academics who deride notions of objectivity and say that we need specifically ideological or identity-based ways of knowing to be added on to science or mainstream scholarship. Yes, it's true that all humans are biased by various things, including our social position, or our "positionality" in activist-academic jargon. However, the whole design of science, institutionally and philosophically, is to cancel out scientists' personal biases no matter where there come from. So, yes, scientists being diverse (not just in terms of race or gender but also culture, background academic training, life experiences, etc.) is a good thing. But that institutional and philosophical design is at least as important. Even if perfect objectivity is not possible, we (as a species) need to aim for it, and we'll get close. If one rejects objectivity as the aim, on the grounds that one's social identity strongly influences peoples' views so badly that science is actually just Western Male Science, one is instead only left with competing identity-based claims with no basis to judge between them except which identities of those making the claims one wishes to favor. What an intellectual and political mess and dead-end that would be. For Wikipedia, replace "objectivity" with "neutrally represent reliable sources with due weight", and the same point applies. We need to aim for that even if it is never completely and fully reached. Crossroads -talk- 04:28, 2 July 2021 (UTC)[reply]
  • I'm personally skeptical about objectivity: for the sake of argument, I could say objectivity exists in the formal sciences, but if you can't define the topic in first-order logic, then you can't formulate a perfectly objective criterion for knowledge claims in your topic. I do think neutrality as we work with it here is usable and useful: Jimbo used to say that NPOV is the standard that you expect in good newspapers, and actually, I think we've achieved a higher level of neutrality in large swathes of articles than that. Note that while Menking, at least, is critical of the notion of neutrality, she does generally think WP has overall been a good thing; she's hoping that radically different pillars would make the encyclopedia better, which I'm pretty sure they would not.
We have plenty of policies that require obviously controversial judgement calls to be made; the two most important, in my opinion are: What is a reliable source? and When does an article have WP:DUE issues? Menking and Rosenburg also singled out WP:NOT, which is a policy that tends to get applied in a Procrustean fashion at AfD. — Charles Stewart (talk) 12:11, 2 July 2021 (UTC)[reply]
@Chess: I wouldn't say "blow the oil tanker up completely" -- it seems to me like they'd also be open to dumping out all the oil and filling it back up with other substances. jp×g 11:44, 17 July 2021 (UTC)[reply]

So what, if knowers are situated? Why can't knowers reflect on what they are doing and put their personal beliefs aside when writing an encyclopedia? Is it so hard to just describe the debates rather than engaging in the debates themselves? Is it really unrealistic to expect that editors can look after each others' edits on controversial articles and make sure that only description of the debates are being done? We aren't trying to write using a view from nowhere (as Thomas Nagel would put it). What we are doing is presenting from every angles proportionally to the weight they have on the composition of views held by expert researchers. Also, feminist epistemology is originally concerned with researchers, not with the summarizers (i.e. encyclopedia writers) of the findings of those researchers. So the concepts in that area aren't automatically applicable to Wikipedia. So why are the authors referring to 'the truth' when Wikipedia doesn't lead, it only follows?

  • This statement assumes that, even how much interconnected the world was since ancient times, there is really such a thing as 'the West'.
  • At Wikipedia, we don't make exceptions for "Western" practices of bloodletting, humorism, Western astrology, homeopathy, and trepanning. Modern medicine as practiced today just happened to mature in Europe, that's all. Also, variolation, the precursor to vaccination, had a non-European origin.
  • There is no such thing as "Western" mathematics, "Western" science, nor "Western" philosophy. For examples, Archimedes (Greek) Avicenna (Persian), Aryabhata (Indian), Yang Hui (Chinese), just to name a few

VarunSoon (talk) 03:39, 2 July 2021 (UTC)[reply]

We could perhaps expand the "view from nowhere" to say "from anybody to everybody". The Wikipedias that are more closely tied to a single country/culture have a more obvious audience. Haiti's view of China–United States relations is "undue" for us, but of obvious interest in a version of Wikipedia written in Haitian Creole, by Haitian people, for other Haitian people. That sort of situation makes it easier to determine which voices to include or exclude. I imagine that something similar would happen if you had a "Wikipedia for <identity group>": you'd know that you needed more about how the subject relates to that particular group than outsiders would think reasonable. "Wikipedia for Teens" would have more information about youth rights or how certain "adult" illnesses affect younger people. "Wikipedia for Autistics" would have more information about which jobs are better or worse suited for people with different characteristics of autism (e.g., working in a glass recycling center is fun for some but a nightmare for people who dislike the noise). "Wikipedia for Christians" would have more information about whether subjects (e.g., Hair coloring, Luxury cars, Plastic surgery, Dancing) are moral. These would all implicitly exclude other groups, but it'd be easier for that group to decide whether they were representing the subject reasonably through a specific lens. That's easier than figuring out whether you represented the subject reasonably on a global scale – fairly balancing views held by people at all income levels, of all health statuses, at all education levels, of all genders, of all races, in all countries, from all cultures, of all religions, of all ages, etc. And since we are all the protagonists of our own stories, anything that fairly represents the viewpoints that I happen to hold will feel like it underrepresents my view. Because we are each only one of billions, but we all believe that our view is the right, reasonable, and rational one. WhatamIdoing (talk) 03:01, 3 July 2021 (UTC)[reply]
However, Wikipedias that could seem more closely tied to a particular culture should have to uphold the 5 pillars, including and especially NPOV. Consider the Croatian Wikipedia, which for many years had been dominated by nationalist and far-right POV pushers. There's another article about it in this very issue of the Signpost. That's not okay regardless of how many Croatians agreed with that POV. There may indeed be a natural tendency for other language Wikipedias to reflect certain POVs more than ours, but this should be kept within limits.
As for other identity and ideological groups - I am glad that there is only one Wikipedia for them all. We all share one reality, and now more than ever as misinformation and misleading material abounds across social media, and as people divide themselves into impenetrable echo chambers, people need to see what the mainstream views are and why, and what others' ideologies and points of view are and why (in the appropriate articles and with due weight). The latter can and should be described with in-text attribution. People are free to set up other wikis if they want to expound particular points of view, and many have done so. But this should never be part of what Wikipedia or the WMF does. Crossroads -talk- 21:34, 3 July 2021 (UTC)[reply]
The Five Pillars aren't universal. It's an English Wikipedia thing, created as an expansion of the older Wikipedia:Trifecta. The m:Founding principles have a different way of stating the ideas.
It's generally agreed that NPOV is necessary for Wikipedias (but not for other projects), and even there, we get different ideas about what "neutral" means. Here's an example: Go to ht:New York City, New York or ht:Miami. One of the section headings translates to "Relationship to Haiti". Is it "neutral" to call out the relationship between a major US city and a small country? We wouldn't think so at this Wikipedia, which is more Anglocentric and global in focus, but if you are using Haitian Creole sources to write articles at the Haitian Creole Wikipedia (which is not unreasonable?), then you will get a very different picture of what "all the reliable sources" are saying. WhatamIdoing (talk) 01:59, 10 July 2021 (UTC)[reply]
@WhatamIdoing: Is it neutral that so many of our articles on any given topic have a "United States" subsection at some point? Is it "neutral" that the education section in our article on Port-au-prince barely mentions indigenous education but devotes multiple sentences to American-style intl schools? That two sentences in the sparse "culture" section talk about how streets in Haiti are named after American abolitionists (maybe)? That it's a coincidence that 3/4 of the sister cities mentioned are American while one is Canadian?
Give me a break we do the same shit they do they're just willing to actually admit it. Chess (talk) (please use {{reply to|Chess}} on reply) 23:37, 16 July 2021 (UTC)[reply]
I'm not sure that it's actually non-neutral for an article written for speakers of Haitian Creole to provide more information about Haiti's connection. The balance of information that we call "neutral" might not be universal. It might be that "global" languages (English, French, Arabic) need a different balance compared to highly local languages. WhatamIdoing (talk) 00:24, 17 July 2021 (UTC)[reply]
Talking about the Haitian Creole wiki for the Haiti centric viewpoint when so many of our articles are US-centric is like the pot calling the kettle black. We do it too and any discussion of geographical bias would be incomplete if we're not going to acknowledge our own bias. Chess (talk) (please use {{reply to|Chess}} on reply) 00:52, 17 July 2021 (UTC)[reply]

Survey percentages

I don't understand how if there were 195 respondents reporting their race/ethnicity in the US, First Nation people can make up 0.1%. Even if there was just one such person in the sample, that would be 0.5%. What am I missing? --Andreas JN466 09:00, 29 June 2021 (UTC)[reply]

Presumably something to do with the weighting - see the report endnotes. DuncanHill (talk) 09:08, 29 June 2021 (UTC)[reply]
Still isn't really super enlightening. Maybe @RMaung (WMF): can add context. GMGtalk 15:15, 1 July 2021 (UTC)[reply]

Publically available version of Menking and Rosenburg paper

The article links to a paywalled version of the paper, which is available OA at https://journals.sagepub.com/doi/pdf/10.1177/0162243920924783?casa_token=EfdSjisfZf8AAAAA:EB-0LLFClccB0CVNc8io5W46u4DoBWAx9gX-bBDf3PHbsRq3xDMbs1Fh_uePmIJ4RpxXh1WGZg9j

The link should be updated. — Charles Stewart (talk) 10:22, 1 July 2021 (UTC)[reply]

Thanks, we always try to link open access versions - however, your link is still paywalled for me. Regards, HaeB (talk) 18:22, 1 July 2021 (UTC)[reply]
If it is available for you, that is because you are already logged in to Sagepub in some way. Both that link and the DOI end up at a (the same) non-free location for me. Izno (talk) 18:23, 1 July 2021 (UTC)[reply]
Hmm? The only institutional access I've ever had from the computer I accessed the Sagepub article from is JSTOR access, which is available to active WP editors via our Wikipedia:Library program. — Charles Stewart (talk) 12:28, 2 July 2021 (UTC)[reply]
Which does not make it free. Izno (talk) 22:24, 2 July 2021 (UTC)[reply]
But you say that the current link (https://doi.org/10.1177%2F0162243920924783 --> https://journals.sagepub.com/doi/10.1177/0162243920924783 ) is paywalled for you even though the one you gave above (https://journals.sagepub.com/doi/pdf/10.1177/0162243920924783?casa_token=EfdSjisfZf8AAAAA:EB-0LLFClccB0CVNc8io5W46u4DoBWAx9gX-bBDf3PHbsRq3xDMbs1Fh_uePmIJ4RpxXh1WGZg9j ) isn't? Then perhaps you encountered a bug in the Wikipedia Library access mechanism and should consider alerting its maintainers about it. Regards, HaeB (talk) 23:43, 2 July 2021 (UTC)[reply]

What inevitably mathematically dominated the study

With our sports SNG "did it for a living for one day" criteria to bypass GNG, we have an immense amount of articles (many permastubs) in this numerically male dominated (and even more so collectively over history) field which heavily influence overall numbers in such studies. I hit "random article" a few hundred times and 43% of ALL of the articles about men were about sports figures. This mathematically dwarfs any other category, with politicians being a distant second at 11%. So sports figures would have mathematically dominated that study. North8000 (talk) 00:05, 16 July 2021 (UTC)[reply]

"Sports" in general have dominated human culture though. Athletes have had fame far beyond their relative proportion society for millennia. You can look at the gladiators in ancient Rome or the Mesoamerican ballgame or Go players in ancient China. This has lasted well into the modern era. Maybe athletes are considered by society to be more important than they actually are but notability guidelines are meant to reflect what society considers important. Like it or not but athletes get a lot of coverage in reliable sources, now and historically. Chess (talk) (please use {{reply to|Chess}} on reply) 01:20, 17 July 2021 (UTC)[reply]

This is premature but I wanted to post something. I did a more careful sample (so far 200 articles) Of the articles about individual people (59) , I divided them into recent (active in the last 15 years) and not recent. Here was the breakdown of articles on individual people:

North8000 (talk) 21:12, 16 July 2021 (UTC)[reply]



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0