The Signpost

Recent research

Gender, race and notability in deletion discussions

Contribute  —  
Share this
By XOR'easter and Tilman Bayer

A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.

"How gender and race cloud notability considerations on Wikipedia" – or do they?

Reviewed by XOR'easter

The March 2023 paper "'Too Soon' to count? How gender and race cloud notability considerations on Wikipedia", by Lemieux, Zhang, and Tripodi[1] claims to have unearthed quantitative evidence for gender and race biases in English Wikipedia's article deletion processes:

Applying a combination of web-scraping, deep learning, natural language processing, and qualitative analysis to pages of academics nominated for deletion on Wikipedia, we demonstrate how Wikipedia’s notability guidelines are unequally applied across race and gender.

Specifically, the authors

"[...] explored how metrics used to assess notability on Wikipedia (WP:Search Engine Test; “Too Soon”) are applied across biographies of academics. To do so, we first web-scraped biographies of academics nominated for deletion from 2017 to 2020 (n = 843). Next, we created a numerical proxy for each subject's online presence score. This value is meant to emulate Wikipedia's “Search Engine Test,” (WP:Search Engine Test) a convenient and common way editors can determine probable notability before nominating a biography for deletion. [...] We also conducted a qualitative analysis of the discussions surrounding deleted biographies labeled “Too Soon,” (WP:Too soon). Doing so allowed our research team to assess if gender and/or racial discrepancies existed in deciding whether a biography was considered notable enough for Wikipedia. We find that both metrics are implemented idiosyncratically."

However, this is making a manifestly and indefensibly incorrect claim about how Wikipedia editors judge topics for notability. Also, the paper attempts to back it up with misleading quotations and numbers that are dubious on multiple levels.


On English Wikipedia, the status of being significant enough to warrant an article is, in the community lingo, notability. This is related to, but more specialized than, the everyday idea of "noteworthiness". Notability is evaluated with the aid of various guidelines that codify community norms and experience, prominent among which is the General Notability Guideline (GNG). Also of relevance for the matter at hand is the specialized guideline applicable to scholars, academics, and educators, known by various abbreviations like WP:PROF. This guideline lays out eight criteria, meeting any one of which is sufficient qualification for notability.

Notability is not based on counting raw search-engine results. As the documentation page for "Arguments to avoid in deletion discussions" succinctly puts it:

Although using a search engine like Google can be useful in determining how common or well-known a particular topic is, a large number of hits on a search engine is no guarantee that the subject is suitable for inclusion in Wikipedia. Similarly, a lack of search engine hits may only indicate that the topic is highly specialized or not generally sourceable via the internet. WP:BIO, for instance, specifically states, Avoid criteria based on search engine statistics (e.g., Google hits or Alexa ranking). One would not expect to find thousands of hits on an ancient Estonian god.

Misrepresentation of Wikipedia policies, guidelines, and essays

The problems begin early in the paper. Referring to the article Katie Bouman, Lemieux et al. observe that it was put up for deletion a mere two days after its creation. They write, "Eventually, her page received a “snow keep” decision, indicating that her notability might be questionable but that deleting her page would be too much of an uphill battle to pursue (WP:SNOW)." This is not the meaning of the snowball clause. Instead, invoking WP:SNOW is a statement that the conclusion of a discussion is already obvious: the chance of its changing direction is the proverbial snowball's chance in hell, and letting it run for a deletion debate's typical period of a full week would be a waste of everyone's time. The conclusion is not that Bouman's notability was "questionable", but rather that it was obvious. The deletion debate for Bouman's biography did not end "eventually". It ended one hour and seventeen minutes after it began. Multiple !voters found the nomination sexist, either in intent (Jealous bros should not cry each time a woman is part of an achievement) or in effect (Shit like this is exactly why en.wp is such a sausage party).

In their literature review, Lemieux et al. state incorrectly that Of the more than 1.5 million biographies about notable writers, inventors, and academics on English-language Wikipedia, less than 20% are about women. There are over 1.5 million biographies total; most of them are not about writers, inventors, or academics.[note 1] They provide two sources for this claim. The first, a primary source, states that the figure is for the total number. The second, a 2021 paper by Tripodi,[2] makes the incorrect claim in its abstract and provides no citation for it.[note 2]

Policies, guidelines, and essays are, in Wikipedia jargon, different types of behind-the-scenes documents. The terms denote a descending sequence:

Policies have wide acceptance among editors and describe standards all users should normally follow. [...] Guidelines are sets of best practices supported by consensus. Editors should attempt to follow guidelines, though they are best treated with common sense, and occasional exceptions may apply. [...] Essays are the opinion or advice of an editor or group of editors for which widespread consensus has not been established. They do not speak for the entire community and may be created and written without approval.

Lemieux et al. describe both WP:Search engine test and WP:Too soon as metrics used to assess notability on Wikipedia. Neither page is such a thing.

The "Search Engine Test"

Lemieux et al. write that the "Search Engine Test" (WP:Search engine test) is a convenient and common way editors can determine probable notability before nominating a biography for deletion. Despite its supposed convenience and commonality, they provide no examples where it is actually invoked by name. Any attempt to find such examples, or even a cursory inspection of the WP:Search engine test page itself, would reveal the truth: their description of it is completely erroneous. Lemieux et al. claim to emulate the WP:Search engine test by computing a numerical proxy for each subject's online presence score. The page contains no such procedure. As observed above, none exists, or ever should exist. WP:Search engine test merely explains some uses of search engines and some reasons why their results must be treated with caution. It explicitly states, "A raw hit count should never be relied upon to prove notability", and it provides multiple reasons why. Checking the discussions in which it is invoked is illuminating. For example, in one debate, a !voter admonished, articles are not assessed based on the number of search results that come up after Googling them. For obvious reasons, this is a poor criteri[on] as you may get tens of thousands of results for inconsequential searches, and some noteworthy topics may not receive a particularly large amount of search results. See Wikipedia:Search_engine_test#Notability. Instead, articles should be assessed based on the criteria such as WP:GNG. In other words, pointing to WP:Search engine test has exactly the opposite meaning as Lemieux et al. make it out to have. This misrepresentation of WP:Search engine test also occurs in Tripodi's earlier paper, but it is more prominent in this one.[note 3]

As to its commonality, the numbers are stark enough to be indicative. The page WP:Search engine test was viewed 1,190 times over the year prior to this writing.[supp 5] The page WP:Notability, which includes the actual General Notability Guideline, was viewed 143,574 times over the same period.[supp 6] Even the specialized guideline WP:PROF was viewed 12,675 times, an order of magnitude more than the page that Lemieux et al. say is commonly used.[supp 7] As of this writing, 8,297 other pages link to the WP:PROF guideline, while only 1,090 link to the search engine test page,[supp 8][supp 9] though according to Lemieux et al. the latter is what applies to any subject, not just the niche domain of academic biographies. If the goal is to determine whether or not Wikipedia's self-proclaimed standards are being applied equitably, then the evaluation should consider the standards that are actually being proclaimed.

A central assertion of Lemieux et al. is that they quantified whether the "Search engine test" is being equitably utilized using a metric they call the "Primer Index", after the software employed to calculate it. This metric is based upon the number of times that an individual is mentioned in a news article and in which context, which evidently is at best an indirect perspective upon any of the WP:PROF criteria. To confirm the validity of the “Primer Index,” we also created a "Google Index" to approximate the total number of hits that appear when an academic's full name and occupation are searched on Google. Using a custom Google Sheets code, we extracted an academic's full name and occupation from Wikidata and automatically searched Google for every instance of “full name + occupation” for each academic in our dataset on the same day. At this juncture, see again the cautionary words of "Search engine test" itself, and the more popular link for making the same point, the "Arguments to avoid" page quoted above. Lemieux et al. nominally validated their "Primer Index" based on an argument that is literally one of the arguments we tell everyone to avoid. This supposed confirmation rests on no foundation at all. Having made this claim, Lemieux et al. write that our data indicate that BIPOC biographies who meet Wikipedia's criteria (i.e., above the White Male Keep median Primer Index of 12.00) were among those deleted. Because the "Primer Index" and "Google Index" are completely unmoored from Wikipedia practice, there is no reason to equate passing some threshold of them with meeting "Wikipedia's criteria".

The evident limitations of the "Google Index" are, in fact, an excellent indication of why a guideline like WP:PROF is necessary. Any search of the form "full name + occupation" will omit, or at best provide a paucity of hits from within books, the text of paywalled journal articles, and bibliographies including citations to the person in question. Furthermore, it will penalize those who are written about in languages other than English. The various criteria of WP:PROF, which start with citation databases but decidedly do not end there, provide a far less superficial understanding of article-worthiness. It is also the case that the consensus-building in AfD's reliant upon WP:PROF allows for greater discernment and flexibility where differences between specializations are concerned. (Lemieux et al. do not individuate between disciplines, despite the strong likelihood that some fields are more heavily advertised and more charismatic than others. Think of pop psychology versus pure mathematics, for example.)

Even if we set aside the concerns about the basic premises, a problem with the analysis remains. Suppose, for the sake of the argument, that we grant that the "Primer Index" measures something of interest. Lemieux et al. write that white males whose biographies were kept rated significantly higher on the Primer Index than those whose biographies were deleted. They compare the median Primer Indices of these two groups using the Kruskal–Wallis test and report that the test gives a small p-value. In contrast, the p-values for white women, for BIPOC men, and for BIPOC women were all large. Lemieux et al conclude that There was no statistically significant difference in the median Primer Index between kept and deleted pages for white women or for BIPOC academics, and thus that the Primer Index is not an accurate predictor of Wikipedia persistence for female and BIPOC academics. But a large p-value on the Kruskal–Wallis test is not itself sufficient to conclude that the distributions being compared are the same. (The documentation for the software the authors use says as much.[supp 10]) Indeed, per their figure 2, the difference in sample medians between kept and deleted biographies for BIPOC women was even larger than that for white men. The apparent differences in the statistics across these groups may be due, in whole or in part, to the large variations in sample sizes: 419 white men, but only 185 white women, 171 BIPOC men, and 69 BIPOC women.[supp 11]

The Titular "Too Soon"

The page Wikipedia:Too soon is an essay, not a guideline and certainly not a policy. It is largely a collection of pointers to other documentation pages with some commentary. The gist is given in the first section: Sometimes, a topic may appear obviously notable to you, but there may not be enough independent coverage of it to confirm that. In such cases, it may simply be too soon to create the article. Lemieux et al. emphasize this page from their title onward, but most of what they have to say about it is misguided.

“Too Soon” is a technical label developed by Wikipedians indicating that a subject lacks sufficient coverage in independent, high-quality news sources to have a page. It is hardly "technical": the meaning is an example of the everyday sense of the phrase. Moreover, the language of the essay is not news sources, but independent secondary reliable sources, with a clarifying link to the Wikipedia:Reliable sources guideline. The latter phrasing is more general than the former. A mathematics textbook or a medical review article is most likely independent, secondary, and reliable, but they are not news. Not only do Lemieux et al. drastically oversell the importance of WP:Too soon, they fail to summarize its contents accurately.[note 4] It is puzzling how anyone would come to believe that "too soon" means anything other than what it says on the tin. How, one wonders, would Wikipedia editors then describe topics that look like they will never be notable?

The initialism "AfD" is short for "Articles for deletion", the area in which deletion debates about articles occur. Lemieux et al. follow this usage, and so shall this comment. First, let us illustrate the usage of WP:Too soon per Wikipedia guidelines. The following excerpt is from the AfD for the biography of a white, male, assistant professor who was nominated for deletion under the tag WP:Too soon: “Most of the newspaper articles cited in the main article are not directly related to the subject, and apart from this brief article in the Dainik Jagran that borders on being a hagiography of the subject, there's no real coverage for WP:GNG. WP:Too soon perhaps.” The biography in question was for a Shivendu Ranjan. Moreover, the tag WP:TOOSOON was not used in the nomination. Instead, the nominator stated that Ranjan failed the GNG. The !voter being quoted began their rationale by saying that there is no evidence that the subject meets WP:ACADEMIC (another pointer to the same guideline as WP:PROF). Most of the !vote is a point-by-point explanation of why the editor believes that the WP:PROF criteria are not met. The plain reading of the WP:TOOSOON perhaps at the end is that the !voter believed Ranjan's situation could change in the future. Further examples of misleading quotation will be noted below.

They go on: Despite the academic being an assistant professor, the moderator [sic] focused on media coverage, not the career stage, of the subject which is in accordance with the Wikipedia guidelines of the tag WP:Too soon. The essay WP:TOOSOON says nothing specifically about academic career stages. Most of it is about movies. The only profession that is specifically discussed is acting.

Lemieux et al. state that they collected the career stages of each individual designated WP:Too soon, using a pair of research assistants to identify these stages manually. Since perceived notability among academics is highly contingent on their rank (Adams et al., 2019; WP:Notability (academics)), academic careers were scored based on stage, with assistant professors being scored as 1, associate professors with 2, and so on. This methodology is flawed from the outset. It confuses correlation with contingency: one cannot neglect citation metrics and the other success indicators described in WP:PROF when talking about how academic-biography AfD's evaluate career status. The only time that "career stage" as Lemieux et al. think of it factors into a WP:PROF judgment is if the subject has attained the named chair/Distinguished Professor level, because these indicate high levels of accomplishment. Lemieux et al. confuse both the status of WP:PROF versus WP:TOOSOON (guideline versus essay) and the logical roles those pages play in the very !votes they quote. WP:TOOSOON is not the means by which notability is evaluated; instead, invoking it is a means of speculating about the background of why the actual notability guideline is not met and noting that the situation may change.[note 5]

The claim that Wikipedia does not count trainees, research scientists, and/or government workers as “academics” is flatly untrue. Plenty of IEEE Fellows work in industry and are notable per WP:PROF#C3, for example. And when articles on students appear at AfD, the community files them with the other AfD's on academics and educators. Students and trainees are academics and are evaluated as such. They often fall short of notability, not because of the label "student", but because students infrequently stand out by the criteria that are actually relied upon. Because Lemieux et al. erroneously believe that Wikipedia regards all these people as "non-academics," they score all biographies thereof with a 0 on their career-stage scale, regardless of the subject's actual career stage. For example, a graduate student at a university would be scored the same as a senior research scientist at a major corporation. Upon these meaningless numbers, Lemieux et al. then do bad arithmetic, computing an average career stage [...] by calculating the sum of the career stage scores and dividing this by the total number of entries. Career stage is a qualitative or categorical variable, so taking the numerical mean is not likely to be indicative. What job is 0.76 of the way between assistant and associate professor?[supp 16]

David Eppstein, a computer-science professor, is a longtime participant in academic-bio AfD's and a member of the Women in Red project which aims to improve Wikipedia's biographical coverage of women. He writes,

[S]ince the article quotes me as invoking TOOSOON, perhaps I should explain what I generally mean by it. It is never the actual reason for a delete opinion, at least from me. In the cases under discussion, my choices are always grounded in notability guidelines and policies, not essays. When I use TOOSOON, it is not intended to strengthen the case for deletion. Maybe it is the opposite: it is a ray of hope in an otherwise negative opinion. If I think someone is not likely to ever be notable, I am probably just going to say delete, and explain why. If I think an academic does not currently meet our notability standards, but is on a trajectory on which they might well eventually do so, years later, I will say TOOSOON. We often see re-creations of the same articles, years apart, and including this in an opinion is a suggestion that if we discuss the same case again sometime we should check their accomplishments again more carefully instead of relying on past opinions.[supp 17]

In fairness, this was written in response to Lemieux et al., but it is also a natural implication of the everyday meaning of the phrase "too soon": not now, but maybe later. For an example of this involving David Eppstein and others, see Wikipedia:Articles for deletion/Charles Steinhardt.

Quote mining

The quotes given after These examples from AfD discussions all failed to mention the presence or depth of media coverage are misleadingly presented. Two of the three are from Wikipedia:Articles for deletion/Kate Killick. One of the !votes quoted actually began, Sadly, fails WP:NPROF. This is a significant omission, since it removes the actual reason the editor gave for believing the article should be deleted. Moreover, that AfD did discuss the presence or depth of media coverage, insofar as editors noted that there wasn't any. (The Irish Times reference is an opinion piece of which Killick's name is only mentioned once among many, many other names. I cannot locate any significant coverage from reliable sources that indicate notability.) The third quote is also truncated; the original is from here and concluded Tiny citations on GS do not pass WP:Prof and lack of independent in-depth sources fails WP:GNG.[note 6]

Another !vote they quote also began with a rationale that Lemieux et al. do not reproduce: The only form of notability claimed in the article is academic, but our standards for academic notability explicitly exclude student awards. Merely having written a few review papers is inadequate for notability; the papers need to be heavily cited, and here they appear not to be.

Misrepresentation of career status and other AfD aspects

The sentence immediately after the blockquote that includes snippets from the K. Killick AfD is Our dataset revealed that men at similar early career stages were present on Wikipedia. Their example is Colin G. DeYoung, who at the time of his AfD had an h-index of 44. That is not "similar" to a postdoc who coauthored a respectable but unremarkable number of papers (no more than 10, according to Web of Science). At the times of their respective AfD's, Killick had possessed a PhD for roughly six years, and DeYoung had possessed his for roughly thirteen. Without making any suppositions about the worth of their research on some notional absolute scale of intellectual achievement, it is safe to say that these AfD's were not for individuals at "similar" stages of their careers.

For example, Tonya Foster, a professor of creative writing and Black feminist scholar at San Francisco State University had a high Primer Index of 41 yet her Wikipedia page was deleted. It is worth mentioning that the article Tonya Foster does exist today. At the time of the AfD in 2017, the consensus was that WP:AUTHOR was not met, but the article was recreated in July 2020 without complaint. Foster did not join San Francisco State University until 2020, at which point she was named one of the George and Judy Marcus Endowed Chairs,[supp 21] so the argument for her passing WP:PROF got significantly stronger. The AfD process can hardly be faulted for failing to consider evidence that would not exist for another three years.

Another example is the late Sudha Shenoy, an economist and professor of economic history at the University of Newcastle, Australia, who had a high Primer Index of 198 yet her page was also deleted. According to her profile at the Mises Institute, she was a "lecturer" at Newcastle,[supp 22] and "lecturer" in Australia is a lower academic rank than a professor. In any case, that AfD looked at WP:PROF and WP:GNG and found that neither was met; citation counts were low, no other academic notability criteria could be argued for, and the sources about her were unreliably published. This data point appears to be more an indictment of the "Primer Index" than anything else.

Lemieux et al. mention the drama surrounding the article about nuclear chemist Clarice Phelps. They state that her biography was deleted three times in the span of one week. While its existence was indeed contentious, that specific claim is not in the cited source,[supp 23] or in the source upon which that website relied.[supp 24] Examining the deletion log for the article[supp 25] and the "article milestones" list at Talk:Clarice Phelps indicates that there is no one-week span that could match. Instead, it was deleted once in February 2019 and twice in April, before being incubated as a draft page and then restored for good in February 2020.[supp 26][supp 27][supp 28] The cause of diversity on Wikipedia is a marathon, not a sprint; the inaccurate timeline and the omission of the eventual success make for a misleading portrayal of the challenge that is of no help in resolving it.


Whatever revolutionary claims have been made on its behalf, Wikipedia has a fundamentally institutionalist character. It layers on top of existing academic and journalistic systems of legitimacy. The same rhetorical fences that keep Wikipedia from being a toxic waste dump of advertising and conspiracy theories also mean that it is a bad place for social change to begin. The encyclopedia can only be as non-sexist as the least sexist institution. The question of which articles should exist and what they should say is a question of content moderation at scale, a task that is "impossible to do well".[supp 29] The failure modes of Wikipedia's written rules and subcultural practices deserve study. But ill-conceived studies can lead to ill-conceived advocacy that makes real problems no easier to solve.

Lemieux et al. misrepresent Wikipedia policies, guidelines, and essays; the content of deletion debates; news reporting; and the prior academic literature. How these errors could have transpired is, in a word, baffling. How they passed through peer review is likewise a puzzle (and a discouraging sign), but pass through they did.[note 7] The problems are too pervasive to be addressed by an erratum or an expression of concern. The literature on the important subject of systemic bias in Wikipedia would be best served by a retraction and a careful re-examination of the editorial process.


Other recent publications

Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

Compiled by Tilman Bayer

"Towards a Digital Reflexive Sociology: Using Wikipedia's Biographical Repository as a Reflexive Tool"

From the abstract:[3]

"[...] we employ Wikipedia as a ‘reflexive tool’, i.e., an external artefact of self-observation that can help sociologists to notice conventions, biases, and blind spots within their discipline. We analyse the collective patterns of the 500 most notable sociologists on Wikipedia, performing structural, network, and text analyses of their biographies. Our exploration reveals patterns in their historical frequency, gender composition, geographical concentration, birth-death mobility, centrality degree, biographical clustering, and proximity between countries, also stressing institutions, events, places, and relevant dates from a biographical point of view. Linking these patterns in a diachronic way, we distinguish five generations of sociologists recorded on Wikipedia and emphasise the high historical concentration of the discipline in geographical areas, gender, and schools of thought."

"How can the social sciences benefit from knowledge graphs? A case study on using Wikidata and Wikipedia to examine the world’s billionaires"

From the abstract:[4]

"This study examines the potentials of Wikidata and Wikipedia as knowledge graphs for the social sciences. The study demonstrates how social science research may benefit from these knowledge bases by examining what we can learn from Wikidata and Wikipedia about global billionaires (2010-2022). [...] We show that the English Wikipedia and, to a lesser extent, Wikidata exhibit gender and nationality biased in the coverage and information about global billionaires. Using the genealogical information that Wikidata provides, we examine the family webs of billionaires and show that at least 15% of all billionaires have a family member also being a billionaire."

"Can you trust Wikidata?"

From the abstract:[5]

"The present work aims to assess how well Wikidata (WD) supports the trust decision process implied when using its data. WD provides several mechanisms that can support this trust decision, and our KG [Knowledge Graph] Profiling, based on WD claims and schema, elaborates an analysis of how multiple points of view, controversies, and potentially incomplete or incongruent content are presented and represented."

"A Study of Concept Similarity in Wikidata"

From the abstract:[6]:

"In light of the adoption of Wikidata for increasingly complex tasks that rely on similarity, and its unique size, breadth, and crowdsourcing nature, we propose that conceptual similarity should be revisited for the case of Wikidata. In this paper, we study a wide range of representative similarity methods for Wikidata, organized into three categories, and leverage background information for knowledge injection via retrofitting. We measure the impact of retrofitting with different weighted subsets from Wikidata and ProBase. Experiments on three benchmarks show that the best performance is achieved by pairing language models with rich information, whereas the impact of injecting knowledge is most positive on methods that originally do not consider comprehensive information. The performance of retrofitting is conditioned on the selection of high-quality similarity knowledge. A key limitation of this study, similar to prior work lies in the limited size and scope of the similarity benchmarks. While Wikidata provides an unprecedented possibility for a representative evaluation of concept similarity, effectively doing so remains a key challenge."

Matching non-notable Wikidata "orphans" to Wikipedia sections

From the abstract and paper:[7]

"We present a transformer-based model, ParaGraph, which, given a Wikidata entity as input, retrieves its corresponding Wikipedia section. To perform this task, ParaGraph first generates an entity summary and compares it to sections to select an initial set of candidates. The candidates are then ranked using additional information from the entity’s textual description and contextual information. Our experimental results show that ParaGraph achieves 87% Hits@10 when ranking Wikipedia sections given a Wikidata entity as input. [...]

This mapping between Wikipedia and Wikidata is beneficial for both projects. On the one hand, it facilitates information extraction and standardization of Wikipedia articles across languages, which can benefit from the standard structure and values of their Wikidata counterpart, e.g., for populating infoboxes. On the other hand, Wikipedia articles are routinely updated, which in turn keeps Wikidata fresh and useful for online applications. However, the Wikipedia editorial guidelines require that an entity be notable or worthy of notice to be added to the encyclopedia, which does not hold for all Wikidata entities. [...] We refer to the remaining entities, which do not have an article in [any language] Wikipedia, as orphans. In the absence of a textual counterpart, orphans often suffer from incompleteness and lack of maintenance. Our present effort stems from the observation that a substantial number of orphan entities are indeed represented in Wikipedia, but not at the page level; orphan entities are often described within existing Wikipedia articles in the form of sections, subsections, and paragraphs of a more generic concept or fact. For example, the English Wikipedia does not have a dedicated page about “Tennis racket”, it is instead embedded in the “Racket” page as a section, whereas it can be found as a standalone (orphan) entity on Wikidata (“Q153362”)."


  1. ^ Lemieux, Mackenzie; Zhang, Rebecca; Tripodi, Francesca (March 29, 2023). ""Too Soon" to count? How gender and race cloud notability considerations on Wikipedia". Big Data & Society. 10. doi:10.1177/20539517231165490. S2CID 257861139.
  2. ^ a b Tripodi, Francesca (2021-06-27). "Ms. Categorized: Gender, notability, and inequality on Wikipedia". New Media & Society. doi:10.1177/14614448211023772. S2CID 237883867.
  3. ^ Beytía, Pablo; Müller, Hans-Peter (2022-09-15). "Towards a Digital Reflexive Sociology: Using Wikipedia's Biographical Repository as a Reflexive Tool". Poetics: 101732. doi:10.1016/j.poetic.2022.101732. ISSN 0304-422X. Closed access icon (author's link
  4. ^ Daria Tisch, Franziska Pradel: How can the social sciences benefit from knowledge graphs? A case study on using Wikidata and Wikipedia to examine the world’s billionaires (submission to Semantic Web – Interoperability, Usability, Applicability , under review)
  5. ^ Veronica Santos, Daniel Schwabe and Sérgio Lifschitz Can you trust Wikidata? (submission to Semantic Web – Interoperability, Usability, Applicability , under review)
  6. ^ Filip Ilievski, Kartik Shenoy, Hans Chalupsky, Nicholas Klein and Pedro Szekely: A Study of Concept Similarity in Wikidata (submission to Semantic Web – Interoperability, Usability, Applicability, under review), Code
  7. ^ Natalia Ostapuk, Djellel Difallah, and Philippe Cudré-Mauroux. “ParaGraph: Mapping Wikidata Tail Entities to Wikipedia Paragraphs.” In: 2022 IEEE International Conference on Big Data, BigData, 2022. slides, Dataset: Ostapuk, Natalia; Difallah, Djellel; Cudré-Mauroux, Philippe (2022-11-25), Wikidata dump extension (enwiki section links), Zenodo

Supplementary references

  1. ^ Matei, S. A.; Dobrescu, C. (2011). "Wikipedia's "Neutral Point of View": Settling Conflict through Ambiguity". The Information Society. 27 (1): 40–51. doi:10.1080/01972243.2011.534368. S2CID 27479715.
  2. ^ Gauthier, Maude; Sawchuk, Kim (2017). "Not notable enough: feminism and expertise in Wikipedia". Communication and Critical/Cultural Studies. 14 (4): 385–402. doi:10.1080/14791420.2017.1386321. S2CID 149229953.
  3. ^ Luo, Wei; Adams, Julia; Brueckner, Hannah (2018-08-30). "The Ladies Vanish? American Sociology and the Genealogy of its Missing Women on Wikipedia". Comparative Sociology. 17 (5): 519–556. doi:10.1163/15691330-12341471.
  4. ^ "Lois K. Alexander Lane, version of 19 March 2016".
  5. ^ "Wikipedia:Search engine test". Retrieved 2023-04-12.
  6. ^ "Wikipedia:Notability". Retrieved 2023-04-12.
  7. ^ "Wikipedia:Notability (academics)". Retrieved 2023-04-12.
  8. ^ "Wikipedia:Notability (academics)". Retrieved 2023-04-12.
  9. ^ "Wikipedia:Search engine test". Retrieved 2023-04-12.
  10. ^ "Interpreting results: Kruskal–Wallis test". GraphPad Prism. Retrieved 2023-05-07.
  11. ^ "Wikipedia talk:WikiProject Women in Red, version of 7 May 2023".
  12. ^ "Wikipedia:Too soon". Retrieved 2023-04-13.
  13. ^ "Wikipedia:Notability". Retrieved 2023-04-13.
  14. ^ "Wikipedia:Too soon". Retrieved 2023-04-13.
  15. ^ Adams, Julia; Brückner, Hannah; Naslund, Cambria (2019). "Who Counts as a Notable Sociologist on Wikipedia? Gender, Race, and the "Professor Test"". Socius. 5: 1–14. doi:10.1177/2378023118823946. S2CID 149857577.
  16. ^ For the meaninglessness of applying averages to ordinal variables, see, e.g., Wilson, Thomas P. (March 1971). "Critique of ordinal variables". Social Forces. 49 (3): 432–444. doi:10.2307/3005735. JSTOR 3005735.
  17. ^ "Wikipedia talk:WikiProject Women in Red, version of 13 April 2023".
  18. ^ Roberts, Justin (2020-07-02). Slavery & Abolition. 41 (3): 686–688. doi:10.1080/0144039X.2020.1790769. ISSN 0144-039X. S2CID 221178536.{{cite journal}}: CS1 maint: untitled periodical (link)
  19. ^ Sklansky, Jeffrey (Spring 2021). Journal of Social History. 54 (3): 973–975. doi:10.1093/jsh/shz115.{{cite journal}}: CS1 maint: untitled periodical (link)
  20. ^ Rhode, Paul (March 2020). The Journal of Economic History. 80 (1): 293–294. doi:10.1017/S0022050720000029. S2CID 214003587.{{cite journal}}: CS1 maint: untitled periodical (link)
  21. ^ "Creative Writing Department announces new George and Judy Marcus Endowed Chairs". SF State News. 2020-04-10. Retrieved 2023-04-12.
  22. ^ "Sudha R. Shenoy". Mises Institute. 20 June 2014. Retrieved 2023-04-11.
  23. ^ Sadeque, Samira (2019-04-29). "Wikipedia just won't let this Black female scientist's page stay". The Daily Dot.
  24. ^ Jarvis, Claire L. (2019-04-26). "Wikipedia's Refusal to Profile a Black Female Scientist Shows Its Diversity Problem". Slate.
  25. ^ "All public logs: Clarice Phelps". Retrieved 2023-04-12.
  26. ^ "Clarice Phelps: version of 7 February 2020".
  27. ^ Page, Sidney (2022-10-17). "She's made 1,750 Wikipedia bios for women scientists who haven't gotten their due". The Washington Post. Retrieved 2023-04-12.
  28. ^ Khan, Arman (2022-11-18). "I've Made More Than 1,700 Wikipedia Entries on Women Scientists and I'm Not Yet Done: British scientist Jessica Wade has made one Wikipedia entry every day since 2017". Vice. Retrieved 2023-04-12.
  29. ^ Masnick, Mike (2019-11-20). "Masnick's Impossibility Theorem: Content Moderation At Scale Is Impossible To Do Well". Techdirt. Retrieved 2023-04-15.


  1. ^ A manual survey of 100 random biographies found only 25 that could meet those criteria, and this casts a net more widely than the remit of WP:PROF.
  2. ^ Later in the review, they write, For academic biographies on Wikipedia, notability is achieved through the significant impact of one's scholarly work on society, the winning of prestigious academic awards, or the holding of important leadership positions at an academic institution or academic journal board. This conflates multiple criteria that WP:PROF lists separately, in a way that may obscure how they operate. WP:PROF#C1 concerns significant impact in their scholarly discipline (emphasis added). WP:PROF#C4 asks for a significant impact in the area of higher education, affecting a substantial number of academic institutions. And influence within society at large, outside academia in their academic capacity, is WP:PROF#C7. These are each evaluated in different ways, as the guideline details. Of the sources cited for this point, Matei and Dobrescu[supp 1] do not discuss any notability guideline at all. Gauthier and Sawchuk[supp 2] and Luo et al.[supp 3] discuss the page WP:Notability but do not mention the existence of a guideline specialized to scholars and academics, despite its relevance to their subject matter.
  3. ^ Tripodi's description of a case where a woman's purported significance is easily verifiable using the search engine test[2] is factually inaccurate. The biography of Lois K. Alexander Lane was not pushed out of the main space; it was a draft article not yet in the main space.[supp 4] This draft did not contain, as Tripodi writes, links to seven credible sources independent of the subject, including The Washington Post and the Smithsonian. It contained two sources, one a Washington Post item and the other a webpage at the Smithsonian, and it linked to those two sources a total of seven times. The article Lois K. Alexander Lane has, contrary to Tripodi's statement, never been nominated for deletion.
  4. ^ To the point about importance, note that WP:Too soon received a median of 106 views per month over the year prior to this writing,[supp 12] versus 11,924 for WP:Notability. 1,664,013 pages link to the latter, and 1,393 link to the former.[supp 13][supp 14]
  5. ^ The Adams et al. paper[supp 15] only studies sociologists, and so whether its conclusions generalize further across academia is an open question. They report (Table 3) a correlation between career stage and the probability of having a Wikipedia page, but they do not disentangle career stage from citation metrics or other indicators. Emeritus professors are likely to have done more than assistant professors. Adams et al.'s data is from October 2016 and may be outdated in various aspects, which it is beyond the scope of this comment to determine. Perhaps worth noting in this context, however, is their tentative conclusion that "pages about women were not more likely to be deleted than pages about men" and "the main story is that women are less likely to appear in the first place".
  6. ^ In the time since that AfD, the book mentioned in it has accumulated additional reviews.[supp 18][supp 19][supp 20] In the present circumstances, the biography of the author might be refactored into a page about the book, rather than deleted. Compare, e.g., Wikipedia:Articles for deletion/Daisy Deomampo, Wikipedia:Articles for deletion/Aaron Fox (musicologist), and Wikipedia:Articles for deletion/Alam Saleh.
  7. ^ For related commentary, see Tilman Bayer's 25 July 2021 "Recent research" column in the Wikipedia Signpost. The concerns about confounding factors raised there are echoed here. For example, newly-created articles might be scrutinized more closely; articles begun by well-meaning novices might be more likely to be nominated for deletion in good faith, even if they turn out salvageable.

In this issue
+ Add a comment

Discuss this story

It's always interesting to read about Wikipedia policies. I myself was unfamiliar with TOOSOON, but it made me chuckle. Just to check, I ran a Wikidata query for the human genders male and female with sitelinks on English Wikipedia who were born after 01-01-2000. Of a total of 9049 humans, women represent 30%. Either it's been too soon for their articles to get Wikidata items that include their gender, or it's too soon for them to have an article, but women up to age 23 are still underrepresented on English Wikipedia by a large margin. Jane (talk) 05:43, 8 May 2023 (UTC)[reply]

From what I recall from research I did on this few years back, the more recent the time range considered, the less women are underprepresented. --Piotr Konieczny aka Prokonsul Piotrus| reply here 08:27, 8 May 2023 (UTC)[reply]
Jane023, I am surprised that women represented 30% of your sample; I would have expected it to be lower, given the undue weight that Wikipedia authors and the results of our notability guidelines give to sports figures, who are predominantly male. I would be interested to see the same percentage breakdown for news coverage or other significant coverage in independent, reliable, secondary sources. In a world dominated by patriarchy, would you expect 51% of coverage to be about women? I would not; our societies have more progress to make in ensuring that the potential contributions of all people are valued and available. – Jonesey95 (talk) 17:38, 8 May 2023 (UTC)[reply]
That's funnny! I myself was surprised it was as high as 30% and can only conclude this is thanks to the w:WikiProject Women in Red. Don't forget there are several big issues with women that men don't have that makes data collection (and thus findability for Wikipedians wishing to create articles) very complex and difficult. Obviously the biggest issue is systemic bias by language (no info in English? article ain't happening). Women tend to travel less than men, due to family issues, making their chances of crossing language lines in news articles much less on the whole as a group. Secondly, women get married and change their name, clouding findability by genealogy methods. Thirdly, women hide their birth dates due to ageism in various fields, which again clouds findability for potential Wikipedians. The main issue that could contribute to a higher degree of article deletions for women and minorities is thus a lack of sources, but this does not address "ghost edits" or articles that were started but never happened. Collecting data on articles by Wikipedia project is definitely useful, but it gets interesting only when you drill down into things like occupation, notable works, and birth country, among other datapoints. Jane (talk) 07:04, 9 May 2023 (UTC)[reply]
I revisited that query I ran because I doubted the 30% and indeed the number is less than 30%. It occurred to me that my earlier query ran rather quickly and this was because I queried those born on 2000-01-01 and not after 2000-01-01. When I correct it I get timeouts, but this query (toggle manually for gender) for births between 2000 and 2010 will return 9705 females and 26672 males so 9705/(9705+26672)=27%, not 30%. This query runs towards the time limit for each gender, which is about right for English Wikipedia. Jane (talk) 09:43, 9 May 2023 (UTC)[reply]
Thanks for contributing some data! This paper did something very similar, just without the restriction to Wikipedia sitelinks (i.e. examined gender gaps on Wikidata instead of Wikipedia). They observed that the proportion of Wikidata items about women ranges between 0.2 and 0.25 for birth years 1950 to 1990 and has increased steadily since then, reaching 0.4178 for the 2000 birth year. (Also, they found that Wikidata editors are likely to over sample male-dominated professions such as American football and baseball, thus contributing to the general predominance of items representing men over items representing women. Our analysis that focused on a set of academic professions show that the gender distribution of Wikidata is no more biased than real world notability judgments in either coverage or quality. Obviously that doesn't speak directly to possible gender biases on Wikipedia instead of Wikidata, but it is still interesting e.g. because of similarities in the editor populations that people often like to draw quick conclusions from.)
Regards, HaeB (talk) 17:44, 8 May 2023 (UTC)[reply]
Without going into a whole list of pet peeves I have with the gender data on Wikidata, I will say it would be useful to query for the age of the subject at the time of item creation with the same moment recorded for each specific Wikipedia project with an actual article (minus any redirects) about the subject. If you could query that data and cross-reference it per Wikipedia language you could get some more specific information about gender bias on Wikipedia. My gut feeling for women in sports is that we have a large quantity of women thanks only to the wp:WikiProject Olympics in some Wikipedia languages. Currently I have no way to measure that. On the whole, I don't think studying AfD data is very useful for gender issues, but possibly it is for occupational issues (controversial (garage bands), new (vloggers) or individual-based (artists) and so forth). Jane (talk) 07:54, 10 May 2023 (UTC)[reply]

Thanks to XOR'easter, with editorial guidance from Tilman Bayer, for exposing the shortcomings of the abysmally bad paper by Mackenzie Lemieux, Rebecca Zhang and Francesca Tripodi about Wikipedia procedures for assessing the notability of scholars and researchers that was recently published in the journal Big data and society. I hope that we shall see a response by Lemieux, Zhang and Tripodi to the many allegations of inaccuracy and misrepresentation that exist in the paper. The editor of the journal Matthew Zook is invited to explain how this egregiously erroneous paper got through the journal’s peer review process. Xxanthippe (talk) 08:52, 8 May 2023 (UTC).[reply]

Great read. One minor thing I think might discourage people from sharing it outside Wikipedia is the use of the jargon "!voter" and "!vote". They're probably better replaced with "participant" and "comment" or the like. Nardog (talk) 09:13, 8 May 2023 (UTC)[reply]

Thanks! Just to clarify though, the text is all XOR'easter's (apart from the explanatory intro that quotes from the paper). On the other hand, it seems that some credit for identifying issues in the paper should also go to other participants in the discussion at the Women in Red talk page that had been started there at the beginning of April. (I myself only read the paper this past weekend, in the context of reviewing XOR'easter's Signpost submission, and subsequently contributed these observations about the authors' misinterpretation of p-values in one of their central quantitative conclusions.) Regards, HaeB (talk) 18:01, 8 May 2023 (UTC) (Tilman)[reply]
Thanks for the feedback! In the original version, I explained about deletion debates not being votes and what "!vote" means, but that got cut when bringing the column over to the Signpost. The original was even longer than this version and included more background for people coming to the topic without experience in Wikipedia behind-the-scenes stuff. XOR'easter (talk) 12:19, 8 May 2023 (UTC)[reply]
Excellent work. I would encourage you to submit the longer version for publication in a scholarly journal (perhaps in Big Data & Society itself). Compassionate727 (T·C) 16:11, 8 May 2023 (UTC)[reply]
Over at the Women in Red project discussion page, we've been contemplating what to do next (and also going further into the statistics, which are ... puzzling: just how many biographies did they look at?). It's the end of the semester for me, so what with the usual obligations I haven't much spare time or energy. And writing about this isn't simple, on account of the tough balancing act: the problem of systemic bias is obviously important, but a specific paper about it can still be methodologically flawed beyond repair. Wikipedia's coverage of minority populations evidently isn't where it should be if we are to live up to our ideals, but a paper with that premise can still go so far astray from how things actually happen here that it enters the empyrean level of confident wrongness more typically associated with ChatGPT and men at cocktail parties. XOR'easter (talk) 16:32, 8 May 2023 (UTC)[reply]

I find it weird for a publication to present views of some importance when they can be manipulated. The Google search is more of a litmus test and never was a set guideline. The publication finding out should have been mentioned as a positive because we don't look at the info that SEO could easily sway. We look at more specific guidelines from GNG for people in their fields because it should make sense. – The Grid (talk) 13:22, 8 May 2023 (UTC)[reply]

This blithe and dismissive take on a well-established and important issue is unfortunate. It may be fair to criticise some of the methodology of their study, but the arguments in response to their criticisms presented above are specious. It would be far better to acknowledge that Wikipedia has well-established systemic biases and then we can figure out why that is the case. To refer to policy obfuscates from how that policy plays out in practice; and I think that most honest editors would willingly acknowledge that Wikipedia suffers from severe bias problems. Jack4576 (talk) 15:10, 8 May 2023 (UTC)[reply]

That's a lot to assume. People will quote essays, policies, and guidelines in AfD discussions to get the ball rolling on discussion. Note that essays and guidelines are not policies. Sometimes people point to WP:AFDOUTCOMES but always note citing any of the above with no explanation can be weighed lower than explanations made. The study seems to really not go into the meat of AfD discussions where discussion is always considered more important than people simply stating essays, policies, or guidelines. There's a tally but it's always been a very rough estimate. The criticism is valid. – The Grid (talk) 16:46, 8 May 2023 (UTC)[reply]
@Jack4576: this reasoning would seem to indicate that no paper should receive detailed criticism for its failings so long as it raises an overall issue of importance. Do you believe that specific criticisms raised in this article are incorrect? How does their being raised undermine the overall desire to combat bias? I also would like you to expand upon your indication of XOR's response being blithe, which means "lacking due thought or consideration". I can't see how this stands up, so seems to me more of an unsupported attack on the quality of their work. "Wikipedia suffers from severe bias problems" is certainly possible, but I would firmly dispute that AfD isn't policy driven. AfD is one of the most policy-backed fora on the entire site. While I'd appreciate a response now, I'd also be interested to see if your views remain unchanged after you've expanded your AfD experience set further, as you appear to be just starting into. Nosebagbear (talk) 16:58, 8 May 2023 (UTC)[reply]
In many cases, ostensibly policy driven. Which I believe is the whole point of this piece. It’s unsurprising that editors are keen to dismiss these critiques of hand. Jack4576 (talk) 23:19, 8 May 2023 (UTC)[reply]
it’s not a very strong critique to point to ostensible policies to dismiss an empirical study about the outcomes of said policy. No one claims Wikipedia’s policies are on their face biased; the core of the piece is what happens in practice with the implementation of said policies. Jack4576 (talk) 23:22, 8 May 2023 (UTC)[reply]
@Jack4576 But the core of the piece isn't about the actual practice. Otherwise it wouldn't spend large periods of time talking about non-policies/guidelines, it wouldn't need to engage in quote mining (the full quotes would give everything they needed), they wouldn't need to raise the idea that AfDs are biased for occurring and leave out they were closed for sexism in less than a couple of hours, and in particular they wouldn't need to make their own flagrantly incorrect statements of what various PAGs mean.
You could absolutely have a really good piece of research talking about whether the !votes people make execute the letter & spirit of the notability policies - but where the paper makes comparisons they do so to clearly incorrect statements of wikipedia policy, rendering them meaningless. I would also like your evidence of the scale of incorrect !votes due to biased interpretations that you say are occurring. Nosebagbear (talk) 12:05, 9 May 2023 (UTC)[reply]
This discussion is unproductive at this point. Obviously I don't have empirical evidence and am running off anecdotal experience like everyone else here. The article's conclusions ring true to me. Selective enforcement of policy can create bias, even if the policy itself is unbiased. This is obviously true and those who dispute that as occurring on Wikipedia just aren't paying close enough attention IMO. I suppose you need to wait for an empirical study to be convinced. You may be waiting a while. Jack4576 (talk) 12:41, 9 May 2023 (UTC)[reply]
I have looked at some of your AfD comments and you don't even provide Wikipedia policy for your reasons. It's really a blunder what you're going on about. – The Grid (talk) 12:55, 9 May 2023 (UTC)[reply]
I thought we were discussing this article. Feel free to chime in over on AfD if you have constructive feedback. Jack4576 (talk) 13:57, 9 May 2023 (UTC)[reply]
There's plenty of sources that discuss the problem of systemic bias on Wikipedia. But, this paper is worthless for determining its extent, nature, and causes, because its methodology is wrong. Most people would agree that there are issues of bias in, say, the criminal justice system, but an analysis paper on such bias that was based around, say, the magazines left in the police station waiting room rather than the briefings & trainings the police chief gave to the officers would be too nonsensical to be useful. SnowFire (talk) 19:51, 8 May 2023 (UTC)[reply]
worthless is overstating the case. Jack4576 (talk) 23:20, 8 May 2023 (UTC)[reply]
It really isn't. The errors described are catastrophic, along with a touch of nonsensical fake-math from people who don't know statistics. Your user page says you're interested in Australian law - imagine reading a paper with a conclusion you agree with, but the argument cites the 19th century Code Napoleon as if it were Australian law, and thinks that Ned Kelly was Prime Minister of Australia. Maybe the conclusion is correct, but the paper is still worthless. The authors clearly never ran their draft past an experienced Wikipedian. SnowFire (talk) 12:50, 9 May 2023 (UTC)[reply]
Personally, I would be shocked if there wasn't some degree of systemic bias in our handling of notability. I think there is absolutely a conversation to be had there. But we cannot reasonably have that conversation when the facts have been so badly muddied that essays are being described as guidelines and the search engine test is being held up as an example of How Notability Works. If your factual information is that poor, then just about the only thing you've told us is that there is systemic bias, somewhere, maybe, which I imagine most of us probably already knew. --NYKevin 04:05, 9 May 2023 (UTC)[reply]
And if we didn't know, we sure aren't going to be convinced by reading that. Compassionate727 (T·C) 08:38, 9 May 2023 (UTC)[reply]

The selective quoting in the paper is so precisely selective in every single instance in order to omit actual statements of substance about notability from those being quoted that I can't help but think it was entirely purposeful by Lemieux et al. in order to farm quotes that present a viewpoint that supports the claims of the paper itself. Pretty disgraceful, if you ask me. SilverserenC 16:46, 8 May 2023 (UTC)[reply]

Great review article, kudos to the contributors CactiStaccingCrane (talk) 01:02, 18 May 2023 (UTC)[reply]


The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0