The Signpost
Single-page Edition
WP:POST/1
11 February 2013

Op-ed
An article is a construct – hoaxes and Wikipedia
Featured content
A lousy week
WikiProject report
Just the Facts
In the media
Wikipedia mirroring life in island ownership dispute
News and notes
UK chapter governance review marks the end of a controversial year
Discussion report
WebCite proposal
Technology report
Wikidata client rollout stutters
 

Wikipedia:Wikipedia Signpost/2013-02-11/From the editors Wikipedia:Wikipedia Signpost/2013-02-11/Traffic report


2013-02-11

Wikipedia mirroring life in island ownership dispute

Contribute  —  
Share this
By Andreas Kolbe
The location of the disputed islands.

On 5 February 2013, Foreign Policy published a report by Pete Hunt on editing of the Wikipedia articles on the Senkaku Islands and Senkaku Islands dispute. The uninhabited islands are under the control of Japan, but China and Taiwan are asserting rival territorial claims. Tensions have risen of late—and not just in the waters surrounding the actual islands:


As the Foreign Policy article reports, the talk page of the Senkaku Islands article is replete with accusations of bias and censorship, with each side claiming to uphold Wikipedia policy—conduct which, Hunt says, mirrors that of Japanese and Chinese officials citing international law to back up their claims and counterclaims.

The growth of the on-wiki dispute paralleled that of the real-world conflict. Created in 2003, by User:Menchi, the Senkaku Islands article originally gave preference to the traditional Chinese name in its lead sentence, with the Japanese name mentioned second, and it was short, at just 300 words. By January 2010, it had grown to more than ten times that size, with 43 sources cited. In October 2010, User:Tenmei created a standalone article on the conflict.

As the political conflict around the islands intensified, so did the conflict at the Wikipedia article. The first point of contention was the islands' very name—should it be Diaoyutai Islands (the Taiwanese name), Diaoyu Islands (preferred in China), or the Japanese name, Senkaku Islands. Some editors advocated using the English name, Pinnacle Islands, to avoid the appearance of bias, but as Hunt reports:


The second area of dispute was the question who owned the islands, and over time, the article grew to describe, "in long, excessively detailed sections", on which basis three different governments came to argue that the islands were rightfully theirs.

The third point of contention, Hunt says, has been editorial neutrality, with editors using the supposed nationality of their opposite numbers as a focus for attacks. But in the end, Hunt concludes, the unappealing, time-consuming and emotionally exhausting process delivers a result:


Hunt ends with the suggestion that for this and similar political disputes, Wikipedia forms what he calls a "kinetic diplomatic front":


In brief

Orbit of 274301 Wikipedia


2013-02-11

Wikidata client rollout stutters

January engineering report published

In January:
  • 112 unique committers contributed patchsets of code to MediaWiki (no change on December)
  • The total number of unresolved commits stood at 650 (no change).
  • About 45 shell requests were processed (up 6).
  • Wikimedia Labs now hosts 155 projects (up 7) and has 931 registered users (up 84).

—Adapted from Engineering metrics, Wikimedia blog

The WMF's engineering report for January was published this week on the Wikimedia blog and on the MediaWiki wiki ("friendly" summary version), giving an overview of all Foundation-sponsored technical operations in that month (as well as brief coverage of progress on Wikimedia Deutschland's Wikidata project, phase 1 is in the process of going live on the English Wikipedia). Of the five headlines picked out for the report, one (the data centre migration) had already received detailed Signpost coverage. The other four highlight, respectively, updates to the mobile site to allow primitive editing, upload and watchlist functionality; "progress on input methods and our upcoming translation interface"; a restructuring of the way MediaWiki stores co-ordinates; and a testing event to assess how VisualEditor handles non-Latin characters.

In many respects, then, January was a quieter month for Wikimedia Engineering, reflecting in part the uncertainty of the data centre migration (though in the event very little actively broke). Of the Foundation's own core projects (that is to say, excluding the Wikimedia Deutschland-led Wikidata project), only the nascent Echo project showed visible improvement over the month. Flow – the Foundation's latest attempt to fix talk pages, particularly with respect to user-to-user communications – did however enter the design stage, while the Visual Editor project saw another month of refinements and bugfixes. In addition, as previously reported, the Foundation's Editor Engagement Experiments (E3) team launched the Guided Tours extension in January, allowing users to be "walked through" their first edit.

In any case, the report allows for a detailed look at some of the smaller-name projects receiving the Foundation's support. In January; that included work on a tool for Unix/Linux users to allow them to import copies of Wikimedia sites more easily by converting the current XML output to a more data-friendly form. The tool came after WMF developers realised the current process for mirroring a Wikimedia wiki was "painful and cumbersome at best, and unfathomable for the end-user in the worst case". The WMF's involvement with the Outreach Program for Women also began on January 3, with six women new to open-source programming taking on three-month micro-projects; this month, the Foundation also reaffirmed its intention to apply to be part of the Google Summer of Code programme, which targets intermediate level developers of either gender.

In brief

Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for several weeks.

Wikipedia:Wikipedia Signpost/2013-02-11/Essay Wikipedia:Wikipedia Signpost/2013-02-11/Opinion


2013-02-11

UK chapter governance review marks the end of a controversial year

Wikimedia UK (WMUK), the national non-profit organization devoted to furthering the goals of the Wikimedia movement in the United Kingdom, has published the findings of a governance review conducted by management consultancy Compass Partnership.

This review was partially the result of a conflict-of-interest controversy revolving around Roger Bamkin, whose roles as English Wikipedia editor, trustee of WMUK, creator of QRpedia, and paid consultant for MonmouthpediA and GibraltarpediA received much press coverage, including a Signpost report. Bamkin subsequently resigned from WMUK's Board of Trustees.

WMUK's turbulent year was dotted with other trustee resignations as well. Ashley Van Haeften resigned from the position of chair in August 2012 after his ban from the English Wikipedia. Later that month, Joscelyn Upendran resigned from the board itself, stating that "personal loyalties may be getting in the way of what is really best for the charity and of dealing with any actual or perceived conflict of interest issues" in regard to Bamkin's actions.

Following these events, the chapter and the Wikimedia Foundation (WMF) published a joint statement on September 28, 2012, where they laid out their plan to appoint an independent expert to review and report on the governance practices of WMUK, along with its handling of the controversy. The WMF's head of communications Jay Walsh posted a blog post on February 7, which said in part:


Compass Partnership was appointed to do the review selected through a collaborative dialogue between the WMF and WMUK, and their fee was covered in full by the WMF.

Compass reported that while WMUK had conflict of interest guidelines, and individual trustees had typically stated their conflicts of interest—including Bamkin—the former were "not always implemented to the standard expected by the movement" and the latter could have been made much more transparent (pp. 13–14). In particular, with regard to the Bamkin controversy, the report found no "indication that the Wikimedia UK board formally asked to know the monetary value of any personal contracts to permit an assessment of the material extent of Roger Bamkin's consultancy work" (p. 8). While some individuals interviewed by Compass believed that the foundation would have known of the conflicts of interest through various postings on WMUK's website, Compass found that the declarations were only posted after discussions with the WMF had already begun, and there was no reference to conflicts of interests in WMUK's reports to the foundation.

Compass laid out 50 recommendations that it believes WMUK should implement to better capitalize on previous positive actions and tackle areas identified as needing work (pp. 17–26). Conflicts of interest were principally dealt with in recommendations 26 through 32, where Compass stated that WMUK should observe the "highest standard" in dealing with potential conflicts of interest.

To do this, Compass recommended that if WMUK trustees thought that there could be "any potential for the perception of a conflict of interest", they should contact the chair. Furthermore, when judging this, the board should gather all of the necessary information before coming to a decision, which includes "the size and extent of the personal or financial interest and the identity of relevant business associates." If this is not possible, Compass believes that WMUK should automatically assume that there is a conflict, and possibly request the resignation of the trustee.

Roger Bamkin, when contacted by the Signpost, told us that recommendation 32 may make it difficult to use otherwise perfectly suited candidates in the short term, but as recommended by the review, he believes that the "role of trustees will change and staff members will be available to take on more of the management roles." He also found that recommendation 47 (pp. 25–26), which regards the negotiations required for the use of the Wikimedia trademark and the role of conflict of interest declarations in them, "is a very good idea that will add to the important and essential safeguards of due diligence, the need to make no assumptions about contracts, and to check when the trademark agreement is required."

When asked about recommendation 50, which read in part that "Wikimedia UK should swiftly come to agreement with the owners of QRpedia on the future ownership of this software", Bamkin pointed to a recent agreement with WMUK, which will transfer the domain names and intellectual property of QRpedia to WMUK, while allowing Bamkin and its coder, Terence Eden, moral rights of attribution without financial compensation.

The current chair of WMUK's Board of Trustees, Chris Keating, stated to the Signpost via email:


The governance review, which also gave recommendations on items like the size of WMUK's board, how to run board meetings, and the relationship of WMUK with the Wikimedia movement, is available on Commons. A centralized discussion of it is taking place on meta, and there is a questions and answers page on the WMUK blog.

In brief

  • Picture of the Year: The Wikimedia Commons' Picture of the Year contest has entered round two, where editors with more than 75 edits may vote for one picture. Voting will be open until 14 February.
  • Fundraising: The Wikimedia Foundation (WMF) is planning to start testing new fundraising banners on 5% of anonymous users. No banners will be shown to logged-in users, nor those in previously targeted countries. Last year's fundraiser was conducted in December, but only in the top five English-speaking countries: the United States, United Kingdom, Canada, Australia, and New Zealand. The banners were taken down early after the foundation hit its US$25 million target.
  • Echo: The WMF has published a blog post introducing a new notifications system, called Echo. The Editor Engagement Team hopes that it will answer the question, "How can our users learn about events that affect them, so they can contribute more productively to MediaWiki sites like Wikipedia?"
  • Individual Engagement Grants: Applications for IEGs, the new WMF grant scheme, are due by February 15 and can be reviewed on Meta.
  • Steward election: The annual election of stewards, who have complete access on all WMF wikis to deal with transproject vandalism, among other matters, is open for voting until February 27.
  • English Wikipedia

Wikipedia:Wikipedia Signpost/2013-02-11/Serendipity


2013-02-11

An article is a construct – hoaxes and Wikipedia

The views expressed in this op-ed are those of the author only; responses and critical commentary are invited in the comments section. The Signpost welcomes proposals for op-eds at our opinion desk.

Wikipedia gets quite a bit of press attention from drive-by vandalism, incoherent scribbles, rude gestures, and just plain page blanking perpetuated by Internet trolls and schoolchildren who take the site's free-to-edit model as an invitation to cause as much havoc as possible. The public perception that Wikipedia is riddled with errors and perpetually vandalized was a major retardant in the site's formative years, when it first engaged in its still-central battle for relevance and accuracy.

But this is a battle that, on the whole, Wikipedia has been winning for a lengthy amount of time. Years of nearly unchecked growth and explosive expansion have made Wikipedia not only the largest but also the most expansive information compendium the world has ever seen. Editing is tightly watched by users armed with tools like Twinkle, Huggle, rollback, semiprotection, and bots. Vandalism as we most commonly think of it is anything but dead—visible pages still regularly get as much as 50 percent of their edits reverted[1]—but today's compendium of anti-vandalism tools have confined it in lesser form to the furthest and most overtaxed fringes of Wikipedia.

The dearth of vandalism lasting more than a few seconds has done much to improve our image. Five years ago, a project as enterprising as the Wikipedia Education Program could never have even existed, let alone thrived as it does today.[2] The days when being a regular editor on Wikipedia was seen as unusual by others are slowly becoming more distant, its use ever more mainstream, and its editing body ever more academic. But another, subtler form of vandalism persists, and in the deterioration of its more visible cousin, may even be spreading—fabrication.[3] Wikipedia has a long, daresay storied history with the spinning of yarns; our internal list documents 198 of the largest ones we have caught as of 4 January 2013. This op-ed will attempt to explain why.

It's frighteningly easy

Wikipedia's policy on vandalism is complex and extensive. Coming in at 41 KB, it is best remembered by the {{nutshell}} wrapper that adorns its introduction, stating that "Intentionally making abusive edits to Wikipedia will result in a block", a threat carried through more often than not. At just over 5k, the guideline on dealing with hoaxes is comparatively slim, and readily admits that "it has been tried, tested, and confirmed—it is indeed possible to insert hoaxes into Wikipedia". It is not hard to tell which is the more robust of the two policies.

First and foremost, this is a consequence of Wikipedia's transitional nature. The site has become mired somewhere between the free-for-all construction binge it once was, and the authoritarian, accuracy-driven project it is quickly becoming. The days of rapidly developing horizontal sprawl are long gone, swallowed up by the project's own growth; increasingly narrow redlink gaps and ever deeper vertical coverage are the new vogue, spearheaded by the plumping of standards and the creation of such initiatives as GLAM and the Education Initiative. Wikipedia gets better, but it also gets much more specialist in nature, and this has a major impact on its editing body. Explosive growth both in the number of articles and the number of editors, once the norm, has been superseded by a more than halved level of article creation and the declining number of active editors, both besides bullish, frankly unrealistic growth projections by the Wikimedia Foundation.[4] The project has reached its saturation limit—put another way, there simply aren't enough new people out there with both the will and the smarts to sustain growth—and the result is that an increasingly small, specialized body of editors must curate an increasingly large, increasingly sophisticated project.[5]

A sparser, more specialized editing body dealing with highly developed articles and centered mainly on depth has a harder time vetting edits than a larger, less centric one focused more on article creation. Take myself as an example: while I have the depth of field to make quality tweaks to Axial Seamount, I could never do as good a job fact-checking Battlecruiser as a Majestic Titan editor could, and I cannot even begin to comprehend what is going on at Infinite-dimensional holomorphy. This hasn't mattered much for pure vandalism: the specialization of tools has proved more than adequate to keep trollish edits at bay. But vetting tools have not been so well-improved; the best possible solution available, pending changes, has received a considerable amount of flak for various reasons, and has so far only been rolled out in extremely limited form. On pages not actively monitored by experienced editors, falsified information can and indeed does slide right through; with an ever-shrinking pool of editors tending to an ever growing pool of information, this problem will only get worse for the foreseeable future.

The relative decline in editor vetting capacity is paralleled by the ease with which falsehoods can be inserted into Wikipedia. Falsified encyclopedic content can exist in one of three states, by its potential to fool editors examining it: inserted without a reference, inserted under a legitimate (possibly offline) reference that doesn't actually support the content, and inserted under a spurious (generally offline) reference that doesn't actually exist. While unreferenced statements added to articles are often quickly removed or at least tagged with {{citation needed}} or {{needs references}}, editors who aren't quite knowledgeable about the topic at hand passing over a page are extremely unlikely to check newly added references, even online ones, to make sure the information is legitimate. This is doubly true for citations to offline sources that don't even exist. Taking citations valeur faciale is standard operating procedure on Wikipedia: think of the number of times that you have followed a link through or looked up a paper or fired off an ISBN search to ascertain the credibility of a source in an article you are reading; for most of us, the answer is probably "not many". After all, we're here to write content, not to pore over other articles' sourcing, a tedious operation that most of us would rather not perform.

This is why complex falsifications can be taken further than mere insertions: they can achieve the kinds of quality standards that ought to speedily expel any such inaccuracies with great prejudice. The good article nominations process is staffed in large part by two parties: dedicated reviewers who are veterans of the process, and experienced bystanders who want to do something relatively novel and assist with the project's perennial backlog. In neither case are the editors necessarily taking up topic matters they are familiar with (most of the time they are not), and in neither case are the editors obligated to vet the sourcing of the article in question (they rarely do; otherwise who would bother?[6]), whatever the standards on verifiability may be. And when a featured article nomination is carried through without a contribution of content experts (entirely possible), or the falsification is something relatively innocent like a new quote, such articles may even scale the heights of the highest standard of all in Wikipedia, that much-worshiped bronze star! Nor are hoaxes necessarily limited to solitary pages; they can spread across Wikipedia, either through intentional insertions by the original vandal, or through the process of "organic synthesis"—the tendency of information to disseminate between pages on Wikipedia, either through copypaste or the addition of links.

Then why aren't we buried?

Readers of this op-ed may well take note of its alarmist tone, but they need not be worried: studies of Wikipedia have long shown that Wikipedia is very accurate, and, by derivation, that false information is statistically irrelevant. Well, if as I have striven to show manufacturing hoaxes on Wikipedia is so strikingly easy, why isn't a major problem?

Answering this question requires asking another one: who are vandals, anyway? The creation of effective, long-lasting hoaxes isn't a matter of shifting a few numbers; it requires an understanding of citations and referencing and the manufacture of references to sources, the positing of real intellectual effort into an activity only perpetuated by unsophisticated trolls and bored schoolchildren, and as it turns out the difficulties involved in making believable cases for their misinformation are a high wall for would-be vandals. And even when real hoaxes are made, studies have shown that Wikipedia is generally fairly effective (if not perfect) at keeping its information clean and rid of errors. Hoaxes have reached great prominence, true, but they are small in number, and they can be caught.

But there is nonetheless a lesson to be learned. Wikipedia is extremely vulnerable. If some sophisticated wash wants to launch a smear campaign on the site, falsification would be the way to do it; and that is something that should concern us. The continual unveiling and debunking of hoaxes long after they have been created is a drag on the project's credibility and on its welfare, and when news breaks out about hoaxes on the site in the media it takes a toll on our mainstream acceptance. This is not a problem that can be easily solved; but nor is it one that should be, as it is now, easily ignored.

Addendum: some highlights

Sorted by date of discovery, here is a selection of what I consider to be fifteen of the most impactful and notable hoaxes known to have existed on Wikipedia.

  • November 6, 2003 – February 23, 2004: Uqbar. One of the earliest hoaxes to have been debunked, the kingdom of Uqbar is a historical hoax (a story within a story) that was passed off as real early in Wikipedia's history.
  • December 2004 – April 2005: Roylee. A referral for comment on four months of activity from a user who "has carried out a sustained introduction of fringe theories and original research into a large number of articles (145 listed at User:Mark Dingemanse/Roylee [defunct]) since December 2004."
  • May 26 – September 22, 2005: Wikipedia biography controversy. To quote from the article: "a series of events that began in May 2005 with the anonymous posting of a hoax article ... about John Seigenthaler, a well-known American journalist. The article falsely stated that Seigenthaler had been a suspect in the assassinations of U.S. President John F. Kennedy and Attorney General Robert F. Kennedy. Then 78-year-old Seigenthaler, who had been a friend and aide to Robert Kennedy, characterized the Wikipedia entry about him as "Internet character assassination". The hoax was not discovered and corrected until September 2005... after the incident, Wikipedia co-founder Jimmy Wales stated that the encyclopedia had barred unregistered users from creating new articles."
  • October 5 – 26, 2005: Alan Mcilwraith. A former call center worker who created a new identity for himself as a decorated military man on Wikipedia, complete with an in-uniform portrait (now known to have been bought on eBay). The story hit headlines in April 2006, and the article was recreated—now about the hoax he perpetuated (see Signpost coverage).
  • ? – March 3, 2007: Essjay controversy. The only fabrication on Wikipedia major enough to have a 39k Good article to call all of its own, this was a hoax not in the classical sense—that is, not carried out across the mainspace—but in an extremely prominent editor's falsified credentials; when combined with a poorly timed promotion to ArbCom, the result was a spectacular fireworks display.
  • November 2005 – 21 June 2007: Baldock Beer Disaster. A disaster in more ways than one; the article appeared on the Main Page as a Did you know? entry on November 25, 2005, and was not rooted out until more than a year and a half later.
  • November 18 – December 18, 2008: Edward Owens hoax. A fisherman turned pirate who never really existed, created by students as part of a class exercise at George Mason University; now has its own article.
  • September 13–14, 2010: Roger Vinson. An addition was made claiming that the man in question, a federal judge in Florida, is an avid taxidermist who displays mounted bear heads in his courtroom. When Rush Limbaugh used this erroneous information on his talk show, it sparked a media reaction—a demonstration of how even relatively short-lived pieces of vandalism can be damaging.
  • Spring 2009 – October 2011: Cohen-Cruse Ruse. "A number of apparent sock puppets seem to be creating an elaborate set of fake pages around a few members of a "Cohen" and a "Cruse" family. It involved a number of completely (very carefully) faked biographies, other faked things (like synagogues) and a lot of associated edits to real pages that attempted to justify and contextualize those fake people." It lasted two years, and a major community clean-up followed.
  • ? – February 15, 2012: Legolas2186. Allegations of impropriety were brought against Legolas2186, a prolific (and supposedly trustworthy) writer with a large number of Madonna-related article credits to his name. As was eventually discovered, Legolas had been manufacturing sources, inventing information, and generally doing as he damn well pleased with his sourcing. A permanent ban and months of clean-up by the community followed (see Signpost coverage).
  • March 8, 2006 – March 21, 2012: Brierfield, Lancashire. An addition was made claiming that the small town was the primary inspiration for Tolkien's Mordor. By the time it was removed in March 2012, it had been on the page for a good six years.
  • June 9, 2004 – July 13, 2012: Gaius Flavius Antoninus. Created on June 9, 2004 and lasting eight years and one month before discovery, this purported assassin of Julius Caesar has the honor of being the longest-lasting hoax ever created on Wikipedia. Given the level of dissemination that happened in that time and the prominence of Caesar's (historically classical) assassination, it's also probably one of the most illustrative of the failings of Wikipedian vetting.
  • September 25 – November 19, 2012: Chen Fang. Chen Fang was the mayor of a small town in China, but he was also a student at an American university who created a fictional article about himself to make a statement about Wikipedian inaccuracy, and his case was cited in a Harvard University writing guideline on the topic. It took seven years and two months for someone to notice.
  • July 4, 2007 – January 28, 2013: Bicholim conflict. The primary inspiration for this op-ed, the Bicholim conflict is (was) one of the most complex and well-crafted hoaxes to have existed on Wikipedia, and spent half a decade, most of its life, as a supposedly verified Good article. A complete fabrication, in 4,500 words it described a clash between colonial Portugal and the Indian Maratha Empire in an undeclared war that supposedly helped cement Goa's independence (see Signpost coverage).
  • ? – February 1, 2013: Bonō Pusī Kalnapilis. A hoax created on our sister project, the German Wikipedia, that was not discovered to be a hoax until it was selected as a Did you know? entry, spending two hours on the main page before being caught.

Notes

  1. ^ See the rough guide to semi-protection.
  2. ^ Not to imply that it has been unilaterally successful, but rather that it is quite voluminous.
  3. ^ The difference between fabrication and hoaxes on Wikipedia is not strictly defined, as Wikipedia hoaxes are technically articles that are spurious. This op-ed will treat the matter in a wider sense and include smaller bits of misinformation.
  4. ^ Per the movement goals of the Strategic Panning Initiative.
  5. ^ For more information on the why of Wikipedian editing trends, refer to this op ed: "Openness versus quality: why we're doing it wrong, and how to fix it". For more details on the Wikimedia Foundation's response, refer to this special report: "Fighting the decline by restricting article creation?".
  6. ^ Good article reviewers are as much regular editors as the next fellow, which means that they find vetting references about as fun as the next fellow—that is to say, not at all. But see revisions made to the reviewing guideline in light of recent discussion on the topic.

Wikipedia:Wikipedia Signpost/2013-02-11/In focus Wikipedia:Wikipedia Signpost/2013-02-11/Arbitration report Wikipedia:Wikipedia Signpost/2013-02-11/Humour

If articles have been updated, you may need to refresh the single-page edition.



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0