The Signpost

Opinion

Trusting Everybody to Work Together

Contribute   —  
Share this
By Pete Forsyth
This article by Pete Forsyth, a former editor-in-chief of The Signpost, is an edited version of an article published by Wikipedia @ 20 (CC BY 4.0)

I first embraced Wikipedia's radical, open invitation to write an encyclopedia in 2006. Most other web platforms at that time featured restrictive permission schemes. Their software, policies, and culture sharply limited users' ability to express their ideas. Wikipedia's platform, by contrast, mostly stayed out of my way. I was free to explore my interests and share my knowledge. I quickly got to know and work with other people who shared my interests, without needing to seek permission first. The site's documentation and policies encouraged me to use my own judgment and contribute what I felt was worthwhile; and its inner workings reinforced that ethos, enabling me to simply follow my own conscience and collaborate.

Wikipedia's creators eliminated the editorial bottleneck in traditional encyclopedia writing by using a flexible software platform that empowered contributors. But the original wiki software they had adopted in 2001 wasn't fully up to the task. Key policies and software features had to be developed before Wikipedia could truly become "the encyclopedia anyone can edit"—that is, before "anyone" could come to include hundreds of thousands of people working in concert. Once implemented, those early improvements to the platform also benefited discerning readers, providing insights into the production process. This element—facilitating reader insight into writers' actions and motivations—may at the time have seemed an ancillary benefit, secondary to the need to support a burgeoning community of writers and editors. But transparency to readers has become a key component of Wikipedia's identity, and as Wikipedia's star has risen, many of us have come to expect greater transparency from more traditional publications as well.

Those building the early versions of Wikipedia's software were guided more by instinct than by careful planning. Staff and volunteers deliberated individual features on the Wikipedia-L email list and on the wiki itself. Reading through those old messages, which are publicly archived, one gets to know a community in pursuit of a shared ideal, rather than a businesslike group pursuing a carefully planned strategic roadmap. The email list discussions are rife with aspirations and idealism, but they lack the kind of gravitas one might expect in the founding days of what would become one of the world's top websites. Wikipedia's earliest architects, both staff and volunteer, managed to construct a robust core of policies, procedures, and software that in many ways outstripped the projects planned out by the executives and investors of other major websites.

Two decades into Wikipedia’s existence, the importance of some of the key software features introduced in those early days is not widely understood. As I'll discuss, the early software of Wikipedia empowered writers and readers alike with a complete picture of the activities involved in producing the site's content; but more recent changes to the software have at times eroded the completeness of information available to those users, seemingly with little appreciation for its significance. Maybe the early architects' scorn for such norms as formal hypotheses, approval mechanisms, analysis of risk vs. opportunity, and detailed reports was a double-edged sword; it enabled them to accomplish incredible things at an inflection point in the Internet's evolution, but it did little to establish broad buy-in to a set of principles that could guide future efforts. Since there was no clearly-articulated central theory in the early days, we have little to go on in explaining how Wikipedia's basic infrastructure has come to support so much generative collaboration. A clear understanding of Wikipedia's early success would be of interest to Internet enthusiasts and historians, but it would have great practical value as well. Such an understanding should inform the kind of strategic vision and planning that a project of Wikipedia's current scale requires.

A theory explaining the unexpected success of Wikipedia could also inform other projects focused on organizing people around the globe to build an information commons. Other sites have tried and failed to replicate Wikipedia's success, but have those efforts been driven by a fulsome understanding of why Wikipedia succeeded? In the absence of a clear, widely accepted theory, we should view definitive statements that Wikipedia's success cannot be replicated with skepticism. We should hesitate to dismiss the possibility of other projects emulating Wikipedia's success, and instead continue to ask basic questions, until we arrive at satisfactory answers.

The appeal of transparency in a time of uncertainty

By the time I started editing in earnest in 2006, the software and policy infrastructure I discuss in this essay was largely in place. Wikipedia was well known, and well on its way to mainstream acceptance. Its English edition had nearly a million articles. It was growing by every measure and would soon become one of the top 10 most visited sites in the world. The masses, however, had still not grasped Wikipedia’s grand gesture of trust, or the possibilities it opened up for writers everywhere to influence the way knowledge is shared. The openness that made Wikipedia possible also made it the butt of numerous jokes, from academia to late night television.

The seemingly preposterous Wikipedia policy "Ignore All Rules" (IAR) held the same status as sensible policies like "Wikipedia is an Encyclopedia" and "Wikipedia is written from a Neutral Point of View". An early formulation of IAR stated: "if rules make you nervous and depressed, then simply use common sense as you go about working on the encyclopedia". Wikipedia dared to defy conventional wisdom, in order to magnanimously welcome the good faith of anyone and everyone.

My own nervous depression, however, had nothing to do with Wikipedia's rules. I worried about rules and traditions more broadly conceived. I worried that society's vast collection of rules, both written and unwritten, might not provide a framework robust enough to bring about a peaceful, respectful, sustainable civilization. Because some crazy stuff was happening.

The early 2000s were defined, for me and for many others around the world, by the horrific attacks on the U.S. on September 11, 2001, and by the political response to them. That day began with the inconceivable news that terrorists had hijacked multiple airplanes, and used them as weapons to kill thousands of civilians and strike at the heart of the country's government and financial institutions. But this was just the first wave of attacks on civilized life: the U.S. government, in its various responses, seemed all too ready to sacrifice honesty, transparency, and civil liberties in the name of security, causing further institutional damage from within.

In 2006, the U.S. Senate—a body often praised for its rigorous adherence to elaborate, time-tested rules ensuring rational and accountable decisions—passed a significant bill, the reauthorization of the USA PATRIOT Act. After the bill was signed into law, though, the public suddenly learned that last-minute changes to its text permitted the President to appoint prosecutors unilaterally. How could such significant changes be made in secret, especially for such a consequential bill? Who had made the changes? Amazingly, nobody seemed to know.

News outlets initially reported that Arlen Specter, chair of the Senate Judiciary Committee, was responsible. But the senator disavowed all knowledge. Reporters took his denials at face value, even before any alternative theory emerged. I found this confusing. Eventually, we learned that the committee's legal counsel had made the changes. This occurred without the senator's knowledge, but under his authority. Who was responsible for the changes? When it came down to it...nobody was.

Months later, President George W. Bush took advantage of the bill's provisions; in so doing, he ignited a scandal around the politicization of the court system. Senators on the committee, however, professed to have been no more aware of the changes to the law than the public and the press. The opacity of the entire situation was baffling. Wasn't this law, like all our laws, deliberated and passed in public view, and weren't records carefully preserved? Didn't legislators, or at least their staffs, know the first thing about using software, or any number of more traditional tools, to keep track of who made what changes? What was the point of a democratic system of government if a single, unelected person could slip a momentous provision into a law unnoticed? I longed for a system that did a better job of standing up for principles of transparency and accountability.[1]

I began dabbling with Wikipedia that same year. I didn't realize it yet, but working on this platform would gradually restore my sense of hope that humans could self-organize to make the world a better place. The philosophy behind Wikipedia, as expressed through policies and software design, drew on many familiar traditions; but the site's idiosyncratic take in longstanding concepts was new and refreshing.

Wikipedia's software, in contrast to the workings of our nominally democratic system of government, exposed data about who was making changes to what. As soon as a change was made, anybody on the Internet could learn who (or at least, what user account) had taken action and exactly what he or she had done. On Wikipedia, one didn't have to rely on civil servants or the press to make such information available; the information, by design and by reliable automated processes, was just a few clicks away.

Wikipedia's wiki software had roots in a software movement founded in the 1970s. At the core, the "free and open source software" (FOSS) movement was not based in anything technical, but on a basic assertion of rights on behalf of the individuals who build and use software. Wiki software was invented in 1995 by engineer Ward Cunningham. Cunningham aimed to enable software engineers to link their experiences and work collaboratively to document patterns in programming. The principles driving these software initiatives reflected the ethos of empowerment supposedly built into our democratic system of government, but seemingly on the wane in the broad public sphere. Even as I developed doubts about whether democratic values could survive in our social and government institutions, I was heartened to see them taking root in the tech world.

I had followed FOSS and wiki software for years, but opportunities to participate in earnest had eluded me. I wasn't a hardcore programmer, and in those days it was pretty difficult to even install and use most FOSS software. With Wikipedia, for the first time, my writing and editing abilities were enough to allow me to get involved.

In 2008, I attended RecentChangesCamp, a wiki conference in Palo Alto. Ehud Lamm, an academic from Tel Aviv, convened a discussion about whether wiki and its principles could help resolve conflict between Israelis and Palestinians. And so, there it was: confirmation that I wasn’t alone in finding parallels and connections between the world of wiki, and the most pressing political problems in the wider world.

The next year, the Wikimedia Foundation hired me to design a pilot program for putting Wikipedia to work in higher education. As I interviewed university teachers across the U.S., I learned that many saw great promise in Wikipedia's ability to impart skills that were difficult to teach in a traditional classroom setting. Wikipedia permitted people all over the world to communicate about the process of building and improving encyclopedic content. Teachers wanted to empower their students to interact, in substantive ways, with people of varying expertise and backgrounds across the world, all in the course of a single term. In academia, I was told, this kind of discourse was generally confined to academic journals, where response articles might be published year upon year. But only scholars advanced enough to successfully submit academic papers, and invested enough in the academic lifestyle to follow discourse on a scale of years or decades, could participate. With proper guidance in Wikipedia's unique environment, undergraduate students could more readily engage in a collaborative dynamic with feedback on a minute-by-minute basis, reaching far beyond the classroom for valuable learning opportunities.

Conditions that support collaboration

Early authors writing about Wikipedia emphasized the importance of tools that support community and communication. The wiki software Wikipedia originally employed in 2001, of course, already included many collaboration features. But important innovations were needed before Wikipedia's contributors could collaborate effectively at scale without active intervention by an traditional expert editors.

Ward Cunningham had built the first wiki, called WikiWikiWeb, to help programmers exchange knowledge about their field more freely. But Wikipedia's purpose was more specific and more ambitious—to enlist the masses to produce a coherent, canonical summary of human knowledge. Several principles guided Wikipedia from the early days; some have been formalized into lasting policies and documentation. Founder Jimmy Wales emphasized "fun for the contributors" as the most important guiding principle in a June 2001 email, predicting that without it, Wikipedia would die.[2]

Respecting human judgment and good intentions also held central importance. Around the time he first formulated the "Ignore All Rules" principle discussed previously, Lee Daniel Crocker, an active participant in Wikipedia's original email list who built an early version of the MediaWiki software, emphasized the importance of using judgment, rather than adopting "hard-and-fast rules."[3] Wales generally concurred.

A web platform's software influences and constrains how users engage with one another, so it plays an outsized role in defining the platform's identity. It establishes the context in which people do or don't enjoy themselves or feel empowered. Web software is often designed to strategically accomplish specific desired outcomes; the Google Analytics tool, for example, is popular in part because it permits a website operator to design campaigns, and then test how closely the site's users follow the desired paths through the site.

Wikipedia's early days, by contrast, were characterized by an absence of explicit strategic planning. In 2006, a Wikipedia user quipped: "The problem with Wikipedia is that it only works in practice. In theory, it's a total disaster."[4] The half-serious notion that no theory can capture the magic of Wikipedia caught on, and is often repeated by Wikipedians and Wikipedia scholars in presentations and in informal conversation. The idea even survived the Wikimedia Foundation's major strategic planning effort of 2009–10. As recently as 2015, respected Internet scholar Jonathan Zittrain used a variant of the phrase to anchor a video praising the value of Wikipedia.[5]

It's not so unusual, though, for new phenomena to initially elude theoretical or scientific explanation. This kind of mystery is what drives scientific inquiry. Scholars do not typically accept defeat in the face of unexplainable phenomena, but rather work to construct new theories that can explain them. Wikipedia, surprising and unfamiliar as it may have been when it emerged, should be no exception. If and when a robust theory of Wikipedia's "magic" emerges, I believe it will give prominent attention to a collection of about eight mutually supporting software features. Many of these eight features, noted below, emerged in the early days of refining Wikipedia's software.

MediaWiki, the wiki software tailored to serve Wikipedia's needs in the early 2000s, grew to incorporate these eight features. These features support one another, providing a complete picture of what's going on. Working with a fairly complete and self-supporting set of data, a Wikipedian can know about the activities of those working on the same content, without relying on other humans or advanced software permissions.

For a more familiar analogy, imagine that a board member newly appointed to help run a company is initially provided only a few tax returns. The board member wants to familiarize herself with the company's finances, and analyze which clients are most important to the company's future; but the tax returns only provide so much information. She won't learn much about individual clients from the tax returns, which provide only a certain kind of view of the company's finances. So she has to rely on a cumbersome process of asking questions of the accounting staff and her peers before she can really start to learn about the finances. What this sophisticated board member really needs, to efficiently accomplish her goal, is direct access to the company's accounting data. That data is complete; it can be audited in various ways to verify that it is internally consistent, that the amount of money in the bank aligns with the flow of income and expenses, and so forth.

Wikipedia users all have access to the site's underlying data, analogous to the company's accounting data. If they want to understand how things work, they don't need to first earn special privileges or build relationships with the proper executives or support staff. Wikipedia provides that data to everyone; so a writer or editor can easily learn the following things (with the relevant software features noted in parentheses):

The ability to find this kind of information can feed a Wikipedian's sense of confidence. When an editor thinks something in an article needs to be adjusted, but wants to fully understand how the article got to its present state, there's no need to email previous contributors, or an editor in charge of the article, and wait for a response, hoping that one hasn't inadvertently caused offense; the software quietly and reliably provides the relevant data. Those same data can help Wikipedians find others interested in the same topics, can help them resolve their own disagreements, and can help them learn technical processes by looking at one another's code.

The "view history" screen displays every edit chronologically, like version tracking features in a word processor. It can reveal whether a block of text was added all at once or piece-by-piece, by one author or by several. If the Wikipedian is curious about a specific change, the view history screen will guide her to a "diff" screen, which conveys exactly what changed between any two revisions, and indicates who's responsible. If that earlier editor chose to enter an edit summary, that summary is conveyed in the diff as well, reducing the guesswork involved in figuring out what motivated the change.

In addition, anyone consulting the encyclopedia also has access to this information. If a sophisticated reader familiar with the features outlined above sees something in an article that doesn’t ring true to them, he can learn something about how the text evolved to that point, including what the article's authors argued about, or what sources they considered but dismissed.

If the Senate Judiciary Committee used an open installation of MediaWiki to conduct its work, no reporter would ever have to ask who added a sentence to a bill; they could just consult the software's record, and see exactly who entered the text. If the lawyer entering the text wanted to demonstrate their diligence, they could use the edit summary field to provide deeper insights, for instance: "New provision entered per discussion with both Illinois senators."

But let's stick with Wikipedia. Let's imagine that sophisticated reader wants to influence the Wikipedia article. He can easily do so, by communicating with the editors or even becoming an editor himself. MediaWiki software makes it easy to:

Every Wikipedia article has a "talk page" for in-depth discussion. This permits editors to collect links to source materials, describe their longer-term efforts to expand a section, or talk through any differing views about what belongs in the article. Any Wikipedia contributor can raise complex topics for discussion, or ask her peers about their previous changes. This is a feature that didn't exist in Cunningham's original wiki.

In 2016, I invited Cunningham to discuss the ways wiki software had evolved since his first wiki, including the emergence of the talk page. He was quick to acknowledge that the ambition to build an encyclopedia introduced new needs and prompted the introduction of important new software features. He told me he had not considered separate talk pages when he designed wiki software.

Pages on the WikiWikiWeb and other early wikis reveal a free-flowing evolution of ideas, which might not have emerged if content and discussion were segregated. Wikipedia, however, aimed for certain standards in its main content; so a separate venue for discussion would be needed. Just a couple months after Wikipedia launched, Wikipedia founder Larry Sanger said: “I think the ‘talk’ page for every page should be automatic, one of the default links in the header/footer.”[6] The talk page became an indispensable component of Wikipedia's platform. With the benefit of hindsight, Cunningham credits the talk page as an important innovation that allowed wiki software to be used to build an encyclopedia.

The above mentioned page version history is another Wikipedia software innovation that wasn’t present in the first iteration of the WikiWikiWeb. Cunningham didn't initially preserve old versions at all. He added code to capture the most recent version, only to address a flaw in web browser software. Cunningham preferred to trust in the good intentions of his site's user community, so he initially resisted writing software code to capture old versions. He believed their contributions would tend to be more generative than destructive, and for his purposes that was enough. But he came to recognize that tens or hundreds of thousands of people building an encyclopedia would need ready access to version information. Later, he did expand the version history features, in response to a kind of malicious edits that emerged. His perspective as a software designer was no different from that of any seasoned wiki editor: when others grafted changes onto the core he built, he was quick to observe advantages. He acknowledged that the ability to observe patterns in a user’s work informs working relationships, and that version histories can make such patterns visible to all users.

Based on my experiences as a Wikipedia editor, I explained to Cunningham that I believed Wikipedia’s software supported collaboration by giving editors ready access to a fairly complete set of relevant data—the eight software features listed above. He agreed, and told me about a design concept, the "observe–orient–decide–act" (OODA) loop, which originated in military strategic theory. The OODA loop theory emphasizes that easy access to relevant data is a crucial component of effective decision-making. This theoretical notion has been referenced in a wide variety of fields in the decades since it was first presented; scholars have applied the OODA concept to such topics as nursing home design and self-managed construction teams. I was taken with this concept. After all, what is the work of a Wikipedia editor, if not making a string of decisions? And what decision doesn't benefit from better access to relevant information?

Over the Internet’s first few decades, many websites aiming to generate self-sustaining collaborative activity have come and gone. The presence or absence of the eight features identified above—the ability to perceive or act without editorial intervention—seems to have played a role in the demise of some of these sites.

For instance, in the early 2010s I dabbled with the Open Directory Project (ODP), a predecessor of Wikipedia that built a web directory through broad-based peer production. Accustomed to wikis, I was frequently frustrated by the challenges of determining exactly what had changed and when, who had done it and why, and how I could effectively engage them in discussion if I disagreed. I would struggle to find the right venue to inquire about such issues. When I was lucky enough to get a response, sometimes weeks after I had moved onto something else, it was sometimes unhelpful or even dismissive. Compared to my experience on Wikipedia, I had to spend more time, and expose myself to more frequent judgment by fellow volunteers, to accomplish even the most basic tasks. In my view, the challenges inherent in contributing to the ODP did not bode well for its long-term survival, especially when compared to sites like Wikipedia, where working together is easier.[7]

Wikipedia users have ready access to information, such as what changes have been made and who made them, that allows them to meet their own needs. By default, users have a great deal of autonomy; to accomplish basic tasks, they don't need to seek out special privileges or privileged information. To use the terminology of OODA, the software helps them observe their surroundings and orient themselves prior to deciding and acting.

In contrast to traditional content production models, users encounter fewer obstacles to creating high quality content.

Mobile Wikipedia: Core design principles forsaken

In the present decade, new hardware has expanded our options for editing Wikipedia. Many of us now carry powerful computers, known as smartphones, everywhere we go. We've become accustomed to using them to change things on the web, whether by updating our social media profiles, buying and selling on an e-commerce platform, or publishing blog posts.

Wikipedia is no exception. Since the advent of Internet-enabled phones, people have been using mobile devices to create and edit Wikipedia content. This was initially possible, if cumbersome, by using Wikipedia's normal web interface. Beginning about 2013, the Wikimedia Foundation began offering two new interfaces specially designed for mobile devices: the Wikipedia mobile web interface and the Wikipedia mobile app.

The mobile-specific options, however, leave out vital parts of the desktop experience. Specifically, they leave out some of the eight core software features listed above. They offer no access to talk pages and only minimal insight into version history. If a Wikipedian wants to understand the motivations and actions of her fellow editors, the mobile interface offers few clues about what they are doing or why they are doing it. A Wikipedia editor (or a Wikipedia reader, for that matter) is at a disadvantage when using the mobile interfaces.

How did that come to be? One might expect the experience of Wikipedia editors to play a central role in designing new software. In the early days, ideas flowed freely on a public email list and on wiki talk pages. But as Wikipedia entered its teens, the collaborative spirit between Wikimedia Foundation personnel and active editors had declined, leading to more complex dynamics in determining what features best serves software users. As the Wikimedia Foundation developed more sophisticated software for Wikipedia, its leaders at times disregarded feedback coming from actual Wikipedia editors. I had this experience in 2014, when a thousand Wikimedians joined me in expressing our shared concern about the Wikimedia Foundation’s approach to deploying a new piece of software. The Foundation never even acknowledged receipt of the letter, nor did it respond to the two requests it contained.[8]

The next year, Wikipedian and author Andrew Lih argued in a widely circulated op-ed that mobile devices were crucial to Wikipedia's future.[9] But the division between those designing the software and those using it was again on display. Lih quoted a member of the foundation board, who said: "some communities have become so change-resistant and innovation-averse" that they risk staying "stuck in 2006 while the rest of the Internet is thinking about 2020 and the next three billion users".[10]

Veteran Wikipedia author Jim Heaphy published an essay outlining a number of reasons why he uses the desktop interface while editing with his smartphone, instead of the specialized interfaces created by the Wikimedia Foundation, later that year.[11] But his words, it would seem, have gone unheeded; years later, the substantial differences between Wikipedia's desktop and mobile interfaces remain. When using Wikipedia's mobile interfaces, editors are exposed to only a fraction of the valuable information contained in its pages. The products they use obscure crucial information, such as access to talk pages, policy pages, and edit histories, thereby handicapping their experience and limiting their ability to collaborate with their peers.

Collaboration software is also a basis for reader trust

Writers working on a project together need data to help understand one another's motivations and activities. But they are not the only ones who value data beyond the text itself. Critical readers of a published work also need data, for a slightly different purpose: to inform how much trust they should place in what they read.

In recent centuries and in recent years, media scholars and the general public have questioned our ability to trust what we read and what factors influence that trust. The term "fake news" is prominent in contemporary discourse, which often centers on the challenges we face as individuals and as a society in discerning quality in information sources.

In a traditional context, readers looking for data to inform trust were typically limited to familiarity with a publisher's or a writer's reputation. Did you read something in the New York Times? You could probably trust that was true. Did you read it scribbled on a bathroom wall? Not so much. Wikipedia, as a product subject to continuous editing by almost anybody, takes a different path; it does not aspire to the kind of reputational trust enjoyed by the Times, but the eight software features in this essay separate it from the wholly unaccountable bathroom wall.

One of Wikipedia’s core traits is that it blurs the traditional lines between producer and consumer. So with Wikipedia, the kind of trust needed within the community of producers inevitably overlaps with the audience’s trust in Wikipedia. Fortunately, the kind of trust needed to build a working relationship is one of the things supported by Wikipedia’s desktop and server software, and by its attention to the OODA loop.

In one sense, Wikipedia makes things more complicated and messier by blending production and publication. But in so doing, it forces us to address the issues of trust inherent in both. One set of software tools provides editorial insights to writers and readers alike. In that sense, Wikipedia might just point the way toward a more coherent way to address issues of trust. In the articles and talk pages of Wikipedia, I have seen editors firmly committed to opposing views resolve seemingly intractable disputes. By resolving such conflicts, they produce better articles, which serve readers well by helping them understand and contextualize competing views. These dynamics bring my thoughts back to that 2008 discussion with Ehud Lamm, and to the kind of trust that will be needed if we are to overcome violence on a global scale.

Furthermore, as is often said, trust is a two-way street. Treating someone with respect, empowering them, and showing trust in them can often engender reciprocal trust. When Wikipedia takes steps that help its readers and contributors trust its content, it also expresses trust in them.

Throughout society, we are currently grappling with basic epistemic questions. How can we differentiate between "real" and "fake" news? What's the proper role of scientific studies in shaping policy decisions or our day-to-day decisions? Individual judgment—that same quality encouraged by Wikipedia's core policies for editors—is a key asset in charting a path forward. A reader literate in the scientific method is better equipped to evaluate a scientific study than one who has to rely on external authorities; a television viewer well-versed in the techniques of video manipulation or rhetorical trickery will be less susceptible to deception.

Wikipedia’s structure invites individuals and institutions to build literacy skills and develop trust. To the degree that we can put Wikipedia’s tools to appropriate use, we may just have the ability to build trust throughout society and generally make the world work better. Wikipedia doesn’t promise any of this to us, but its software and policies do nudge us in the right direction. Perhaps more than any other factor, those frequent nudges make Wikipedia a valuable resource worth protecting and worth exploring.

As some of the world’s largest technology platforms field tough questions about what value they ultimately provide, Wikipedia stands apart. Its idealistic roots in traditions like wiki and free and open source software, and its ability to build on the lessons of longer-standing social institutions, have served it well. Wikipedia empowers its editors and its readers, and its software encourages everyone involved to find ways to trust one another.

To fully appreciate the value of Wikipedia, a reader needs to consult features like talk pages and edit histories. As Wikipedia has grown, and as MediaWiki and similar software has proliferated across numerous websites, an ability to work with these software features has become a core part of information literacy. They should be taught in our formal educational institutions, and curious readers should investigate them on their own.

Trust in technical and strategic matters lags

In twenty short years, Wikipedia has had a substantial influence on the way software functions and the ways we interact online. From the start, Wikipedia has given its users ready access to relevant data, and has encouraged them to take action. Hundreds of thousands of people have taken up the challenge, and they have produced an enormous amount of useful information.

In spite of the central role of trust in Wikipedia's basic structure, though, an absence of trust has characterized many of the strategic and software design decisions involving Wikipedia. Too often, decisions and actions of the Wikimedia Foundation occur with little apparent connection to the loose-knit communities of volunteer Wikipedia users. In an ideal world, we would be able to recapture something of the collaborative spirit that characterized the early Wikipedia-L email list, which often enabled the experience of the site's heaviest users to directly influence how the software evolved. To apply principles that worked in a community of a few dozen to the sprawling behemoth that Wikipedia has become, of course, is no easy task; but for Wikipedia to truly reach its potential, both the volunteers and the foundation that guides its future should work tirelessly to ensure that relevant expertise is captured and put to good use. The spirit of collaboration that can sometimes work so well in generating encyclopedic content should guide the wider movement as well.

Those building new software intended to support and nurture collaboration would do well to study the interplay of the specific software features described here. This applies as much within the Wikipedia world as outside it: a mobile interface that obscures the "view history" screen, for instance, deprives the reader of a key element required for critical reading and thereby presents an incomplete view of the richness of Wikipedia's content. The platforms that support communication around the world, if they are to serve society well, must take careful stock of the kind of information needed by their editors and readers and ensure that it is presented in useful and coherent ways.

As for the USA PATRIOT Act reauthorization, President Bush signed a law the following year rescinding its most problematic components. The United States Attorney General resigned shortly thereafter, under pressure for his implementation of those components. Perhaps this was a simple case of democratic institutions holding each other accountable. But in those stories, the level of unrest in the public is always a factor and I can't help thinking about the many thousands of people who navigated that complex story of power and policy by consulting a weird encyclopedia, less than seven years into its existence, patched together by writers and researchers from all over the world. Let's hope their sustained commitment to the ideals of Wikipedia is enough to launch the site into future decades, in which we collaborate as effectively on matters strategic as on matters encyclopedic.

References

  1. ^ See Dahlia Lithwick: "Specter Detector: U.S. attorney scandal update: Who’s to blame for those alarming Patriot Act revisions?" Slate, March 5, 2007. Archived at https://slate.com/news-and-politics/2007/03/u-s-attorney-scandal-update-who-s-to-blame-for-those-alarming-patriot-act-revisions.html Accessed May 1, 2019.
  2. ^ Jimmy Wales, "Controversial thoughts," Wikipedia-L (electronic mailing list), June 13, 2001. Archive of message available at: https://lists.wikimedia.org/pipermail/wikipedia-l/2001-June/000187.html
  3. ^ Lee Daniel Crocker, "Reciprocal system…," Wikipedia-L (electronic mailing list), May 25, 2002. Archive of message available at: https://lists.wikimedia.org/pipermail/wikipedia-l/2002-May/002132.html
  4. ^ Gareth Owen, Gareth Owen (Wikipedia user page), January 20, 2006. Archive of posting available at: https://en.wikipedia.org/wiki/Special:Diff/35978744
  5. ^ Jonathan Zittrain, "Why Wikipedia Works Really Well in Practice, Just Not in Theory" (video), Big Think, April 7, 2015. https://bigthink.com/videos/the-model-for-wikipedia-is-truly-unique
  6. ^ Larry Sanger, "Web links subpage," Wikipedia-L (electronic mailing list), May 6, 2001. Archived at https://lists.wikimedia.org/pipermail/wikipedia-l/2001-May/000106.html
  7. ^ Andrew Lih, The Wikipedia Revolution, Hyperion. ISBN 978-1-4013-0371-6 (2009).
  8. ^ Pete Forsyth, "Letter to Wikimedia Foundation: Superprotect and Media Viewer," Meta Wiki, August 19, 2014. Archived at https://meta.wikimedia.org/wiki/Letter_to_Wikimedia_Foundation:_Superprotect_and_Media_Viewer
  9. ^ Andrew Lih, "Can Wikipedia Survive?" (opinion), The New York Times, June 20, 2015. Archived at: https://www.nytimes.com/2015/06/21/opinion/can-wikipedia-survive.html
  10. ^ "María Sefidari, answer to "Use of Superprotect and respect for community consensus", Wikimedia Foundation board elections questions, June 2015. Archived at https://meta.wikimedia.org/wiki/Wikimedia_Foundation_elections/Board_elections/2015/Questions/1
  11. ^ Jim Heaphy, "Smartphone editing," Wikipedia (user page), first published December 2015. Archived at https://en.wikipedia.org/wiki/User:Cullen328/Smartphone_editing


S
In this issue
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.

Trust

There are some systemic factors that make trust difficult
  1. The lack of adequate references. While there are certainly a lot of online archives containing historical and technical data, there are a huge number of documents available only in hardcopy or available only to members of particular organizations.
  2. The lack of an effective dispute resolution mechanism. I've given up on large edits because content is often determined by who has more stamina.
  3. The lack of volunteers. Wikepedia badly needs more editors with a technical background, but not enough are willing to volunteer and not enough organizations are willing to offer the editorial services of their staff pro bono. This leads to articles with significant errors that nobody is willing to correct in a timely fashion.
  4. The presence of circle squarers and other cranks, and plain vandals. Shmuel (Seymour J.) Metz Username:Chatul (talk) 19:07, 1 May 2020 (UTC)[reply]


I only read a fraction of this long editorial. I find the internet resists long-form writing, preferring bits and snippets of hard fact, and the last thing Wikipedia needs is one more long opinion piece. It needs doers, not talkers. Software is secondary at best. If you don't have skilled, mature people of good will who like rolling up their sleeves to work together, it doesn't matter what tools they have. Technical improvements are far down my wish list for Wikipedia. It looks like a lot of wasted time and effort that could be better spent somewhere else. "Ask not what..."
Vmavanti (talk) 03:59, 3 May 2020 (UTC)[reply]
"If you don't have skilled, mature people of good will who like rolling up their sleeves to work together, it doesn't matter what tools they have." I see, so you argue that Wikipedia should never have been created, we should have insisted on Nupedia instead of trying a new software. Nemo 07:53, 3 May 2020 (UTC)[reply]
Did I say that? Where? I don't see how anything you wrote proceeds logically from anything I wrote. When I was kid, my mom used to say to me and my siblings, "Aren't your little legs tired from jumping to conclusions?" The temptation to read between the lines, interpret, and extrapolate is common, but, like this article, it helps very little.
Vmavanti (talk) 14:11, 3 May 2020 (UTC)[reply]
We could certainly use better tools. Teaching at editathons I spend much time explaining the things Visual Editor does poorly, and often newbies fail to understand how to use Talk Pages because they need to learn an old-fashioned markup language before they can ask questions about such matters as, uhh, how to use the markup language. And all this silly stuff about indent and outdent and reply-links and four tildes and "My talk page or yours".
Last year I started doing Quora and it's quite easy. Many writers there do excellent long-form essays on complex topics, probably because they get paid for good work. Some participants there make the suggestion that Quora is better than Wikipedia, which of course is dumb since that's not the purpose, but if a similar level of technical development work could go into Wikipedia forums and mobile usage we would attract, or anyway keep and develop, more new, smart, content editors. Jim.henderson (talk) 00:46, 4 May 2020 (UTC)[reply]
No one needs to learn markup to edit Wikipedia. Or the Visual Editor. Just some basic templates. Colored syntax helps. Sourcing is easy with the pop-up menu templates, but many people don't use them. I don't know why. I don't know what the big deal is about colon indents. I see experienced editors use asterisks. Why? A colon is easier to type than an asterisk. Four tildes: the upper left corner of the keyboard below the ESC key. That's easy. Let's get real. If people can't get the easy stuff right, should they even be editing? No. Talk pages? There's a misnomer. Many people don't respond, and when they do, it's not helpful. Strangers pop up to insult you. Administrators pop up to insult you. There ought to be discussions, but it's usually more like a wrestling match. To repeat my point, tech matters are not high on my long list of priorities for Wikipedia. A lot of people might want to take a look at the proper use of the comma. Maybe stop using the words "launch" and "subsequently" every other sentence. Study impartiality. Maybe stop writing love letters to people and causes they admire. I have found Quora to be nearly useless.Vmavanti (talk) 02:46, 4 May 2020 (UTC)[reply]



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0