Due to the impact of Sue Gardner's comments on Triage (the development project for Page Curation) for New Page Patrolling, this interview is reprinted, unabridged, from the newspaper's January 2011 issue. Gardner was executive director of the Wikimedia Foundation from December 2007 through May 2014
Sue Gardner is [in 2011] the executive director of the Wikimedia Foundation, the non-profit, non-commercial organization that operates nearly 300 online projects. Originally from Canada, she worked with the Canadian Broadcasting Corporation in production, journalism and documentaries.
In 2007, she took up consulting for the Foundation on operations and governance and within a year was in the top job. Her tenure has seen a precipitous increase in the staffing at the San Francisco office, which now employs some 100 people, up from 65 just six months ago, and a budget well in excess of $20M a year. In October 2009, Gardner was named by the Huffington Post as one of ten “media game changers of the year” for the impact on new media of her work for Wikimedia.
The Signpost interviewed Gardner on her fourth anniversary as executive director. In person, the boss of one of the world's most powerful websites is all charm and professionalism. Much of the interview concerned the issues she raised in a landmark address in November to the board of Wikimedia UK,[1] in which she said the slide showing a graph of declining editor retention (below) is what the Foundation calls “the holy-shit slide". This is a huge, "really really bad" problem, she told Wikimedia UK, and is worst on the English and German Wikipedias.
A prominent issue on the English Wikipedia is whether attempts to achieve high quality in articles – and perceptions that this is entangled with unfriendly treatment of newbies by the community – are associated with low rates of attracting and retaining new editors. Although Gardner believes that high quality and attracting new editors are both critical goals, her view is that quality has not been the problem, although she didn't define exactly what article quality is. What we didn’t know in 2007, she said, was that “quality was doing fine, whereas participation was in serious trouble. The English Wikipedia was at the tail end of a significant drop in the retention of new editors: people were giving up the editing process more quickly than ever before.
There aren't enough people to do the work ... people are stressed and they're burned out ... you still have lots of [older editors] doing scut-work ... So where are the new generations of people, relieving them of the need to do all this scut-work?
Sue Gardner, UK address
Participation matters because it drives quality. People come and go naturally, and that means we need to continually bring in and successfully orient new people. If we don’t, the community will shrink over time and quality will suffer. That’s why participation is our top priority right now.
At the core of Gardner's philosophy, then, is an intrinsic connection between editor retention and what she calls openness. But The Signpost wanted to know more about where article quality fits into this model – specifically whether the three factors are sometimes at odds with each other and whether a purely one-way causality is involved. Deletions and reversions might be distasteful to new editors, but how can we, for instance, maintain strict standards about biographies of living people (BLP) without reverting problematic edits and deleting inappropriate articles? Gardner rejected the premise:
I don’t believe that quality and openness are inherently opposed to each other. Openness is what enables and motivates people to show up in the first place. It also means we’ll get some bad faith contributors and some who don’t have the basic competence to contribute well. But that’s a reasonable price to pay for the overall effectiveness of an open system, and it doesn’t invalidate the basic premise of Wikipedia: that openness will lead to quality.
What do you say, we asked, to editors whose focus has been on improving quality and who may have taken your comments and the recent focus of Foundation initiatives as an indication that their contributions aren't valued, or even that they are part of the problem?
If you believe there’s an inherent tension between quality and openness, then yes, you might believe that when I advocate for openness, I’m speaking against quality; but I don’t believe that. Quality improvement work – like page patrolling, the FAC, developing content partnerships, and staging competitions like Wiki loves monuments – makes Wikipedia better and more valuable for readers. Where we run into problems is when we do things that repel or frustrate good-faith new editors. But I’m not sure there’s a fixed relationship between activities designed to improve quality and activities that hurt new editor retention, so I don’t think editors who focus on quality improvement should feel attacked or unappreciated when openness is being emphasized.
... we’re not falling off a cliff; but we are having serious difficulty retaining good faith new editors, and that will cause our community to dwindle if we don’t fix the problem.
— Sue Gardner
Does the Foundation have any solutions to enable the editing community to address the cultural issues that might be driving editors away – beyond the WMF's technical initiatives such as an easier editing interface and means of empowering kitten distribution, and external initiatives such as outreach and institutional partnerships?" For Gardner, "The editor retention problem is our shared problem. ... it's easiest for the Foundation to help when there's a technical or tools aspect to the problem. But when the issue is purely editorial or cultural, it’s harder for us to play a role." She singled out two areas: the first is behavioral problems, and the second the sheer quantity of policy and instructional text ("simplifying it would help everyone").
We queried her take on this second area, pointing out that all publishers that aim to present high-quality information find they need complex rules, whether explicit or via accepted standards of writing and scholarship. Could she give specific examples of areas where we could simplify policy without sacrificing standards?
Yes, the premise of this question is absolutely correct. The analogy I often use is the newsroom. Anybody who’s curious and reasonably intelligent can be a good journalist, but you do need some orientation and guidance. Just like a newsroom couldn’t invite in 100 random people off the street and expect them to make an immediate high-quality contribution, neither can Wikipedia expect that.
So if we say that becoming an editor should be easy, really, that's a little delusional. And it’s exactly why people need easy instructional text and videos. The resources used by Wikipedia Ambassadors aren’t ideal in every respect, but they’re increasingly road-tested and optimized for the real-world instruction of new contributors. They're pretty good. In general, to the extent that we’re showing instructions as part of the user interface, we need to make them concise, and emphasize the must-read items instead of trying to cover every edge case.
I don't have specific examples of where policy should be simplified, but I do think it would be helpful for us to visibly embrace "be bold" again, as well as "break all rules." People get embarrassed when they make mistakes, and some of our policies seem almost impossibly intricate. So, I think one helpful thing we could do is to tell people that making mistakes is normal and okay.
We need to be able to experiment, to do stuff. We’re going to consult when when we think it’s helpful and necessary, … but we need to do tiny bits of experimentation ...
— Sue Gardner, UK address
At this point, Gardner stepped back to take a big-picture view of how the community and the WMF should interface, saying that the Foundation isn't the expert in either the behavioural or the cultural aspects:
The community understands them better than we do, and will probably have better ideas about how to solve them. ... The Foundation will lead on technical initiatives such as the visual editor, and I think the editing community should lead on others. There are other initiatives where we’re partnering with the editing community – for example, the Foundation built the feedback dashboard to make new editors’ experiences visible to experienced editors, and to give experienced editors an easy mechanism for coaching the new people. We’ve started working with editors to create new page triage, a tool that will make page patrolling easier, and will offer options that support and encourage new editors as well as repelling vandalism and other bad edits.
While staking the Foundation's claim to the more technical side of the equation, Gardner doesn't shrink from providing advice on how we can fix the cultural problem.
If you look at new editors’ talk pages, they can be pretty depressing – they’re often an uninterrupted stream of warnings and criticisms. Experienced editors put those warnings there because they want to make Wikipedia better: their intent is good. But the overall effect, we know, is that the new editors get discouraged. They feel like they’re making mistakes, that they’re getting in trouble, people don’t want their help. And so they leave, and who can blame them? We can mitigate some of that by toning down the intimidation factor of the warnings: making them simpler and friendlier. We can also help by adding some praise and thanks into the mix. When the Foundation surveys current editors, they tell us one of the things they enjoy most about editing Wikipedia is when someone they respect tells them they’re doing a good job. Praise and thanks are powerful.
What, then, does Sue Gardner believe are our significant social challenges? She puts these questions in response:
How do we counter systemic bias when it comes to defining reliable sources and notability – that is, in a context where decisions are made by consensus, and in which many types of people are underrepresented, how do we ensure systemic bias doesn’t weaken and harm the quality of the decisions we collectively make? How can we better distinguish in the patrolling process between good faith new-user mistakes, and bad faith edits? What are the three most essential pieces of advice that every new editor should be given, and how do we make them front and centre, early in their experience with us? In general, how can we best equip new editors to edit well, as quickly and enjoyably as possible?
[Around the time of the Siegenthaler and Essjay controversies] Jimmy went to Wikimedia and said "quality … we need to do better", [and through the distortions of the ripple-effect in the projects] there was this moral panic created around quality … what Jimmy said gave a whole lot of people the license to be jerks. ... Folks are playing Wikipedia like it's a video game and their job is to kill vandals ... every now and again a nun or a tourist wanders in front of the AK-47 and gets murdered ...
— Sue Gardner, UK address
Many people have complained that Wikipedia patrollers and administrators have become insular and taken on a bunker mentality, driving new contributors away. Do you agree, and if so, how can this attitude be combated without alienating the current core contributors?
I wouldn’t characterize it as bunker mentality at all. It’s just a system that’s currently optimized for combating bad edits, while being insufficiently concerned with the well-being of new editors who are, in good faith, trying to help the projects. That’s understandable, because it’s a lot easier to optimize for one thing (no bad edit should survive for very long) than for many things (good edits should be preserved and built upon, new editors should be welcomed and coached, etc.). So I don’t think it’s an attitudinal problem, but more an issue of focusing energy now on re-balancing to ensure our processes for patrolling edits, deleting content, etc. are also designed to be encouraging and supportive of new people.
If the presumably less important issue of controversial content merits outside study by consultants, why isn't the Foundation putting resources into having social scientists diagnose the problems of editor retention and offer suggestions to the community to reform its internal culture? Why aren't there Foundation usability experts recommending overhauls of dense and daunting policy and procedural pages and not just of technical aspects such as the interface?
As I said, we do want to work with community members to figure out how to reduce policy and instructional cruft: probably the first step in doing that, though, would be a convening of interested editors. The community department has been talking about doing that. I think it would be very difficult for most social scientists to help us with a highly specific problem like new editor retention, because it's a deep community issue, and not many social scientists understand very much about how Wikipedia works.
Having said that, Gardner pointed out that the Foundation tries to persuade social scientists who do understand Wikipedia to study it, and has done "lots of informal consultation on participation-related issues" with them.
We put it to Gardner that much of the supporting research she cited to the Wikimedia UK board was informal, small-scale or not entirely rigorous (such as the study based on a dubious assumption that any editor who was not a sockpuppet, vandal or spammer was editing in "good faith" – ignoring self-promoters and POV-pushers). Now that we know editor decline is a very serious problem, is a comprehensive and rigorous research initiative being planned to analyze the phenomenon?
No. The Foundation isn’t a research organization, and programs like the Summer of research aren’t intended to conduct comprehensive, authoritative studies, but to produce actionable insights that guide and support our work. That’s why we tend to focus on small, fast, tightly defined research projects that will answer simple questions, such as "Which warning templates deter new editors the least?" and "Which help avenues are most used by new editors?"
How can a culture that has a heavy status quo bias be changed? How can the community be persuaded to become less risk-averse?
My hope is that the community will become less risk-averse as the Foundation makes successful, useful interventions. I believe the Vector usability improvements are generally seen as successful, although they of course haven't gone far enough yet. Wikilove is a small feature, but it’s been adopted by 13 Wikipedia language-versions, plus Commons. The article feedback tool is on the English Wikipedia and is currently being used in seven other projects. The new-editor feedback dashboard is live on the English and Dutch Wikipedias. New warning templates are being tested on the English and Portuguese Wikipedias. And the first opt-in user-facing prototype of the visual editor will be available within a few weeks. My hope is all this will create a virtuous circle: support for openness will begin to increase openness, which will begin to increase new editor retention, which will begin to relieve the workload of experienced editors, which will enable everyone to relax a little and allow for more experimentation and playfulness.
Regaining our sense of openness will be hard work: it flies in the face of some of our strongest and least healthy instincts as human beings. People find it difficult to assume good faith and to devolve power. We naturally put up walls and our brains fall into us-versus-them patterns. That’s normal. But we need to resist it. The Wikimedia projects are a triumph of human achievement, and they’re built on a belief that human beings are generally well-intentioned and want to help. We need to remember that and to behave consistently with it.
Discuss this story