The Signpost

Special report

An advance look at the WMF's fundraising survey

Contribute  —  
Share this
By Gamaliel and Tony1
Lisa Gruwell, the WMF's chief revenue officer

The Wikimedia Foundation gave the Signpost an advance copy of the results of a survey of English Wikipedia readers regarding Wikimedia fundraising, due for official release today, and shared details that do not appear in the final report. The survey, conducted by Lake Research Partners in February, asked a number of questions about readers' attitudes towards the WMF's fundraising to gauge the awareness and effectiveness of those efforts.

This is not the first WMF fundraising survey. After contributing to Wikipedia, donors are able to complete a survey if they wish; more than 250,000 of them did so in December 2014. Professional surveys are also done, such as the 2011 readership survey, which included questions about fundraising—but Lisa Gruwell, the Foundation's chief revenue officer, told the Signpost that "this is the first professional, randomized survey of Wikipedia fundraising to include donors and non-donors."

Methodology

Country Number of readers surveyed (% of respondents) % of total English Wikipedia page views, December 2014
United States 1000 (41.7%) 36.4 %
United Kingdom 500 (20.8%) 9.7 %
Canada 400 (16.7%) 5.8 %
India 0 5.8 %
Australia/New Zealand 400 (16.7%) 3.4 %
Germany 0 2.0 %
Philippines 0 1.5 %
France 0 1.2 %
China 0 1.2 %
Ireland 0 0.7 %

The survey questioned a sample of 2,300 people who said they used Wikipedia at least once a month. They were from five primarily English-speaking countries: the United States, the United Kingdom, Canada, Australia, and New Zealand, with the last two countries combined into one sample group. Celinda Lake of Lake Research told the Signpost that "the sample was stratified geographically by region and the data were weighted by gender, age, region, and race where appropriate to reflect the population of internet users in each country."

Lake said "Respondents for the survey were drawn from an international panel of over 2 million Internet users in the target countries who have agreed to participate in online surveys, supplied by GMI", a market research company that specializes in selecting participants for surveys. The selection bias is unclear for people who apparently had previously agreed to GMI's request that they participate in unspecified online surveys.

Lake Research did not survey readers in countries where English is not the primary language, but nevertheless have significant populations of readers of the English Wikipedia, such as India, Germany, the Philippines, France, and China. Gruwell said that they chose to focus only on five English-speaking countries for a number of reasons, including making the survey more manageable and achieving more comparable results.

The 2,300 participants ranged from daily users of the encyclopedia to casual visitors, with most responding that they use the encyclopedia several times a week (27–35%, depending on the country) or several times a month (21–24%). Those who indicated they use Wikipedia less than once a month were not included in the survey.

Perceptions of Wikipedia

Surprisingly, up to 40% of people are not aware that Wikipedia is a non-profit operation. While this is an improvement on the 2011 Readership Survey – which reported that about half of Wikipedia readers were unaware of this – it is troubling given the importance attached within the community and the Foundation to its non-profit status.

Wikipedia's revenue source was similarly not well understood. UK readers were the only group in which a majority identified reader donations as Wikipedia's primary funding source. Significant numbers of respondents in all countries surveyed have no idea where the money comes from, or identified it as a mix of sources including government funding. A disturbing percentage (13–20%) thought that Wikipedia is supported by advertising.

Perhaps respondents were confusing fundraising appeals with paid advertising. Lake told the Signpost that "They may be confusing other online ads they see in close proximity to their use of Wikipedia pages. They are also possibly making an assumption that since many major websites are supported by ads, Wikipedia must be also." Lake noted the survey also asked if they had seen commercial advertising on Wikipedia. While 56–60% said no, some 20% either thought they had seen ads on the encyclopedia or were unsure.

The result may point to a lack of awareness of Wikipedia's commitment to neutrality and independence. Gruwell said: "This is a fairly consistent misunderstanding about Wikipedia that we see across the board and we have seen it for years, and it's an ongoing challenge for the movement. We think it is important to communicate the non-profit or non-commercial message in banners and other communications to help better educate readers about this."

Motivations

Most respondents recall seeing a fundraising appeal on Wikipedia.

Depending on the country, 55–63% of respondents recalled seeing a fundraising appeal on Wikipedia. When asked about how many they remembered, around one-third said two or three, and slightly fewer said four to nine. Despite some vocal complaints about the frequency of the banners last year, only 4–5% regard the appeals as "too intrusive".

Perhaps the most surprising results were respondents' primary reason for donating to Wikipedia. By overwhelming numbers, they stated that their motivation was their frequent use of Wikipedia. Given that support of Wikipedia is often expressed in terms of a strong commitment towards free and open access and dissemination of knowledge, it is surprising that relatively few donors cited this. Perhaps those who have those principles are already avid users of Wikipedia, or perhaps it is indicative of the widespread use of Wikipedia – as the sixth-most-used website in the world for so long, it may have simply become part of the furniture for most people. This may be a reason for donating, but it may also lead to apathy or indifference in those who do not donate, something the survey was not designed to explore.

Targeting donors

It is important to know why readers donate to Wikipedia, but other questions need to be asked to reach people who do not donate. The survey identified about a quarter of respondents as "donor targets": people who donate elsewhere but not to the encyclopedia.

The survey examined which types of causes were favored among both donors and non-donors. In most of the countries, those who donate to Wikipedia appear to be generally more altruistic than non-donors, in that they are more likely to donate to other charitable causes too. Popular charities among those who did not donate to Wikipedia were those related to poverty, health and medicine, and, in the US, religion, causes which are generally perceived to have little to do with Wikipedia's mission of encyclopedia building and the free dissemination of knowledge.

Given the nature of these causes, specialized appeals may be needed to reach these donor target groups, as they may be unlikely to be swayed by typical fundraising banners or testimonials featuring sad-eyed Wikipedia editors. Targeted banners including statements that Wikipedia is perhaps the world's most frequently consulted medical resource or touting the ability of free access to information to help alleviate poverty may be effective. Gruwell said "It is an interesting finding and we may explore how to talk about this area of work with those who care most about certain kinds of information in Wikipedia."

Fundraising messages: responses and effectiveness

A large fundraising banner

The survey concludes that "Users are not turned off by Wikipedia's fundraising messages", and the survey results appear to bear that out. Despite vocal dissenters voicing criticism of the campaign, these sentiments don't appear to be shared among the respondents. More than half of respondents generally feel that Wikipedia asks for less money than other organizations and that fundraising messages do not appear "very often".

When asked about the statement "I don't mind the fundraising messages on Wikipedia because I know the fundraising is necessary", up to 70% in each country indicated agreement. The suggested cause in this statement might be considered a loaded question. When asked about whether the response would have been different had the last seven words been omitted, Gruwell replied: "It's possible. I don't think we should try to guess how readers would have answered a different question."

Similar questions worded differently yielded smaller majorities, though still indicate a generally positive attitude towards fundraising appeals. For example, 42–51% indicated they "enjoyed learning" about Wikipedia from the fundraising campaign (another potentially loaded question compared with what might have been more neutral wording – "learned" rather than "enjoyed learning"). Fundraising messages "annoyed" 19–31% of people; this might seem to conflict with the results of the "I don't mind" option, or it might indicate that while they are personally annoyed by the messages, they also saw the need for them.

Desktop banner donation rate

Up to half of the respondents said the more the fundraising messages appear, the less they notice them. However, 34–46% said they pay attention to the messages. This may indicate a widespread concern among Wikipedia readers about desensitization and overexposure, even if those readers don't feel they are personally susceptible.

The size of the banner ads was the focus of some of the questions. Readers think the larger banner ads are clearer and more convincing, but only by slight margins. When asked about the intrusiveness of the different sized banners, readers rate them about the same irrespective of size.

Gruwell said: "This survey helps us understand reader's opinions of the banners. We pair this with what we know about what readers do when they see the banners through our donation rates." She pointed to results that appear in the Foundation's latest quarterly review of fundraising which indicate that in December 2014, the large banners produced a donation rate five times that of the smaller ones. Those results speak for themselves, so it appears the large banners are here to stay.

+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
This is with reference to a WMF banner last year that listed a minimum recommended donation of 3 pounds with reference to it being "the price of buying a programmer a coffee", which became the topic of a Wikipediocracy article. ResMar 06:07, 13 March 2015 (UTC)[reply]
A cup of coffee isn't particularly cheap in countries like mine. --NaBUru38 (talk) 00:00, 14 March 2015 (UTC)[reply]

I see ads on Wikipedia fairly frequent. Maybe this report supports that we need to do more about removing / preventing them. Doc James (talk · contribs · email) 06:21, 13 March 2015 (UTC)[reply]

    • "(13–20%) thought that Wikipedia is supported by advertising." 13-20% is in my opinion a good estimate of the proportion of wikipedia articles that are primarily promotional. Probably half are worth rewriting. DGG ( talk ) 06:38, 13 March 2015 (UTC)[reply]
  • There are a fair number of WP mirrors that present WP content (using the same formatting as we do), and often these mirrors do have ads (which to me are different than articles that are promotional/POV). This may explain a part of those 13-20% --Randykitty (talk) 10:37, 13 March 2015 (UTC)[reply]
I'd be surprised if that were the case as I can't imagine very many people use the mirrors. If I had to guess I'd say it'd be that people are classifying global messages they get in the header (particularly fundraising prompts) as advertisements. ResMar 14:59, 13 March 2015 (UTC)[reply]

The report from the researchers is on Commons and also on the Fundraisng Meta page. Lgruwell-WMF (talk) 15:56, 13 March 2015 (UTC)[reply]

It's very good that the WMF did a professional survey of readers. We can count on these results. It should be noticed that the results in general are technically negative - we haven't been doing anything to turn off potential donors. Sometimes negative results are really positive, and they are worth the money spent on the survey. After all, we really want to be sure that we're not killing the goose that lays the golden egg!
I'm wondering how much the survey cost? Not that I want to complain, but it may be worthwhile having a professional survey of readers and editors on other topics as well. For example, how many of our readers are women? How many of our editors are women? Evidence on the 1st question is likely hidden in the current survey, but wasn't published as far as I can tell.
The second question is more difficult to answer. The same sample selection method as the current survey wouldn't work - but another could be devised. This question is at the heart of a current controversy - how to get more women to contribute. The evidence to my reading currently adds up to "somewhere between 10-20% of editors are women" but any details such as "is the percentage of women editors increasing?" are well beyond analysis with the current data. It would be very nice to get a better handle on these questions. A professional survey using reasonable sampling methods should do the job. Smallbones(smalltalk) 17:30, 13 March 2015 (UTC)[reply]
Thanks for the questions Smallbones. I think WMF as a whole is interested in having better, randomized opinion research. This has already been very informative to the fundraising team and has sparked a lot of ideas that we want to test. We decided not to ask any questions at all about editing in this reader survey. It became clear that with a sample size of 2,300 that we would not find very many people who had edited Wikipedia from the audience of people who read Wikipedia at least one a month in these five countries. The number would not likely be statistically significant, so we really couldn't draw many conclusions from it. However, the number that I am excited about is that 49,123 of the 250,000 donors who completed the donor survey in December said they wanted to learn how to edit Wikipedia. With regard to gender, we do have demographic data paired with the responses in this reader survey. We are just beginning to get that analysis and will share if we find any interesting differences on the gender front. Again though, this was focused on understanding reader opinions of our fundraising efforts. We did not touch on editing in this research.--Lgruwell-WMF (talk) 22:47, 13 March 2015 (UTC)[reply]
Yes, it is very good to focus on one major issue at a time, especially to keep the survey short enough so that people will answer the last half as seriously as the first half. So far, so good.
But I do think that further surveys of readers, donors (I haven't seen the 2014 donor survey - is that 250,000 who completed the survey?), and editors would be very useful. In general I think folks don't take readers' opinions seriously enough here. You can get a random sample just from the people who enter the site without logging in, though there will be a selection bias just from people who don't answer. Just don't try to get everyone who enters the site. Perhaps folks who view 3 articles in a row might be a better starting population. Also sample in proportion to the readership according to different times-of-day (so you don't get a higher sample from Europe, the US and Canada, Australia, or India than is normal).
Surveying editors can be important for issues related to the site's governance. Have editors been subjected to gender discrimination? Do they see sexism in our articles? How do they feel about the editing environment (or civility in particular) on the site? Do they see commercial advertisements hidden in our articles? Are the admins and other governance mechanisms responsive to their needs? Tough issues, but sooner or later we're going to have to get a handle on how our editors view them.
The sample of editors will have to have input from the WMF (e.g. the list of editors who made more than 5 edits last month or similar), so there likely will be some concern that the data be kept confidential. I'd suggest something like the WMF lists the overall population to be sampled, a computer selects those to be sampled and sends them to an outside contractor with a key. The contractor has no idea who from the population has been sampled and only gives aggregate results to the WMF. In any case a method can be devised to keep the sample and individual responses totally confidential.
I'll also suggest several small sample surveys per year rather than one big survey. For example, 3 surveys per year (1 every 4 months) of 400 editors each will give more information than an annual survey of 1200 editors. You'll get to see if there are changes over time. The increase in the confidence interval for the smaller samples (say ±5% from ±2%) would not be that important for many issues.
But I'm not saying - follow my guidelines or else you won't get anything meaningful. I'm just saying that there are questions that are commonly discussed on Wikipedia as being important for governance, but nobody has tried to get the answers according to some of the fairly standard statistical methods. Survey professionals should be able to guide the WMF on how to do this properly. Smallbones(smalltalk) 02:38, 14 March 2015 (UTC)[reply]
@Smallbones: With regards to editor surveys, that is definitely on the radar for the new Community Engagement Department. Exact form still to be determined. —Luis V. (WMF) (talk) 18:46, 16 March 2015 (UTC)[reply]

Countries

"The survey questioned a sample of 2,300 people who said they used Wikipedia at least once a month. They were from five primarily English-speaking countries: the United States, the United Kingdom, Canada, Australia, and New Zealand, with the last two countries conflated into one sample group."

The survey is interesting from a journalistic point of view, but it's hardly statistically significant. --NaBUru38 (talk) 23:53, 13 March 2015 (UTC)[reply]

Back of the envelope math:
So with 97.5% confidence the capped inaccuracy for the smallest sample-size population is ±5%. With α = 0.05 that is very reasonable...maybe the Foundation is using the same maths I am! ResMar 01:59, 14 March 2015 (UTC)[reply]
Dear Resident Mario, I agree that 2,300 is a great sample size. I was questioning the small selection of countries, which doesn't fit the great diversity of Wikipedia readers. --NaBUru38 (talk) 03:23, 16 March 2015 (UTC)[reply]
I'd guess the issues of importance on the English, Russian, and Indonesian Wikipedias (as well as for many other languages) can be quite different. I'll suggest that we survey one language WP at a time to avoid mixing up issues and getting the issues of the Indonesian Wikipedias watered down. For example, start with surveys (400 is a great sample size - it allows many surveys to be taken) of the English WP, then the Spanish, German, French, Russian, ... Indonesian, ... , Vietnamese, ... Catalan, etc. We wouldn't be able to do all language versions, but if we did one per month, within a year we'd be able to see how the important issues vary and be able to tell how the needs of the different language versions can be addressed. Mixing it up into one big bag would likely just confuse things. Smallbones(smalltalk) 14:40, 16 March 2015 (UTC)[reply]
Since this is a fundraising survey, you would expect them to exclude all of the (many) countries and languages that the fundraising campaign doesn't target. In fact, this particular survey was about a single campaign, called "Big English", which was (a) only on the English Wikipedia and (b) only shown to logged-out users of the English Wikipedia whose IP address geolocated to those five countries. As a survey of that particular campaign, the survey's limitations were a perfect match. WhatamIdoing (talk) 17:34, 16 March 2015 (UTC)[reply]



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0