The title of last week's piece, "The Tragedy of Wikipedia's commons" was perhaps rather more ironic than its author intended. One of the truly great tragedies of medieval England was not so much the tragedy of the commons in its original sense but the forcible enclosure by powerful outside interests of the historic common land that had for centuries been available as a free resource for all. If there is any tragedy here, it is in the author's wish to use Wikipedia to take over Wikimedia Commons and to do very much the same thing online.
Commons always has and always will have a far broader free-content remit than that of supporting the narrow focus of an encyclopaedia. Commons provides media files in support not just the English Wikipedia but all of the WMF projects, including Wikisource, Wikibooks, Wikivoyage and many more. These sister projects of Wikipedia often have a need to use media on Commons that could never be used on the Wikipedias as they are not - in Wikipedia's narrow sense - "encyclopaedic". Some of Commons' detractors like to give the impression that its collections are nothing more than a dumping ground for random non-educational content. Nothing could be further from the truth, and the energy expended by those who would criticise from the outside (but who are strangely reluctant to engage on wiki) bears little relation to the extremely small proportion of images that could in any way be considered contentious.
Commons' policies are of necessity different and more wide ranging than any of the individual projects. We hold many images that will never be useful to the English Wikipedia, and that is not only OK, but should be welcomed as Commons' contribution to the overall mission of the Wikimedia Foundation, "to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally". Note that the overall mission of the WMF is not "to write an encyclopaedia", but rather to develop and disseminate educational content. Supporting the English Wikipedia is one way, but by no means the only way, in which we do that, and the idea that Commons should be forcibly subjugated to the policies of a specialist encyclopaedia project would do immeasurable harm to the mission which I had hoped we were all working to support.
Contrary to the suggestion that the Commons policy on scope of 2008 was an "unchallenged action by a tiny group of people", it was in fact largely an exercise in documenting for the first time the unwritten long-established practices of the community. The policy attracted very little controversy (despite it being very widely advertised, on Wikipedia and elsewhere) largely because the vast majority of it was uncontentious. Indeed, the fact that it has retained very wide community support since then indicates that we didn't do too bad a job.
With its specialised emphasis on media curation and the niceties of copyright law, Commons will never be as popular a place for editors to hang out as some of the bigger encyclopaedias. It requires not only a particular set of interests, but also at least for admins some level of specialist knowledge which not everyone has or is interested to acquire. Those outside the local community who only see the external carping may not realise that we have thousands of very committed editors who work tirelessly in the background curating and categorising content and bringing to the attention of the admins non-educational content that has no place in our collections.
Commons has never (as was claimed last week) been merely a repository that supports its sister WMF projects. Right from the start it had a remit to make content freely available to external re-users. As early as 2006 there was a formal proposal (since implemented as InstantCommons) to integrate into Mediawiki a mechanism specifically designed to support users on non WMF projects. Perhaps the real worry of last week's author was that Commons currently holds too many non-encyclopedic images of a sexual nature. But even assuming that is true, a proposal to revoke one of the fundamental free content aims of Commons hardly seems proportionate. Instead, let's have a proper discussion on what Commons' scope should be. Times change, as do priorities, and what made sense five years ago may now perhaps need to be revisited.
Over the last few months especially there has been a lot of discussion within Commons as well as outside about issues concerning the small proportion of our holdings that relate to sexual imagery and to privacy/the rights of the subject. Both have complex moral and legal dimensions, and neither has yet been fully resolved. I've set out the main strands of argument below, as objectively as I can, for those who may not be familiar with them. Of course, these summaries are by no means the whole story, and many of the discussions are far more subtle than I have space for here, so please bear with me if you are familiar with this and feel I have mis-characterised or omitted any important point that may be close to your own heart. I deliberately make no comment on the validity of any of these arguments.
Some argue that pornographic images (as defined in some way) are never appropriate for any of the Wikimedia projects and are simply not educational.
Others argue that we should keep most images, almost whatever the subject matter, as we need to show the whole range of human experience if we are to call ourselves a comprehensive educational resource. Anything else would be censorship.
Yet others suggest that not all the sexual images held by Commons are "educational", properly defined. Some are photographs that have been taken for non-educational purposes, for example personal gratification/entertainment, and/or have been uploaded for the same purpose or by users who wish to push an extreme view that equates any limits at all with unacceptable "censorship".
Finally, some hold that Commons has too many images in certain marginally-educational areas that, taken overall, create an oppressive or threatening environment (e.g. for women) which may be harming the project as a whole.
One strand of argument is that we should do more to respect the rights of individuals who are identifiable in a photograph, and recognise that, even where the image may be legal, it can be highly damaging to the individual. Even when an outsider might naively think the image unremarkable, it may still be considered threatening, harassing or oppressive by its subject.
Another strand is that allowing the subject of a photograph a say on whether it should stay on Commons or not opens the door to all sorts of censorship. Proponents argue it's essential that we are able to collect all types of educational image, including those that may offend the subject.
If there is indeed a problem with the boundaries of Commons' scope - perceived or otherwise - we should tackle it head-on with open community discussion. Commons should be and I believe is receptive to the views of everyone within the Wikimedia community in reviewing its curatorial policies. But the way to get things changed is to engage rather than to criticise from afar.
A comprehensive review of Commons' scope is just starting now, and you need never say again that your voice cannot be heard. Please talk.
Please visit Commons' Review of Scope pages now, and make your views known for the sake of all the Wiki communities.
Commons has proved to be a phenomenal success in the years since its introduction, and we should be proud of what has been achieved. We should keep it, improve it, and celebrate it.
Last week, the Signpost published a rather scathing op-ed about Wikimedia Commons, the Wikimedia project which seeks to be a resource of free, educational media. Perhaps you feel it presented a valid argument, perhaps not, that's for you to make up your mind on. I would like to take this chance to offer a defence of Commons.
As you probably know, Wikimedia Commons acts as a central repository for images. Once an image is on Commons, any project can use it, exactly the same way they can use their own images. It's an incredibly valuable tool for the Wikimedia project as a whole, as it prevents duplication and provides a central place to search. You want an image of something for your Wikipedia article? Commons probably has a category for it. And that is the same whether you're editing in English, German, Arabic or even Tagalog.
I first joined Commons back in October 2007, when I was working on an eclectic mix of the Ffestiniog Railway and McFly. About six months later I became a Flickrreviewr, checking uploads from Flickr that for some reason couldn't be checked by a bot, and a month or so after that I became an admin, primarily so I could deal with all the copyright violations I came across with the Flickr work. In the five years since my interest in admin duties has waxed and waned, and I had little side-projects, but Commons had swiftly become my home-wiki. My watchlist has some 60,000 pages on it, of which 10,000 are my own photos.
Commons has its problems, I cannot deny that. The number of people who believe that because they found a photo on Google it can be uploaded to Commons is simply staggering. The search engine is designed for pages not images (a limitation of the software). The community can be a bit fractured, it can be very difficult to get people blocked for being terminally incapable of working with others (even when their name comes back to the admin noticeboards week after week after week), and we have remarkably little in the way of actual policy. Indeed our main guiding principles boil down to two pages: Commons:Licensing and Commons:Project Scope. The former tells us what files we're allowed, the latter which we want. Scope is the real issue of the moment, and in a nutshell it says that Commons collects educational media. Which brings the question, "what is educational?"
A similar problem has existed on Wikipedia for years - what is notable? There are even factions - deletionists, who think articles must prove their notability, and inclusionists, who think that there's no harm in letting potentially non-notable articles stay. And so it is on Commons - those who adhere to a strict definition of educational, and those who accept a somewhat looser guide.
And this dispute would be fine, if it were argued on Commons and in the abstract. But that is not what happens. The major rift happened a few years ago, when, apparently due to a disparaging Fox News article about the amount of "porn" on Wikipedia, Jimbo Wales, co-founder of Wikimedia, came onto Commons and starting deleting sexuality images. That didn't really go over well with the Commons community, of which Jimbo has never been a part, especially when it was found he was deleting images which were in use on multiple projects. To cut a long story short, the deleted images were restored and Jimbo lost admin rights at Commons, as did several admins who had joined him in his purge. Many of the images Jimbo deleted were in fact subsequently deleted again, following deletion requests to allow for Community input. But the deed had been done, and for a large proportion of the Commons community, it appeared that Jimbo was not to be trusted to have the best interests of the project at heart.
The issue stewed for a few years, and reemerged with a vengeance last year. Again, it has been fought almost entirely over what some describe, disparagingly, as "porn". As I mentioned earlier, the Commons search engine is not really designed for images, and so it tends to give unexpected results. One of those results was the search "toothbrush" returning a picture of a woman using an electric toothbrush for self-pleasure as one of the top results. This was entirely a legitimate result - it was a picture of a toothbrush, and it was titled as such. And while the so-called "principle of least astonishment" can easily be applied to categories - Commons has a whole proliferation of "nude or semi-nude people with X" categories on the grounds that nudity should not appear in the parent category "X" - it doesn't really work for a search algorithm, not if you want to continue with correct categorisation. Until the Wikimedia Foundation develops some form of search content filter (which itself brings up issues of what exactly should be filtered - should images of Muhammed be filtered out? What about Nazi images due to German law?) all that can really be done is to either delete the image or rename it to try and reduce the chances of an innocuous search returning it. I personally favour keeping the images, and this has led me to be named as part of a "porn cabal" by people, most of whom rarely if ever edit on Commons, who favour deleting the images.
But the issue, for me, is that these issues so rarely get brought up on Commons. Instead of using the deletion request system to highlight potentially problematic images (which is after all what the process is for), the detractors would rather just soapbox on Wikipedia - usually on Jimbo's talk page - about how awful Commons is, and how this latest penis photo proves once and for all that I (or some other member of the "porn cabal") am the worst admin in the history of forever and deserve to be shot out of a cannon into a pit of ravenous crocodiles. What people don't seem to understand is that in large part, I do agree. Commons has problems. We do have too many low quality penis pictures - so many that we even have a policy on it - and so I have a bot which searches new uploads for nudity categories and creates a gallery so I can see any problematic ones, and thus nominate them for deletion. This somehow seems to make me an even worse admin in many people's eyes. We should indeed have better checks to ensure that people in sexual pictures consented to having their pictures uploaded, and I would like to see a proper policy on this. I'd like to see the community as a whole have a reasoned discussion on the matter, for a policy to be drafted, amended, voted on and finally adopted. But that is very difficult when you feel you are under attack all the time, where your attackers are not willing to actually work with you to create a better project.
Wikimedia projects are based around collaboration and discussion within the community. I would urge those of you who feel that Commons is "broken" to come to Commons and offer constructive advice. Attacking long-term Commons users will get you nowhere, nor will pasting links on other projects, or on Jimbo's talk page. If you truly want to make Commons a better place, and are not in fact just looking for any reason to tear it down, then come to Commons. Come to the village pump - tell us what is wrong, and how you feel we could do better. Use the systems we have in place for project discussions to discuss the project. Sitting back and sniping from afar does nothing for your cause, and it only embitters the Commons community.
Come and talk to us.
Discuss this story
Contentious images
I do not see any issues with contentious images be filtered by choices, as it is all personal preference. One technical way to resolved this, is to tag those photos and let user decide what photos they want to see. For example, tag the "Contentious images" in broad category ie. (Sexual explicit/Violence/Glory/Discretion's. Ordinary user have to explicitly select/tick search result to include photos that falls under that category, Otherwise only non-tag photos will be displayed. This is not censorship, because it is the user decisions to decide what they want to see, without being forced to see photos that they do not want to see in the first place. User have the right to choose. Why should a group of admin decide on behalf of the user what photos user must see/cannot see. User power. Yosri (talk) 01:26, 21 June 2013 (UTC)[reply]
filter
". Thefilter
flag would be disabled by default but made available to anyone. Logged in users could indefinitely opt for this flag by toggling an option in their preferences. Anonymous users could similarly toggle the feature for their session by way of a workaround that stores a temporarily valid key in their cookie data. Thefilter
flag would then function much as does any limited access flag like those that affect Special:DeletedContributions and would allow new types of content to be generated differently on the server side according to the user's preference. As for the classification side of things, let's now suppose that thefilter
flag depends on a database column calledclassification
and its data for any given file is always0
by default. We then create a new Special:Classification page that contains a simple form to change theclassification
code for any given file. — C M B J 23:04, 22 June 2013 (UTC)[reply]Historically and Economically Inaccurate
One of the truly great tragedies of medieval England was not so much the tragedy of the commons in its original sense but the forcible enclosure by powerful outside interests of the historic common land that had for centuries been available as a free resource for all. - no, that's still wrong. But whatever.Volunteer Marek 03:24, 21 June 2013 (UTC)[reply]
Whether the description of the medieval English commons is accurate or not, the problem with this discourse is apparent ignorance of basic economic terminology. The Tragedy of the Commons is an economic concept which is relevant to a discussion about limited resources. However: Wikimedia Commons, like the Creative Commons, is a functionally unbounded public good. It was perhaps clever, but not helpful, for the author of "The Tragedy of Wikipedia's Commons" to conflate a limited with an unbounded resource in an apparent effort to score rhetorical points. I hope the Signpost will continue to strive NOT to publish articles with contrived arguments such as this. Thank you for publishing these more nuanced and considered responses. ChristineBushMV (talk) 16:30, 21 June 2013 (UTC)[reply]
Opinion from Dcoetzee
I came to this party a bit late so I didn't submit an op ed, but wanted to give my thoughts briefly. I'm a long-time adminstrator on both Commons and English Wikipedia, and I refer to both as home wikis. There is substantial overlap between us in the area of image curation and dealing with media licensing issues - I have seen a lot of great work going into Wikipedia:Possibly unfree files here, and Commons itself is quite reliant upon the excellent table at Wikipedia:Non-U.S. copyrights. I believe many of the image admins here on En would be great Commons admins, and vice versa. On the other hand, Commons' understanding of copyright law, U.S. and international, and policies surrounding it are in many ways more nuanced than En's, with extensive pages on issues like non-copyright restrictions, de minimis, and freedom of panorama, and as such it's no surprise that not everyone who excels here has the specialist understanding to administrate content on Commons.
But the main thrust of the original essay was, as these responses suggest, about scope. I want to emphasize what MichaelMaggs referred to as the "small proportion of our holdings that relate to sexual imagery and to privacy/the rights of the subject". Commons does receive a lot of low-quality penis uploads by white first-world males, for whatever reason, and we purge these without prejudice; this inspired the part of the scope policy reading: "poor or mediocre files of common and easy to capture subjects may have no realistic educational value, especially if Commons already hosts many similar or better quality examples." At the same time, Commons struggles to acquire a variety high-quality and/or distinctive media of sex, anatomy, and pornography topics, such as medical images, images of non-whites or women, documentary photographs and videos of sexual acts, portraits of porn stars, and so on. Contrary to the moral panic that frequently surrounds the presence of sexual content at Commons, we actually need a lot more of it, just the right kind.
Our policy on photographs of identifiable people addresses many of the typical cases where a person's image may be used unethically, particularly images taken in a private setting without consent. In addition to this, there is a de facto policy that persons who request deletion of an image of themselves, which is not in use or easily replaced by another image, typically have their request honored (we call this "courtesy deletion"). We also provide courtesy deletion in some cases when users make it clear that they didn't understand the meaning of the free license at the time they used it. Photos of people online can damage reputations and be very disturbing, so we take these kind of issues very seriously, and always weigh the benefit of works to the public carefully against the risk to the individual.
That said, much of this practice is encoded only as folk knowledge gained through experience, and deserves more thorough documentation as official policies and guidelines. Policy development on Commons can be a struggle, with a small number of users split among a huge number of tasks, and in many cases practice shifts before policy comes along to document it, as has happened with the more aggressive deletion of URAA-violating images, or with the 2257 tag for sexually explicit works. But when policy development founders, it is not through lack of attention so much as because new policies have to be effective at carving out a new area that is not adequately addressed by our core policies. As anyone who's frequented Wikipedia talk:Criteria for speedy deletion would know, rules that seem intuitive are often found to have important exceptions. As an international project, it's also important that policies on Commons are culturally-neutral and guided by the common needs of all projects.
Part of the misunderstandings between Commons and other projects arise because of poor communication: we sometimes delete files without warning users on local projects, or fully explaining to them the intricacies of the laws that require the deletion; we sometimes do not delete works that one project finds inappropriate for its local culture, but others find useful. I think an important part of our mission going forward should be to communicate our intentions and rationales to all affected parties at all times. Dcoetzee 04:41, 21 June 2013 (UTC)[reply]
Good only in speech
Despite they (Common admins) are big philosophers by speech; they never leave a chance to humiliate someone, neglecting every personality rights. JKadavoor Jee 11:46, 21 June 2013 (UTC)[reply]
Communication is the key
The original op-ed and the two responses as well as many of the comments above point to the lack of communication among parties being the source of contention. Perhaps a way to forestall future disagreements is to make sure the lines of communication are always open. Even though we're all focused on the projects, we have to go the extra distance and focus a bit more on individual's perceived displeasure in order to see that sometimes illusive consensus. -- kosboot (talk) 12:51, 21 June 2013 (UTC)[reply]
{ {Keep Local} }
There is a way to opt out of the dysfunctional Commons asylum. Whenever one uploads a file, never upload to Commons, always upload straight to En-WP and include (in addition to a proper Rights tag) the template { {keep local} }, which will prevent the speedy deletion of the En-WP version of the file in the event that it is moved over to Commons. All files should be housed by the various language WPs, in my estimation, and Commons written off as a good idea gone terribly wrong. Carrite (talk) 17:02, 21 June 2013 (UTC)[reply]
What is the question to be answered ?
NSFW: Masturbating_with_a_toothbrush.jpg
For what I have understood, these two answers to the "Pink Parrot Incident" can be summarized as "come and fix it". But this is not so simple. What is to be fixed ? Let us go back to the "Tooth Brush Incident". You type: "ToothBrush". in the Commons search bar and you get a lot of toothbrushes. A simple request gives you 20 of them. If you look further, you obtain File:Toothbrush regurgitated by albatross on Tern Island, Hawaii - 20060614.jpg (the 40th toothbrush, on the left) and three picture later, you obtain another great moment in the life of a toothbrush File:Masturbating_with_a_toothbrush.jpg (the 43rd toothbrush, on the right).
The long "history". of this file is *educative*. It was created on 6 May 2011. Quite immediately (2 June 2011) the "Tooth Brush Incident" appeared, and part of the people has tried to fix this incident by renaming the file into File:Woman masturbating with improvised vibrator.jpg (and removing it from the Commons:Category:Toothbrushes). Another part of the people has worked hard to keep alive this "Tooth Brush Incident", introducing again and again the searchkey "toothbrush" into the name, the categorization, a link to the file or whatever. At 19:27, 22 May 2012 (UTC), a group of admins has stated that ""We agree that there is a problem when a search for toothbrush on Commons returns this image on top of the results"".. But the problem has not been fixed as now.
From that, we can see that this long lasting "Tooth Brush Incident" is not the result of a poor search tool nor even the result of the mere existence of that file on 'commons.wikimedia.org'. Its a bigger problem, that cannot be solved by a simple increase of the workforce at Commons. In fact, this place doesn't look to be a workplace and this can be the key problem.
Isn't this barnstar a great *educative* picture, in the context ? Pldx1 (talk) 19:22, 21 June 2013 (UTC)[reply]
Toothbrushgate, Part 47
"One of those results was the search "toothbrush" returning a picture of a woman using an electric toothbrush for self-pleasure as one of the top results. This was entirely a legitimate result - it was a picture of a toothbrush, and it was titled as such."
... no, it was NOT a "legitimate result." Is this what someone actually wants to see when searching for toothbrushes? No? Then it's not a good result. The user is always right. Let me add that I am one of the people who rolls their eyes at proposal for general "content filters," but something akin to your average Google SafeSearch should have been in place, and if it isn't now, it should be added. Nothing to do with images of Muhammad or whatever which is a giant distraction. SnowFire (talk) 22:26, 22 June 2013 (UTC)[reply]
An observation
I was waiting to see if anyone would replace the images above of a woman masturbating with an electric toothbrush and the "hot sex barnstar" with links, but no one has. A reasonable interpretation of WP:NOTCENSORED is that if you look at WP articles about topics dealing with sexuality or anatomy one should expect to see images of nudity or sexuality. In practice, a rather silly but popular invocation of WP:NOTCENSORED as nothing but a slogan means that our readers should follow a "principle of most astonishment" where at any time you should expect to see images of nudity or sexuality. I'm not offended by it personally, but It seems obvious to me that it is not appropriate to for readers of the Signpost -- hopefully read by many of other "millions" of contributors -- to be faced with an image of a masturbating woman in the comments. Delicious carbuncle (talk) 14:25, 23 June 2013 (UTC)[reply]
Are these pictures really free pictures?
Item one, the Colgate toothbrush [1]. For what I understand, this Colgate toothbrush picture was taken by someone working at the United States Fish and Wildlife Service (USFWS), Hawaiian Islands NWR, as part of that person's official duties. As a work of the U.S. federal government, the image is *in the public domain*. Yes, this is the truth. But this is only a part of the truth.
Because this picture doesn't appear to be a *free* picture. This pictures carries a proeminent "Colgate" trademark. Moreover, this proeminent trademark on the handle of the brush is the only thing that is clearly identifiable on this picture. How do you even know that the central part of the bolus is a toothbrust ? From the trademark on the handle. This largely not de minimis. Not convinced ? Let us replace the trademark "Colgate" by the common name "commons". This gives the right picture. Convinced now ?
The description attached to the picture says: An albatross bolus – undigested matter from the diet such as squid beaks and fish scales. This bolus from a Hawaiian albatross (either a Black-footed Albatross or a Laysan Albatross) found on Tern Island, in the French Frigate Shoals, Northwestern Hawaiian Islands, has several ingested flotsam items, including monofilament fishing line from fishing nets and a discarded toothbrush. Ingestion of plastic flotsam is an increasing hazard for albatrosses.
Therefore it is *fair* to use such a picture, among many other ones, to describe that "even an innocent toothbrush can turn into a fatal weapon". This is done in , and could be done in en:Marine debris, etc. But it is *unfair* to use the same picture in a way suggesting that Colgate is the worst among all these wrongful killers of innocent albatrosses, as done at and . The use at en:Bolus (digestion) as the only picture illustrating a one line article stating that "Under normal circumstances, the bolus then travels to the stomach for further digestion" is unclear.
Conclusion: the picture seems to be relevant (with the proper statements) at some pages inside Wikipedia, and irrelevant at 'commons.wikimedia.org'.Pldx1 (talk) 12:11, 23 June 2013 (UTC)[reply]
Add Google Search link to commons:Special:Search
Why reinvent the wheel? Just add a Google Search link to the commons:Special:Search page:
Then the nudity, gore, and sex challenged could use Google's SafeSearch option if they so choose.
I am sure one of the various gadgeteers on the Commons could come up with some gadget to add the link to the bottom of commons:Special:Search. It could be enabled at commons:Special:Preferences#mw-prefsection-gadgets.
Or better yet it could be added by default for all users, whether registered or anonymous. Google often does much better searches of the Commons than the MediaWiki search engine. So I and others would love to have it enabled by default.
I never use SafeSearch though,
and so please do not add a Google search link with SafeSearch enabled by default.--Timeshifter (talk) 14:27, 24 June 2013 (UTC)[reply]To MattBuck