The Signpost

Op-ed

Two responses to "The Tragedy of Wikipedia's Commons"

Contribute  —  
Share this
By MichaelMaggs and Mattbuck
Following last week's op-ed by Gigs ("The Tragedy of Wikipedia's Commons"), the Signpost is carrying two contrary opinions from MichaelMaggs, a bureaucrat on Wikimedia Commons, and Mattbuck, a British Commons administrator.

MichaelMaggs

The true tragedy

The title of last week's piece, "The Tragedy of Wikipedia's commons" was perhaps rather more ironic than its author intended. One of the truly great tragedies of medieval England was not so much the tragedy of the commons in its original sense but the forcible enclosure by powerful outside interests of the historic common land that had for centuries been available as a free resource for all. If there is any tragedy here, it is in the author's wish to use Wikipedia to take over Wikimedia Commons and to do very much the same thing online.

Background and remit

Commons always has and always will have a far broader free-content remit than that of supporting the narrow focus of an encyclopaedia. Commons provides media files in support not just the English Wikipedia but all of the WMF projects, including Wikisource, Wikibooks, Wikivoyage and many more. These sister projects of Wikipedia often have a need to use media on Commons that could never be used on the Wikipedias as they are not - in Wikipedia's narrow sense - "encyclopaedic". Some of Commons' detractors like to give the impression that its collections are nothing more than a dumping ground for random non-educational content. Nothing could be further from the truth, and the energy expended by those who would criticise from the outside (but who are strangely reluctant to engage on wiki) bears little relation to the extremely small proportion of images that could in any way be considered contentious.

Commons' policies are of necessity different and more wide ranging than any of the individual projects. We hold many images that will never be useful to the English Wikipedia, and that is not only OK, but should be welcomed as Commons' contribution to the overall mission of the Wikimedia Foundation, "to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally". Note that the overall mission of the WMF is not "to write an encyclopaedia", but rather to develop and disseminate educational content. Supporting the English Wikipedia is one way, but by no means the only way, in which we do that, and the idea that Commons should be forcibly subjugated to the policies of a specialist encyclopaedia project would do immeasurable harm to the mission which I had hoped we were all working to support.

Contrary to the suggestion that the Commons policy on scope of 2008 was an "unchallenged action by a tiny group of people", it was in fact largely an exercise in documenting for the first time the unwritten long-established practices of the community. The policy attracted very little controversy (despite it being very widely advertised, on Wikipedia and elsewhere) largely because the vast majority of it was uncontentious. Indeed, the fact that it has retained very wide community support since then indicates that we didn't do too bad a job.

With its specialised emphasis on media curation and the niceties of copyright law, Commons will never be as popular a place for editors to hang out as some of the bigger encyclopaedias. It requires not only a particular set of interests, but also at least for admins some level of specialist knowledge which not everyone has or is interested to acquire. Those outside the local community who only see the external carping may not realise that we have thousands of very committed editors who work tirelessly in the background curating and categorising content and bringing to the attention of the admins non-educational content that has no place in our collections.

Commons has never (as was claimed last week) been merely a repository that supports its sister WMF projects. Right from the start it had a remit to make content freely available to external re-users. As early as 2006 there was a formal proposal (since implemented as InstantCommons) to integrate into Mediawiki a mechanism specifically designed to support users on non WMF projects. Perhaps the real worry of last week's author was that Commons currently holds too many non-encyclopedic images of a sexual nature. But even assuming that is true, a proposal to revoke one of the fundamental free content aims of Commons hardly seems proportionate. Instead, let's have a proper discussion on what Commons' scope should be. Times change, as do priorities, and what made sense five years ago may now perhaps need to be revisited.

Over the last few months especially there has been a lot of discussion within Commons as well as outside about issues concerning the small proportion of our holdings that relate to sexual imagery and to privacy/the rights of the subject. Both have complex moral and legal dimensions, and neither has yet been fully resolved. I've set out the main strands of argument below, as objectively as I can, for those who may not be familiar with them. Of course, these summaries are by no means the whole story, and many of the discussions are far more subtle than I have space for here, so please bear with me if you are familiar with this and feel I have mis-characterised or omitted any important point that may be close to your own heart. I deliberately make no comment on the validity of any of these arguments.

Sexual imagery

Some argue that pornographic images (as defined in some way) are never appropriate for any of the Wikimedia projects and are simply not educational.

Others argue that we should keep most images, almost whatever the subject matter, as we need to show the whole range of human experience if we are to call ourselves a comprehensive educational resource. Anything else would be censorship.

Yet others suggest that not all the sexual images held by Commons are "educational", properly defined. Some are photographs that have been taken for non-educational purposes, for example personal gratification/entertainment, and/or have been uploaded for the same purpose or by users who wish to push an extreme view that equates any limits at all with unacceptable "censorship".

Finally, some hold that Commons has too many images in certain marginally-educational areas that, taken overall, create an oppressive or threatening environment (e.g. for women) which may be harming the project as a whole.

Privacy and the rights of the subject

One strand of argument is that we should do more to respect the rights of individuals who are identifiable in a photograph, and recognise that, even where the image may be legal, it can be highly damaging to the individual. Even when an outsider might naively think the image unremarkable, it may still be considered threatening, harassing or oppressive by its subject.

Another strand is that allowing the subject of a photograph a say on whether it should stay on Commons or not opens the door to all sorts of censorship. Proponents argue it's essential that we are able to collect all types of educational image, including those that may offend the subject.

Review

If there is indeed a problem with the boundaries of Commons' scope - perceived or otherwise - we should tackle it head-on with open community discussion. Commons should be and I believe is receptive to the views of everyone within the Wikimedia community in reviewing its curatorial policies. But the way to get things changed is to engage rather than to criticise from afar.

A comprehensive review of Commons' scope is just starting now, and you need never say again that your voice cannot be heard. Please talk.

Please visit Commons' Review of Scope pages now, and make your views known for the sake of all the Wiki communities.

Conclusion

Commons has proved to be a phenomenal success in the years since its introduction, and we should be proud of what has been achieved. We should keep it, improve it, and celebrate it.



Mattbuck

Last week, the Signpost published a rather scathing op-ed about Wikimedia Commons, the Wikimedia project which seeks to be a resource of free, educational media. Perhaps you feel it presented a valid argument, perhaps not, that's for you to make up your mind on. I would like to take this chance to offer a defence of Commons.

As you probably know, Wikimedia Commons acts as a central repository for images. Once an image is on Commons, any project can use it, exactly the same way they can use their own images. It's an incredibly valuable tool for the Wikimedia project as a whole, as it prevents duplication and provides a central place to search. You want an image of something for your Wikipedia article? Commons probably has a category for it. And that is the same whether you're editing in English, German, Arabic or even Tagalog.

I first joined Commons back in October 2007, when I was working on an eclectic mix of the Ffestiniog Railway and McFly. About six months later I became a Flickrreviewr, checking uploads from Flickr that for some reason couldn't be checked by a bot, and a month or so after that I became an admin, primarily so I could deal with all the copyright violations I came across with the Flickr work. In the five years since my interest in admin duties has waxed and waned, and I had little side-projects, but Commons had swiftly become my home-wiki. My watchlist has some 60,000 pages on it, of which 10,000 are my own photos.

Commons has its problems, I cannot deny that. The number of people who believe that because they found a photo on Google it can be uploaded to Commons is simply staggering. The search engine is designed for pages not images (a limitation of the software). The community can be a bit fractured, it can be very difficult to get people blocked for being terminally incapable of working with others (even when their name comes back to the admin noticeboards week after week after week), and we have remarkably little in the way of actual policy. Indeed our main guiding principles boil down to two pages: Commons:Licensing and Commons:Project Scope. The former tells us what files we're allowed, the latter which we want. Scope is the real issue of the moment, and in a nutshell it says that Commons collects educational media. Which brings the question, "what is educational?"

A similar problem has existed on Wikipedia for years - what is notable? There are even factions - deletionists, who think articles must prove their notability, and inclusionists, who think that there's no harm in letting potentially non-notable articles stay. And so it is on Commons - those who adhere to a strict definition of educational, and those who accept a somewhat looser guide.

And this dispute would be fine, if it were argued on Commons and in the abstract. But that is not what happens. The major rift happened a few years ago, when, apparently due to a disparaging Fox News article about the amount of "porn" on Wikipedia, Jimbo Wales, co-founder of Wikimedia, came onto Commons and starting deleting sexuality images. That didn't really go over well with the Commons community, of which Jimbo has never been a part, especially when it was found he was deleting images which were in use on multiple projects. To cut a long story short, the deleted images were restored and Jimbo lost admin rights at Commons, as did several admins who had joined him in his purge. Many of the images Jimbo deleted were in fact subsequently deleted again, following deletion requests to allow for Community input. But the deed had been done, and for a large proportion of the Commons community, it appeared that Jimbo was not to be trusted to have the best interests of the project at heart.

The issue stewed for a few years, and reemerged with a vengeance last year. Again, it has been fought almost entirely over what some describe, disparagingly, as "porn". As I mentioned earlier, the Commons search engine is not really designed for images, and so it tends to give unexpected results. One of those results was the search "toothbrush" returning a picture of a woman using an electric toothbrush for self-pleasure as one of the top results. This was entirely a legitimate result - it was a picture of a toothbrush, and it was titled as such. And while the so-called "principle of least astonishment" can easily be applied to categories - Commons has a whole proliferation of "nude or semi-nude people with X" categories on the grounds that nudity should not appear in the parent category "X" - it doesn't really work for a search algorithm, not if you want to continue with correct categorisation. Until the Wikimedia Foundation develops some form of search content filter (which itself brings up issues of what exactly should be filtered - should images of Muhammed be filtered out? What about Nazi images due to German law?) all that can really be done is to either delete the image or rename it to try and reduce the chances of an innocuous search returning it. I personally favour keeping the images, and this has led me to be named as part of a "porn cabal" by people, most of whom rarely if ever edit on Commons, who favour deleting the images.

But the issue, for me, is that these issues so rarely get brought up on Commons. Instead of using the deletion request system to highlight potentially problematic images (which is after all what the process is for), the detractors would rather just soapbox on Wikipedia - usually on Jimbo's talk page - about how awful Commons is, and how this latest penis photo proves once and for all that I (or some other member of the "porn cabal") am the worst admin in the history of forever and deserve to be shot out of a cannon into a pit of ravenous crocodiles. What people don't seem to understand is that in large part, I do agree. Commons has problems. We do have too many low quality penis pictures - so many that we even have a policy on it - and so I have a bot which searches new uploads for nudity categories and creates a gallery so I can see any problematic ones, and thus nominate them for deletion. This somehow seems to make me an even worse admin in many people's eyes. We should indeed have better checks to ensure that people in sexual pictures consented to having their pictures uploaded, and I would like to see a proper policy on this. I'd like to see the community as a whole have a reasoned discussion on the matter, for a policy to be drafted, amended, voted on and finally adopted. But that is very difficult when you feel you are under attack all the time, where your attackers are not willing to actually work with you to create a better project.

Wikimedia projects are based around collaboration and discussion within the community. I would urge those of you who feel that Commons is "broken" to come to Commons and offer constructive advice. Attacking long-term Commons users will get you nowhere, nor will pasting links on other projects, or on Jimbo's talk page. If you truly want to make Commons a better place, and are not in fact just looking for any reason to tear it down, then come to Commons. Come to the village pump - tell us what is wrong, and how you feel we could do better. Use the systems we have in place for project discussions to discuss the project. Sitting back and sniping from afar does nothing for your cause, and it only embitters the Commons community.

Come and talk to us.

+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
==General==
  • There's a minor typo: "free-content remit than that that of supporting". Mohamed CJ (talk) 00:34, 21 June 2013 (UTC)[reply]
  • MichaelMaggs's article doesn't really tell us his/her opinion on sexual images, while Mattbuck is basically supporting the current status quo. My main interest in Commons is uploading images for use on Wikipedia, but I also upload images knowing that they may never be used here. Although I !voted for Muhammad images to be used in the article [1], I find the excessive collection of seemingly useless sexual content on the Commons as disruptive, repellent and pointy. Mohamed CJ (talk) 01:04, 21 June 2013 (UTC)[reply]
    • No one is supporting the status quo. The theme of both responses is: "If you have a problem with Commons, engage with us instead of sniping from afar." Which is good advice. Powers T 01:22, 21 June 2013 (UTC)[reply]
      • I can see how "Come talk to us" is a definite vote for the status quo, since it seems the only well-developed policy on Commons is "Your problem isn't our problem". These responses consider my op-ed scathing for Commons, and maybe I did use harsh language in places, but ultimately, none of them address how Commons can continue to assert policy autonomy while still serving the inter-wiki media sharing function. My op-ed offered a solution that in my eyes is win-win, allowing autonomy for Commons, and removing the repercussions of Commons-local policy or lack thereof from the other projects Commons serves. Gigs (talk) 02:36, 21 June 2013 (UTC)[reply]
  • I'm pleased to see these two responses to the original op-ed. I won't comment on the sexual images debate, but otherwise the responses reflect my long held view that commons is not just a source of images for wikimedia projects, but also a useful library of free licence images, etc, that anyone can use. I have uploaded many of my own images to commons. Although I often immediately add an image I have uploaded to an appropriate wikipedia article, that is not always the case, as I also frequently upload images on the basis that someone might find them useful somewhere at some time in the future. If I have one criticism of commons, it is that many of the images in commons are not of particularly high quality. But I suspect that that problem is merely one of many reasons for us to upload better images to commons than many of the ones that are already there. Bahnfrend (talk) 02:03, 21 June 2013 (UTC)[reply]

Contentious images

I do not see any issues with contentious images be filtered by choices, as it is all personal preference. One technical way to resolved this, is to tag those photos and let user decide what photos they want to see. For example, tag the "Contentious images" in broad category ie. (Sexual explicit/Violence/Glory/Discretion's. Ordinary user have to explicitly select/tick search result to include photos that falls under that category, Otherwise only non-tag photos will be displayed. This is not censorship, because it is the user decisions to decide what they want to see, without being forced to see photos that they do not want to see in the first place. User have the right to choose. Why should a group of admin decide on behalf of the user what photos user must see/cannot see. User power. Yosri (talk) 01:26, 21 June 2013 (UTC)[reply]

Unfortunately, this would be censorship, because it enables entities other than the user to manipulate the classification system to forcibly impose it through technical and/or punitive means.   — C M B J   01:32, 21 June 2013 (UTC)[reply]
The tagging must not be done arbitrary/single person, must follows guidelines set by committee/voting. User still can choose to see/ignore. Why should somebody force me to see things that I do not want to see. Yosri (talk) 01:42, 21 June 2013 (UTC)[reply]
Consider the following scenario. The New Foo State Education Agency (NFSEA) boasts a promise of zero tolerance for prohibited activities and commissions a task force to implement the policy across all educational institutions in New Foo. NFSEA's task force then concludes that content-control software will be necessary to enforce a provision that forbids, among other things, accessing online pharmacies. Accordingly, the task force recommends acquisition of a compliant software suite, one of such, "Foo Filter", it notes as being a government off-the-shelf product made available through an NFSEA-approved vendor, FuTek. The NFSEA then negotiates with FuTek and procures a license to use Foo Filter over the next ten fiscal years. The NFSEA deploys Foo Filter at all educational institutions across New Foo, including institutions ranging in scope from elementary schools to public research universities, then concludes that the implementation has been completed. Everything seems to be in order and life goes on as usual. Several weeks later, class is back in session at Foo University, and Joseph, a sophomore at FU, is at the computer science laboratory reading Wikipedia articles pertaining to an upcoming assignment he has on human rights. He is particularly moved by Abu Ghraib torture and prisoner abuse and plans to make his presentation on the subject. However, upon visiting that article, he soon realizes that the article's twelve images are all inaccessible except for one: File:Navy consolidated brig -- Mirimar CA.jpg. He raises the point with a member of the laboratory's staff and asks her why students aren't allowed to access these images. "I'm sorry," she says, "these images are restricted because Foo Filter automatically blocks all images classified as offensive in nature." Joseph replies, "but doesn't that go against the idea of free speech?" "Yes," she says, "but Foo Filter's use is mandated on all state campuses and there are stiff penalties for noncompliance." "So you're saying that I'm going to have to walk back to my dorm if I want to use one of these images in my presentation?" "No," she says, "the dormitories actually use the same network, so you won't be able to access it there, either." "Ridiculous," Joseph says. "Rules are rules," she says sighingly.   — C M B J   13:37, 21 June 2013 (UTC)[reply]
That was really long and unhelpful. -mattbuck (Talk) 14:16, 21 June 2013 (UTC)[reply]
Sorry that it did not resonate well with you. The point was to illustrate the concept that I outlined above, which is that the practical effects of an optional filter extend beyond that of user choice.   — C M B J   14:27, 21 June 2013 (UTC)[reply]
If someone wanted to prevent others from seeing parts of Wikipedia, they could simply block all of WP or block all images from WP. You describe a scenario in which only specific images are blocked - which is worse? Delicious carbuncle (talk) 14:56, 21 June 2013 (UTC)[reply]
Doesn't Google and other large image hosting websites do the same? It's common sense. Mohamed CJ (talk) 06:57, 21 June 2013 (UTC)[reply]
The owner of the network should be allowed to block it. Ie, If I open up my wifi network, I mght want to block some IP at the router. If you want to access those go to CC ("Cyber Cafe")/Starbuck. Yosri (talk) 11:23, 22 June 2013 (UTC)[reply]
  • What is amazing is just how little interest there is in technical mechanisms to allow users to control what they find offensive. Knowing very little about Javascript, I wrote up a tiny little script [2] that actually hid all the images in Muhammad. This was proof-of-principle of an idea I had gone on about at considerable length in User:Wnt/Personal image blocking. We could allow people who are offended to form networks, transclude together huge lists of blacklisted images, doing so collaboratively without requiring any Official View of what is a Bad Image. It doesn't seem like that is of any interest to anyone though. Despite talk of people being offended, the cause seems to be more about trying to win power to affect what other people see. If you don't have personal choice of what to block and whose blocklists to transclude into your own - if you have a project-wide set of categories to include and exclude content - then inevitably people will disagree on those categories, and someone has to so regretfully place himself in charge of saying who is right and who has to be banned to keep him from disagreeing with the others. Wnt (talk) 07:54, 21 June 2013 (UTC)[reply]
Decentralizing this sort of scheme is actually the best suggestion I've heard yet, although it does still worry me that if lists gain enough popularity they will be used to do harm. If China catches wind of such a scheme, for example, they could exploit our work to easily block access more legitimate content than would otherwise be feasible. There are also considerations in places like Iran where homophobic lists, for example, could theoretically contribute to persecution efforts.   — C M B J   13:42, 21 June 2013 (UTC)[reply]
Some of those things worry me too, but the point is, if users write up some user scripts to do what they want, that's their right and I can't stop it, nor should I want to. There are technical refinements that might be helpful (such as ensuring that it is hidden from others whether a user is actually running a script he seems to have active on his page) but they might only be a false sense of security anyway (since the connection could be spied on). If people are that intimidated in an area they probably are already being effectively censored anyway. Wnt (talk) 15:22, 21 June 2013 (UTC)[reply]
Come to think of it, the decentralized model actually gave rise to another thought in my mind, which is that there may be some technical ways around this problem of exploitation. The most important part would be to disincentivize unauthorized attempts to interface with the system and this could be implemented in several different ways. One such example would be to make classifications a hidden attribute. The system could then allow users to synchronize their filter preference using an automatically generated key that is unique to either their session or account, which, if valid, would allow embedded files to be first checked up against the classification table. This would presumably prevent the vast majority of abuse while still allowing users to have control over what they see.   — C M B J   11:37, 22 June 2013 (UTC)[reply]
I'm not quite sure I understand that, but my thought is that the list could be kept as a userspace .js file (perhaps in a JSONP format to permit use on multiple WMF projects). Those files already have a special advantage that no one else can edit them but the user and admins; it is possible that a small technical measure might also prevent others from reading them. But again, it sort of asks for trouble because who knows if an admin will be co-opted to check up on how people in the faith are doing, etc. - it might reduce privacy rather than increase it when such things are figured in. So long as people can simply not log in or not enable Javascript these seem like better options for a user in such a strange position. Wnt (talk) 16:34, 22 June 2013 (UTC)[reply]
Basically, one way of thinking about this would be that a new user flag could be created, let's call it "filter". The filter flag would be disabled by default but made available to anyone. Logged in users could indefinitely opt for this flag by toggling an option in their preferences. Anonymous users could similarly toggle the feature for their session by way of a workaround that stores a temporarily valid key in their cookie data. The filter flag would then function much as does any limited access flag like those that affect Special:DeletedContributions and would allow new types of content to be generated differently on the server side according to the user's preference. As for the classification side of things, let's now suppose that the filter flag depends on a database column called classification and its data for any given file is always 0 by default. We then create a new Special:Classification page that contains a simple form to change the classification code for any given file.   — C M B J   23:04, 22 June 2013 (UTC)[reply]
"If China catches wind of such a scheme, for example" This is offensive and, pardon my bluntness, stupid. "If the United States Navy catches wind of them airplane things, they might get some !
The statement is retarded and the ridiculous propganda behind it is even worse. Cisco built the Great Firewall, okay ? "The Chinese" have already got wind of that filtration concept, fella. And they do a lot better job than a bunch of twits living in Mummy's basement.
I live in China. The GFW can be a pain in the behind. But we all know it's there and we all know why and we all account for the fact that the bureaucracy is exercising a thousand-year-old cultural imperative to put a happy smiling face on all public events. The ridiculous crap that Westerners (esp. Americans) believe about the Communist Party trying to retain control of their power is so ludicrous it doesn't deserve comment.
On the other hand, the NSA playing Big Brother hiding in the closets of *all* Americans is not a tinfoil-beanie paranoid fantasy. So please take the ridiculous spew about "if China got wind of this !!" and place it where the sun doesn't shine, along with all the rest of the fascist propaganda that the Amerikanski i-dot-tens love to spew.
Someone, somewhere in this long discussion claimed that "there is no technical answer." Of course there is. Both BeOS and OS/2 had extended attributes in their file systems that could determine all sorts of things about any given file. Yse the same idea : all one would have to do would be to tag items of a sexual or violent nature and let *the user* decide for himself what he or she wishes to see.
But the point is that neither party to the dscussion wants to allow that. One group wants to remove anything they don't like, the other group wants to force people who find sexual or violent materials uncomfortable to have to wade through it anyway. "That's FREEDOM !"
No it isn't. Freedom is being able to make the decision for yourself. The US is big on FREEDOM ! as long as it's the freedom that the powers-that-be have decided is okay. As a Jewish friend once told me, "Nazi Germany really wasn't a bad place - as long as you were a nazi." Or as George W freedomiciously informed the Palestinians, "You better democratically elect someone more acceptable to us than Yasser Arafat."
Either you believe in freedom or you are a fascist. If you believe in freedom, then you give people the tools that allow them to make their own choices. You don't ram your own ideas down their throats, no matter how much you think it "would be good for them." It's that simple.
I have 100 times more freedom in Axis of Evil Commie Red China than Americans do in the US. Honest. That's sick. 210.22.142.82 (talk) 07:35, 24 June 2013 (UTC)[reply]
I'm very sorry for the distress that my comments caused you. The reason I mentioned China is because the government has sought to censor Wikipedia in the past and because the Ministry of Industry and Information Technology recently discussed plans to strengthen its control over some mainstream content providers. I did not intend to suggest that China is inherently bad or socially inferior, or that the unique consideration you speak of is without merit. Incidentally, your comment actually took the place of a clarification that I had written but withdrawn for further thought; it is now added above. I also believe that your idea of file attributes is nearly identical in principle to what I was trying to say.   — C M B J   10:07, 24 June 2013 (UTC)[reply]
I'm not really distressed :) but thanks for the concern. I am just *so* tired of the hypocrisy that runs rampant in the US. Yes, China censors the news. Guess what ? we are all aware of it. AND, at this time I am not so sure I am opposed. They didn't toss Google over "censorship", they tossed Google over spying and commercial issues. They have no intention whatsoever of letting a foreign quasi-governmental establishment have that kind of control in China. And they do not like the kind of hate-filled rabble-rousing news that is all the rage in the US. A news media that everyone knows is slanted and no one really believes is better than the pretentiously uncensored yet in fact *very* slanted so-called news that a large portion of the populace does really believe. Which is worse for society, CCTV or Faux News ?
The truth is, the US is worse and censors in a much more insidious manner. The US has feet of clay ...no, worse than clay. The Establishment of the US is ruthless, corrupt, murderous (John F, Robert, Martin Luther, Bobby Seale, &c &c) deceitful (Vietnam, Iraq) ... you name it. Open your eyes. It's not China that you need to be concerned about. In the US you have the total freedom to say what the Establishment wants to hear. Or act weird enough that they can call you a crackpot and use you to whitewash their filthy lies.
If you want to use a country as an example of censorship, please don't start with China.
About technical issues, you are right - solving this problem would be simple if people actually wanted to solve it. Just tag all files, somewhat like the exif info on a photograph, only simpler. Make ten categories if you want. BFS, HPFS, and XFS can all do that right in the file system. Probaly other files systems also. Then a user could click a radio button if he chooses to not see any items in those categories. It would solve both the toothbrush issue and the vaginal Simpsons painting. File is tagged "sexual", suddenly both the Mennonnites and Anton LeVeigh's disciples can be happy. (Except truth is, neither party would be happy. The Mennonites don't care if they can't see sex, they don't want *anyone* to see sex. And the Satanists same, except opposite. That's the root problem :)
Wake up, America. The US is **worse** than China about freedom, equality, and human rights. That's why I moved here. Not joking. 210.22.142.82 (talk) 13:24, 24 June 2013 (UTC)[reply]

Historically and Economically Inaccurate

One of the truly great tragedies of medieval England was not so much the tragedy of the commons in its original sense but the forcible enclosure by powerful outside interests of the historic common land that had for centuries been available as a free resource for all. - no, that's still wrong. But whatever.Volunteer Marek 03:24, 21 June 2013 (UTC)[reply]

Whether the description of the medieval English commons is accurate or not, the problem with this discourse is apparent ignorance of basic economic terminology. The Tragedy of the Commons is an economic concept which is relevant to a discussion about limited resources. However: Wikimedia Commons, like the Creative Commons, is a functionally unbounded public good. It was perhaps clever, but not helpful, for the author of "The Tragedy of Wikipedia's Commons" to conflate a limited with an unbounded resource in an apparent effort to score rhetorical points. I hope the Signpost will continue to strive NOT to publish articles with contrived arguments such as this. Thank you for publishing these more nuanced and considered responses. ChristineBushMV (talk) 16:30, 21 June 2013 (UTC)[reply]

I'd defend my title by saying that administrator and volunteer effort is not an unlimited resource, it's definitely scarce in economic terms. I think that many of the problems of Commons do stem from a lack of resource. Much of the lack of content policy development comes from a desire to not get drawn into content disputes on the encyclopdias, combined with a lack of administrative resource to develop a nuanced policy that prevents Commons from becoming a flickr-esque dumping ground. All of these things require a great deal of administrative effort, and when you are spending all your time weeding the firehose of license problems that comes from the encyclopedias, you don't have much time for nuanced policy discussion or development. Gigs (talk) 17:21, 21 June 2013 (UTC)[reply]
The resource is the media themselves, their storage and distribution. A modest proposal: perhaps if encyclopedia editors focused more on developing well-written encyclopedic content in summary style with rigorous citations, and less on policing the commons, it would be a win-win? ChristineBushMV (talk) 18:24, 21 June 2013 (UTC)[reply]
You're missing the point. Or, actually, points, plural.Volunteer Marek 04:25, 22 June 2013 (UTC)[reply]
You're missing the point. People have the right to oppose enclosures, and did, for example in a series of enclosure riots in the 1500s described in that article. They realized - as too many today seem to have forgotten - that as more and more things are taken out of the public domain and sold off as "rights", the common inheritance of every human being becomes smaller and smaller, so those born poor become poorer and poorer, until you have the absurdity of whole countries running automatically by the power of metals and fossil fuels and machines, but all the profit goes to a tiny elite that "owns" all the resources God provided under this Earth (and next, in space), and the others are called parasites for wishing they had a way to live.
Though your original point itself, criticizing this history, may seem like a distraction, it isn't really, because the whole point of Commons and of a wide-ranging and open Commons is to try to provide the poor of the world (and the others, whose rights, it turns out, depend on the rights of the poor to exist) with open access to at least think and read and write about a larger legacy of ideas, against those who believe that the power of learning and thought itself should be rationed to an elite and its favored pets, and forever foreclosed from the larger part of humanity, whose purpose is solely to be made extinct or reduced to the status of mere raw material, a sacrifice to Moloch under the name of Spencerism. To some, of course, the entire mission of Commons is therefore illegitimate, and they need merely begin somewhere. Wnt (talk) 16:51, 22 June 2013 (UTC)[reply]
This is probably as good place as any to let you know that I have a standing personal policy to ignore and not respond to your nonsensical rants, Wnt. Here and elsewhere.Volunteer Marek 21:13, 24 June 2013 (UTC)[reply]

Opinion from Dcoetzee

I came to this party a bit late so I didn't submit an op ed, but wanted to give my thoughts briefly. I'm a long-time adminstrator on both Commons and English Wikipedia, and I refer to both as home wikis. There is substantial overlap between us in the area of image curation and dealing with media licensing issues - I have seen a lot of great work going into Wikipedia:Possibly unfree files here, and Commons itself is quite reliant upon the excellent table at Wikipedia:Non-U.S. copyrights. I believe many of the image admins here on En would be great Commons admins, and vice versa. On the other hand, Commons' understanding of copyright law, U.S. and international, and policies surrounding it are in many ways more nuanced than En's, with extensive pages on issues like non-copyright restrictions, de minimis, and freedom of panorama, and as such it's no surprise that not everyone who excels here has the specialist understanding to administrate content on Commons.

But the main thrust of the original essay was, as these responses suggest, about scope. I want to emphasize what MichaelMaggs referred to as the "small proportion of our holdings that relate to sexual imagery and to privacy/the rights of the subject". Commons does receive a lot of low-quality penis uploads by white first-world males, for whatever reason, and we purge these without prejudice; this inspired the part of the scope policy reading: "poor or mediocre files of common and easy to capture subjects may have no realistic educational value, especially if Commons already hosts many similar or better quality examples." At the same time, Commons struggles to acquire a variety high-quality and/or distinctive media of sex, anatomy, and pornography topics, such as medical images, images of non-whites or women, documentary photographs and videos of sexual acts, portraits of porn stars, and so on. Contrary to the moral panic that frequently surrounds the presence of sexual content at Commons, we actually need a lot more of it, just the right kind.

Our policy on photographs of identifiable people addresses many of the typical cases where a person's image may be used unethically, particularly images taken in a private setting without consent. In addition to this, there is a de facto policy that persons who request deletion of an image of themselves, which is not in use or easily replaced by another image, typically have their request honored (we call this "courtesy deletion"). We also provide courtesy deletion in some cases when users make it clear that they didn't understand the meaning of the free license at the time they used it. Photos of people online can damage reputations and be very disturbing, so we take these kind of issues very seriously, and always weigh the benefit of works to the public carefully against the risk to the individual.

That said, much of this practice is encoded only as folk knowledge gained through experience, and deserves more thorough documentation as official policies and guidelines. Policy development on Commons can be a struggle, with a small number of users split among a huge number of tasks, and in many cases practice shifts before policy comes along to document it, as has happened with the more aggressive deletion of URAA-violating images, or with the 2257 tag for sexually explicit works. But when policy development founders, it is not through lack of attention so much as because new policies have to be effective at carving out a new area that is not adequately addressed by our core policies. As anyone who's frequented Wikipedia talk:Criteria for speedy deletion would know, rules that seem intuitive are often found to have important exceptions. As an international project, it's also important that policies on Commons are culturally-neutral and guided by the common needs of all projects.

Part of the misunderstandings between Commons and other projects arise because of poor communication: we sometimes delete files without warning users on local projects, or fully explaining to them the intricacies of the laws that require the deletion; we sometimes do not delete works that one project finds inappropriate for its local culture, but others find useful. I think an important part of our mission going forward should be to communicate our intentions and rationales to all affected parties at all times. Dcoetzee 04:41, 21 June 2013 (UTC)[reply]

Dcoetzee, I agree with you that policy documentation on Commons is very lacking, mostly due to a lack of administrative resources. There's an open RfC over there right now and I have proposed the beginnings of a working draft on a more nuanced inclusion policy. If you have time, come check it out. Gigs (talk) 17:25, 21 June 2013 (UTC)[reply]

Good only in speech

Despite they (Common admins) are big philosophers by speech; they never leave a chance to humiliate someone, neglecting every personality rights. JKadavoor Jee 11:46, 21 June 2013 (UTC)[reply]

Quite indiscriminate your judgement: none of the 2 op-editors voted to keep that image. Of the admins participating in the deletion discussion, 4 voted to delete and 5 voted to keep. --Túrelio (talk) 12:20, 21 June 2013 (UTC)[reply]
It was a contentious DR, and was decided on the grounds that the subject is a public figure and the media are not easily replaceable. I'm sure if I tried I could find many XfDs on en.wp that I disagree with, but that just shows that my own opinions are not the consensus ones. -mattbuck (Talk) 12:22, 21 June 2013 (UTC)[reply]
I said “Common admins”; not the above admins. The DR was closed by an admin; am I right? Please appoint admins who have a common sense to read and understand what are written in our policies: "While some aspects of ethical photography and publication are controlled by law, there are moral issues too. They find a reflection in the wording of the Universal Declaration of Human Rights, Article 12: "No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation."Common decency and respect for human dignity may influence the decision whether to host an image above that required by the law. The extent to which an image might be regarded as "unfairly obtained" or to be "intrusive", for example, is a matter of degree and may depend on the nature of the shot, the location, and the notability of the subject." JKadavoor Jee 12:31, 21 June 2013 (UTC)[reply]
"none of the 2 op-editors voted to keep that image"- One voted; and later strike off. This too interesting (not sure though). JKadavoor Jee 12:45, 21 June 2013 (UTC)[reply]
This isn't the first time I've seen it, but the use of the UDHR as an excuse for censorship is an unrivalled act of Wikilawyering chutzpah. The declaration - which could have been better written there - was speaking of actions of governments that single out individual citizens for systematic society-wide ostracism and abuse based on their political beliefs, race, religion, etc. (This is due to an insufficient recognition of positive rights, which made it inconvenient for the document to speak of what is denied when the government makes such attacks) It clearly does not mean, in any country, that attacks on people's "honor" have ever stopped - especially not if something as trivial as doing an unconventional painting offends your notion of honor. Wnt (talk) 15:29, 21 June 2013 (UTC)[reply]
FYI, personality rights do not relate to privacy rights - they relate to control of a person's image for publicity or advertising purposes, and are inapplicable on Commons. See commons:Commons:Photographs_of_identifiable_people#The_right_of_publicity or Personality rights for more information. Privacy rights vary widely by jurisdiction and are largely inapplicable in a case like this one, although there are ethical concerns. As I explained in the DR at some length, I do not believe the works were intended to insult anyone, and I believe their educational value to the public exceeds any risk or discomfort experienced by Wales. Dcoetzee 20:29, 21 June 2013 (UTC)[reply]
Are you consciously ignoring what had written a few paragraphs below? What is the educational value? No need to buy a brush; since God already gifted you a multipurpose brush with built-in gray-white paint? JKadavoor Jee 03:10, 22 June 2013 (UTC)[reply]
It is an example of the work of a notable artist. If you want to argue that "Pricasso" is not notable, that's something you'd have to take up with the Wikipedia community, not the Commons community. Powers T 23:57, 22 June 2013 (UTC)[reply]
So Article 12 is not applicable to notable people; only for the poor and ignorant? I hope Pricasso can portrait Muhammad and keep alive; because he is notable. JKadavoor Jee 05:26, 23 June 2013 (UTC)[reply]
I'm not following you. Are you saying it violates "Pricasso's" privacy to include work that he specifically released under a free license in a repository of freely licensed work? Powers T 12:15, 24 June 2013 (UTC)[reply]
It is difficult to talk people who pretend they didn't understand. Template:Did you know nominations/Pricasso and Wikipedia_talk:Did_you_know#Pricasso may tell you more. JKadavoor Jee 15:24, 24 June 2013 (UTC)[reply]
Don't blame me if you're communicating your ideas poorly. Where did I suggest that "Article 12 is not applicable to notable people"? Article 12 refers to privacy of individuals; since I was referring to Pricasso's notability, I assumed you thought Pricasso's privacy was at issue. Is it? Powers T 14:12, 25 June 2013 (UTC)[reply]
Sorry. Privacy doesn't only mean do not portray an identifiable living persons in a private place or situation without permission. "When something is private to a person, it usually means there is something within them that is considered inherently special or personally sensitive." Bodily integrity, Modesty and a lot of things related to it. Here what is compromised is Jimbo's Personal rights; his honour and reputation. We already have a resolution, http://wikimediafoundation.org/wiki/Resolution:Images_of_identifiable_people : "Treat any person who has a complaint about images of themselves hosted on our projects with patience, kindness, and respect, and encourage others to do the same. which was neglected here. JKadavoor Jee 15:52, 25 June 2013 (UTC)[reply]
So you're talking about Jimbo's privacy rights? What does that have to do with Pricasso's notability? Powers T 17:20, 25 June 2013 (UTC)[reply]
Please read the deletion request; it was closed by an admin as kept saying "The artworks despicted here are made by a notable artist, and therefore his works are within COM:SCOPE" JKadavoor Jee 05:10, 26 June 2013 (UTC)[reply]
Update: Commons:Administrators are in more troubles; but refusing to accept it. JKadavoor Jee 06:58, 26 June 2013 (UTC)[reply]
Yes, Pricasso is notable. What does any of that have to do with Jimmy Wales' privacy rights? Those are two entirely separate issues; why have you conflated them? Powers T 23:40, 26 June 2013 (UTC)[reply]
Not me; it was Com:Admins and 'crats who conflated them and says "because Pricasso is notable, we can ignore Jimbo's Personal rights". JKadavoor Jee 03:51, 27 June 2013 (UTC)[reply]

Communication is the key

The original op-ed and the two responses as well as many of the comments above point to the lack of communication among parties being the source of contention. Perhaps a way to forestall future disagreements is to make sure the lines of communication are always open. Even though we're all focused on the projects, we have to go the extra distance and focus a bit more on individual's perceived displeasure in order to see that sometimes illusive consensus. -- kosboot (talk) 12:51, 21 June 2013 (UTC)[reply]

{ {Keep Local} }

There is a way to opt out of the dysfunctional Commons asylum. Whenever one uploads a file, never upload to Commons, always upload straight to En-WP and include (in addition to a proper Rights tag) the template { {keep local} }, which will prevent the speedy deletion of the En-WP version of the file in the event that it is moved over to Commons. All files should be housed by the various language WPs, in my estimation, and Commons written off as a good idea gone terribly wrong. Carrite (talk) 17:02, 21 June 2013 (UTC)[reply]

"Keep local" doesn't even say not to copy to Commons, only that you should also keep a local copy. If your meaning is that people shouldn't free-license their work, then you're basically proposing a "take your ball and go home" approach.
To be sure, that is an approach that works both ways, and when artists can't be sure whether their upload will be kept or discarded because it offends somebody, the odds of them uploading to Commons will indeed be reduced. Indeed, even now, I would not suggest an artist simply put his work straight on Commons, because he will only be demeaned by a community that puts no value on what it gets for free - I'd say that it makes far more sense for him to pay a little to set up a high quality web site (like http://www.thescarproject.org/) to showcase his work, control and optimize its presentation, and then if someday he wants to free license some, maybe when he's ready for a publicity blitz, maybe in his will, he can put a cc-by on it and maybe someone will notice and upload it to Wikipedia. Wnt (talk) 17:31, 21 June 2013 (UTC)[reply]
The problem is that quite frequently, as I recall from the 2009-10 period where I was more active in administrative areas, a local image is uploaded to commons and deleted locally, then afterwards deleted at commons because of copyright or other issues, even though the image satisfied local project policy (e.g. an acceptable license or justifiable as fair use), and most of the time the commons admins who delete the image don't bother informing the local project. I don't know if commons made efforts on this matter since that time. Of course there is also that with the huge number of 'inappripriate images' on commons, they are used for vandalism on local projects. On wikipedia, we know how to use the mediawiki blacklist (though it sometimes takes a while) but some smaller projects aren't aware of this possibility, and are hit harder. The problem with commons is mostly the lack of communication with the local projects, but not only that. Commons has become a strong independent project, but the emphasis on being primarily a service for local projects has largely disappeared. In order to grow as a project, people at commons are prepared to relegate to the second plan their responsibility to local projects, and it's not just with Wikipedia. Cenarium (talk) 20:29, 21 June 2013 (UTC)[reply]
The { {keep local} } template doesn't stop importation to Commons (although I believe it slows the process); it does prevent speedy deletion of the original En-WP image as a duplicate, however. Then, just for instance, citing an incident in my own personal experience, when Administrator No. 1 at Commons engages in personal retaliation by deleting or attempting to delete uploaded material in a "up yours" power play, he can delete away on Commons all day long and the En-WP image is preserved. Carrite (talk) 04:04, 24 June 2013 (UTC)[reply]
There we have an example from March: File:EnthanasiePropaganda.jpg, uploaded to commons, deleted there, when we had used this for years and it was acceptable under fair use. Why would this be moved to commons and then years later deleted ? Even though I (maybe excessively) emphatically asked to inform wp in the deletion request if this were going to be deleted, no one bothered. I just undeleted the image now on wp. How can this be explained ? Cenarium (talk) 21:43, 21 June 2013 (UTC)[reply]
For such cases we have a special template {{fair use delete}} at Commons, which starts a process of copying the file to :en into the fair-use queue and tagging it finally for deletion on Commons. However, as this is a :en-only process, developed by Dcoetzee 1 or 2 years ago, it is an additional item to have in mind when processing the deletion queues on Commons and eventually not all my colleagues are even aware of this special process. --Túrelio (talk) 22:14, 21 June 2013 (UTC)[reply]
As an English Wikipedia user I generally oppose use of Keep Local on the grounds that it defeats the purposes Commons was created for: consolidation of media used on many projects, and more importantly, making it easy to make improvements to media and/or the file description page in a single centralized location. Divergence between the Commons version and English Wikipedia version is a serious maintenance issue. Dcoetzee 06:19, 25 June 2013 (UTC)[reply]

What is the question to be answered ?

NSFW: Masturbating_with_a_toothbrush.jpg

For what I have understood, these two answers to the "Pink Parrot Incident" can be summarized as "come and fix it". But this is not so simple. What is to be fixed ? Let us go back to the "Tooth Brush Incident". You type: "ToothBrush". in the Commons search bar and you get a lot of toothbrushes. A simple request gives you 20 of them. If you look further, you obtain File:Toothbrush regurgitated by albatross on Tern Island, Hawaii - 20060614.jpg (the 40th toothbrush, on the left) and three picture later, you obtain another great moment in the life of a toothbrush File:Masturbating_with_a_toothbrush.jpg (the 43rd toothbrush, on the right).

The long "history". of this file is *educative*. It was created on 6 May 2011‎. Quite immediately (2 June 2011‎) the "Tooth Brush Incident" appeared, and part of the people has tried to fix this incident by renaming the file into File:Woman masturbating with improvised vibrator.jpg (and removing it from the Commons:Category:Toothbrushes). Another part of the people has worked hard to keep alive this "Tooth Brush Incident", introducing again and again the searchkey "toothbrush" into the name, the categorization, a link to the file or whatever. At 19:27, 22 May 2012 (UTC), a group of admins has stated that ""We agree that there is a problem when a search for toothbrush on Commons returns this image on top of the results"".. But the problem has not been fixed as now.

From that, we can see that this long lasting "Tooth Brush Incident" is not the result of a poor search tool nor even the result of the mere existence of that file on 'commons.wikimedia.org'. Its a bigger problem, that cannot be solved by a simple increase of the workforce at Commons. In fact, this place doesn't look to be a workplace and this can be the key problem.

NSFW: Hot sex barnstar.png

Isn't this barnstar a great *educative* picture, in the context ? Pldx1 (talk) 19:22, 21 June 2013 (UTC)[reply]

That "hot sex barnstar", incidentally, was created by User:Beta M, who was banned from all projects by the WMF. Delicious carbuncle (talk) 19:30, 21 June 2013 (UTC)[reply]
Sorry, but I don't understand what you are implying by your remark. Do you suggest that the Wikimedia Fundation has ordered the deletion of this barnstar and that 'commons.wikimedia.org' has not complied ? Pldx1 (talk) 22:19, 21 June 2013 (UTC)[reply]
There was a deletion discussion about this barnstar, but it was kept. One day a journalist with nothing better to write about will do a story on Wikimedia Commons and include this barnstar as an example of the culture there. The fact that it was created by someone with a conviction for distribution of child porn will just be another salacious detail for them to add. They might even mention that Commons could not reach a consensus to ban that user, which may be what encouraged the WMF to act. If they are good researchers, they might also note that despite banning him from all WMF projects, we still link to Beta M's porn site in Anarchism and issues related to love and sex (I removed it once, but someone put it back). Delicious carbuncle (talk) 04:22, 22 June 2013 (UTC)[reply]
  • Our photo of a toothbrush regurgitated by an albatross is core content. There are few things that we can serve from Commons which are more important than ongoing documentation by citizens of the ecological impact of the pollution that clogs our oceans. Is it disgusting? Sure. The world comes in a lot of flavors, and most of them are expressible only in scatological epithets. Our job is to cover it all.
  • Toothbrush masturbation is a little more peculiar, but it goes to show the inventiveness of the human mind in these things. If you can say "I'd like to see a picture of ---" then Commons' educational mission is to provide some, at least within the limits of what it can legally get away with. Wnt (talk) 20:01, 21 June 2013 (UTC)[reply]
  • But you see, sex and human bodies are dirty and sinful, so we should remove all depictions of such things, or else we may tempt people into sin. I mean, humans have used improvised masturbatory and sex aids all the time for as long as humans have existed, so much so that palaeontologists turn them up all the time on digs, and there are ribald jokes about women sitting on running washing machines, horseback riding, and emergency room personnel regularly have to treat people who've injured themselves by using an ill-chosen sex aid, and so on. But this just shows how corrupted and sinful humans are. We certainly don't want to be giving people the idea that sex and masturbation are normal, ordinary parts of human life and as deserving of depiction as anything else, or else people might wind up touching themselves and going to Hell. We need to only depict things that are acceptable in respectable Christian society, like this and this. --108.38.191.162 (talk) 20:52, 21 June 2013 (UTC)[reply]
I'm belief the issues is protecting the children from those images, unless you want to tagged Wikipedia 18SX. Yosri (talk) 12:49, 23 June 2013 (UTC)[reply]
How exactly are these images damaging to children? Powers T 12:14, 24 June 2013 (UTC)[reply]
Children have a tendency to imitate; and it hurts their private parts. :( JKadavoor Jee 12:26, 24 June 2013 (UTC)[reply]
[citation needed] Powers T 12:45, 24 June 2013 (UTC)[reply]
http://else.econ.ucl.ac.uk/papers/uploaded/302.pdf JKadavoor Jee 12:52, 24 June 2013 (UTC)[reply]

Toothbrushgate, Part 47

"One of those results was the search "toothbrush" returning a picture of a woman using an electric toothbrush for self-pleasure as one of the top results. This was entirely a legitimate result - it was a picture of a toothbrush, and it was titled as such."

... no, it was NOT a "legitimate result." Is this what someone actually wants to see when searching for toothbrushes? No? Then it's not a good result. The user is always right. Let me add that I am one of the people who rolls their eyes at proposal for general "content filters," but something akin to your average Google SafeSearch should have been in place, and if it isn't now, it should be added. Nothing to do with images of Muhammad or whatever which is a giant distraction. SnowFire (talk) 22:26, 22 June 2013 (UTC)[reply]

Define what is "safe" without reference to your particular culture. Powers T 00:01, 23 June 2013 (UTC)[reply]
A) The picture wasn't really of a toothbrush. It was loosely related at best. So... a bad result, which is why the above line in the editorial is so infuriating. A Google Image search for the topic wouldn't have found it unless you cued it (so sexuality is okay if you specifically include a word asking for it).
B) If we assume that the picture was already cross-categorized as both sexual & related to something non-sexual... which there probably are good examples of... must the perfect be the enemy of the good? I've already said that I largely think that customer search filters by country or whatever would be a waste of time, a mess, and easily abused. However, the fact that a wide-ranging general filter is bad doesn't mean that filtering out sexual material that is objectionable to the vast majority of human culture is bad. I doubt the toothbrush would have be an "expected" result in the most liberal Scandinavian country anyway, but if it was, oh well, see above about you can't be perfect. Don't serve porn without explicit user sign-off, it shouldn't be that hard. Letting these kind of results continue will only strengthen the case for idiotic censorship filters. SnowFire (talk) 00:18, 23 June 2013 (UTC)[reply]
To present an analogy: by default, I have SafeSearch disabled on Google Images search, because it tends to accidentally exclude some useful results. On one occasion, I searched for images of Homer Simpson for an article, and one of the results was a picture of someone's vagina painted as Homer Simpson. The technology does not exist to reliably distinguish pornographic and non-pornographic images - there will always be false positives and false negatives - and some users who are deliberately seeking pornographic content will inevitably be confused by the need to deactivate a filter. The question is whether we, like Google, should make the presence of such filters a default, despite its disadvantages - or if not, what other alternatives might exist. Dcoetzee 02:35, 23 June 2013 (UTC)[reply]
Wiki should make the presence of such filters a default. If the user is old enough (to decide you want to see those pictures), they should be able to deactivate those filters.Yosri (talk) 12:49, 23 June 2013 (UTC)[reply]

An observation

I was waiting to see if anyone would replace the images above of a woman masturbating with an electric toothbrush and the "hot sex barnstar" with links, but no one has. A reasonable interpretation of WP:NOTCENSORED is that if you look at WP articles about topics dealing with sexuality or anatomy one should expect to see images of nudity or sexuality. In practice, a rather silly but popular invocation of WP:NOTCENSORED as nothing but a slogan means that our readers should follow a "principle of most astonishment" where at any time you should expect to see images of nudity or sexuality. I'm not offended by it personally, but It seems obvious to me that it is not appropriate to for readers of the Signpost -- hopefully read by many of other "millions" of contributors -- to be faced with an image of a masturbating woman in the comments. Delicious carbuncle (talk) 14:25, 23 June 2013 (UTC)[reply]

It is difficult to open Wikimedia pages in front of children nowadays. JKadavoor Jee 16:54, 23 June 2013 (UTC)[reply]
I've replaced the images with links. Colin°Talk 18:00, 27 June 2013 (UTC)[reply]

Are these pictures really free pictures?

Undigested toothbrush

Item one, the Colgate toothbrush [1]. For what I understand, this Colgate toothbrush picture was taken by someone working at the United States Fish and Wildlife Service (USFWS), Hawaiian Islands NWR, as part of that person's official duties. As a work of the U.S. federal government, the image is *in the public domain*. Yes, this is the truth. But this is only a part of the truth.

Because this picture doesn't appear to be a *free* picture. This pictures carries a proeminent "Colgate" trademark. Moreover, this proeminent trademark on the handle of the brush is the only thing that is clearly identifiable on this picture. How do you even know that the central part of the bolus is a toothbrust ? From the trademark on the handle. This largely not de minimis. Not convinced ? Let us replace the trademark "Colgate" by the common name "commons". This gives the right picture. Convinced now ?

The description attached to the picture says: An albatross bolus – undigested matter from the diet such as squid beaks and fish scales. This bolus from a Hawaiian albatross (either a Black-footed Albatross or a Laysan Albatross) found on Tern Island, in the French Frigate Shoals, Northwestern Hawaiian Islands, has several ingested flotsam items, including monofilament fishing line from fishing nets and a discarded toothbrush. Ingestion of plastic flotsam is an increasing hazard for albatrosses.

Therefore it is *fair* to use such a picture, among many other ones, to describe that "even an innocent toothbrush can turn into a fatal weapon". This is done in , and could be done in en:Marine debris, etc. But it is *unfair* to use the same picture in a way suggesting that Colgate is the worst among all these wrongful killers of innocent albatrosses, as done at and . The use at en:Bolus (digestion) as the only picture illustrating a one line article stating that "Under normal circumstances, the bolus then travels to the stomach for further digestion" is unclear.

Conclusion: the picture seems to be relevant (with the proper statements) at some pages inside Wikipedia, and irrelevant at 'commons.wikimedia.org'.Pldx1 (talk) 12:11, 23 June 2013 (UTC)[reply]

The name of the brand is incidental, and even if it weren't it's just text which means it's not eligible for copyright. As for trademark issues, we just tag it with {{trademark}} and leave it as that. -mattbuck (Talk) 12:37, 23 June 2013 (UTC)[reply]
If it's the Colgate trademark on a Colgate brand toothbrush, then that's nominative use. You could make a weak case for tarnishment of the brand, but I doubt that would go far since we didn't shove the toothbrush down the albatross's throat to try to make Colgate look bad. Gigs (talk) 03:52, 25 June 2013 (UTC)[reply]
Good God, I thought everybody who lobbied for that kind of suppression of anything bad about a company in the name of trademark law was making $100,000 a year. If people will do it for free nowadays, there goes the last best hope of the middle class. Wnt (talk) 12:30, 27 June 2013 (UTC)[reply]

Why reinvent the wheel? Just add a Google Search link to the commons:Special:Search page:

Then the nudity, gore, and sex challenged could use Google's SafeSearch option if they so choose.

I am sure one of the various gadgeteers on the Commons could come up with some gadget to add the link to the bottom of commons:Special:Search. It could be enabled at commons:Special:Preferences#mw-prefsection-gadgets.

Or better yet it could be added by default for all users, whether registered or anonymous. Google often does much better searches of the Commons than the MediaWiki search engine. So I and others would love to have it enabled by default.

I never use SafeSearch though, and so please do not add a Google search link with SafeSearch enabled by default. --Timeshifter (talk) 14:27, 24 June 2013 (UTC)[reply]

It is an interesting suggestion, but Google refined their image search last year so that people who aren't searching for porn see less of it diluting their results. An image search for "toothbrush" doesn't turn up File:Woman masturbating with improvised vibrator.jpg, nor does a search for "vibrator" (although it does turn up File:Report-a-File step 1.png, which incorporates that image). A search for "masturbating" does find the image. Note that this is with Google's "safe search" disabled. Delicious carbuncle (talk) 15:36, 24 June 2013 (UTC)[reply]
Yes, according to the SafeSearch article Google's default image search now turns up less nudity, gore, and sex unless more explicitly searched for. Those search algorithms are up to Google.
The default image search with Google is without SafeSearch turned. One can tell this by doing an image search. The option to turn on the filter then shows up.
My main point is that there is no need for the Commons to tag images somehow for filtering. Google already filters for sex, nudity, and gore with SafeSearch. So let us use it. We could even add 3 links to commons:Special:Search:
See this Google page. It explains what I did with the last link in the above list: "append &safe=active or &safe=on directly to all search URLs. This will enable strict SafeSearch." --Timeshifter (talk) 16:41, 24 June 2013 (UTC)[reply]
Those Google image search links can be put on Special:Search on every Wikimedia wiki. An additional Google site search can also be set to search for images particular to a specific Wikimedia wiki. That would allow Google search of fair use images too. --Timeshifter (talk) 04:04, 26 June 2013 (UTC)[reply]
It would not be desirable to feature Google to the exclusion of rivals like Ixquick and Bing, but of course, there's no reason why we couldn't fit all the commercial search sites with all options in a page for the purpose of searching. Wnt (talk) 06:55, 26 June 2013 (UTC)[reply]
We could add a few site search links to Special:Search from the major search engines such as Google and Bing. Then we could link to a page with more links. There is no reason we shouldn't make Wikipedia and the Commons more accessible to all. Making people go elsewhere defeats the purpose of a search engine in speeding up access to what people want. --Timeshifter (talk) 16:06, 26 June 2013 (UTC)[reply]
I'm just so grateful that there is someone who has a bot to take note of uploaded genitalia photos to mark them for deletion...and that person isn't me. God bless you for your work! 69.125.134.86 (talk) 23:24, 29 July 2013 (UTC)[reply]
  1. ^ To be continued. May be. Remember: 'commons.wikimedia.org' is not a workplace



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0