The Signpost

News and notes

Picture of the Year 2010; data challenge; brief news

Contribute  —  
Share this
By Tom Morris, TheGrappler, Dank and Tilman Bayer

Commons Picture of the Year 2010 announced

Commons Picture of the Year 2010: "Laser towards Milky Way's Centre", taken last August by Yuri Beletsky, at the high-altitude Paranal Observatory in Chile. The laser "creates an artificial star at an altitude of 90 km high in the Earth's mesosphere", as a reference-point enabling the adaptive optics to correct the blurring effect of the atmosphere, thus improving observations of phenomena such as the giant black hole at the centre of the galaxy.

On the Wikimedia Foundation's blog, the results of the fifth annual Picture of the Year competition were officially announced last week. 2,463 votes were cast, with all 783 pictures that were promoted to featured picture status in 2010 entered into the competition. The winning image is a photo by Yuri Beletsky, Photo Ambassador of the European Southern Observatory (ESO): File:Laser Towards Milky Ways Centre.jpg (above, Signpost readers might recognize it as "choice of the week" from the November 8 "Features and admins" section).

Reacting to the news via email, Beletsky said in thanks, "I am really honored and delighted with the results of the poll. I am happy ESO released the image under a free license." ESO (which had itself featured it as "Picture of the Week" in September) highlighted the result in an announcement on their website: "ESO Picture of the Paranal Observatory Voted Wikimedia Picture of the Year 2010".

Wikipedia data analysis challenge

Last week, the Wikimedia Foundation announced "the launch of the Wikipedia Participation Challenge, a data-modeling competition to develop an algorithm that predicts future editing activity on Wikipedia", hosted by Kaggle, a platform for crowd-sourcing predictive modeling. Based on data derived from Wikipedia's public XML dump, contestants are to "develop a model to predict the number of edits a given editor will make in six months' time", competing for $10,000 in prize money provided by an anonymous donor. The challenge was noted on various blogs, such as Revolution Analytics and New Scientist. User:Protonk noted that the dataset has been anonymized "to obscure editor identity and article identity, simultaneously adding focus to the challenge and robbing the dataset of considerable richness", and gave detailed advice to participants, especially those not familiar with Wikipedia editing processes. A blog posting by a former collaborator of the WMF's data scientist Diederik van Liere, titled "Mind. Prepare to be blown away. Big data, Wikipedia and government", compared the challenge to an earlier one on Kaggle that had significantly improved existing models from HIV research, and noted that "Within 36 hours of the wikipedia challenge being launched the leading submission has improved on internal Wikimedia Foundation models by 32.4%". By July 1st, the dataset had been downloaded more than 200 times. At the time of writing, 17 teams have submitted models.

In brief

An offensive image as it would look when hidden by the proposed personal image filter
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.

Re the DMCA take-down issue, whatever happened to the doctrine of ostensible authority? – ukexpat (talk) 13:09, 5 July 2011 (UTC)[reply]

On the same issue, why haven't the pictures be uploaded as fair use to en Wikipedia? --Piotr Konieczny aka Prokonsul Piotrus| talk 15:48, 5 July 2011 (UTC)[reply]
Although there was apparent authority at the time, the copyright owner has directly refused permission. As such we have no right to use the images. As for using on enwiki, they'd be hosted on US servers, so we'd just get a new take down notice, surely? If the original permission-giver had been the copyright owner themselves rather than a representative, it'd be harder for a take-down notice to be enforced, as he would have directly given a release under a suitable license. Although it's a shame to lose the images, they presumably aren't so essential as to make any articles they were used in useless - or can we expect some article deletions as a result of the illustrative pictures no longer being available? -- PhantomSteve.alt/talk\[alternative account of Phantomsteve] 18:07, 5 July 2011 (UTC)[reply]
If anyone wishes to restore one or more of these images as NFC compliant images on En, mail me and I will send them to you. The firm should also be informed of these uses. If another take-down request is issued, you then have the option under the DMCA of issuing a counter claim claiming it as fair use. See How to File a DMCA Counter Claim. Dcoetzee 19:25, 5 July 2011 (UTC)[reply]
A far more insidious problem occurs with TI calculator encryption keys. These were uploaded by an anonymous source, taken down as a result of DMCA as an "office action". Because it is an office action no-one can reverse it, the "office action" won't be withdrawn unless a DMCA counter-claim is made, and only the anonymous editor is allowed to make a counter claim. Therefore the "office action" rules, as they stand, allow anyone to effectively protect the unprotectable, by uploading it to WP via a puppet, then issuing a DMCA takedown. Rich Farmbrough, 15:15, 16 July 2011 (UTC).[reply]
I dislike the offensive image filter idea. First of all, as a certain cretin has demonstrated repeatedly, any attempt at controlling the use of offensive images can be easily circumnavigated (per BEANS I won't say how, except that it isn't a hack or exploit). This means that if someone wants to shock or offend people, we can't stop them from doing so, we can only strive to get better and better at catching and removing the offending images. As for the permanent images, who becomes the arbiter of values for Wikipedia. Do we allow anyone to set up their own personal filters? If so that does not prevent first time exposure. Do we just do it to items on the bad image list? Do we establish a working group of experts/parents/concerned bystanders? How do we decide what's inappropriate to whom? Does it become censorship? I would most certainly fear a few prudes going haywire and deeming large swaths of images as needing filtering, leading to an all out brawl when those changes are reverted. The offensive image filter will cause nothing but trouble. Sven Manguard Wha? 20:49, 5 July 2011 (UTC)[reply]



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0