The Signpost

Opinion

A photo on Wikipedia can ruin your life

Andreas Kolbe is a former co-editor-in-chief of The Signpost and has been a Wikipedia contributor since 2006. The views expressed in this opinion article are those of the author and do not necessarily reflect the views of The Signpost, its staff or of any other Wikipedian. Responses and critical commentary are invited in the comments section.
WMF Legal Director Jacob Rogers

The Wikimedia Foundation's Legal Director, Jacob Rogers, this month published a triumphant essay on Wikimedia's Diff blog, titled "A victory for free knowledge: Florida judge rules Section 230 bars defamation claim against the Wikimedia Foundation". As he says in his post describing this legal victory for the Foundation,

The case began when plaintiff Nathaniel White sued the Wikimedia Foundation in January 2021, claiming that the Foundation was liable for the publication of photos that incorrectly identified him as a New York serial killer of the same name. Because of its open nature, sometimes inaccurate information is uploaded to Wikipedia and its companion projects, but the many members of our volunteer community are very effective at identifying and removing these inaccuracies when they do occur. Notably, this lawsuit was filed months after Wikipedia editors proactively corrected the error at issue in September 2020. Wikimedia moved to dismiss the amended complaint in June, arguing that plaintiff's claims were barred by Section 230.
In its order granting the Wikimedia Foundation's motion to dismiss, the court affirmed that "interactive computer service providers" such as the Foundation generally cannot be held liable for third-party content like Wikipedia articles and photographs. ... the plaintiff argued that the Foundation should be treated like a traditional offline publisher and held responsible as though it were vetting all posts made to the sites it hosts, despite the fact that it does not write or curate any of the content found on the projects. The court rejected this argument because it directly conflicts with Section 230 ...
This outcome perfectly demonstrates how critical Section 230 remains to crowdsourced projects and communities.
— Diff

So what actually happened on-wiki?

The case against the Wikimedia Foundation was dismissed by the Second Judicial Circuit court for Leon County, Florida. Picture shows Leon County Courthouse.

The case discussed in Rogers' essay on Diff concerned the Wikipedia biography of New York serial killer Nathaniel White. For more than two years this Wikipedia article had as its lead image a police photograph of a quite different Nathaniel White, an African-American man resident in Florida whose picture has also, equally erroneously, been used in a Discovery Channel broadcast on the New York serial killer of the same name.

The image was inserted into the Wikipedia article by User:Vwanweb on 28 May 2018, incorrectly identified as originating from the New York State Department of Corrections and Community Supervision. It was removed from the article on 4 September 2020 – an edit attributed by Wikipedia only to an American IP address, rather than a registered Wikipedia user account.

The removal of the image occurred about a week after Karl Etters, writing for the Tallahassee Democrat, reported that the Florida Mr. White had sued the Discovery Channel for defamation. In his article, Etters wrote that Wikipedia was also using the wrong picture to illustrate its article on the serial killer: "A Google search turns up the name of the Florida Nathaniel White with a Wikipedia page showing his photo and label as a serial killer."

Taken together, these facts contradict Rogers' characterization of how well Wikipedia deals with cases such as this:

  1. The photo was in the article for over two years. For a man to have his face presented to the world as that of a serial killer on a top-20 website, for such a significant amount of time, can hardly be described as indicative of "very effective" quality control on the part of the community.
  2. The picture was only removed after a press report pointed out that Wikipedia had the wrong picture. This means the deletion was in all likelihood reactive rather than "proactive", as it was described in the Diff essay.
  3. The wrong photograph appears to have been removed by an unknown member of the public, an IP address that had never edited before and has not edited since. The volunteer community seems to have been completely unaware of the problem throughout.

Image sourcing

Vwanweb captioned Mr. White's picture as originating from the New York State Department of Corrections and Community Supervision in the Wikipedia article but named crimefeed.com, a site associated with Discovery, Inc., as the source in the picture upload. The Florida Nathaniel White first sued the Discovery Channel, with the Wikimedia Foundation added as a Defendant later on.

Now, surely no individual editor can be blamed for having failed to see the Tallahassee Democrat article. But it is just as surely inappropriate in a case like this, where real harm has been done to a living person – on which more below – to praise community processes. It would seem more appropriate –

  1. to acknowledge that community processes failed Mr. White to a quite egregious degree, and
  2. to alert the community to the fact that its quality control processes are in need of improvement.

The obvious issue is image sourcing, and especially the sourcing of photographs of criminals. The original upload of the picture by User:Vwanweb cited crimefeed.com as the source of the picture. Crimefeed.com today redirects to investigationdiscovery.com, a site owned by Discovery, Inc., which also owns the Discovery Channel. The Web Archive shows that an article on Nathaniel White was indeed published on the site on August 2, 2017. The article itself is not in the archive, but its URL matches the truncated "http://crimefeed.com/2017/08/31713..." URL listed in the log of the upload.

If this, then, was Vwanweb's source, subsequent events clearly showed that it was unreliable. And even less trustworthy sites (such as murderpedia.org) have been and are used in Wikipedia to source police photographs. Surely Wikipedia's guidelines, policies and community practices for sourcing images, in particular images used to imply responsibility for specific crimes, would benefit from some strengthening, to ensure they actually depict the correct individual.

Correctly indicating image provenance in an article, along the lines of the "Say where you read it" guideline that applies to written texts, is another aspect that may require attention: according to the upload information, the picture came from a "true crime" site, not the New York State Department as was indicated in the article.

Section 230: a quick recap

Section 230 of the Communications Decency Act has come under fire of late, from both sides of the political spectrum.

As Rogers explains in his Diff essay, Section 230 of the Communications Decency Act is essential to the way Wikipedia and other Wikimedia sites have operated for the past twenty years. The key sentence in Section 230 is this: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." But the law has come under fire lately in the US, both from the political right and from the political left.

Republicans who feel their views are being censored online argue that social media websites have abandoned the ideals of plurality and political diversity, and that as a consequence websites should no longer enjoy Section 230 privileges that were originally designed to benefit neutral hosts. Some Democrats, meanwhile, have criticized sites for hiding behind Section 230 and doing too little about problematic content. In their view, Section 230 was created to enable sites to moderate content without liability risk to them, and if they don't do so, then the law is not fit for its purpose.

A common but mistaken idea about Section 230 in this context is that site operators like the Wikimedia Foundation would "lose" their protection if they started to moderate more content than they were legally required to remove (i.e. if they went beyond copyright infringements, child pornography, court-ordered removal of defamatory content, etc.). This notion is often expressed as follows: "If the Foundation were to start moderating content, it would no longer be a platform, but a publisher, and would become liable for everything posted on its sites."

This is almost the exact opposite of the truth. As Mike Godwin, former General Counsel of the Wikimedia Foundation, explained in Slate last year, Section 230 was actually "designed to empower internet companies to remove offensive, disturbing, or otherwise subscriber-alienating content without being liable for whatever else their users posted. The idea was that companies might be afraid to censor anything because in doing so, they would take on responsibility for everything." Section 230 was designed to remove that risk.

Interested readers can find more information on this issue in the following articles:

Who can the victim hold responsible?

This picture, uploaded by Vwanweb in the same week in 2018, purports to show Paula Angel, a woman said to have been hanged in New Mexico in 1861. According to historian John Boessenecker, it is a fabrication of far more recent origin by Gladwell Richardson alias Maurice Kildare, a man described by Boessenecker as "a leader in publishing fake stories and fake photos of the history of the Southwest". Significantly, perhaps, the image is not – or no longer – present on the page cited as its source in the Wikipedia upload.

The Diff essay contains another paragraph related to Section 230 that is worthy of particular attention. It implies that Mr. White would have done better to direct his complaint at User:Vwanweb. Let's look at this passage in detail. Rogers states:

It is important to note that Section 230's broad protection of Wikimedia projects and other online services does not leave litigants like Mr. White without options. Instead, the law simply requires that litigants direct their complaints at the individuals who made the statements at issue, rather than the forums where the statements were made. This both allows litigants to challenge the appropriate parties responsible for their harm and protects online hosting companies like the Wikimedia Foundation from the costs associated with liability for user-generated content.
— Diff

This may sound plausible and equitable enough to the general reader, but Rogers surely knows that Wikipedia editors, by and large, write under the cover of pseudonymity – a practice which the Wikimedia Foundation explicitly encourages and vigorously defends. Identifying contributors is no easy task – and certainly not one the Foundation wants people to pursue. According to the Wikimedia Foundation's Universal Code of Conduct, which is in the process of being adopted, determining and sharing a contributor's identity is "unacceptable". So, how genuine is this advice given to Mr. White?

Moreover, there is no reason to assume that User:Vwanweb, the editor concerned, would have been able to give appropriate compensation to Mr. White. To cite a precedent, when John Seigenthaler learned the identity of his pseudonymous Wikipedia defamer, Brian Chase, Seigenthaler ended up feeling sorry for Chase, and interceded with Chase's employer, who had fired Chase, to give him his job back.

Nor is there any reason to assume any malice or racist motives on the part of Vwanweb. That user had been very involved in Wikipedia's crime articles for a while, frequently requesting and uploading police photographs. In 2016, Vwanweb argued passionately (and unsuccessfully) for including criticism of an instance of all-white jury selection in a criminal case in which the perpetrator was white and all the victims were black. Their insistence on including criticism of this practice eventually earned them a warning for edit-warring. If there was any race whose failings this editor was likely to highlight on Wikipedia, judging by that episode, it was Caucasians.

I believe that like many other editors, Vwanweb simply followed community practices they had observed here. In this subject area, this involves widespread use of "true crime" sources that present crime as entertainment, and whose level of reliability is akin to that of tabloids and other types of publications that are banned or deprecated as sources in other parts of Wikipedia.

When asked for comment by The Signpost the WMF legal department responded that they are not trying to encourage victims to sue Wikipedia contributors, only that there may be others beyond the WMF who can be held responsible.

In this particular case the Discovery Channel was sued and is not protected by Section 230. But in the general case, would the majority of victims be able to find another responsible party?

The effect on Nathaniel White of Florida

As an interactive computer service provider, the Wikimedia Foundation is not considered to be exercising a publisher's traditional editorial functions. Instead, the court order said, "the relevant content was provided by another information content provider" – in other words, the volunteer who uploaded the picture and added it to the article.

Here are some excerpts from Mr. White's complaint. It states that after the 2018 Discovery Channel broadcast,

… friends and family contacted Plaintiff concerning the broadcast and asking Plaintiff if he actually murdered people in the state of New York.
Plaintiff assured these friends and family that even though he acknowledged his criminal past, he never murdered anyone nor has he ever been to the state of New York. …
Plaintiff has been threatened with harm to his person and shunning by members of the public who, because of the broadcast and social and digital media imagery, assumed that Plaintiff was the vicious killer who committed the murders in New York state. …
Plaintiff has resorted to dressing incognito so he is not recognized in order to preserve his life and damp down the threats he received.
Defendants published this false and defamatory image, photo and information regarding Plaintiff to a third party which is and was the public at large on its television broadcast, social media and digital & electronic audience which encompasses millions of people in Florida and billions of people around the world.
Plaintiff is an African-American man and Defendants appear to believe that all African-American men are interchangeable and that no one would notice or care Defendants were defaming an innocent man, not even other African-Americans, in their description of Plaintiff in this matter.
It is obvious in this case that Plaintiff is not the gruesome murderer that was supposed to be depicted in Defendants' broadcasts and media platforms and that this is more than a simple, excusable or inadvertent error.
African-Americans have always borne an unequal brunt of punishment in this country and this behavior continues from these private Defendants upon Plaintiff.
— Nathaniel White's complaint

This has clearly been an extremely harrowing experience for Mr. White, as it would surely have been for anyone.

While to the best of my belief the error did not originate in Wikipedia, but was imported into Wikipedia from an unreliable external site, for more than two years any vigilante Googling Nathaniel White serial killer would have seen Mr. White's color picture prominently displayed in Google's knowledge graph panel (multiple copies of it still appear there at the time of writing). And along with it they would have found a prominent link to the serial killer's Wikipedia biography, again featuring Mr. White's image – providing what looked like encyclopedic confirmation that Mr. White of Florida was indeed guilty of sickening crimes.

Moreover, it can be shown that Mr. White's image spread to other online sources via Wikipedia. On the very day the picture was removed from the article here, a video about the serial killer was uploaded to YouTube – complete with Mr. White's picture, citing Wikipedia. At the time of writing, the video's title page with Mr. White's color picture is the top Google image result in searches for the serial killer. All in all, seven of Google's top-fifteen image search results for Nathaniel White serial killer today feature Mr. White's image. Only two black-and-white photos show what seems to have been the real killer.

Black Lives Matter

The Wikimedia Foundation has declared its solidarity with the worldwide George Floyd protests.

The Wikimedia Foundation has in the recent past cited the fate of George Floyd and the resulting Black Lives Matter protests as its inspiration for the Knowledge Equity Fund, a $4.5 million fund set up last year to support racial equity initiatives outside the Wikimedia movement. It has declared "We stand for racial justice", expressing the hope that the Wikimedia projects would "document a grand turning point – a time in the future when our communities, systems, and institutions acknowledge the equality and dignity of all people. Until that day, we stand with those who are fighting for justice and for enduring change. With every edit, we write history." A subsequent blog post on the AfroCROWD Juneteenth Conference again referenced the Black Lives Matter movement.

Yet here we have a case where a very real black life was severely harmed, with Wikipedia playing a secondary, but still highly significant part in the sorry tale. The Wikimedia blog post contains no acknowledgement of this fact. Instead it is jubilant – jubilant that the Wikimedia Foundation was absolved of all responsibility for the fact that Mr. White was for over two years misrepresented as a serial killer on its flagship site, the result of a pseudonymous Wikimedian trusting a source that proved unreliable.

Now we can shrug our shoulders and say, "This sort of thing will happen once in a while." Would we have accepted this sort of response from the police force in George Floyd's case?

The Seigenthaler case resulted in changes to Wikipedia's referencing requirements for biographies of living people. Will this present case result in similar changes to sourcing practices for images, especially those implying responsibility for a crime? Who will help Mr. White clean up his continuing Google footprint as a serial killer?

There is also a deeper moral question here. What kind of bright new world is this we are building, in which it is presented to us as a cause for celebration that it was possible for a black man – a man, perhaps, not unlike George Floyd – to be defamed on our global top-20 website with absolute impunity, without his having any realistic hope of redress for what happened to him here?


S
In this issue
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.

Comments

There's an earlier article, dated July 1, 2017, on Monsters and Critics website that also sports the same photo of the wrong Nathaniel White, so Mike McPadden's article at crimefeed.com can't have been the original source of the misidentification (or not the only one, at least).
2001:8003:1DF2:D00:2482:6F1C:C109:DAC6 (talk) 07:02, 4 November 2021 (UTC)[reply]
Thanks for mentioning it, you're absolutely correct. I was aware of that page but had missed that it preceded the crimefeed article by a month. The show referred to on that monstersandcritics page, CopyCat Killers, was produced by another defendant in the suit, Reelz. This also seems to have a Discovery connection. The program is listed e.g. as episode 4 on https://www.discoveryuk.com/series/copycat-killers/?ss=2#episodes It should be obvious that Wikipedia has no business citing this sort of material. Discovery used to be reputable but has descended into the realms of pseudoscience and reality TV. --Andreas JN466 09:51, 4 November 2021 (UTC)[reply]

What could the WMF have done better

BLP policy implications

Sourcing and verification on Wikipedia is poor all round

Before people get too hung up on thinking this is about image sourcing, or policy or images in general, can I direct your attention to the very next edit after the mysterious stranger who removed the image. Another IP editor has changed the date of birth. They didn't say why, and didn't provide a source. Which is perhaps immaterial, because the original date, added all the way way in 2008, never had a source either. Who knows which is correct, if either even is. Why would that be important? Well, obviously, dates of birth are one way you can prove you're not the serial killer that Wikipedia says you are. On current evidence, a malicious actor, on seeing the press coverage, could have altered the date to make it look even more like the misidentified man is the serial killer. There's nothing here that suggests anyone would have even noticed. Even now, even after this article has been subject to much attention for lack of editorial oversight, still nobody appears to have noticed (or perhaps have but just don't care) that the date of birth is unsourced. A basic and obvious violation of BLP, if ever there was one. Policy is clear, and has been for a very long time. That unsourced DOB should be removed immediately. Would take seconds. And yet, it hasn't happened. There have allegedly been efforts to improve Wikipedia's sourcing. If those efforts have not even reached a quick first pass over dates of birth, it doesn't instill much confidence. One wonders what's behind the delay. Perhaps it is lack of editors all round (see below section for a related observation). Mackabrillion (talk) 18:17, 2 November 2021 (UTC)[reply]

Agreed, and while BLP are rightly a particular concern, it's poor sourcing all around. Most of my editing time on Wikipedia is on STEM articles, mostly geology and mostly noncontroversial. Much of that time is supplying missing sourcing. I find myself hesitant to rip out entire sections of articles that lack sourcing, out of fear that some source, somewhere, might support it, and because I am as prone to anyone to the magical thinking that if I simply put a "citation needed" mark on the statement, the source will magically appear. In spite of the fact that I can count the times that has happened on one hand with fingers to spare.
But, not long ago, I found that most of a geology article I had started reviewing was extremely well-written but suspiciously devoid of any sourcing. Sure enough, after some digging on Google Scholar, I found that it was a flagrant, extensive, and 16-year-old copyright violation. Copyright violation may not be as serious a moral issue as libelous BLP, but it's nonetheless a serious legal issue.
I am strongly inclined now to put a "citation needed" on an unsourced statement only if it really, really sounds right, is interesting and important, but for some reason I can't immediately dredge up a supporting source. Otherwise I just cut it and be done with it.
I am coming round to the view that if a statement in an article is unsourced (whether a BLP article or a definition of a igneous rock type) it should be removed immediately. It's nice if the editor makes a good-faith effort to find a supporting source first, but I'm beginning to think there is no obligation to do so. Sure, if we pull every "citation needed" statement out of Wikipedia today, the encyclopedia will lose half its content -- but I suggest that this content will not be much missed and the rest will acquire a much better reputation.
Apologies for the long rant. --Kent G. Budge (talk) 19:13, 2 November 2021 (UTC)[reply]
Kent G. Budge is right on the money here. Large amounts of Wikipedia articles, even highly viewed ones, have tons of uncited material that is very difficult to deal with, usually the easiest solution is to rip it out and start again, but it's so much work. Hemiauchenia (talk) 23:36, 3 November 2021 (UTC)[reply]
Mackabrillion, the article actually had two birth dates for more than six years, ever since this 2014 edit: [2] – 26 July in the infobox, 28 July in the lead paragraph. No one noticed or cared. The edit you mention made the two dates the same ... both wrong. The date in the article now (28 July in both lead paragraph and infobox) is sourced ... The article had nearly 12,000 views in a single day in August 2018, when the TV program aired. Overall, there were more than 128,000 pageviews during the time when it contained Mr. White's photograph. As a system for writing a reliable reference work, at least in this subject area, it's surely a very, very far cry from being "very effective at identifying and removing these inaccuracies when they do occur". Cheers, --Andreas JN466 21:28, 2 November 2021 (UTC)[reply]

Low participation is also a factor

What should also be noted is that although a mysterious IP editor did notice and remove the incorrect image, in all likelihood because of the press coverage of the lawsuit, they didn't detail why they were doing it. It could quite as easily have just been vandalism for all anyone knew. And after that edit was made, no other edits are made to either the article or the talk page for at least a year. While we can't know for sure, it seems quite likely that no established editor noticed the removal, or they did but they weren't sufficiently moved by the unevidenced claim that Wikipedia had misidentified a serial killer, to investigate and provide their colleagues with a positive indication that they had chased this down and found the press report. There was, as far as I can tell, nothing stopping the image from being readded from one of the sources that had reused it from Wikipedia. Most likely by someone who might never have even seen the IP editor's note in the edit history. Mackabrillion (talk) 18:17, 2 November 2021 (UTC)[reply]

The folly of being mostly right, most of the time

The Wikimedia Counsel cited a study in Wikipedia's defence that says Wikipedia editors are quite quick to removed most bad edits. And who knows, maybe that study is true. I am dubious of the Berkmam Klein Centre's findings, ever since it was quite obvious that in the case of a publication of theirs that mentioned Gateway Pundit, they had sourced basic descriptive information from Wikipedia. Information that didn't have a source here, and appeared nowhere else on the internet before it appeared on Wikipedia, suggesting quite strongly that this had been simply made up by a Wikipedia editor, and other Wikipedia editors either never noticed or simply didn't care (for obvious reasons, the Gateway Pundit isn't exactly going to get a gold star service from Wikipedia). The folly of resting on some assurance that you'll get it right most of the time, is when you have the risk of the few times times you do get it wrong, it can quite easily be very, very, harmful. It has been suggested that BLP articles should be fully protected. And to be honest, when there is nothing to suggest that in similar cases the editor community would not make the same mistake here again (chiefly to not notice press coverage of a lawsuit that mentions Wikipedia), that looks like a wise move. As this case showed quite well, other often suggested changes, such as preventing IP editing, might not yield the same level of protection. Mackabrillion (talk) 18:17, 2 November 2021 (UTC)[reply]

The identity of the IP editor

Probably futile, but an explicit on the record denial from the Wimimedia Counsel that to the best of their knowledge, neither he or anyone in the Wikimedia Foundation is behind the IP that removed the image from the Wikipedia article, would go some way to better understanding the true weaknesses of the Wikipedia model. Mackabrillion (talk) 18:17, 2 November 2021 (UTC)[reply]

Comment from Wikimedia Foundation

First, we note that the editorial makes some thoughtful suggestions regarding the possibility of taking greater care when dealing with articles and photographs of people accused of crimes. We agree with the editorial that the most ideal outcome would be that the mistaken photograph on Wikipedia had been caught and removed immediately after upload and we are supportive of the community's ongoing efforts to develop processes which improve the quality of potentially sensitive articles. At the same time, we also believe that the overall Wikipedia process for this case worked once the problem was discovered. It appears that the broader internet was not aware of the original research mistake until after the Discovery TV program came out. Once the mistake was reported on, Wikipedia’s open structure allowed someone (an IP editor) to remove the image quickly and there was no further problem on Wikipedia. This is a considerably better outcome than how the TV program responded and compared with other websites that hosted the same content. We would point to this as an example of Wikipedia’s open structure working well while agreeing that it could still be further improved to catch future mistakes sooner.

Second, we wish to offer a clarification regarding a misunderstanding in the Signpost editorial. The original blog stated “Instead, the law simply requires that litigants direct their complaints at the individuals who made the statements at issue, rather than the forums where the statements were made.” The Signpost editorial interprets this to mean that one should sue a specific Wikipedia editor, but this was neither the intent nor what the section stated. In this case, the “statement at issue” is the original research conducted by Discovery Television that misidentified Mr. White. Our intent was to highlight the fact that Mr. White actually did sue Discovery Television in his lawsuit and the case against them was not dismissed by the motion that ended the case against the Foundation. It is fairly likely that Mr. White will be able to proceed with the court and receive a full hearing regarding his claims of harm from the misidentification or to reach an agreeable settlement with Discovery Television. We would also note that in the cases that are most difficult to address on Wikipedia because they have an existing reliable source, there is very often a corporation that can be held responsible for what they originally published.

The goal of the people from multiple departments within the Foundation that came together to help Jacob author and factcheck the Diff article was to show that this is a successful case where the law worked well: Mr. White still has a route to compensation for his harm from the business that appears to have made the original mistake while the Wikimedia Foundation was protected from legal liability that could have significantly disrupted open community editorial processes. While many other websites have feedback links to report errors, this is yet another example of how and why it often takes less time to address these issues on open platforms like Wikipedia in comparison to closed platforms.

--Stephen LaPorte, Wikimedia Foundation Associate General Counsel — Preceding unsigned comment added by GVarnum-WMF (talkcontribs) 10:36, 3 November 2021 (UTC)[reply]

For reference, the relevant passage of the WMF blog post reads:

It is important to note that Section 230's broad protection of Wikimedia projects and other online services does not leave litigants like Mr. White without options. Instead, the law simply requires that litigants direct their complaints at the individuals who made the statements at issue, rather than the forums where the statements were made. This both allows litigants to challenge the appropriate parties responsible for their harm and protects online hosting companies like the Wikimedia Foundation from the costs associated with liability for user-generated content.
— Diff

If Wikipedia is the forum that was improperly sued here, then the Discovery Channel is not "the individual who made the statement at issue" in this forum.
I am certainly not alone in having interpreted the post this way. For reference, the top-rated post at last month's Slashdot discussion (which I was unaware of until a couple of days ago) of the Diff post reads as follows:

It is important to note that Section 230â(TM)s broad protection of Wikimedia projects and other online services does not leave litigants like Mr. White without options. Instead, the law simply requires that litigants direct their complaints at the individuals who made the statements at issue, rather than the forums where the statements were made.
So they want him to subpoena the IP address and sue the contributor who posted his image.
Seems like a fair compromise.

The effectiveness or otherwise of the quality control system is amply covered by others in the discussion above. I don't think the system works at all well in the True Crime area. Also, why should it matter what the "broader internet" is aware of? Reality matters. People matter. Mr. White matters. (And there actually has long been what I believe is a real photo of the killer on the Internet. It does not look like Mr. White, beyond their both being African-American men.) Regards, --Andreas JN466 11:38, 3 November 2021 (UTC)[reply]

Some thoughts

If we learn any lessons from this, one of them ought to be that the nonprofit status of the Foundation does not automatically keep us from doing evil. --Kent G. Budge (talk) 17:46, 3 November 2021 (UTC)[reply]
An additional point I forgot to make, in addition to my first 2 points above. Since it is easy to make mistakes with articles about living people, the dangers of editing them could have a chilling effect on new editors -- which is not desirable. I would hope that the Foundation has some plan of action for those cases where an editor is being sued over a good-faith mistake. As I wrote above, lawsuits are expensive to defend, & editors should not be penalized for such mistakes. Even experienced editors can make mistakes that less haste or more sleep would help them avoid. -- llywrch (talk) 15:27, 4 November 2021 (UTC)[reply]

Invalid fair use rationale

Fair use rationales are not valid for images living people, this image should never have been uploaded in the first place. Hemiauchenia (talk) 21:05, 3 November 2021 (UTC)[reply]

I can't see deleted images, but the image was likely uploaded under a claim of being public domain - which is accurate for mugshots taken by the federal government, and I have no idea for mugshots taken by New York State Corrections. (See File:Thomas_Hagan.jpg for a suspicious example that's still current - is this really public domain?). That said, even if it was copyrighted, there are rare exceptions that do allow fair use images for living people when taking a fresh compatibly licensed photo is not reasonable, and prisoners may qualify. (As usual, no guarantees, FFD is a roulette wheel.) SnowFire (talk) 22:38, 3 November 2021 (UTC)[reply]
Here is the log entry, which you don't need special permissions for: [3]. It says,
  • "(== Summary == This is an inmate photograph (mugshot) of American serial killer Nathaniel White, by the New York State Department of Corrections and Community Supervision, United States, dated: after 1993. == Rationale information == {{Non-free use rationale 2 |description = After 1993 inmate photograph (mugshot) of American serial killer Nathaniel White. |author = New York State Department of Corrections and Community Supervision, United States. |source = http://crimefeed.com/2017/08/31713...)" (The picture was only deleted because it was an unused non-free fair use image.) --Andreas JN466 23:02, 3 November 2021 (UTC)[reply]
Never mind, then. Guess it was uploaded as fair-use, surprising. SnowFire (talk) 00:24, 4 November 2021 (UTC)[reply]

This part of the edit that added the rationale was truncated in the edit summary:

|commercial  = If this image is subject to copyright, it belongs to the New York State Department of Corrections and Community Supervision.  Which is a government (or government contracted) public service and not a commercial entity nor artist, and by using this low-resolution image it does not impact the department of any monetary value.  
|other information = Source (no image):  provides inmate data from the state’s department of incarceration:<br>
http://nysdoccslookup.doccs.ny.gov/kinqw00 <br>
DIN (Department Identification Number):   93A4050 <br>
Fair use: Ensured conformance with the image’s “low-resolution” sizing by using the recommended [https://tools.wmflabs.org/cp/resize.php Image Resize Calculator].

Vwanweb acknowledged that there was "(no image)" on the NYS Department of Corrections' website, and wrongly assumed that this was where crimefeed.com obtained the image from, despite the failed verification. – wbm1058 (talk) 13:49, 5 November 2021 (UTC)[reply]

An observation: Good intentions do not justify cutting corners. I have to remind myself of that frequently. --Kent G. Budge (talk) 14:55, 5 November 2021 (UTC)[reply]
Thanks. I've added a sentence covering images and media files to WP:SWYGT. Also, I learned today that the Discovery program only featured the picture very briefly, right at the end of the program. Of course that's long enough for anyone who knows a person to recognise them, but for everyone else it would have been far harder to memorise the face from that than from a stationary image on an internet page. --Andreas JN466 18:44, 5 November 2021 (UTC)[reply]

Discussions of this story elsewhere

Bottom line: We have just provided Exhibit A in favor of repealing Section 230, which already has broad bipartisan support. --Kent G. Budge (talk) 15:39, 8 November 2021 (UTC)[reply]
Note that there is a difference between social media sites like Facebook and Twitter, where each post appears with the name of its author prominently identified, and Wikipedia. In Wikipedia, the names of the authors are obscured. Everything is published under the name "Wikipedia". And the Wikimedia Foundation collects hundreds of millions of dollars on behalf of "Wikipedia". A lawyer could actually argue that the volunteers are not social media users, but unpaid Wikimedia Foundation staff. --Andreas JN466 15:51, 8 November 2021 (UTC)[reply]



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0