The Signpost

News and notes

Wikimedia enters US Supreme Court hearings as "the dolphin inadvertently caught in the net"

Contribute  —  
Share this
By Andreas Kolbe, Bri, Ganesha811, and HaeB
US Supreme Court seal
The United States Supreme Court is hearing two cases that will be crucial for Wikimedia

Cautious optimism from "the dolphin inadvertently caught in the net" at US Supreme Court hearings "crucial for Wikimedia"

For prior Signpost coverage, see see Section 230 report (February 2023)

On the Wikimedia-l mailing list, two members of the Wikimedia Foundation's "Global Advocacy" team drew attention to

important hearings happening this week at the United States Supreme Court.

The hearings on two cases that will be crucial for Wikimedia have just started: NetChoice, LLC v. Paxton and Moody v. NetChoice, LLC. Both cases are challenges to state laws in Texas and Florida, which impact content moderation on social media websites. [...] As they are written, these laws prohibit website operators from banning users or removing speech and would generally risk Wikipedia’s volunteer-led systems of content moderation. That’s because these laws were designed to prevent social media platforms from engaging in politically motivated content moderation, but were drafted so broadly that they would also impact Wikipedia. The case is also important beyond the impact it might have on our projects. It represents a scenario that is part of a trend globally, where governments introduce legislation to address harms from big tech actors, yet Wikimedia ends up as the dolphin inadvertently caught in the net."

The Foundation has previously weighed in on these cases with an amicus brief and several blog posts, and is present at the current hearings "in person talking to stakeholders and observing the proceedings. We expect the Court to rule this year and will be providing updates as we know more."

Asked about the worst-case scenario (from a Wikimedia perspective), Stan Adams of the Global Advocacy team elaborated:

"Perhaps the worst long-term outcome would be if several other states or even the US Congress replicated the Texas or Florida laws. If those laws were enforced against Wikipedia editors or the Foundation – say, for editors' regular work of removing content that is inaccurate, unsourced, or that violates NPOV policies – it could become increasingly difficult to operate and maintain Wikipedia."

However,

"based on what I observed at the Court yesterday [February 26, mentioning comments by justice Brett Kavanaugh in particular], I think most of the Justices would be reluctant to uphold the Texas and Florida laws. That said, these cases won't be the end of legislative attempts to regulate social media and other venues for expression online – I expect to see the Court considering more cases like these as states continue to enact laws that raise First Amendment questions in the online context."

H

U4C Charter vote

The Wikimedia Foundation advised on Meta-Wiki that –

A vote to ratify the charter for the Universal Code of Conduct Coordinating Committee (U4C) was held from 19 January until 2 February 2024 via SecurePoll. Voting is now closed. Thank you to all who voted. The result was 1249 voters in support and 420 voters opposed. 69 voters did not choose an option. Voter statistics and a summary of voter comments will be published soon.

You can find more information on the U4C's purpose and scope here. – AK

2024 Requests for adminship review

Placeholder alt text
Cleanup on aisle 24...

Are more changes afoot for the Requests for adminship process? Open proposals from Phase I include the following (some others were already rendered unsuccessful).

Phase I is still open, and you may weigh in with your thoughts here: Wikipedia:Requests for adminship/2024 review. – B

Stylized, black white and blue picture of a human brain made up of PCB-type connections
AI is changing the way people use the Internet ...

WMF publishes draft "research agenda on the implications of artificial intelligence (AI) for the knowledge commons"

From February 19 to February 23, 2024, "a group of 21 Wikimedians, academics, and practitioners" met at the Rockefeller Foundation's Bellagio Center in Northern Italy "to draft an initial research agenda on the implications of artificial intelligence (AI) for the knowledge commons." The aim is "to focus attention (and therefore resources) on the vital questions volunteer contributors have raised, including the promise, as well as risks and negative impacts, of AI systems on the open Internet." The agenda is available on Meta-Wiki, together with a brief report on the meeting.

Members of the "Wikimedia AI" Telegram group expressed their surprise about hearing about this effort first from organizations outside the Wikimedia movement, and about the fact that the term "open source" isn't mentioned in the document (despite open-source AI being an important topic of debate in AI currently, and WMF's general commitments to the use of open source software). While the announcement appears to be speaking on behalf of "volunteer contributors", the "Wikimedians" involved in drafting the document appears to have consisted exclusively of Wikimedia Foundation staff (largely from its Research department), according to the attendee list. Wikimedia Foundation CEO Maryana Iskander subsequently clarified that this "effort to contribute to a shared research agenda on AI [...] was created by a small group working in the open who rushed to publish a ‘bad first draft’ that will benefit from more input."

In other AI-related news, the Wikimedia Foundation recently received a $2.2 million grant from the Sloan Foundation (a longtime supporter) for the purpose of "leverag[ing] AI for the benefit of Wikipedia's readers and contributors, including tools to address vandalism" over the next three years. (These funds come on top of a $950,000 grant announced in April 2023 by WMF's own Wikimedia Endowment for "building and strengthening AI and machine learning infrastructure on Wikipedia and Wikimedia projects", similarly highlighting "the development of algorithms to measure the quality of Wikipedia articles and machine learning models that help catch incidents of vandalism on Wikimedia projects.")

AK H


Brief notes

Group photo from the EduWiki Conference 2023 in Belgrade, Serbia
Community service
S
In this issue
+ Add a comment

Discuss this story

U4C Charter vote

The result was 1249 voters in support and 420 voters opposed. 69 voters did not choose an option Who let Elon Musk rig the UCOC result? (and who let Elon Musk name it in the first place?) BilledMammal (talk) 11:03, 2 March 2024 (UTC)[reply]

Ah, operation Votey McVoteFace. I am actually impressed, even if that’s a coincidence. ASUKITE 13:56, 2 March 2024 (UTC)[reply]
Based. voorts (talk/contributions) 20:55, 2 March 2024 (UTC)[reply]
Came here to ask the same thing. Clear violation of WP:NOTSOCIALNETWORK :) pythoncoder (talk | contribs) 21:55, 3 March 2024 (UTC)[reply]
Can confirm, did result in quite the chuckle at tally-time. Total coincidence of course. Joe Sutherland (WMF) (talk) 19:27, 4 March 2024 (UTC)[reply]

In brief

Many congratulations to Sdkb for gaining admin powers! Oltrepier (talk) 11:40, 2 March 2024 (UTC)[reply]

Thanks, Oltrepier! I wasn't expecting it to spawn a round of RfA reform haha, but hope it leads to some improvements! Sdkbtalk 16:42, 2 March 2024 (UTC)[reply]
Sdkb, and congratulations on the new signature! I presume to help fellow editors recognize the new powers responsibilities, lest they confuse you with the old {{u|pill-shaped}} Sdkb. —⁠andrybak (talk) 13:56, 3 March 2024 (UTC)[reply]
I had been overdue for a change for a while, Andrybak, and when Q21 was asked it seemed like an opportune moment! Eagle Lemur-eyed observers may also notice the new avatar(s) on my user page 🙂 Sdkbtalk 16:48, 3 March 2024 (UTC)[reply]

Politically motivated content moderation

these laws were designed to prevent social media platforms from engaging in politically motivated content moderation Are you sure? Polygnotus (talk) 23:53, 15 March 2024 (UTC)[reply]

Leading up to the 2020 United States elections, there was a rise of misinformation on these services related to topics such as claims of election fraud and conspiracy theories related to the COVID-19 pandemic. Most of this misinformation originated from conservative parties including the far right and alt right.[2] Because of this, services like YouTube, Twitter and Facebook took action to moderate these posts from users, either by tagging them as misinformation or outright removal.[2] Some of this misinformation was put forth by Republican party members, including then-President Donald Trump, leading the Republican Party to seek legal review of Section 230 believing that this law allowed politically-motivated moderation.... Two state laws passed by Florida and Texas in 2021 created state-level challenges to Section 230.

One of those is Texas House Bill 20. It is certainly not "designed to prevent social media platforms from engaging in politically motivated content moderation". It is designed to ensure that platforms can't stop the spread of fake news and misinformation (lies about the election, about Biden, about Covid et cetera). Polygnotus (talk) 00:04, 16 March 2024 (UTC)[reply]

"If we have to negotiate the terms of the negotiation, we will never get anywhere." - The Elder Scrolls V: Skyrim. Or, more prosaically: We can discuss the law, or we can discuss how the law is described, but if we try to do both at once, it's not going to work very well. I happen to agree that the law is intended to favor conservative speech, but I don't think it would be useful to argue over whether it is intended to favor false speech specifically, and (considering the current state of the Republican Party) that is arguably a distinction without a difference anyway. --NYKevin 09:36, 17 March 2024 (UTC)[reply]
@NYKevin: All conversations in 2024 are metaconversations dixit the Zuckster. Anyway, when quoting something that is obviously false it is wise to throw a little [sic] or "We know this is bullshit but we are reproducing the quote as written" in there. A law intended to favor a particular political party != a law designed to prevent social media platforms from engaging in politically motivated content moderation. Not something to argue about, but worth pointing out so that people are not misled. Polygnotus (talk) 10:17, 17 March 2024 (UTC)[reply]
I've read the Texas bill, albeit a little hastily. It's not clear to me that, even if it applies to any Wikimedia sites, there is any onerous condition. I'd be interested to know which clauses are the problem here. I haven't seen the other piece of legislation. All the best: Rich Farmbrough 14:36, 19 March 2024 (UTC).[reply]
I take a more radical view, failing to see why there should be any law on the topic of censorship by social media. No need to carve out an exception for Wikipedia and the like. If The Onion or Facebook or Truth Social or whoever, want to operate a website free of dissent or balance or truth or whatever they may decide to dislike, they should be allowed to do that. We have our values; others value something else. Jim.henderson (talk) 23:34, 29 March 2024 (UTC)[reply]





       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0