The Signpost

File:CRAIYON-REALESRGAN-A book wrapped in chains.jpg
Craiyon
PD
0
0
300
News and notes

Information considered harmful

Contribute   —  
Share this
By Bri, Mhawk10, Andreas Kolbe and EpicPupper
A book wrapped in chains.
Wants to be free, but can it be?

WMF publishes Human Rights Impact Assessment

Human Rights Impact Assessment (English)

On 12 July, the Wikimedia Foundation announced the publication of its Human Rights Impact Assessment.

What has been published is an edited, public version of a report originally submitted to the Wikimedia Foundation in July 2020 by Article One Advisors, an external consultancy. The public version was jointly edited by the WMF and Article One.

Article One's assessment came up with a number of "priority recommendations" in the following areas:

Some of these recommendations have already been implemented over the past two years (e.g., Universal Code of Conduct, Human Rights Policy), while others have not. Some require community discussion.

Commenting on the two-year delay in publication as well as the progress made since the report was received, the WMF has said,

Unfortunately, due to capacity constraints and disruptions caused by the COVID-19 pandemic, publication of this report has been significantly delayed. However, the Foundation has continued moving forward on important human rights work in the two years since the report was submitted. The Foundation has taken steps (more information below) to advance human rights work that aligned with existing organizational priorities, including some recommendations made in the human rights impact assessment.

The WMF has provided a status report on Meta, which is copied below:

Wikimedia Foundation HRIA Foreword + Executive Summary (Arabic)
Wikimedia Foundation HRIA Foreword + Executive Summary (Chinese (Traditional))
Wikimedia Foundation HRIA Foreword + Executive Summary (French)
Wikimedia Foundation HRIA Foreword + Executive Summary (Russian)
Wikimedia Foundation HRIA Foreword + Executive Summary (Spanish)
Strategies for the Foundation
  1. Develop a standalone Human Rights Policy that commits to respecting all internationally recognized human rights by referencing the International Bill of Human Rights. Status: Complete
  2. Conduct ongoing human rights due diligence to continually assess risks to rightsholders. A Foundation-level HRIA should be conducted every three years or whenever significant changes could have an effect on human rights. Status: Ongoing
  3. Develop rights-compatible channels to address human rights concerns, including private channels, and ensure alignment with the UNGPs’ effectiveness criteria. Status: Complete
Harmful content
  1. Develop an audit protocol to assess projects that are at high risk of capture or government-sponsored disinformation. Status: Ongoing
  2. Develop a Content Oversight Committee (COC) to review content with a focus on bias and have the ability to make binding editorial decisions in line with ICCPR 19. Status: No action, community input needed
  3. Continue efforts outlined in the Knowledge Integrity white paper to develop: a) a machine-readable representation of knowledge that exists within Wikimedia projects along with its provenance; b) models to assess the quality of information provenance; and c) models to assess content neutrality and bias. Ensure that all AI/ML tools are designed to detect content and action that would be considered illegal under international human rights law, and that the response aligns with the three years or whenever significant changes could have an effect on human rights. Status: Ongoing
  4. Provide access to a geotargeted suicide prevention hotline at the top of the articles on Suicide Methods. Status: Ongoing
Harassment
  1. Develop and deploy training programs for admins and volunteers with advanced rights on detecting and responding to harassment claims. Status: Ongoing
  2. Commission a “social norms marketing” research project to assess what type of messaging is likely to reduce and prevent harassing comments and actions. Status: No action, community input needed
  3. Explore opportunities to rate the toxicity of users, helping to identify repeat offenders and patterns of harassment. Consider awards for projects with the lowest toxicity levels. Status: No action, community input needed
  4. Consider developing admin metrics focused on enforcing civility and applying the forthcoming Universal Code of Conduct (UCoC). Status: No action, community input needed
  5. Ensure that the (UCoC) and its accompanying governance mechanism is reviewed by human rights experts, including experts on free expression and incitement to violence. Status: Ongoing
Government surveillance and censorship
  1. Continue efforts underway as part of the IP-masking project to further protect users from public identification. Status: Ongoing
  2. Develop awareness-raising tools and programs for all volunteers to understand and mitigate risks of engagement. Tools should be made publicly available and should be translated into languages spoken by volunteers in higher risk regions. Status: Ongoing
Risks to child rights
  1. Conduct a child rights impact assessment of Wikimedia projects, including conducting interviews and focus groups with child contributors across the globe. Status: Ongoing
  2. Create child safeguarding tools, including child-friendly guidance on privacy settings, data collection, reporting of grooming attempts, the forthcoming UCoC as well a “Child’s Guide to Editing Wikimedia Project” to help advance the right of children to be civically engaged. Status: No action, pending full child rights impact assessment
Limitations on knowledge equity
  1. Support retention by developing peer support and mentoring for under-represented contributors. Status: Ongoing
  2. Engage stakeholders on how the “notability” requirement may be shifted to be more inclusive of oral histories, and to identify what definitions resonate with under-represented communities. Status: No action, community input needed
  3. Adapt Wikimedia projects to be more accessible via mobile phones. Status: Ongoing

This month, the WMF hosted a series of conversation hours on this topic:

The WMF has also invited questions and feedback on the discussion page on Meta-Wiki as well as through the Movement Strategy Forum. AK

Internet Archive files for summary judgement in relation to lawsuit

The Internet Archive preserves access to artifacts in digital form. Book citations on Wikipedia commonly link to versions on the Internet Archive. In 2020, four publishers sued the non-profit, alleging that the Controlled Digital Lending that the library takes part in violates their rights. Hachette, HarperCollins, Wiley, and Penguin Random House claim CDL has cost their companies millions of dollars and is a threat to their businesses. The Electronic Frontier Foundation (EFF), representing the Archive, filed a motion for summary judgement on July 7, requesting that a federal judge grant an end to the case. The EFF declined to comment on the implications for Wikipedia in an email. – E

Russian government escalates fight against Wikipedia, orders search engines to label Wikipedia as disinformation

Russian telecommunications agency Roskomnadzor has escalated its fight against Wikipedia's uncensored war-related content

On July 20, Russian telecommunications agency Roskomnadzor issued a statement saying that it was taking new action against Wikipedia for what it deems false information about the Russo-Ukrainian war. The statement, which said that it would require search engines to label Wikipedia as containing false war-related content, may mark a new front in Russia's fight to de-legitimize Wikipedia as a reputable and trusted source of information.

While Wikipedia currently remains available in Russia, Roskomnadzor has previously taken legal action against the Wikimedia Foundation for its decision not to remove information verboten in Russia from several Russian Wikipedia articles; a Moscow judge issued a ₽5,000,000 fine for non-compliance with Russian law earlier this year. The Wikimedia Foundation has appealed this court decision, arguing that Russia does not have territorial jurisdiction over the Wikimedia Foundation and that the Russian court's decision violates rights to free expression and access to knowledge.

And it's not only Russian bureaucrats that are going on the offensive against Wikipedia; legislators are applying public pressure in order to compel internet companies to change how they treat Wikipedia. In a July 20 post on a channel on messaging application Telegram, State Duma member Anton Gorelkin of the ruling United Russia party pressed Russian search engines to go beyond using warning labels and to affirmatively downgrade Wikipedia's search ranking, urging also to give an affirmative upgrade in search rankings to websites that are in full compliance with Russian censorship laws. Gorelkin criticized the Wikimedia Foundation's refusal to take down information that has been banned in Russia and described the decision to defy censorship orders as damaging to the credibility of the project.

Russian search engine Yandex has already rolled out warning labels relating to battles in Ukraine, though the text of existing warning labels do not appear to be specific to Wikipedia. One such warning placed atop Yandex search results reads "[н]екоторые материалы в интернете могут содержать недостоверную информацию. Пожалуйста, будьте внимательны", which in English means "some materials on the Internet may contain false information. Please be careful." Thus far, the implementation of these warning labels is far from comprehensive, even on Russia's largest homegrown search engine. For example, a search on Yandex for the "Битва за Киев (2022)", which in English means "battle for Kiev" contains this warning atop its page, while a direct search for the "Битва за Киев" (Битва за Киев) shows no such warning even though a Wikipedia article on the 2022 battle is its first result.

Despite pressure from the Russian government, the Wikimedia Foundation has not complied with any Russian court orders thus far, spokeswoman Samantha Lien told The New York Times. Instead of complying, per the spokeswoman, the Wikimedia Foundation remains "committed to our mission to deliver free knowledge to the world". The Wikimedia Foundation has previously said in a statement that "the articles flagged for removal uphold Wikipedia’s standards of neutrality, verifiability, and reliable secondary sources to ensure articles are based in fact".

Wikipedia remains accessible in Russia, for now. Over the past decade, Wikipedia has been under threat of being blocked by the Russian Federation and was briefly blocked by Roskomnadzor in 2015. Stanislav Kozlovsky, a co-founder of the Wikimedia Russia chapter, expressed in an interview with Russian opposition news outlet Meduza that it's more plausible that Wikipedia may become blocked in the Russian Federation than it traditionally had been. That being said, Kozlovsky told Meduza, "when someone points a gun at you for ten years straight, you get used to it". See further coverage in this issue's "In the media" column. – M

Brief notes

A win and a loss for Wikimedia at the UN this month


S
In this issue
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.



       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0