The Signpost


Community view

A Deep Dive Into Wikimedia (part 4): The Future Of Wikimedia and Conclusion

Contribute   —  
Share this
By Feed Me Your Skin
User Feed Me Your Skin agreed to re-publish an original guide on Wikimedia, which can be found on his personal blog, on The Signpost. This has been presented as a multi-part series of columns in this space over the last few issues, and this is the final part. – Signpost Editors

Part 4.1: The Future Of Wikimedia

[edit]

Wikimedia is immensely large, but it's not done growing. The community has ambitious plans on how to improve the features that the Wikimedia projects already have and to innovate to improve the user experience.

Annual Plan

[edit]

Rather than just have a plan for a single year, the annual plans that the Wikimedia movement drafts straddle 2 sequential years (e.g. 2021-2022). Because of that, this section showcases 2 plans, not one.

2023-2024

[edit]

Every year, the foundation devises and releases a plan for short-term goals, which is ratified by the community. For the 2023-2024 period, the Wikimedia Foundation prepared for radical shifts in the movement that will be caused by the rapid adoption of the internet by the 3rd world and generative AI, among other things. At a high level, the foundation planned to continue its commitment to equity, to prioritize the user experience of established editors so they can better run the projects, and to prepare for long-term changes in its financial model. Of course, there's a lot more to the plan than that.

[edit]

Wikimedia doesn't exist in a vacuum, and the foundation has to plan accordingly. A surprising trend that the plan is shaped around is the tendency of younger audiences to use social media to get information. Many social platforms now have built in search features, which threatens traditional search engines and potentially harms the SEO of Wikimedia projects.

Besides losing market share to social media, both search engines and the Wikimedia projects also suffer from people directly asking LLMs for information instead of looking it up. The foundation would like to leverage LLMs where possible, but there's difficulties caused by copyright, hallucinations, and cost. LLMs also cause damage by allowing bad actors to spread disinformation at scale. All in all, it's clear that LLMs are going to have a major impact on the movement, but nobody knows whether or not it'll be positive.

Infrastructure
[edit]

The infrastructure goals for the 2023-2024 period are heavily based on a listening tour undertaken by Selena Deckelmann, Chief Product and Technology Officer of the Wikimedia Foundation, after she was hired. Broadly speaking, the goals are to improve the experience for volunteers (whether technical or otherwise), to provide better insight into Wikimedia using the data collected by the foundation, and to increase the spread of Wikimedia into new demographics, particularly people who live in the 3rd world.

Equity
[edit]

As part of a larger drive to increase equity within the Wikimedia movement, the foundation has several equity goals for 2023-2024. To begin with, the foundation focuses on each of the 8 main regions of the world and decides on initiatives for every one of them. For example, a goal for Sub-Saharan Africa is to increase editor retention, while for North America, where many Wikimedia editors come from, the goal is to work with large-scale organizations like the Digital Public Library of America to create contribution pipelines. On top of that, there are thematic goals that improve equity across the board by encouraging volunteers to contribute content relating to culture and heritage.

Safety
[edit]

In a similar vein to the push towards equity, the Wikimedia Foundation also wants to make sure that every user feels safe and welcome within the movement. Part of that is lobbying politicians around the world to inform them about how Wikimedia communities work so that they can pass laws that protect them. This has become incredibly important in recent years as governments around the world create regulations for Big Tech, which affects Wikimedia projects without considering their unique needs and purposes.

Disinformation is also treated as a safety issue within the annual plan, since the Wikimedia Foundation sees efforts to prevent people from accessing truthful information as a human rights violation. This is particularly difficult because the open nature of Wikimedia makes it exceptionally easy to intentionally add disinformation compared to more traditional projects. The foundation plans to use machine learning to help volunteers to identify disinformation and to increase the reliability of sources, but maintains that the best way to counter disinformation is a safe and diverse community that can fill in knowledge gaps and identify disinformation themselves. Accordingly, the foundation also plans to find ways to prevent surveillance of volunteers so they can't be intimidated or forced to add disinformation by bad actors.

Effectiveness
[edit]

A big problem with running a foundation as large as Wikimedia is that inefficiencies tend to creep in. This problem is made even worse by that fact that the foundation has multiple projects that more or less act independently of each other, which means that multiple teams often work to solve the same problem without collaborating. Because of this, a major goal for the foundation this year has been about improving effectiveness. Additionally, the Wikimedia Foundation will continue to work on refreshing and implementing their new values, which have been progressively adjusted for years with collaboration with the community.

Foundation Details
[edit]

To improve transparency, the 2023-2024 plan comes with an explanation on what the foundation actually does. This is essentially a breakdown on how money and human resources are allocated, as well as an overview of what every group in the foundation is doing this year. Additionally, the report also gives details about how salaries for employees are determined and gives some interesting stats about how many employees there are, among other things.

Reports
[edit]

As part of their continuing dedication towards transparency, the foundation publishes quarterly reports so the community can be assured that the foundation is adequately meeting their needs. Metrics were only reported up to quarter 3, but the progress report for Q4 was included in the foundation's annual review.

2024-2025

[edit]

Overlapping with the 2023-2024 plan is the more forward-looking 2024-2025 plan. Because this is more forward-looking, it's not as fleshed out as the 2023-2024 plan. However, there's a clear overarching theme of sustainability. Specifics aside, the plan for this period is to make Wikimedia "multigenerational" by improving the technical infrastructure, encouraging new editors to join, and decreasing reliance on donations.

Part 4.2: Conclusion

[edit]

When I decided to make this blog post, I did it assuming that I already knew most of what there was to know about Wikimedia, and all I had to do was write it down. After doing an absurd amount of research and looking into various rabbit holes I had no idea even existed, I realized that I knew nothing and I still know nothing. Even though this blog post is called "A Deep Dive Into Wikimedia", it still feels shallow. There's things that I don't really understand, and because I'm only fluent in English, I can't explore any of the rabbit holes in non-English projects. More than that, I've come to the conclusion that Wikimedia can't ever really be understood by any one person. Sure, Jimmy Wales and a few other highly prominent people probably have a very thorough understanding of what's going on at a high level, but can they tell you all the FOSS software they use, the people who maintain them, or the degree of support that they've received from actual employees? No, they can't. They can't tell you how the many communities that spawn around Wikimedia interact with the projects either, and it seems that nobody really can, considering how many grants the foundation give specifically to fund research on the movement. There's just too many moving parts, so many things that change faster than you can learn them. That's not even getting into the history of Wikimedia, which is surely rich and possible to piece together from the many archived documents scattered around the projects, if you have the time. At well over 5000 words, I've merely scratched the surface of what there is to know about Wikimedia.

I'll be honest here, when I first came up with the idea to investigate the movement and how it works, I planned for my blog post to be negative about the movement from the outset. I was going to be the cool-guy contrarian that showed off how much he knew by pointing out how much it sucks that Wikipedia has a bureaucracy, that Wikipedia has annoying powerusers, that the Wikimedia Foundation doesn't need as much money as they pretend to. And yeah, those complaints are totally valid. As I learned more about the movement, I found other flaws that I could focus on, like how amazing projects were just straight up abandoned, or how the foundation is too focused on Wikipedia to the point where it's almost a detriment to the rest of the movement. There's a lot that's wrong in the movement, but you know what? I don't care. What I've learned from doing this blog post is that I'm glad that Wikimedia exists. I'm glad that I can get free encyclopedic information without being nickel-and-dimed by a corporation. I'm glad that in a internet taken over by ragebait meant to make people miserable in exchange for engagement, there's a place online where people can peacefully benefit from projects designed for the betterment of mankind. I'm glad that somebody is trying to make an academic journal that doesn't charge readers to see the latest research, even if it seems ill-advised. I'm glad that people are actually trying to improve education in 3rd world countries through the internet instead of snobbishly looking down on non-traditional sources of information. And you know what? I'm glad that I can see parts of the movement that I don't like, because that means the movement is transparent enough that they don't try to hide or sugarcoat their flaws. It's easy to be a hater that talks about how Wikipedia isn't accurate enough or that "Wikipedia has cancer" because you don't like how the foundation spends its money, but it's even better to be a fixer that works to make the movement better today than it was yesterday, and I'm glad that Wikimedia lets me have that opportunity.

When I started this blog post, I said that Wikimedia is an online movement dedicated to making access to knowledge equitable. That's certainly how the movement presents itself, but honestly, I don't think that fully captures what the movement is about. To end this blog post, I want to introduce a new paradigm for understanding and interacting with the movement. Rather than being a place to provide equitable access to information, I want Wikimedia to be seen as a place for people to innovate and create new ways of sharing information. The Wikimedia movement is a sandbox where anybody can experiment with new ways of learning, both as the learner and the teacher. Some experiments will fail, but others will succeed, and people can carefully contribute to the experiments that work until they become mainstream sources of information. Rather than seeing the Wikimedia projects as websites that passively provide us with content, we should see the content as the product of people and organizations actively building and maintaining an entire educational ecosystem. People who contribute to Wikimedia projects shouldn't be seen as volunteers, they should be seen as leaders taking control and ownership of the projects. Researchers should look at the Wikimedia movement as a subculture with its own unique history and form, not as mere set of websites used to learn about things. Last but not least, we should stop looking at Wikimedia as being something totally separate from us. By virtue of its open nature, so many of us have contributed in so many ways to this wonderful, impossibly ambitious movement. Even if you haven't, you've at least used Wikipedia before, and therefore allowed yourself to be influenced by the people who work to provide information to the world. From here on out, I want everybody reading this blog post to stop looking at Wikimedia as something that's static and start looking at it as something that's dynamic. Don't take anything that you see for granted, think deeply about who wrote the content, wrote the code, and hosts the software. Question their motives, but also don't become paranoid and start instinctively distrusting one of the the greatest movements in internet history. Above everything, see Wikimedia as a collection of people doing amazing things, not just pixels on your computer screen.


Signpost
In this issue
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.




       

The Signpost · written by many · served by Sinepost V0.9 · 🄯 CC-BY-SA 4.0