While the concept of “fake news” has recently become a topic of global conversation, it’s not new. Propaganda and lies for political and economic gain date back to ancient history. The reason it has become such a buzzword lies not in its novelty, but in its capacity to do serious harm. From politics to global health, misinformation — especially that which spreads rapidly online — threatens democracy, truth, and lives across the globe.


As a movement committed to empowering people with access to free, objective knowledge, Wikimedia works hard to combat online misinformation through several different initiatives. As part of the 2030 strategy process, Wikimedia has worked with researchers to assess how misinformation threatens the future of free knowledge. The resulting insights not only help the movement fight misinformation on Wikipedia and related projects, but they can also support other professionals tasked with protecting objective knowledge, including academics, journalists, and policymakers.


On Wikipedia, a global community of volunteers works hard to ensure that the free encyclopedia remains a source of accurate, trustworthy information about critical topics. For example, the Covid-19 pandemic article on the English version of Wikipedia has seen more than 24,500 edits by over 3,300 editors since the start of the pandemic — and that number continues to grow every day. Given the topic’s polarizing nature, Covid-19 articles on Wikipedia are now subject to a higher standard of administrator oversight. This extra step helps ensure that the articles contain accurate, fact-based information, rather than the speculation and conspiracy theories prevalent on many other online channels.


Wikipedia’s collaborative, peer-edited approach has even been touted by The Atlantic as a model that could serve as inspiration for governments and other sources of authority to spread information on important topics.


To further strengthen Wikipedia’s model, a group of more than 1,500 volunteers recently created a Universal Code of Conduct for Wikipedians. The document aims to create a safe, healthy environment for Wikipedia editors, while also helping ensure the content on the platform is as factual and objective as possible. To this end, it includes a section on “content vandalism” to specifically address the proliferation of fake news and censorship.


Here at Wikimedia CH, we’re combatting misinformation online on three fronts.

  • First, we advance the above efforts to ensure objective, factual content on Wikipedia and related projects by supporting our community of Swiss and international Wikipedians.
  • We enable access to free educational and knowledge platforms for educators and students of all ages through our Education program.
  • We organize events and activities that teach young people how to distinguish between different sources online to build their digital and information literacy.


This work will continue to be critical to protect free knowledge, democracy — and in the face of a global pandemic — human lives. To support our work to fight misinformation online, please consider making a tax-deductible donation to Wikimedia CH by clicking the button below.