The foundation behind Wikipedia has introduced its first global code of conduct, which seeks to address the critics who say that it has failed to combat harassment and is not as diverse as one might like. Maria Sefidari, the board of trustees’ chair for the Wikimedia Foundation, said that the foundation’s projects need to be more inclusive.
She added that Wikimedia is missing a lot of voices, including women and marginalized groups.
Online platforms are now under intense scrutiny for violent rhetoric, abusive behavior, and other problematic content types. Now, they are being pushed to rework content rules and strictly enforce them.
A process of change
In contrast with Facebook and Twitter, which take a top-down approach for content moderation, the world’s encyclopedia, which turned 20 in January, relies on unpaid volunteers who handle issues surrounding users’ behavior.
Wikimedia reported that more than 1500 Wikipedia volunteers from five continents and 30 languages participated in creating the new rules, following a vote by the board of trustees in May 2020, to develop new binding standards.
Katherine Maher, the executive director of the Wikimedia Foundation, said to Reuters that there had been a process of change throughout the communities.
What the code says
The newly announced code of conduct bans harassment on and off-site, bars hate speech and slurs, stereotypes, or attacks based on persona characteristics. The rules also bar threats of physical violence and hounding (where you follow someone across different articles to critique their work.)
It also bans the deliberate addition of false or biased information into the content. Wikipedia is relatively trusted by users, compared to other social media platforms where misinformation rages like wildfires.
Wikipedia has more than 230,000 volunteer editors who work on crowdsourced articles and more than 3,500 admins who have the power to block or restrict edits on some pages.