Exclusive: Wikipedia launches new global rules to combat site abuses

A Wikipedia webpage in use on a laptop computer is seen in this photo illustration. REUTERS/Gary Cameron

(Reuters) - The foundation that operates Wikipedia will launch its first global code of conduct on Tuesday, seeking to address criticism that it has failed to combat harassment and suffers from a lack of diversity.

"We need to be much more inclusive," said María Sefidari, the chair of board of trustees for the non-profit Wikimedia Foundation. "We are missing a lot of voices, we're missing women, we're missing marginalized groups."

Online platforms have come under intense scrutiny for abusive behavior, violent rhetoric and other forms of problematic content, pushing them to revamp content rules and more strictly enforce them.

Unlike Facebook Inc and Twitter Inc which take more top-down approaches to content moderation, the online encyclopedia, which turned 20 years old last month, largely relies on unpaid volunteers to handle issues around users' behavior.

Wikimedia said more than 1,500 Wikipedia volunteers from five continents and 30 languages participated in the creation of the new rules after the board of trustees voted in May last year to develop new binding standards.

"There's been a process of change throughout the communities," Katherine Maher, the executive director of the Wikimedia Foundation, said in an interview with Reuters. "It took time to build the support that was necessary to do the consultations for people to understand why this is a priority."

The new code of conduct bans harassment on and off the site, barring behaviors like hate speech, the use of slurs, stereotypes or attacks based on personal characteristics, as well as threats of physical violence and 'hounding,' or following someone across different articles to critique their work.

It also bans deliberately introducing false or biased information into content. Wikipedia is a relatively trusted site compared to major social media platforms which have struggled to curb misinformation.

Maher said some users' concerns that the new rules meant the site was becoming more centralized were unfounded.

Wikipedia has 230,000 volunteer editors who work on crowdsourced articles and more than 3,500 'administrators' who can take actions like blocking accounts or restricting edits on certain pages. Sometimes, complaints are decided on by panels of users elected by the communities.

Wikimedia said the next phase of the project would be working on the rules' enforcement.

"A code of conduct without enforcement...is not going to be useful," said Sefidari. "We're going to figure this out with the communities," she said.

Maher said there would be training for communities and interested task-forces of users.

Wikimedia has no immediate plans to beef up its small 'trust and safety' team, a group of about a dozen staff which currently acts on urgent matters such as death threats or the sharing of people's private information, she said.

(Reporting by Elizabeth Culliford, Editing by Kenneth Li and Edwina Gibbs)

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 46
Cxense type: free
User access status: 3


Next In Tech News

Chinese TikTok influencers apologise for videos of lavish spending including a US$1,500 haircut, US$53,000 bed and a US$61,000 suit
Avast's confident update sends cybersecurity firm to top of FTSE
Uber, Bolt drivers in Africa protest higher costs
Amazon’s clash with Tencent killed its Lord Of The Rings video game after the Chinese tech giant acquired Leyou
SpaceX set to take four astronauts to International Space Station on April 22
PlayStation Store for older consoles to remain open
China tech stampede into electric cars sparks auto sector buzz
TikTok privacy deal worth a cup of coffee rued as too puny
Apple to reinstate Parler; Google offers potential return
Musk says Autopilot off in Texas Tesla crash that killed two

Stories You'll Enjoy