Wikipedia’s New Code Of Conduct Will get One Factor Proper; One other Will Be A Wrestle

ANKARA, TURKEY – JANUARY 15: (BILD ZEITUNG OUT) In this photo illustration is the logo of Wikipedia … [+] seen on the screen of a laptop with a magnifying glass on January 15, 2021 in Ankara, Turkey. (Photo by Altan Gocher / DeFodi Images via Getty Images)

DeFodi Images via Getty Images

A large social network announced a new set of rules for its members on Tuesday that may not be considered news.

Wikipedia is not just any social network, however, and its new set of rules differs from the terms of use transmitted by commercial social platforms like Facebook and Twitter.

The Universal Code of Conduct announced Tuesday by the Wikimedia Foundation, the San Francisco nonprofit that hosts Wikipedia and related projects, is not a top-down product. Instead, Wikipedians worked together to write it. Nearly 356,000 of them regularly create or edit entries in this online encyclopedia.

“More than 1,500 Wikipedia volunteers from 19 different Wikipedia projects on five continents and 30 languages ​​were involved in the creation of the universal code of conduct,” said the announcement from Wikimedia.

This goes well beyond previous steps taken by commercial social platforms to borrow the collective wisdom of their masses. See, for example, Twitter, which is taking over the basic functions of @mentions and hashtags from its early users, or Facebook, which allows users to vote on new terms of use before this experiment is canceled in 2012 after too few people bothered to cast virtual ballot papers.

Wikimedia started working on the new code with input from around the world on the need to revise the previous rules and the months of collaboration.

“They are an alternative model to the private social experience that exists almost everywhere else,” said Alex Howard, director of the Demand Progress Education Fund’s Digital Democracy Project.

The results also differ from many other codes of conduct in that they are unusually short – less than 1,700 words or less than 1,300 if you subtract the opening paragraphs.

The operative text does not begin with a you-should-not-note, but with a you-should-list of the expected behavior of a user: “practice empathy”; “Assume good faith and make constructive changes”; “Respect the way contributors name and describe themselves”; “Recognize and appreciate, among other things, the work of the contributors”.

“The organization says here are our values,” said Howard. “They give people a framework to interact with each other.”

The following is a list of “unacceptable behavior,” including a general prohibition of harassment. This includes the usual categories – such as insults targeting personal traits, threats, and doxing – but also the broader category of being an idiot.

This is both necessary because people who knock a little in public are often more private, and tricky because those minor fouls aren’t as obvious.

“People sometimes assume this is unintentional,” said Caroline Sinders, founder of Convocation Design + Research and an online harassment research expert who has worked with the Ford Foundation, Amnesty International, and others (including a previous position at Wikimedia even).

Or, she added, the crime is not recorded and then forgotten without a “ladder of accountability” who understands how unchecked minor abuses can lead to more toxic behavior.

These provisions also apply to behavior outside of Wikimedia projects. The Doxing Clause, for example, states that “the disclosure of other contributors’ private information such as name, place of work, physical address or e-mail address without their express consent does not appear in the line“ either in the Wikimedia projects or elsewhere ”.

There’s one complicating factor here to the understandable lack of a real name policy on Wikimedia: Enforcing a policy would put marginalized communities at risk, especially those living under abusive governments. Wikipedia doesn’t even need an email address to create a Contributor account.

Chantal De Soto, head of communications at the Wikimedia Foundation, noted this problem in an email: “Enforcing behavior violations on other platforms is often very difficult. Checking the connections between Wikimedia accounts and, for example, a Twitter account is often not easy. ”

However, it is important that Wikimedia communities make this effort, given all the evidence now available on how online radicalization can erupt in the physical world.

“We just have to look at January 6th to get a feel for what happens if this goes too far,” said Howard of the riots in the US Capitol.

The next chapter in Wikimedia’s efforts will include greater collaboration on enforcement policies and mechanisms. This can be the hardest part as there is a need to build structures that can work on a scale and between cultures.

“A community needs to think about how it will document these cases, who will have access to them, how they will keep track of how they will respond to harassment,” said Sinders.

If done right, more dedicated trust and security professionals may need to be hired.

“In open source communities, much of this tedious work is done by volunteers,” warned Sinders. “And that leads to burnout in the community.”

Comments are closed.