The use of software tools and autonomous bots against vandalism: eroding Wikipedia’s moral order?

Ethics and Information Technology 17 (3):175-188 (2015)
  Copy   BIBTEX

Abstract

English - language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the ‘ coactivity ’ in use between humans and bots, this research ‘ discloses ’ the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical layer. Further, surveillance exhibits several troubling features : questionable profiling practices, the use of the controversial measure of reputation, ‘ oversurveillance ’ where quantity trumps quality, and a prospective loss of the required moral skills whenever bots take over from humans. The most troubling aspect, though, is that Wikipedia has become a Janus - faced institution. One face is the basic platform of MediaWiki software, transparent to all. Its other face is the anti - vandalism system, which, in contrast, is opaque to the average user, in particular as a result of the algorithms and neural networks in use. Finally it is argued that this secrecy impedes a much needed discussion to unfold ; a discussion that should focus on a ‘ rebalancing ’ of the anti - vandalism system and the development of more ethical information practices towards contributors.

Author's Profile

Paul B. De Laat
University of Groningen

Analytics

Added to PP
2015-10-30

Downloads
688 (#19,716)

6 months
81 (#46,784)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?