Abstract
In order to fight massive vandalism the English-
language Wikipedia has developed a system of surveillance
which is carried out by humans and bots, supported by
various tools. Central to the selection of edits for inspection
is the process of using filters or profiles. Can this profiling
be justified? On the basis of a careful reading of Frederick
Schauer’s books about rules in general (1991) and profiling
in particular (2003) I arrive at several conclusions. The
effectiveness, efficiency, and risk-aversion of edit selection
all greatly increase as a result. The argument for increasing
predictability suggests making all details of profiling
manifestly public. Also, a wider distribution of the more
sophisticated anti-vandalism tools seems indicated. As to
the specific dimensions used in profiling, several critical
remarks are developed. When patrollers use ‘assisted edit-
ing’ tools, severe ‘overuse’ of several features (anonymity,
warned before) is a definite possibility, undermining profile
efficacy. The easy remedy suggested is to render all of
them invisible on the interfaces as displayed to patrollers.
Finally, concerning not only assisted editing tools but tools
against vandalism generally, it is argued that the anonymity
feature is a sensitive category: anons have been in dispute
for a long time (while being more prone to vandalism).
Targeting them as a special category violates the social
contract upon which Wikipedia is based. The feature is
therefore a candidate for mandatory ‘underuse’: it should
be banned from all anti-vandalism filters and profiling
algorithms, and no longer be visible as a special edit trait.