Switch to: References

Add citations

You must login to add citations.
  1. The use of software tools and autonomous bots against vandalism: eroding Wikipedia’s moral order?Paul B. de Laat - 2015 - Ethics and Information Technology 17 (3):175-188.
    English - language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the ‘ coactivity ’ in use between humans and bots, this research ‘ discloses ’ the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Trusting the (ro)botic other.Paul B. de Laat - 2015 - Acm Sigcas Computers and Society 45 (3):255-260.
    How may human agents come to trust artificial agents? At present, since the trust involved is non-normative, this would seem to be a slow process, depending on the outcomes of the transactions. Some more options may soon become available though. As debated in the literature, humans may meet bots as they are embedded in an institution. If they happen to trust the institution, they will also trust them to have tried out and tested the machines in their back corridors; as (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • From open-source software to Wikipedia: ‘Backgrounding’ trust by collective monitoring and reputation tracking.Paul B. de Laat - 2014 - Ethics and Information Technology 16 (2):157-169.
    Open-content communities that focus on co-creation without requirements for entry have to face the issue of institutional trust in contributors. This research investigates the various ways in which these communities manage this issue. It is shown that communities of open-source software—continue to—rely mainly on hierarchy (reserving write-access for higher echelons), which substitutes (the need for) trust. Encyclopedic communities, though, largely avoid this solution. In the particular case of Wikipedia, which is confronted with persistent vandalism, another arrangement has been pioneered instead. (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Evaluating Google as an Epistemic Tool.Thomas W. Simpson - 2012 - Metaphilosophy 43 (4):426-445.
    This article develops a social epistemological analysis of Web-based search engines, addressing the following questions. First, what epistemic functions do search engines perform? Second, what dimensions of assessment are appropriate for the epistemic evaluation of search engines? Third, how well do current search engines perform on these? The article explains why they fulfil the role of a surrogate expert, and proposes three ways of assessing their utility as an epistemic tool—timeliness, authority prioritisation, and objectivity. “Personalisation” is a current trend in (...)
    Download  
     
    Export citation  
     
    Bookmark   30 citations  
  • Coercion or empowerment? Moderation of content in Wikipedia as 'essentially contested' bureaucratic rules.Paul B. de Laat - 2012 - Ethics and Information Technology 14 (2):123-135.
    In communities of user-generated content, systems for the management of content and/or their contributors are usually accepted without much protest. Not so, however, in the case of Wikipedia, in which the proposal to introduce a system of review for new edits (in order to counter vandalism) led to heated discussions. This debate is analysed, and arguments of both supporters and opponents (of English, German and French tongue) are extracted from Wikipedian archives. In order to better understand this division of the (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • NAVIGATING BETWEEN CHAOS AND BUREAUCRACY: BACKGROUNDING TRUST IN OPEN-CONTENT COMMUNITIES.Paul B. de Laat - 2012 - In Karl Aberer, Andreas Flache, Wander Jager, Ling Liu, Jie Tang & Christophe Guéret (eds.), 4th International Conference, SocInfo 2012, Lausanne, Switzerland, December 5-7, 2012. Proceedings. Springer.
    Many virtual communities that rely on user-generated content (such as social news sites, citizen journals, and encyclopedias in particular) offer unrestricted and immediate ‘write access’ to every contributor. It is argued that these communities do not just assume that the trust granted by that policy is well-placed; they have developed extensive mechanisms that underpin the trust involved (‘backgrounding’). These target contributors (stipulating legal terms of use and developing etiquette, both underscored by sanctions) as well as the contents contributed by them (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations