Switch to: Citations

Add references

You must login to add references.
  1. Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability?Paul B. de Laat - 2018 - Philosophy and Technology 31 (4):525-541.
    Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Would transparency contribute to restoring accountability for such systems as is often maintained? Several objections to full transparency are examined: the loss of privacy when datasets become public, the perverse effects of disclosure of the very algorithms themselves, the potential loss of companies’ competitive edge, and the limited gains in answerability to be expected since sophisticated algorithms usually are (...)
    Download  
     
    Export citation  
     
    Bookmark   28 citations  
  • Data Derivatives.Louise Amoore - 2011 - Theory, Culture and Society 28 (6):24-43.
    In a quiet London office, a software designer muses on the algorithms that will make possible the risk flags to be visualized on the screens of border guards from Heathrow to St Pancras International. There is, he says, ‘real time decision making’ – to detain, to deport, to secondarily question or search – but there is also the ‘offline team who run the analytics and work out the best set of rules’. Writing the code that will decide the association rules (...)
    Download  
     
    Export citation  
     
    Bookmark   30 citations  
  • Information Privacy and Social Self-Authorship.Daniel Susser - 2016 - Techné: Research in Philosophy and Technology 20 (3):216-239.
    The dominant approach in privacy theory defines information privacy as some form of control over personal information. In this essay, I argue that the control approach is mistaken, but for different reasons than those offered by its other critics. I claim that information privacy involves the drawing of epistemic boundaries—boundaries between what others should and shouldn’t know about us. While controlling what information others have about us is one strategy we use to draw such boundaries, it is not the only (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Opening Black Boxes Is Not Enough- Data-based Surveillance in Discipline and Punish And Today.Tobias Matzner - 2017 - Foucault Studies 23:27-45.
    Discipline and Punish analyzes the role of collecting, managing, and operationalizing data in disciplinary institutions. Foucault’s discussion is compared to contemporary forms of surveillance and security practices using algorithmic data processing. The article highlights important similarities and differences regarding the way data processing plays a part in subjectivation. This is also compared to Deleuzian accounts and Foucault’s later discussion in Security, Territory, Population. Using these results, the article argues that the prevailing focus on transparency and accountability in the discussion of (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability?Paul Laat - 2018 - Philosophy and Technology 31 (4):525-541.
    Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Would transparency contribute to restoring accountability for such systems as is often maintained? Several objections to full transparency are examined: the loss of privacy when datasets become public, the perverse effects of disclosure of the very algorithms themselves (“gaming the system” in particular), the potential loss of companies’ competitive edge, and the limited gains in answerability to be expected (...)
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability?Massimo Durante & Marcello D'Agostino - 2018 - Philosophy and Technology 31 (4):525-541.
    Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Would transparency contribute to restoring accountability for such systems as is often maintained? Several objections to full transparency are examined: the loss of privacy when datasets become public, the perverse effects of disclosure of the very algorithms themselves, the potential loss of companies’ competitive edge, and the limited gains in answerability to be expected since sophisticated algorithms usually are (...)
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • Living by Algorithm: Smart Surveillance and the Society of Control.Sean Erwin - 2015 - Humanities and Technology Review 34:28-69.
    Foucault’s disciplinary society and his notion of panopticism are often invoked in discussions regarding electronic surveillance. Against this use of Foucault, I argue that contemporary trends in surveillance technology abstract human bodies from their territorial settings, separating them into a series of discrete flows through what Deleuze will term, the surveillant assemblage. The surveillant assemblage and its product, the socially sorted body, aim less at molding, punishing and controlling the body and more at triggering events of in- and ex-clusion from (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations