Skip to main content
Log in

The Particularized Judgment Account of Privacy

  • Published:
Res Publica Aims and scope Submit manuscript

Abstract

Questions of privacy have become particularly salient in recent years due, in part, to information-gathering initiatives precipitated by the 2001 World Trade Center attacks, increasing power of surveillance and computing technologies, and massive data collection about individuals for commercial purposes. While privacy is not new to the philosophical and legal literature, there is much to say about the nature and value of privacy. My focus here is on the nature of informational privacy. I argue that the predominant accounts of privacy are unsatisfactory and offer an alternative: for a person to have informational privacy is for there to be limits on the particularized judgments that others are able to reasonably make about that person.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. See DeCew (1997, pp. 46–60); Allen (1988, pp. 82–122); Solove (2002, pp. 1116–1118).

  2. Similarly, Elizabeth Beardsley understands the ‘conceptual core’ of privacy as the right to selective disclosure, such that having privacy is having the ability to selectively disclose information Beardsley (1971). See also Moore (2010); Gross (1971); Van Den Haag (1971); Westin (1967, p. 7).

  3. Note any such belief will require inference, if only about the reliability of our belief-forming mechanisms; the difference I refer to here is one of degree. Nonetheless, some beliefs require more inference than others, and here I wish to pick out those beliefs that require a substantial degree of inference. The point I wish to make is just that privacy may continue to decrease as judgments accrue; the whole picture of P’s privacy is not complete once observations are made.

  4. One could argue that falsehoods cannot decrease privacy because only true propositions constitute information. See Floridi (2004, p. 197). Q’s hearing a falsehood about P imparts no information, and hence P’s informational privacy cannot decrease. But information need not be true. James Fetzer maintains that for a proposition to constitute information, it need only be well-formed and meaningful Fetzer (2004, pp. 224–225). Even if it were correct that information must be true, we would still need a concept to apply to well-formed and meaningful propositions that are either untrue or whose truth is unknown (Fetzer suggests ‘information,’), and we would have to investigate whether such propositions are relevant to privacy. Concluding that privacy depends only upon truths on the grounds that information is by definition true would assume the answer to the question at hand—namely, whether falsehoods are relevant to personal privacy.

  5. See Christensen (1997) for a discussion of the conceptual difficulties regarding what it means for evidence to confirm a hypothesis relative to a set of background propositions.

  6. See Pick (1989, pp. 109–135); Young (1970, pp. 11–14). Pick and Young describe attempts of phrenologists seeking to establish connections between features of persons’ skulls and their mental faculties or traits.

  7. Richard Posner argues that no privacy violation occurs in data mining until a person has actually viewed information collected. Posner (2006, pp. 96–97). Recently Matthew Tokson has argued that no privacy loss occurs if information is sorted electronically. Tokson (2011).

  8. Computer scientist LaTanya Sweeney was able to use summaries of hospital visits with explicit identifying information redacted in combination with voting records to identify patients, including the governor of Massachusetts. Analyzing 1990 census data, Sweeney determined that three pieces of information (postal code, birth date, and sex) uniquely identify 87 percent of people in the United States; city, birth date, and sex can uniquely identify 53 percent; county, birth date, and sex can uniquely identify 18 percent. Sweeney (2000). In a widely publicized case, America Online released data regarding user searches, with ‘personal’ information redacted and unique numbers assigned to each. Because the queries often related to users’ lives, they were far from anonymous, and New York Times reporters tracked one such user down for an interview. Barbaro and Zeller (2006). See also Narayanan and Shmatikov (2008).

  9. Put another way, ‘making a particularized judgment about’ is referentially transparent. We can substitute ‘P’ and ‘the author of the journal’ without altering the truth value of ‘Q made a particularized judgment about P, viz., that P has ennui.’ To use a familiar example, Q might make a particularized judgment about Cicero, viz., that he is a gifted writer. Q has likewise made such a judgment about Tully, even though Q does not know Tully and Cicero are the same person. Of course, Q does not believe that Tully is a gifted writer.

References

  • Allen, Anita. 1988. Uneasy access: Privacy for women in a free society. Totowa, N.J.: Rowman & Littlefield.

    Google Scholar 

  • Barbaro, Michael and Tom Zeller Jr. 2006. A face is exposed for AOL searcher no. 4417749. New York Times, August 9.

  • Beardsley, Elizabeth. 1971. Privacy, autonomy, and selective disclosure. In NOMOS XIII: Privacy, ed. J. Roland Pennock, and John W. Chapman, 65–70. New York: Atherton Press.

    Google Scholar 

  • Christensen, David. 1997. What is relative confirmation? Noûs 31: 370–384.

    Article  Google Scholar 

  • DeCew, Judith. 1997. In pursuit of privacy: Law, ethics, and the rise of technology. Ithaca, N.Y.: Cornell University Press.

    Google Scholar 

  • Fetzer, James. 2004. Information: Does it have to be true? Minds and Machines 14: 223–229.

    Article  Google Scholar 

  • Floridi, Luciano. 2004. Outline of a theory of strongly semantic information. Minds and Machines 14: 197–221.

    Article  Google Scholar 

  • Fried, Charles. 1984. Privacy [a moral analysis]. In Philosophical dimensions of privacy, ed. Ferdinand Schoeman, 203–222. Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Gavison, Ruth. 1984. Privacy and the limits of law. In Philosophical dimensions of privacy, ed. Ferdinand Schoeman, 346–402. Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Gross, Hyman. 1971. Privacy and autonomy. In NOMOS XIII: Privacy, ed. J. Roland Pennock, and John W. Chapman, 169–181. New York: Atherton Press.

    Google Scholar 

  • Inness, Julie. 1992. Privacy, intimacy, and isolation. New York: Oxford University Press.

    Google Scholar 

  • Moore, Adam. 2010. Privacy rights: Moral and legal foundations. University Park, Pa.: Pennsylvania State University Press.

    Google Scholar 

  • Narayanan, Arvind and Vitaly Shmatikov. 2008. Robust de-anonymization of large sparse datasets. Proceedings of the 2008 IEEE Symposium on Security and Privacy: 111–122.

  • Ohm, Paul. 2009. Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review 57: 1701–1777.

    Google Scholar 

  • Parent, W.A. 1983. Privacy, morality, and the law. Philosophy and Public Affairs 12: 269–288.

    Google Scholar 

  • Pick, Daniel. 1989. Faces of degeneration: A European disorder, c.1848-c.1918. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Posner, Richard. 2006. Not a suicide pact: The constitution in a time of national emergency. Oxford: Oxford University Press.

    Google Scholar 

  • Powers, Madison. 1996. A cognitive access definition of privacy. Law and Philosophy 15: 369–386.

    Article  Google Scholar 

  • Rachels, James. 1975. Why privacy is important. Philosophy and Public Affairs 2: 323–333.

    Google Scholar 

  • Rubel, Alan. 2007. Claims to privacy and the distributed value view. San Diego Law Review 44: 921–956.

    Google Scholar 

  • Solove, Daniel. 2002. Conceptualizing privacy. California Law Review 90: 1087–1155.

    Article  Google Scholar 

  • Sweeney, Latanya. 2000. Uniqueness of simple demographics in the U.S. population. Laboratory for International Data Privacy, Working Paper LIDAP-WP4.

  • Tokson, Matthew. 2011. Automation and the fourth amendment. Iowa Law Review 96: 581–647.

    Google Scholar 

  • Van Den Haag, Ernest. 1971. On privacy. In NOMOS XIII: Privacy, ed. J. Roland Pennock, and John W. Chapman, 149–168. New York: Atherton Press.

    Google Scholar 

  • Westin, Alan. 1967. Privacy and freedom. New York: Atheneum.

    Google Scholar 

  • Young, Robert. 1970. Mind, brain and adaptation in the nineteenth century: Cerebral localization and its biological context from Gall to Ferrier. Oxford: Clarendon Press.

    Google Scholar 

Download references

Acknowledgments

I very much appreciate the many helpful comments I’ve received from Claudia Card, Robert Streiffer, Russ Shafer Landau, Harry Brighouse, Victoria Nourse, Fred Harrington, Madison Powers, and Tom Beauchamp; audiences at the University of Wisconsin, University at Albany, and Georgetown University; and the paper’s anonymous referees.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alan Rubel.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rubel, A. The Particularized Judgment Account of Privacy. Res Publica 17, 275–290 (2011). https://doi.org/10.1007/s11158-011-9160-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11158-011-9160-4

Keywords

Navigation