Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Privacy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Conceptions of privacy== ===Privacy as contextual integrity=== {{Main|Contextual integrity}} The theory of [[contextual integrity]],<ref name="ci_book">{{cite book |last1=Nissenbaum |first1=Helen |title=Privacy in Context Technology, Policy, and the Integrity of Social Life |date=2009 |publisher=Stanford University Press |location=Stanford, CA |isbn=978-0804772891}}</ref> developed by [[Helen Nissenbaum]], defines privacy as an appropriate information flow, where appropriateness, in turn, is defined as conformance with legitimate, informational norms specific to social contexts. ===Right to be let alone=== In 1890, the United States [[jurists]] Samuel D. Warren and Louis Brandeis wrote "The Right to Privacy", an article in which they argued for the "right to be let alone", using that phrase as a definition of privacy.{{sfn|Solove|2010|pp=15–17}} This concept relies on the theory of [[natural rights and legal rights|natural rights]] and focuses on protecting individuals. The citation was a response to recent technological developments, such as photography, and sensationalist journalism, also known as [[yellow journalism]].<ref name=Brandeis>Warren and Brandeis, [http://www.law.louisville.edu/library/collections/brandeis/node/225 "The Right To Privacy"](1890) 4 Harvard Law Review 193</ref> There is extensive commentary over the meaning of being "let alone", and among other ways, it has been interpreted to mean the right of a person to choose [[seclusion]] from the attention of others if they wish to do so, and the right to be immune from scrutiny or being observed in private settings, such as one's own home.{{sfn|Solove|2010|pp=15–17}} Although this early vague legal concept did not describe privacy in a way that made it easy to design broad legal protections of privacy, it strengthened the notion of privacy rights for individuals and began a legacy of discussion on those rights in the US.{{sfn|Solove|2010|pp=15–17}} ===Limited access=== Limited access refers to a person's ability to participate in society without having other individuals and organizations collect information about them.{{sfn|Solove|2010|p=19}} Various theorists have imagined privacy as a system for limiting access to one's personal information.{{sfn|Solove|2010|p=19}} [[Edwin Lawrence Godkin]] wrote in the late 19th century that "nothing is better worthy of legal protection than private life, or, in other words, the right of every man to keep his affairs to himself, and to decide for himself to what extent they shall be the subject of public observation and discussion."{{sfn|Solove|2010|p=19}}<ref>{{cite journal|last1=Godkin|first1=E.L.|author-link1=Edwin Lawrence Godkin|title=Libel and its Legal Remedy|journal=[[Atlantic Monthly]]|date=December 1880|volume=46|issue=278|pages=729–739|url=http://digital.library.cornell.edu/cgi/t/text/pageviewer-idx?c=atla;cc=atla;rgn=full%20text;idno=atla0046-6;didno=atla0046-6;view=image;seq=0735;node=atla0046-6%3A1}}</ref> Adopting an approach similar to the one presented by Ruth Gavison<ref name="transparency">{{cite journal|last1=Oulasvirta |first1=Antti |last2=Suomalainen |first2=Tiia |last3=Hamari |first3=Juho |last4=Lampinen |first4=Airi |last5=Karvonen |first5=Kristiina |date=2014 |title=Transparency of Intentions Decreases Privacy Concerns in Ubiquitous Surveillance |url=https://www.researchgate.net/publication/264638054 |journal=Cyberpsychology, Behavior, and Social Networking |volume=17 |issue=10 |pages=633–638 |doi=10.1089/cyber.2013.0585|pmid=25226054 }}</ref> Nine years earlier,<ref>{{cite journal|last1=Gavison|first1=Ruth|author-link1=Ruth Gavison|title=Privacy and the Limits of Law|journal=Yale Law Journal|date=1980|volume=89|issue=3|pages=421–471|doi=10.2307/795891|jstor=795891}}</ref> [[Sissela Bok]] said that privacy is "the condition of being protected from unwanted access by others—either physical access, personal information, or attention."{{sfn|Solove|2010|p=19}}<ref>{{cite book|last1=Bok|first1=Sissela|author-link1=Sissela Bok|title=Secrets : on the ethics of concealment and revelation|date=1989|publisher=Vintage Books|location=New York|isbn=978-0-679-72473-5|pages=10–11|edition=Vintage Books}}</ref> ===Control over information=== Control over one's personal information is the concept that "privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others." Generally, a person who has [[Implied consent|consensually formed an interpersonal relationship]] with another person is not considered "protected" by privacy rights with respect to the person they are in the relationship with.{{sfn|Solove|2010|p=24}}<ref>The quotation is from Alan Westin.{{cite book|last1=Westin|first1=Alan F.|last2=Blom-Cooper|first2=Louis|author-link1=Alan Westin|author-link2=Louis Blom-Cooper|title=Privacy and freedom|date=1970|publisher=Bodley Head|location=London|isbn=978-0-370-01325-1|page=7}}</ref> [[Charles Fried]] said that "Privacy is not simply an absence of information about us in the minds of others; rather it is the control we have over information about ourselves. Nevertheless, in the era of [[big data]], control over information is under pressure.<ref>{{cite web|url=https://openaccess.leidenuniv.nl/handle/1887/46935|title=Predicting Data that People Refuse to Disclose; How Data Mining Predictions Challenge Informational Self-Determination|website=openaccess.leidenuniv.nl|access-date=2017-07-19}}</ref><ref>{{Cite journal|date=2014-12-01|last1=Mantelero|first1=Alessandro|title=The future of consumer data protection in the E.U. Re-thinking the "notice and consent" paradigm in the new era of predictive analytics|url=https://www.sciencedirect.com/science/article/abs/pii/S026736491400154X|journal=Computer Law & Security Review|language=en|volume=30|issue=6|pages=643–660|doi=10.1016/j.clsr.2014.09.004|s2cid=61135032 |issn=0267-3649}}</ref>{{quote without source|reason=This reference does not contain the quote.|date=June 2021}}{{check quotation|reason=The quote opens without closing.}} === States of privacy === Alan Westin defined four states—or experiences—of privacy: solitude, intimacy, anonymity, and reserve. [[Solitude]] is a physical separation from others;<ref name=":0">{{Cite book|title = Privacy and Freedom|last = Westin|first = Alan|publisher = Atheneum|year = 1967|location = New York}}</ref> Intimacy is a "close, relaxed; and frank relationship between two or more individuals" that results from the seclusion of a pair or small group of individuals.<ref name=":0" /> Anonymity is the "desire of individuals for times of 'public privacy.'"<ref name=":0" /> Lastly, reserve is the "creation of a psychological barrier against unwanted intrusion"; this creation of a psychological barrier requires others to respect an individual's need or desire to restrict communication of information concerning themself.<ref name=":0" /> In addition to the psychological barrier of reserve, Kirsty Hughes identified three more kinds of privacy barriers: physical, behavioral, and normative. Physical barriers, such as walls and doors, prevent others from accessing and experiencing the individual.<ref name=":1">{{Cite journal|title = A Behavioural Understanding of Privacy and Its Implications for Privacy Law|last = Hughes|first = Kirsty|date = 2012|journal = The Modern Law Review|volume = 75|issue = 5|pages = 806–836|doi = 10.1111/j.1468-2230.2012.00925.x|s2cid = 142188960}}</ref> (In this sense, "accessing" an individual includes accessing personal information about them.)<ref name=":1" /> Behavioral barriers communicate to others—verbally, through language, or non-verbally, through personal space, body language, or clothing—that an individual does not want the other person to access or experience them.<ref name=":1" /> Lastly, normative barriers, such as laws and social norms, restrain others from attempting to access or experience an individual.<ref name=":1" /> === Privacy as personal control === [[Psychologist]] Carl A. Johnson has identified the psychological concept of “personal control” as closely tied to privacy. His concept was developed as a process containing four stages and two behavioural outcome relationships, with one’s outcomes depending on situational as well as personal factors.<ref name="Johnson">{{Cite journal |last=Johnson |first=Carl A. |date=1974 |title=Privacy as Personal Control |url=https://www.researchgate.net/publication/268370076 |journal=Man-environment Interactions: Evaluations and Applications: Part 2 |volume=6 |pages=83–100}}</ref> Privacy is described as “behaviors falling at specific locations on these two dimensions”.{{sfn|Johnson|1974|p=90}} Johnson examined the following four stages to categorize where people exercise personal control: outcome choice control is the selection between various outcomes. Behaviour selection control is the selection between behavioural strategies to apply to attain selected outcomes. Outcome effectance describes the fulfillment of selected behaviour to achieve chosen outcomes. Outcome realization control is the personal interpretation of one’s achieved outcome. The relationship between two factors– primary and secondary control, is defined as the two-dimensional phenomenon where one reaches personal control: primary control describes behaviour directly causing outcomes, while secondary control is behaviour indirectly causing outcomes.{{sfn|Johnson|1974|pp=85-89}} Johnson explores the concept that privacy is a behaviour that has secondary control over outcomes. [[Lorenzo Magnani]] expands on this concept by highlighting how privacy is essential in maintaining personal control over one's identity and consciousness.<ref name=":Magnani">{{Cite book|title = Morality in a Technological World: Knowledge as Duty|last = Magnani|first = Lorenzo|publisher = Cambridge University Press|year = 2007|location = Cambridge|isbn = 9780511498657|doi = 10.1017/CBO9780511498657|pages=110–118|chapter=4, "Knowledge as Duty: Cyberprivacy"}}</ref> He argues that consciousness is partly formed by external representations of ourselves, such as narratives and data, which are stored outside the body. However, much of our consciousness consists of internal representations that remain private and are rarely externalized. This internal privacy, which Magnani refers to as a form of "information property" or "moral capital," is crucial for preserving free choice and personal agency. According to Magnani,{{sfnp|Magnani|2007|p=116|loc=ch. 4, "Knowledge as Duty: Cyberprivacy"}} when too much of our identity and data is externalized and subjected to scrutiny, it can lead to a loss of personal control, dignity, and responsibility. The protection of privacy, therefore, safeguards our ability to develop and pursue personal projects in our own way, free from intrusive external forces. Acknowledging other conceptions of privacy while arguing that the fundamental concern of privacy is behavior selection control, Johnson converses with other interpretations including those of Maxine Wolfe and Robert S. Laufer, and Irwin Altman. He clarifies the continuous relationship between privacy and personal control, where outlined behaviours not only depend on privacy, but the conception of one’s privacy also depends on his defined behavioural outcome relationships.{{sfn|Johnson|1974|pp=90-92}} ===Secrecy=== Privacy is sometimes defined as an option to have secrecy. Richard Posner said that privacy is the right of people to "conceal information about themselves that others might use to their disadvantage".{{sfn|Solove|2010|p=21}}<ref>{{cite book|last1=Posner|first1=Richard A.|author-link1=Richard Posner|title=The economics of justice|date=1983|publisher=Harvard University Press|location=Cambridge, MA|isbn=978-0-674-23526-7|page=[https://archive.org/details/economi_pos_1981_00_0099/page/271 271]|edition=5. print|url=https://archive.org/details/economi_pos_1981_00_0099/page/271}}</ref> In various legal contexts, when privacy is described as secrecy, a conclusion is reached: if privacy is secrecy, then rights to privacy do not apply for any information which is already publicly disclosed.{{sfn|Solove|2010|pp=22–23}} When privacy-as-secrecy is discussed, it is usually imagined to be a selective kind of secrecy in which individuals keep some information secret and private while they choose to make other information public and not private.{{sfn|Solove|2010|pp=22–23}} ===Personhood and autonomy=== Privacy may be understood as a necessary precondition for the development and preservation of personhood. Jeffrey Reiman defined privacy in terms of a recognition of one's ownership of their physical and mental reality and a moral right to [[self-determination]].<ref name=":2">{{Cite journal|title = Privacy, Intimacy, and Personhood|last = Reiman|first = Jeffrey|date = 1976|journal = Philosophy & Public Affairs}}</ref> Through the "social ritual" of privacy, or the social practice of respecting an individual's privacy barriers, the social group communicates to developing children that they have exclusive moral rights to their bodies—in other words, moral ownership of their body.<ref name=":2" /> This entails control over both active (physical) and cognitive appropriation, the former being control over one's movements and actions and the latter being control over who can experience one's physical existence and when.<ref name=":2" /> Alternatively, Stanley Benn defined privacy in terms of a recognition of oneself as a subject with agency—as an individual with the capacity to choose.<ref name=":3">{{Cite book|title = Philosophical Dimensions of Privacy: An Anthology|last = Benn|first = Stanley|publisher = Cambridge University Press|location = New York|editor-first = Ferdinand|chapter = Privacy, freedom, and respect for persons|editor-last = Schoeman}}</ref> Privacy is required to exercise choice.<ref name=":3" /> Overt observation makes the individual aware of himself or herself as an object with a "determinate character" and "limited probabilities."<ref name=":3" /> Covert observation, on the other hand, changes the conditions in which the individual is exercising choice without his or her knowledge and consent.<ref name=":3" /> In addition, privacy may be viewed as a state that enables autonomy, a concept closely connected to that of personhood. According to Joseph Kufer, an autonomous self-concept entails a conception of oneself as a "purposeful, self-determining, responsible agent" and an awareness of one's capacity to control the boundary between self and other—that is, to control who can access and experience him or her and to what extent.<ref name=":4">{{Cite journal|title = Privacy, Autonomy, and Self-Concept|last = Kufer|first = Joseph|date = 1987|journal = American Philosophical Quarterly}}</ref> Furthermore, others must acknowledge and respect the self's boundaries—in other words, they must respect the individual's privacy.<ref name=":4" /> The studies of psychologists such as Jean Piaget and Victor Tausk show that, as children learn that they can control who can access and experience them and to what extent, they develop an autonomous self-concept.<ref name=":4" /> In addition, studies of adults in particular institutions, such as Erving Goffman's study of "total institutions" such as prisons and mental institutions,<ref>{{Cite book|title = Asylums: Essays on the Social Situation of Mental Patients and Other Inmates|last = Goffman|first = Erving|publisher = Doubleday|year = 1968|location = New York}}</ref> suggest that systemic and routinized deprivations or violations of privacy deteriorate one's sense of autonomy over time.<ref name=":4" /> === Self-identity and personal growth === Privacy may be understood as a prerequisite for the development of a sense of self-identity. Privacy barriers, in particular, are instrumental in this process. According to Irwin Altman, such barriers "define and limit the boundaries of the self" and thus "serve to help define [the self]."<ref name=":5">{{Cite book|title = The Environment and Social Behavior: Privacy, Personal Space, Territory, and Crowding|last = Altman|first = Irwin|publisher = Brooks/Cole Publishing Company|year = 1975|location = Monterey|isbn=}}{{ISBN?}}</ref> This control primarily entails the ability to regulate contact with others.<ref name=":5" /> Control over the "permeability" of the self's boundaries enables one to control what constitutes the self and thus to define what is the self.<ref name=":5" /> In addition, privacy may be seen as a state that fosters personal growth, a process integral to the development of self-identity. Hyman Gross suggested that, without privacy—solitude, anonymity, and temporary releases from social roles—individuals would be unable to freely express themselves and to engage in self-discovery and [[self-criticism]].<ref name=":4" /> Such self-discovery and self-criticism contributes to one's understanding of oneself and shapes one's sense of identity.<ref name=":4" /> ===Intimacy=== In a way analogous to how the personhood theory imagines privacy as some essential part of being an individual, the intimacy theory imagines privacy to be an essential part of the way that humans have strengthened or [[intimate relationships]] with other humans.{{sfn|Solove|2010|p=35}} Because part of [[Interpersonal relationship|human relationships]] includes individuals volunteering to self-disclose most if not all personal information, this is one area in which privacy does not apply.{{sfn|Solove|2010|p=35}} [[James Rachels]] advanced this notion by writing that privacy matters because "there is a close connection between our ability to control who has access to us and to information about us, and our ability to create and maintain different sorts of social relationships with different people."{{sfn|Solove|2010|p=35}}<ref>{{cite journal|last1=Rachels|first1=James|author-link1=James Rachels|title=Why Privacy is Important|journal=[[Philosophy & Public Affairs]]|date=Summer 1975|volume=4|issue=4|pages=323–333|jstor=2265077}}</ref> Protecting intimacy is at the core of the concept of sexual privacy, which law professor [[Danielle Citron]] argues should be protected as a unique form of privacy.<ref>{{Cite journal|last=Citron|first=Danielle |author-link=Danielle Citron|date=2019|title=Sexual Privacy|url=https://scholarship.law.bu.edu/faculty_scholarship/620/|journal=Yale Law Journal|volume=128|pages=1877, 1880}}</ref> ===Physical privacy=== Physical privacy could be defined as preventing "intrusions into one's physical space or solitude."<ref>{{cite book| title=Managing Privacy: Information Technology and Corporate America| author=H. Jeff Smith| year=1994| publisher=UNC Press Books| isbn=978-0807821473| url-access=registration| url=https://archive.org/details/managingprivacyi0000smit}}</ref> An example of the legal basis for the right to physical privacy is the U.S. [[Fourth Amendment to the United States Constitution|Fourth Amendment]], which guarantees "the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures".<ref name="autogenerated1">{{cite news|url=http://findarticles.com/p/articles/mi_qa3805/is_200206/ai_n9109326/pg_1 |work= Georgetown Law Journal |title= Fixing the Fourth Amendment with trade secret law: A response to Kyllo v. United States |year=2002 }}</ref> Physical privacy may be a matter of cultural sensitivity, personal dignity, and/or shyness. There may also be concerns about safety, if, for example one is wary of becoming the victim of crime or [[stalking]].<ref>{{cite web |url= http://www.privacyrights.org/fs/fs14a-stalking.htm |title= Security Recommendations For Stalking Victims |publisher= Privacyrights |date= 11 January 2012 |access-date= 2 February 2008 |archive-date= 11 January 2012 |archive-url= https://web.archive.org/web/20120111081006/http://www.privacyrights.org/fs/fs14a-stalking.htm |url-status= dead }}</ref> There are different things that can be prevented to protect one's physical privacy, including people watching (even through recorded images) one's [[Physical intimacy|intimate behaviours]] or [[intimate part]]s and unauthorized access to one's personal possessions or places. Examples of possible efforts used to avoid the former, especially for [[modesty]] reasons, are [[clothes]], [[wall]]s, [[fence]]s, privacy screens, [[cathedral glass]], [[window covering]]s, etc. ===Organizational=== Government agencies, corporations, groups/societies and other organizations may desire to keep their activities or secrets from being revealed to other organizations or individuals, adopting various [[security]] practices and controls in order to keep private information confidential. Organizations may seek legal protection for their secrets. For example, a government administration may be able to invoke [[executive privilege]]<ref>{{cite web|url=http://writ.corporate.findlaw.com/amar/20040416.html |title=FindLaw's Writ – Amar: Executive Privilege |publisher=Writ.corporate.findlaw.com |date=2004-04-16 |access-date=2012-01-01}}</ref> or declare certain information to be [[Classified information|classified]], or a corporation might attempt to protect valuable proprietary information as [[trade secret]]s.<ref name="autogenerated1" /> ====Privacy self-synchronization==== Privacy self-synchronization is a hypothesized mode by which the stakeholders of an enterprise privacy program spontaneously contribute collaboratively to the program's maximum success. The stakeholders may be customers, employees, managers, executives, suppliers, partners or investors. When self-synchronization is reached, the model states that the personal interests of individuals toward their privacy is in balance with the business interests of enterprises who collect and use the personal information of those individuals.<ref>Popa, C., et al., "Managing Personal Information: Insights on Corporate Risk and Opportunity for Privacy-Savvy Leaders", Carswell (2012), Ch. 6</ref> ===An individual right=== [[David Flaherty]] believes networked computer databases pose threats to privacy. He develops 'data protection' as an aspect of privacy, which involves "the collection, use, and dissemination of personal information". This concept forms the foundation for fair information practices used by governments globally. Flaherty forwards an idea of privacy as information control, "[i]ndividuals want to be left alone and to exercise some control over how information about them is used".<ref>Flaherty, D. (1989). Protecting privacy in surveillance societies: The federal republic of Germany, Sweden, France, Canada, and the United States. Chapel Hill, U.S.: The University of North Carolina Press.</ref> [[Richard Posner]] and Lawrence Lessig focus on the economic aspects of personal information control. Posner criticizes privacy for concealing information, which reduces market efficiency. For Posner, employment is selling oneself in the labour market, which he believes is like selling a product. Any 'defect' in the 'product' that is not reported is fraud.<ref>{{cite journal | last1 = Posner | first1 = R. A. | year = 1981 | title = The economics of privacy | journal = The American Economic Review | volume = 71 | issue = 2| pages = 405–409 }}</ref> For Lessig, privacy breaches online can be regulated through code and law. Lessig claims "the protection of privacy would be stronger if people conceived of the right as a property right",{{sfnp|Lessig|2006|loc=p. 229: "In my view, the protection of privacy would be stronger if people conceived of the right as a property right."}} and that "individuals should be able to control information about themselves".{{sfnp|Lessig|2006}} ===A collective value and a human right=== There have been attempts to establish privacy as one of the fundamental [[human rights]], whose social value is an essential component in the functioning of democratic societies.<ref>{{cite book |last=Johnson|first=Deborah|title=Ethical theory and business.|year=2009|publisher=Pearson/Prentice Hall|location=Upper Saddle River, NJ|isbn=978-0-13-612602-7|pages=428–442|edition=8th|author-link=Privacy|editor1=Beauchamp |editor2=Bowie |editor3=Arnold}}</ref> Priscilla Regan believes that individual concepts of privacy have failed philosophically and in policy. She supports a social value of privacy with three dimensions: shared perceptions, public values, and [[Collectivism and individualism|collective]] components. Shared ideas about privacy allows freedom of conscience and diversity in thought. Public values guarantee democratic participation, including freedoms of speech and association, and limits government power. Collective elements describe privacy as collective good that cannot be divided. Regan's goal is to strengthen privacy claims in policy making: "if we did recognize the collective or public-good value of privacy, as well as the common and public value of privacy, those advocating privacy protections would have a stronger basis upon which to argue for its protection".<ref>Regan, P. M. (1995). ''Legislating privacy: Technology, social values, and public policy''. Chapel Hill: The University of North Carolina Press.{{ISBN?}}{{page needed|date=April 2022}}</ref> Leslie Regan Shade argues that the human right to privacy is necessary for meaningful democratic participation, and ensures human dignity and autonomy. Privacy depends on norms for how information is distributed, and if this is appropriate. Violations of privacy depend on context. The human right to privacy has precedent in the [[United Nations Declaration of Human Rights]]: "Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."<ref>{{cite web |url=https://www.un.org/Overview/rights.html | title=United Nations Universal Declaration of Human Rights | year=1948 | archive-url=https://web.archive.org/web/20141208080853/http://www.un.org/Overview/rights.html | archive-date=2014-12-08}}</ref> Shade believes that privacy must be approached from a people-centered perspective, and not through the marketplace.<ref>Shade, L.R. (2008). "Reconsidering the right to privacy in Canada". ''Bulletin of Science, Technology & Society'', 28(1), 80–91.</ref> Dr. Eliza Watt, Westminster Law School, University of Westminster in London, UK, proposes application of the International Human Right Law (IHRL) concept of “virtual control” as an approach to deal with extraterritorial mass surveillance by state intelligence agencies. Dr. Watt envisions the “virtual control” test, understood as a remote control over the individual's right to privacy of communications, where privacy is recognized under the ICCPR, Article 17. This, she contends, may help to close the normative gap that is being exploited by nation states.<ref>Watt, Eliza. [http://eprints.bournemouth.ac.uk/30324/1/THE%20ROLE%20OF%20INTERNATIONAL%20LAW%20AND%20CYBER%20SURVEILLANCE-CYCON%20TALLIN%202017.pdf "The role of international human rights law in the protection of online privacy in the age of surveillance."] In 2017 9th International Conference on Cyber Conflict (CyCon), pp. 1–14. IEEE, 2017.</ref> ===Privacy paradox and economic valuation=== The ''privacy paradox'' is a phenomenon in which online users state that they are concerned about their privacy but behave as if they were not.<ref name="Swartz, J. 2000">Swartz, J., "'Opting In': A Privacy Paradox", The Washington Post, 03 Sep 2000, H.1.</ref> While this term was coined as early as 1998,<ref>Bedrick, B., Lerner, B., Whitehead, B. "The privacy paradox: Introduction", ''News Media and the Law'', Washington, DC, Volume 22, Issue 2, Spring 1998, pp. P1–P3.</ref> it was not used in its current popular sense until the year 2000.<ref>J. Sweat "Privacy paradox: Customers want control – and coupons", ''Information Week'', Manhasset Iss, 781, April 10, 2000, p. 52.</ref><ref name="Swartz, J. 2000"/> Susan B. Barnes similarly used the term ''privacy paradox'' to refer to the ambiguous boundary between private and public space on social media.<ref>{{Cite web|url=https://firstmonday.org/ojs/index.php/fm/issue/view/203|title=Volume 11, Number 9 |date= 4 September 2006|website=firstmonday.org|access-date=2019-11-25}}</ref> When compared to adults, young people tend to disclose more information on [[#Social Media|social media]]. However, this does not mean that they are not concerned about their privacy. Susan B. Barnes gave a case in her article: in a television interview about Facebook, a student addressed her concerns about disclosing personal information online. However, when the reporter asked to see her Facebook page, she put her home address, phone numbers, and pictures of her young son on the page. The privacy paradox has been studied and scripted in different research settings. Several studies have shown this inconsistency between privacy attitudes and behavior among online users.<ref>{{Cite journal|last=Taddicken|first=Monika|date=January 2014|title=The 'Privacy Paradox' in the Social Web: The Impact of Privacy Concerns, Individual Characteristics, and the Perceived Social Relevance on Different Forms of Self-Disclosure|journal=Journal of Computer-Mediated Communication|language=en|volume=19|issue=2|pages=248–273|doi=10.1111/jcc4.12052|doi-access=free}}</ref> However, by now an increasing number of studies have also shown that there are significant and at times large correlations between privacy concerns and information sharing behavior,<ref>{{Cite journal|last1=Nemec Zlatolas|first1=Lili|last2=Welzer|first2=Tatjana|last3=Heričko|first3=Marjan|last4=Hölbl|first4=Marko|date=April 2015|title=Privacy antecedents for SNS self-disclosure: The case of Facebook|url=https://linkinghub.elsevier.com/retrieve/pii/S0747563214007274|journal=Computers in Human Behavior|language=en|volume=45|pages=158–167|doi=10.1016/j.chb.2014.12.012}}</ref> which speaks against the privacy paradox. A meta-analysis of 166 studies published on the topic reported an overall small but significant relation between privacy concerns and informations sharing or use of privacy protection measures.<ref>{{Cite journal|last1=Baruh|first1=Lemi|last2=Secinti|first2=Ekin|last3=Cemalcilar|first3=Zeynep|date=February 2017|title=Online Privacy Concerns and Privacy Management: A Meta-Analytical Review: Privacy Concerns Meta-Analysis|url=https://academic.oup.com/joc/article/67/1/26-53/4082433|journal=Journal of Communication|language=en|volume=67|issue=1|pages=26–53|doi=10.1111/jcom.12276}}</ref> So although there are several individual instances or anecdotes where behavior appear paradoxical, on average privacy concerns and privacy behaviors seem to be related, and several findings question the general existence of the privacy paradox.<ref>{{Cite journal|last1=Gerber|first1=Nina|last2=Gerber|first2=Paul|last3=Volkamer|first3=Melanie|date=August 2018|title=Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior|url=https://linkinghub.elsevier.com/retrieve/pii/S0167404818303031|journal=Computers & Security|language=en|volume=77|pages=226–261|doi=10.1016/j.cose.2018.04.002|s2cid=52884338}}</ref> However, the relationship between concerns and behavior is likely only small, and there are several arguments that can explain why that is the case. According to the [[Value-action gap|attitude-behavior gap]], attitudes and behaviors are ''in general'' and in most cases not closely related.<ref>{{Cite journal|last1=Kaiser|first1=Florian G.|last2=Byrka|first2=Katarzyna|last3=Hartig|first3=Terry|date=November 2010|title=Reviving Campbell's Paradigm for Attitude Research|url=http://journals.sagepub.com/doi/10.1177/1088868310366452|journal=Personality and Social Psychology Review|language=en|volume=14|issue=4|pages=351–367|doi=10.1177/1088868310366452|pmid=20435803|s2cid=5394359|issn=1088-8683}}</ref> A main explanation for the partial mismatch in the context of privacy specifically is that users lack awareness of the risks and the degree of protection.<ref>Acquisti, A., & Gross, R. (2006, June). Imagined communities: Awareness, information sharing, and privacy on the Facebook. In ''Privacy enhancing technologies'' (pp. 36–58). Springer Berlin Heidelberg.</ref> Users may underestimate the harm of disclosing information online.<ref name="Cambridge University Press"/> On the other hand, some researchers argue that the mismatch comes from lack of technology literacy and from the design of sites.<ref>{{cite journal | author = S. Livingstone | year = 2008 | title = Taking risky opportunities in youthful content creation: teenagers' use of social networking sites for intimacy, privacy and self-expression | url = http://eprints.lse.ac.uk/27072/1/Taking_risky_opportunities_in_youthful_content_creation_%28LSERO%29.pdf| journal = New Media & Society | volume = 10 | issue = 3| pages = 393–411 | doi = 10.1177/1461444808089415 | s2cid = 31076785 }}</ref> For example, users may not know how to change their [[Privacy settings|default settings]] even though they care about their privacy. Psychologists Sonja Utz and Nicole C. Krämer particularly pointed out that the privacy paradox can occur when users must trade-off between their privacy concerns and impression management.<ref>Utz, S., & Kramer, N. (2009). The privacy paradox on social network sites revisited: The role of individual characteristics and group norms. ''Cyberpsychology: Journal of Psychosocial Research on Cyberspace,'' article 1. [http://www.cyberpsychology.eu/view.php?cisloclanku=2009111001&article=1] {{Webarchive|url=https://web.archive.org/web/20160413214515/http://cyberpsychology.eu/view.php?cisloclanku=2009111001&article=1|date=2016-04-13}}</ref> ====Research on irrational decision making==== {{further|#Social networking|#Advertising on Mobile Devices}} A study conducted by Susanne Barth and Menno D.T. de Jo demonstrates that decision making takes place on an irrational level, especially when it comes to mobile computing. Mobile applications in particular are often built up in such a way that spurs decision making that is fast and automatic without assessing risk factors. Protection measures against these unconscious mechanisms are often difficult to access while downloading and installing apps. Even with mechanisms in place to protect user privacy, users may not have the knowledge or experience to enable these mechanisms.<ref name="Barth 1038–1058">{{Cite journal|last1=Barth|first1=Susanne|last2=de Jong|first2=Menno D. T.|date=2017-11-01|title=The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behavior – A systematic literature review|journal=Telematics and Informatics|volume=34|issue=7|pages=1038–1058|doi=10.1016/j.tele.2017.04.013|issn=0736-5853|doi-access=free}}</ref> Users of mobile applications generally have very little knowledge of how their personal data are used. When they decide which application to download, they typically are not able to effectively interpret the information provided by application vendors regarding the collection and use of personal data.<ref name="Kokolakis 122–134">{{Cite journal|last=Kokolakis|first=Spyros|date=January 2017|title=Privacy attitudes and privacy behaviour: A review of current research on the privacy paradox phenomenon|journal=Computers & Security|language=en|volume=64|pages=122–134|doi=10.1016/j.cose.2015.07.002|s2cid=422308 }}</ref> Other research finds that this lack of interpretability means users are much more likely to be swayed by cost, functionality, design, ratings, reviews and number of downloads than requested permissions for usage of their personal data.<ref>{{Cite journal|last1=Barth|first1=Susanne|last2=de Jong|first2=Menno D. T.|last3=Junger|first3=Marianne|last4=Hartel|first4=Pieter H.|last5=Roppelt|first5=Janina C.|date=2019-08-01|title=Putting the privacy paradox to the test: Online privacy and security behaviors among users with technical knowledge, privacy awareness, and financial resources|journal=Telematics and Informatics|volume=41|pages=55–69|doi=10.1016/j.tele.2019.03.003|issn=0736-5853|doi-access=free}}</ref> ====The economic valuation of privacy==== {{see also|Surveillance capitalism|Mass surveillance industry}} The willingness to incur a privacy risk is suspected to be driven by a complex array of factors including risk attitudes, personal value for private information, and general attitudes to privacy (which are typically measured using surveys).<ref name="Frik">{{Cite journal|last1=Frik|first1=Alisa|last2=Gaudeul|first2=Alexia|date=2020-03-27|title=A measure of the implicit value of privacy under risk|journal=Journal of Consumer Marketing|volume=37 |issue=4 |language=en|pages=457–472|doi=10.1108/JCM-06-2019-3286|s2cid=216265480|issn=0736-3761}}</ref> One experiment aiming to determine the monetary value of several types of personal information indicated relatively low evaluations of personal information.<ref name="Kokolakis 122–134" /> Despite claims that ascertaining the value of data requires a "stock-market for personal information",<ref>{{Cite web|url=https://blog.mozilla.org/internetcitizen/2018/08/24/the-privacy-paradox-is-a-privacy-dilemma|title=The privacy paradox is a privacy dilemma|last=Burkhardt|first=Kai|website=Internet Citizen|language=en-US|access-date=2020-01-10}}</ref> [[surveillance capitalism]] and the [[mass surveillance industry]] regularly place price tags on this form of data as it is shared between corporations and governments. =====Information asymmetry===== {{see also|#User empowerment}} Users are not always given the tools to live up to their professed privacy concerns, and they are sometimes willing to trade private information for convenience, functionality, or financial gain, even when the gains are very small.<ref>{{Citation|last1=Egelman|first1=Serge|title=Choice Architecture and Smartphone Privacy: There's a Price for That|date=2013|work=The Economics of Information Security and Privacy|pages=211–236|publisher=Springer Berlin Heidelberg|isbn=978-3-642-39497-3|last2=Felt|first2=Adrienne Porter|author2-link=Adrienne Porter Felt|last3=Wagner|first3=David|doi=10.1007/978-3-642-39498-0_10|s2cid=11701552 }}</ref> One study suggests that people think their browser history is worth the equivalent of a cheap meal.<ref name="2. The Privacy Paradox">{{Citation|chapter=2. The Privacy Paradox|year=2018|pages=45–76|publisher=transcript Verlag|isbn=978-3-8394-4213-5|doi=10.14361/9783839442135-003|title=Network Publicy Governance|series=Digitale Gesellschaft|s2cid=239333913|last1=Belliger|first1=Andréa|last2=Krieger|first2=David J.|volume=20}}</ref> Another finds that attitudes to privacy risk do not appear to depend on whether it is already under threat or not.<ref name="Frik"/> The methodology of [[#User empowerment|user empowerment]] describes how to provide users with sufficient context to make privacy-informed decisions. ======Inherent necessity for privacy violation====== {{further|Privacy concerns with social networking services}} It is suggested by [[Andréa Belliger]] and David J. Krieger that the privacy paradox should not be considered a paradox, but more of a ''privacy dilemma'', for services that cannot exist without the user sharing private data.<ref name="2. The Privacy Paradox"/> However, the general public is typically not given the choice whether to share private data or not,<ref name=bloomberg-siri-alexa-listen/><ref name=state-arizona-redacted-complaint/> making it difficult to verify any claim that a service truly cannot exist without sharing private data. ====== Privacy calculus model====== {{Expand section|small=no|with=more description of the mechanics of the privacy calculus model and how it relates to the privacy paradox|date=June 2023}} The privacy calculus model posits that two factors determine privacy behavior, namely privacy concerns (or perceived risks) and expected benefits.<ref>{{Cite journal|last1=Laufer|first1=Robert S.|last2=Wolfe|first2=Maxine|date=July 1977|title=Privacy as a Concept and a Social Issue: A Multidimensional Developmental Theory|url=https://onlinelibrary.wiley.com/doi/10.1111/j.1540-4560.1977.tb01880.x|journal=Journal of Social Issues|language=en|volume=33|issue=3|pages=22–42|doi=10.1111/j.1540-4560.1977.tb01880.x}}</ref><ref>{{Cite journal|last1=Culnan|first1=Mary J.|last2=Armstrong|first2=Pamela K.|date=February 1999|title=Information Privacy Concerns, Procedural Fairness, and Impersonal Trust: An Empirical Investigation|url=http://pubsonline.informs.org/doi/abs/10.1287/orsc.10.1.104|journal=Organization Science|language=en|volume=10|issue=1|pages=104–115|doi=10.1287/orsc.10.1.104|s2cid=54041604 |issn=1047-7039}}</ref> By now, the privacy calculus has been supported by several studies.<ref>{{Cite journal|last1=Trepte|first1=Sabine|last2=Reinecke|first2=Leonard|last3=Ellison|first3=Nicole B.|last4=Quiring|first4=Oliver|last5=Yao|first5=Mike Z.|last6=Ziegele|first6=Marc|date=January 2017|title=A Cross-Cultural Perspective on the Privacy Calculus|journal=Social Media + Society|language=en|volume=3|issue=1|pages=205630511668803|doi=10.1177/2056305116688035|issn=2056-3051|doi-access=free}}</ref><ref>{{Cite journal|last1=Krasnova|first1=Hanna|last2=Spiekermann|first2=Sarah|last3=Koroleva|first3=Ksenia|last4=Hildebrand|first4=Thomas|date=June 2010|title=Online Social Networks: Why We Disclose|url=http://journals.sagepub.com/doi/10.1057/jit.2010.6|journal=Journal of Information Technology|language=en|volume=25|issue=2|pages=109–125|doi=10.1057/jit.2010.6|s2cid=33649999|issn=0268-3962}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)