AEJMC Archives

AEJMC Archives


Next Message | Previous Message
Next in Topic | Previous in Topic
Next by Same Author | Previous by Same Author
Chronologically | Most Recent First
Proportional Font | Monospaced Font


Join or Leave AEJMC
Reply | Post New Message
Search Archives

Subject: AEJ 05 ZwarunL CTM Exploring Peoples Conceptions of Privacy in the Virtual World
From: Elliott Parker <[log in to unmask]>
Reply-To:AEJMC Conference Papers <[log in to unmask]>
Date:Sat, 4 Feb 2006 08:59:06 -0500

text/plain (769 lines)

This paper was presented at the Association for Education in Journalism and
Mass Communication in San Antonio, Texas August 2005.
         If you have questions about this paper, please contact the author
directly. If you have questions about the archives, email
rakyat [ at ] For an explanation of the subject line, 
send email to
[log in to unmask] with just the four words, "get help info aejmc," in the
body (drop the "").

(Jan 2006)
Thank you.
Elliott Parker

Exploring People's Conceptions of Privacy in the Virtual World

Lara Zwarun, Ph.D.
Assistant Professor
Department of Communication
University of Texas at Arlington
Box 19107
Arlington, TX 76019
[log in to unmask]

Mike Z. Yao, M.A.
Doctoral Candidate
Department of Communication
University of California at Santa Barbara
Santa Barbara, CA 93106-4020
(805) 893-4479
[log in to unmask]

Please direct all correspondence to the first author.

Submitted to the Communication Theory and Methodology Division of the
Association for Education in Journalism and Mass Communication
for the 2005 conference in San Antonio, TX
April 1, 2005

This study uses Q-methodology to explore whether five dimensions of 
privacy identified from extant literature are meaningful ways of 
organizing people's subjective concerns about online 
privacy.  Results indicate that a majority of people display a 
similar pattern when asked to organize the statements based on 
spatial and psychological views of privacy, but not on the 
informational, rights, and boundary management perspectives, 
suggesting that interpretation of privacy can be subjective and 
requires more examination.
By far, the most commonly studied online privacy issue is consumer 
concern (e.g., Federal Trade Commission, 2000; UCLA Center for 
Communication Policy, 2000). Hoffman, Novak, and Peralta (1999) found 
that more than 90% of Internet users either have declined to provide 
personal information or have fabricated information due to online 
privacy concerns. A recent analysis of over sixteen opinion polls 
taken between 1998 and 2002 reveals that nearly two third of 
respondents were either "very" or "somewhat" concerned about privacy 
when they go on the Internet (Docter & Metzger, 2003).
	Although many recent studies of online privacy provide useful and 
generalizable descriptive findings about the scope and nature of the 
public's online privacy concerns, research on this topic, overall, 
suffers from a number of conceptual, theoretical and methodological 
limitations. As a result many critical research questions about 
online privacy remain unanswered.
Conceptually, several studies have examined the nature of online 
privacy concerns.  For example, Wang, Lee, & Wang (1998) suggest that 
Internet marketing could raise privacy concerns in the areas of 
access, solicitation, collection, monitoring, analysis, transfer, and 
storage of consumer information. The Federal Trade Commission (2000) 
identified five core principles of fair information practices溶otice, 
choice, access, security, and enforcement葉hat Sheehan and Hoy (2000) 
extended as the major dimensions of privacy concerns among Internet 
users (see also Docter & Metzger, 2003). However, research on online 
privacy does not adequately address any of these issues.
	Moreover, because specific issues such as public outcry and policy 
advocacy drive most online privacy studies, there is no overarching 
theoretical framework that guides this body of research (Harper & 
Singleton, 2001).  The multiple perspectives of privacy offered by 
philosophers, legal scholars, and social scientists further 
complicate this matter. For example, one researcher can make a legal 
consideration of violations of the right to privacy, while another 
may follow the social scientific model by examining users' 
perceptions, attitudes, and behaviors relating to privacy issues in 
the online environment. In this case, privacy is defined as specific 
acts by the first researcher, but as a psychological need that may 
vary from individual to individual by the second researcher. Without 
an overarching perspective to bridge these two intellectual 
traditions, the knowledge generated by these two researchers may not 
be easily synthesized in order to gain a fuller understanding of 
online privacy as a social phenomenon.
Methodologically, most online privacy studies rely on opinion polls 
and survey methods. Harper and Singleton (2001) point out that most 
online privacy surveys include many different concerns under the same 
heading of "privacy" without informing respondents about all the 
facets of the issue. Thus, results from these surveys are difficult 
to interpret. Further, surveys are unlikely to accurately reflect 
consumers' true preferences and concerns of privacy because 
respondents often do not have enough time to consider the complex 
issues and trade-offs involved in this topic (Harper & Singleton, 
2001). Also, while public opinion polls are useful in revealing the 
percentage of users who have privacy concerns, there is very little 
information about who is concerned about what under what 
circumstances and why.
  An Apparent Paradox
Although Internet users report high levels of concern for online 
privacy, they have very little knowledge of marketing practices that 
may violate privacy specifically, and little knowledge of online 
privacy policies as a whole (Dommeyer & Gross, 2003). Moreover, while 
consumers claim to be fairly well informed about privacy protection 
strategies, they do not often adopt these strategies.  Consumer use 
of privacy protection software, consumer attention to privacy 
statements, and the incidence rate of taking measures to increase 
privacy such as prohibiting "cookies" is very low (Dommeyer & Gross, 
2003, Tavani, 2000). The apparent discrepancy between high levels of 
online privacy concerns and the lack of actual self-protection 
behaviors among users of the Internet limits the development of more 
useful privacy protection tools, and severely hinders policymakers' 
ability to offer effective legal protections of online privacy.
An insight into why this discrepancy may occur comes from Tavani's 
(2000) examination of how users of the Internet use privacy-enhancing 
technologies (PET) to protect their online privacy. Tavani (2000) 
found that although many PETs are designed to give Internet users 
control over the revelation of personal information, much education 
is needed before these technologies can be truly effective due to the 
lack of user knowledge of how to implement these technologies.
This is not necessarily just confusion about the technologies 
themselves: it is possible that people have difficulty applying 
conceptions of privacy, space and protection developed over years of 
experience with the physical world to the virtual world.  The term 
privacy has been loosely used in many social contexts to describe an 
array of activities (Solove & Rotenberg, 2003), and there is no 
research explicitly examining how the public defines online 
privacy.  For example, researchers rarely ask: How do users of the 
Internet understand spatial metaphors in cyberspace? How do these 
metaphors relate to privacy issues? What kind of legal rights do 
users believe they have with regard to online privacy?  Until it is 
known how people answer questions like, what turf am I defending, 
what information is getting out, do I have a legal right to keep it 
private, and do I care if it becomes public, it will be difficult to 
explain the consistent paradox between people's stated concern over 
online privacy and their failure to use privacy protection 
technologies to maintain their privacy.
Five Components of Privacy
Although the concept of privacy has been defined and studied in many 
different ways, almost all views of privacy, at the broadest level, 
include one of five basic components: the spatial component用rivacy 
as separation of private and public physical spaces; the 
informational component用rivacy as a matter of protecting identity, 
personal information, and decision making; the rights or personal 
liberty component用eople have a right to privacy and should be able 
to protect it; the need component用sychologically people desire 
privacy; and boundary management葉he extent to which individuals are 
able to control the spatial and informational aspects of their 
private lives. Current research on online privacy fails to 
incorporate these five dimensions into an overarching theoretical 
The first component recognizes that privacy is considered to be a 
"spatial" problem in the physical world.  Here, thinking about 
privacy refers to separating the "private" and "public" space, or 
"spheres" of privacy that may literally surround the 
individual.  Here also is a concern for maintaining and protecting 
private land or property.
The second component reflects a respect for private information, 
thinking, writing and making decisions free of interference from the 
government or other people and the control of intellectual "property" 
or information and ideas. The third component is the rights 
component. Do I have a right to privacy?  Do others think I have a 
right to privacy? The fourth component is the psychological 
component.  How much privacy do I need?  Do I care if this 
information is made public? The fifth component is boundary 
management. What boundary is being broken, what space is being 
invaded, what information is being taken and how can I manage these 
intrusions? Privacy protection software is primarily addressed to the 
last component, although it is not known if this is a common 
conceptualization of privacy for the public.
Very little empirical research has been aimed at investigating the 
five dimensions of privacy in the online environment. However, 
communication technologies, such as the Internet, challenge these 
earlier definitions and make it necessary to ask whether digital 
technologies require a new understanding of privacy.  With respect to 
the Internet, it is not enough for existing views and legal 
protections of privacy to simply be adapted.  The unique 
characteristics of today's communication technologies, particularly 
the Internet, present new challenges to all five components of the 
notion of privacy. These components must be applied to the concept of 
online privacy so that a theoretical perspective can be gained.
The spatial component.   Nearly all influential thinkers within the 
Western philosophical tradition have made some sort of distinction 
between "public" and "private" spaces (Elshtain, 1995). The 
assumption generally has been that there is a boundary between the 
private realm and the public realm.  For example, Aristotle saw 
private homes and households as the private sphere, or "oikos," as 
contrasted with the public sphere defined by political activities 
(DeCew, 1997; Swanson, 1992).
Violations of privacy in the physical world usually involve the 
intrusion of some physical boundary that separates one's private 
space and public space. While the boundaries can be the physical 
walls surrounding one's home, an arbitrary property line drawn by 
land developers, or even an invisible buffer zone around one's body, 
what is considered private can, nevertheless, be measured by spatial 
properties of the physical world (e.g., length, distance, and area).
However, this spatial metaphor may be inadequate when it is extended 
to the digital universe because of a lack of spatial measurement in 
the virtual environment of the Internet, where boundaries are not 
observable and cannot be measured in the usual way. In cyberspace, 
there are no physical walls, there are no landmarks, and there is no 
differentiation between where one person's space begins and another's ends.
Despite this, programmers have devised metaphors based on the 
physical world such as "firewalls" and "cookies" to encourage 
computer users to adopt protection technologies to defend their 
boundaries or turf. But people may not really understand the 
metaphors. For example, computer users could "build" some sort of 
metaphoric "firewall" around an arbitrary space in cyberspace, but 
what this really means is to activate certain features of software to 
enable it to protect information. It is not really a wall, you cannot 
see it, and therefore this spatial metaphor for privacy may not work 
very well because users will not be able to actually define a space 
around which they should build a "wall."
Similarly, while "cookies" are literally strings of computer code 
deposited on a computer's hard drive, they are often explained as 
being similar to a camera in a room watching people's behavior. The 
spatial metaphor here is supposed to invoke a sense of physical 
protection so that people might turn on the protection features in a 
Web browser to reject "cookies". However, if users cannot conceive of 
the existence of a "room" in which "cameras" (cookies) are installed, 
they won't be able to protect themselves.
In summary, using spatial metaphors to understand online privacy 
issues may not work because people do not know how to think about 
their territory and how to defend it in virtual space.  They may not 
know how to defend their boundaries when the boundaries are 
informational rather than physical.
The informational component.  A second component that can be found in 
most definitions of privacy concerns the degree to which individuals 
have controls over their personal information and their private 
lives. Informational privacy relates to an individual's ability to 
determine when, how, and what private information about the self is 
released to another person (Westin, 1967) or an organization. 
Protections of private information have usually been treated as a 
moral and ethical issue instead of a matter of financial interest 
(Solove & Roterberg, 2003).
However, the informational component of privacy is challenged by new 
communication technologies, as well as the increasing commercial 
value of personal information (Leino-Kilpi et al., 2001).  The main 
concerns here are with access to personal data stored in centralized 
locations (e.g., computer databases) and the extent to which they 
should be protected.  Today, personal information is a commodity that 
is bought and sold by businesses.  Whole industries and attendant 
bureaucracies have formed solely to collect and distribute sensitive 
information, such as medical records, personal shopping habits and 
credit histories that individuals once viewed as under their 
exclusive control. The commercial gains associated with one's 
identity are no longer limited to the advertising value as 
conceptualized by the appropriation privacy tort. Marketing companies 
and other commercial entities spend millions of dollars to gather 
personal information from consumers.
With the enormous amounts of personal information stored in computer 
databases, the ease, speed, and scale of information exchange allowed 
by the Internet, and the commercial value the data possess (e.g., 
Kornblum, J., July 22nd, 1997), a traditional sense of control of 
one's private information in this new environment is grossly 
inadequate. As such, it may be that individuals who fail to 
effectively use privacy enhancing technologies are those who do not 
know what information they want to protect, through what means this 
information is collected on the Internet, and to what extent this 
information is valued by others.
The right/liberty component.  The third component of privacy is the 
right or liberty component. The legal protections of privacy in the 
U.S. can at best be described as haphazard, and while there is an 
ongoing effort by legislators to expand the right of privacy to the 
Internet (Roternberg, 2003; Turkinton & Allen, 1999), the four 
privacy torts are geared toward very specific issues in the physical 
world and are not easily translated to the online environment. For 
example, the intrusion privacy tort assumes the existence of an 
observable space, physical or psychological, that is deemed to be 
private. In the digital world, the difficulty in defining such a 
space limits the usefulness of this tort in the online environment.
Federal statutory bodies have enacted a great number of statutes and 
regulations that protect informational privacy, but these statutes 
are confusing and are usually narrowly tailored to guard against 
specific types of businesses collecting or exchanging specific types 
of information (e.g., Right to Financial Privacy Act of 1978, Cable 
Communications Policy Act of 1984, Video Privacy Protection Act of 
1988, Telephone Consumer Protection Act of 1991, etc.). Furthermore, 
there is no consensus among lawmakers and private entities as to what 
personal information is private and what is not. The ease of data 
sharing and matching also allows new information about a person to be 
created by merging data from seemingly non-private sources, adding to 
the problem.  Therefore, an average consumer is unlikely to have 
extensive legal knowledge about his/her privacy rights offered by the 
legal system.
In order to protect one's privacy effectively using privacy enhancing 
technologies, a person must ask: Do I have a right to privacy? Do 
others think I have a right to privacy? What kind of information 
collection is considered legal? To what extent do the privacy laws 
protect me?   Difficulty in answering these questions may be 
reflected in a failure to effectively adopt privacy protections technologies.
The psychological need component.  Extensive research has been 
devoted to individuals' need to preserve their privacy in the 
physical world.  Social scientists have identified physical, 
psychological, informational, and social dimensions of individuals' 
psychological need for privacy, each of which has been examined 
closely through empirical observations (Burgoon, 1982; Parrott, 
Burgoon, Burgoon, & LePoire, 1989).
The psychological dimension of privacy concerns the ability of human 
beings to control cognitive and affective inputs and outputs, to form 
values, and the right to determine when, with whom, and how to share 
one's thoughts or emotions (Burgoon, 1982). For example, Westin 
(1967) posits that people have a need for privacy that, in concert 
with other needs, helps them adjust emotionally to daily life with 
other people. Privacy, according to Westin (1967), fulfills one's 
need for personal autonomy, emotional release, self-evaluation, and 
limited and protected communication.
Not only is very little known about what Internet users mean by 
online privacy, it cannot be assumed that every computer user has the 
same level of psychological need for privacy. Before users can 
effectively adopt privacy protection technologies they need to ask: 
How much privacy do I need?  Do I care if this information is made 
public?  Existing studies of consumers' concerns do not provide 
adequate understanding of the relationship between psychological 
characteristics (e.g., need for privacy, personal space, 
territoriality, etc.) and concerns about online privacy. The 
difference between those who adopt privacy enhancing technologies and 
those who don't can be that those users who adopt have greater 
psychological need for privacy. An important variable here is motivation. .
The boundary management component.  Overall, research in social 
science would suggest that privacy can be generally defined as a 
physical or psychological boundary that separates a person's 
perceived private domain from the public domain (Altman, 1975; Buss, 
2001). Thus, a state of privacy can be achieved through processes of 
boundary management (Buss, 2001). Effectively managing personal 
boundaries requires answering questions such as, what boundary is 
being broken?  What space is being invaded?   What information is 
being taken? How can I manage these intrusions?
One of the most well-established boundary management strategies in 
psychological research is the process of self-disclosure (Buss, 2001; 
Caldwell & Peplau, 1982; Franzoi & Davis, 1985; Mikulincer & Nachson, 
1991).	Another well-studied boundary management process concerns how 
individuals manage personal space. In this sense, the privacy 
boundary refers to the psychological border between one's self and 
the outer world (Hall, 1966, Sommer, 1959). 	In the physical reality, 
self-disclosure and personal space management rely on basic social 
activities that we can all do with very little conscious effort.
As stated earlier, the Federal Trade Commission (2000) identified 
five core principles of fair information practice: notice, choice, 
access, security, and enforcement. Notice refers to information about 
what is collected, how it is collected, its purpose and disclosure to 
third parties. Choice refers to opportunity that allows individuals 
to choose not to have information used or disclosed. Access refers to 
the ability to access one's own personal data. Security concerns 
precautions against misuse of data. Enforcement concerns mechanisms 
for assuring compliance. Sheehan and Hoy (2000) extended these five 
principles as the major dimensions of privacy concerns among Internet 
users (see also Docter & Metzger, 2003).
These concerns all relate to the issue of boundary management in the 
online environment. They are similar to boundary management 
strategies in the physical world. Individual users want to maintain 
their ability to choose what information, to whom and when they want 
to disclose, and some sort of guarantee from the target of 
self-disclosure that their personal information would be kept safely. 
In the virtual world, however, protections of individual privacy 
involve conscious effort and the use of technologies. The difference 
between those who adopt privacy enhancing technologies and those who 
don't may be that savvy users of these technologies are more aware of 
the boundary management strategies that exist online.
	The goal of the current study is to determine whether or not the 
five components of privacy extracted from past literature are 
meaningful ways of organizing people's subjective concerns about 
online privacy.  We argue that if these five components of privacy 
are meaningful ways of thinking about individual's online privacy 
concerns, then people should be able to organize various online 
privacy concerns in ways that are consistent with these components. 
Specifically, we predict that participants will evaluate statements 
about online privacy in similar ways if they are required to organize 
them by instructions that are consistent with these components.
Overview of Analysis
	Q-method is a controlled technique for revealing structures of 
subjectivity. However, unlike other qualitative techniques for 
analyzing subjective data, Q-method is deeply rooted in the tradition 
of quantitative analysis due to its involvement with factor analysis 
(Brown, 1996). This technique takes advantage of powerful statistical 
tools typically used by quantitative researchers to examine people's 
subjective understandings of the surrounding world.  Unlike most of 
the common quantitative methods, such as traditional ranking opinion 
surveys, Q-method is less concerned with comparing patterns between 
different groups than it is with determining what these patterns are 
and determining their underlying structures.
	As Stephenson (1935), the inventor of the Q-methodology, notes, 
traditional quantitative methodologies deal with a selected 
population of n individuals, each of whom has been measured in m 
tests, whereas Q-methodology refers to a population of n different 
tests (or essays, pictures, traits or other measurable material), 
each of which is measured or scaled by m individuals. As such, 
statistical analysis of Q-sort data is not performed by variable, 
trait, or statement, but rather by person. People correlate to others 
with similar opinions based on their Q-sorts. Q-methodology results 
in the grouping of expressed opinion profiles based on the 
similarities and differences in which the statements are arranged by 
each participant (Brown, 1993; McKeown & Thomas, 1988)
	The current study is comprised of five separate Q-sort tasks, each 
with a sorting instruction that reflects a component of privacy 
(e.g., spatial, informational, etc.). For each Q-sort task in the 
current study, participants' rank-ordered sorts of statements are 
transformed into an array of numerical data. A by-person factor 
analysis determines the factors that represent clusters of 
participants with similar opinions. In the practice of Q-methodology, 
people who are associated with one factor have something in common 
that differentiates them from those who are associated with the other factors.
	In traditional Q-sort studies, after clusters of participants are 
identified within each Q-sort task, a factor score for each statement 
is calculated. The factor scores reveal the level of relevancy that 
each statement receives within each of the identified participant 
clusters. The interpretation of factors in Q-methodology uses 
statement scores rather than factor loadings. However, in the present 
study, the primary goal is not to interpret and categorize various 
views of online privacy.  Instead, we are mainly concerned with 
whether or not the five components of privacy can be used to organize 
online privacy concerns in meaningful and consistent ways without 
sacrificing people's subjectivity.
This study used an extensive person-sample (McKeown & Thomas, 1988), 
meaning that multiple people sorted the same group of statements (the 
Q-sample) using one of 5 similar sets of instructions. 87 students 
were recruited from both upper- and lower-division classes offered in 
a Communication Department by offering extra credit for their 
participation, 80 of whom completed the sort.
Participants were 40% male and 60% female, and came from different 
majors and class levels.  73% reported being "somewhat" private 
people, with 11% saying they were not very private, and 15% 
describing themselves as very private.  With respect to experience 
using computers, 5% said they were not very experienced, 45% said 
they were somewhat experienced, and 48% said they were very 
experienced (one participant did not answer the gender, privacy, or 
computer experience levels).
In an earlier study, a diverse group of 1189 participants in an 
online experiment were asked, "If you have ever provided false or 
phony information to a commercial Web site, why?" The question was 
open-ended, and generated 535 responses.
These 535 responses were examined and similar answers were 
eliminated. For example, many people answered that they had provided 
phony information because they did not want to receive Spam; another 
popular answer was "to prevent identity theft." Once duplicate 
reasons were eliminated, 91 discrete reasons remained.  These 91 
reasons were used as a quasi-naturalistic Q-sample.
This study is explorative, in that it tries using a method not 
previously applied to the topic of online privacy to shed light on 
how it is conceptualized.  The statements about online privacy were 
entered into an online software program known as "WebQ" that allowed 
participants the opportunity to virtually sort them into 'piles' 
based on how much each represented a particular dimension of 
privacy.  The software scrambled the order of the statements for each 
Participants came to a computer lab in groups of 15-20, where a 
research assistant demonstrated a short, fictitious Q-sort by sorting 
9 personality characteristics according to how well they described a 
person named Bill. Once participants had a basic understanding of how 
the sorting task was accomplished using the software, they were given 
a piece of paper and asked to read a short passage, representing one 
of the five perspectives on privacy to which they were randomly assigned.
After reading their passages, they were instructed to open the WebQ 
program on their computer and sort the statements found there 
according to how much they did or did not represent the perspective 
on privacy they had just read.
A pilot test had revealed that, including instructions, the task took 
about 40 minutes, so subjects were told to expect to spend about an 
hour on the task, longer if needed.  The WebQ software enables 
subjects to rearrange the statements as many times as they want to, 
and participants were encouraged to use as much time and as many 
iterations as needed until they felt satisfied with their sort.
Each subject was randomly assigned to one of the following five 
dimensions of privacy:
"The spatial component用rivacy as separation of private and public 
physical spaces:  Many social scientists believe that we divide our 
living environment into two separate domains: a private space and a 
public space. Entering a person's private space without consent is a 
form of intrusion of privacy. For example, we all consider our homes, 
bedrooms, bathrooms, or even our mailboxes to be private spaces where 
we can be free from intrusion. We often feel threatened when these 
private spaces are invaded.
Although the distinction of private and public spaces clearly exists 
in the real world, some researchers believe that it can also be 
extended to the virtual environment of the Internet. For example, in 
the virtual world, a published Web site can be considered a public 
place, but someone's personal computer or personal network would be 
considered a private place. From this point of view, it can be argued 
that providing some personal information to a Web site may increase 
the likelihood of our virtual private space being invaded by 
unauthorized people."

"The informational component用rivacy as a matter of protecting 
identity, personal information, and decision making: Researchers 
believe that some information about us is central to the safety of 
our identities and lives, whereas other information about us may not 
be as central. For example, although we wouldn't mind sharing our 
names with someone we just met at a bar, we usually would not feel 
comfortable sharing our address out of fear that this person might 
harm us. We sometimes give false information or simply refuse to give 
certain information because we worry that our safety or identity will 
be threatened."

"The rights or personal liberty component用eople have a right to 
privacy and should be able to protect it: Many people believe that we 
have a given right to privacy. This means that even if our personal 
property, safety, or identity is not at risk, we should still be able 
to keep certain things from others. According to this point of view, 
not disclosing certain personal information on the Internet is not 
necessarily a matter of safety; rather, it is a matter of protecting 
someone's right to privacy."

"The need component用sychologically people desire 
privacy:  Scientists believe that the desire to be private is an 
innate psychological tendency. It is a personality trait. Just like 
some of us are extroverts and some of us are introverts, some people 
need to have a lot of privacy and others may not need as much. 
According to this point of view, a person may decide not to disclose 
certain personal information on the Internet because he/she feels 
uncomfortable doing so. They may not have a specific reason why they 
do not want to give this information. They simply feel uneasy about it."

"The boundary management component葉he extent to which individuals 
are able to control the spatial and informational aspects of their 
private lives: Many people believe that disclosing (or not 
disclosing) certain personal information is a way to define and 
control relationships between people. For example, we can choose to 
disclose certain information to someone, but not to other people, 
because we feel closer to this individual. We may also choose not to 
disclose certain information to someone because we feel that this 
person is not important enough. From this perspective, an individual 
may refuse to disclose certain personal information to a Web site 
because they feel that this site may not be "close" enough to be 
given this information. It is important to remember that this is NOT 
a simple concern for safety. Just because we feel that someone is not 
important enough to be given some information does not mean that we 
feel threatened by them."

Participants were asked to sort each statement in the Q-sample 
according to how much it corresponded to the perspective they read on 
the paper. The scale used to sort the items was 5 to +5, with 5 
meaning, "This statement does not correspond to the perspective on 
privacy I read at all" and +5 meaning, "This statement is totally 
relevant to the perspective on privacy I read." As is customary with 
a Q sort, a forced distribution was used, and the computer would not 
let subjects submit their results until an appropriate number of 
statements received each score. Subjects had to put 6 statements in 
the 5 and +5 categories, 7 statements in the 4 and +4 categories, 8 
statements in the 3 and +3 categories, 9 in the 2 and +2 
categories, 10 in the 1 and +1 categories, and 11 statements in the 
0 category. Once this distribution was achieved, the results were 
emailed to one of the study's authors for analysis.
After the sorting task, so as not to influence the sort in any way, 
subjects reported their gender, how private they considered 
themselves to be, and how experienced with computers they considered 
themselves to be.
	This study predicted that if the five components extracted from past 
literature were meaningful ways of organizing online privacy 
concerns, then participants would organize statements about online 
privacy in similar ways according to the privacy components. Thus, we 
expected that under specific sorting instructions generated from the 
five components of privacy, a majority of participants would sort the 
91 online privacy statements in similar ways. Another way of 
expressing this is to say that this hypothesis would be supported if 
most of participants loaded into one by-person factor.
Spatial Component
	Participants in group 1 (n = 15) received sorting instructions based 
on the spatial component of privacy. These instructions required 
participants to disregard their own opinions about online privacy and 
to sort the 91 statements based on their relevancy to the spatial 
component of privacy. A by-person principle component factor analysis 
with varimax rotation for Q-sorts suggested a three-factor solution 
explaining over 60% of the variance in the data. This result suggests 
that there are three distinctive patterns of sorting these 
statements. However, consistent to our prediction, 11 out of the 15 
participants were loaded into factor 1. In other words, 11 out of 15 
participants sorted 91 statements in the same pattern when asked to 
think about these statements from a spatial perspective.
Informational Component
	Participants in group 2 (n = 14) were instructed to sort the 91 
statements according to their relevance to the informational aspect 
of privacy.  A by-person principle component factor analysis with 
varimax rotation for Q-sorts suggested a four-factor solution 
explaining over 58% of the variance in the data. This result suggests 
that there are four distinctive patterns of sorting these statements. 
8 out of 14 participants were included in factor 1, and the second, 
third, and fourth factor each had 2 participants. Although a majority 
of participants did sort the statements in the same pattern, almost 
half of the participants in this group sorted the statements in 
distinctive patterns. This suggests that the participants did not 
have a consistent view of the 91 statements when asked to think about 
these statements from an informational perspective.
The Rights or Personal Liberty Component
	A third group of participants (n = 18) was instructed to sort the 91 
statements according to the rights or personal liberty component. A 
by-person principle component factor analysis with varimax rotation 
for Q-sorts returned a 7-factor solution explaining 67 % of the 
variance in the data. Further, only 12 out of 18 participants' factor 
loading exceeded .60 on a single factor with factor loadings below 
.40 on other factors. The first and the largest factor only included 
6 participants. This result clearly suggests that participants did 
not view the rights or personal liberty component in the same way. 
Sorting of these 91 statements in this group was largely guided by 
individual's subjective understandings instead of a consistent 
The Psychological Component
	The fourth component is the psychological component. Participants (n 
= 17) in this condition were asked to sort the 91 statements based on 
a view that privacy is a psychological need.  A by-person principle 
component factor analysis with varimax rotation for Q-sorts returned 
a 5-factor solution explaining 66 % of the variance in the data. 
Interestingly, 11 of the 17 participants were in the cluster with 
loadings exceeding .60 on this factor and below .40 on other factors. 
Further, three of the five factors had only 1 participant. This 
pattern suggests that although a few participants each had 
distinctive patterns of sorting the statements, a majority of 
participants had similar ways of judging these statements based on 
the psychological perspective of privacy.
The Boundary Management Component
	Finally, the fifth group of participants (n = 16) was instructed to 
sort the 91 statements based on the boundary management component.  A 
by-person principle component factor analysis with varimax rotation 
for Q-sorts in this group returned a 4-factor solution explaining 56% 
of the variance in the data. 7 participants had loadings above .60 
for factor 1 and below .40 for other factors. 3 participants were 
included in the second factor, and the third and fourth factor each 
had 2 participants. The remaining 2 participants did not have factors 
loadings above .60 for any factor. This result suggests that the 
participants did not form a consistent view of the 91 statements when 
asked to think about them from a boundary management perspective.
	 A majority of people displayed a similar and consistent pattern 
when asked to organize various reasons why people do not wish to give 
personal information to a Web site based on spatial and psychological 
views of privacy. However, our results suggest that people were not 
able to use a consistent way of organizing these statements based on 
the informational, rights, and boundary management perspective. 
Especially for the rights component, several valid and distinctive 
patterns of Q-sort were generated, suggesting that interpretation of 
this privacy component was subjective and inconsistent.
	It is interesting to note which dimensions were conceived of 
consistently and which were not.  Although cyberspace is not 
physical, it appears that participants in this study are fairly 
capable of using physical metaphors to think about what is private 
and what is public.  This may be reflective of the common use of 
these metaphors in society; it may also be a function of the 
metaphors being spelled out in the definition provided to the 
participants.  Participants also appear fairly unified in their 
conceptualization of online privacy as a need, consistent with the 
many studies showing high levels of concern over this issue.
	On the other hand, the three dimensions with disparate sorts reflect 
the lack of privacy-protecting behavior exhibited by many people.  It 
does not appear to be widely agreed upon what information is private, 
what right to online privacy people have, or how to maintain the 
distinction between the private and the public.  While these results 
do not explain the discrepancy between people's concern and their 
behavior, the fact that they are consistent with its existence 
suggests that there is potential for the use of this method to 
examine the issue.
	Indeed, this explorative study employs a unique method to examine 
the five theoretical components of privacy extracted from past 
literature by taking into account both objective and subjective 
understandings of online privacy concerns. The between-group design 
with different instructions objectively tests the degree to which a 
particular privacy component is a meaningful way of organizing these 
online privacy concerns. However, by changing the research focus from 
traits, as in a conventional quantitative research paradigm, to a 
people-oriented view as permitted by the Q-methodology, we allowed 
people's subjectivity to be examined and interpreted in a meaningful way.
	Traditional quantitative methods relying on objective observations 
may overemphasize preconceived notions of privacy and miss the 
subjective nature of this concept. While preconceived theoretical 
components of privacy can be useful to understand individual concerns 
about online privacy, variations in people's subjective views of this 
issue should not be pushed aside under the label of an error term. 
Instead, future research should focus on the underlying structure of 
people's subjective perceptions of various issues related to online privacy.
	Although it was not a goal of the present study, sorting patterns 
displayed by different factors of participants can be analyzed and 
compared to explore various sorting patterns in more detail. However, 
such an analysis requires a relatively larger sample of participants 
in each condition.
	In addition, to sort through 91 statements on a forced quasi-normal 
distribution can be a long and boring task. Variations in sorting 
patterns may be caused by participants' fatigue and lack of 
concentration. Replications with a reduced number of statements 
should be conducted in the future.  These replications should also 
try asking participants to sort the statements without reading about 
or being exposed to a particular dimension.  Not only would this 
allow more insight into people's existing conceptualizations, it 
would minimize the likelihood that results are affected by how the 
dimension is explained in the instructions, as may have happened with 
the spatial dimension in this study.  While the results here are 
useful in seeing how people conceptualize online privacy, they are at 
least as helpful at providing insight into how these 
conceptualizations might be elucidated through future research.
Altman, I. (1975). The environment and social behavior. Belmont, CA: 
Brown S.R. A primer on Q-methodology. Operant Subjectivity. 1993;16: 91-138.
Brown, S.R. (1996). Q methodology and qualitative research. 
Qualitative Health Research, 6, 561-567.
Burgoon, J. K. (1982). Privacy and communication. In M. Burgoon 
(Ed.), Communication Yearbook 6 (pp. 206-249). Beverly Hills, CA: 
Sage Publications.
Buss, A. (2001). Psychological dimensions of the self. Thousand Oaks, 
CA: Sage.
Caldwell, M. A., & Peplau, L. A. (1982). Sex differences in same-sex 
friendship. Sex Roles, 8, 721-732.
DeCew, J. W. (1997). In pursuit of privacy: Law, ethics, and the rise 
of technology. Ithaca, NY: Cornell University Press.
Docter, S., & Metzger, J. M. (2003). Public opinion and policy 
initiatives for online privacy protection. Journal of Broadcasting & 
Electronic Media, 47, 350-374.
Dommeyer, C. J., & Gross, B. L. (2003). What consumers know and what 
they do: An investigation of consumer knowledge, awareness, and use 
of privacy protection strategies. Journal of Interactive Marketing, 17, 34-51.
Elshtain, J. B. (1995). Democracy on trial. New York, NY: Basic Book.
Federal Trade Commission. (2000, May). Privacy online: A report to 
Congress. Retrieved December 20, 2003, from
Franzoi, S. L., & Davis, M. H. (1985). Adolescent self-disclosure and 
loneliness: Private self-consciousness and parental influences. 
Journal of Personality and Social Psychology, 48, 768-780.
Hall, E. T. (1966). The hidden dimension. New York: Doubleday.
Harper, J. R., & Singleton, S. (2001, June). With a grain of salt: 
What consumer privacy surveys don't tell us. Report prepared for The 
Competitive Enterprise Institute. Retrieved January 29, 2001, from, 02061.cfm.
Hoffman, D. L., Novak, T. P., & Peralta, M. (1999). Building consumer 
trust online. Communications of the ACM, 42(4), 80-85.
Kornblum, J. (July 22nd, 1997). AOL to give out phone numbers. CNET Retrieved on January 5, 2004, from
Leino-Kilpi, H., Vaelimaeki, M., Dassen, T., Gasull, M., Lemonidou, 
C., Scott, A., & Arndt, M. (2001). Privacy: A review of the 
literature. International Journal of Nursing Studies, 38(6), 663-671.
McKeown, B.F., & Thomas, B.D. (1988). Q-methodology. Newbury Park, 
CA: Sage Publications.
Mikulincer, M., & Nachson, O. (1991). Attachment styles and patterns 
of self-disclosure. Journal of Personality and Social Psychology, 61, 321-331.
Parrot, R., Burgoon, J. K., Burgoon, M., & LePoire, B. A. (1989). 
Privacy between physicians and patients: More than a matter of 
confidentiality. Social Science & Medicine. 29(12), 1381-1385.
Sheehan, K. B., & Hoy, M. G. (2000). Dimensions of privacy concern 
among online consumers. Journal of Public Policy & Marketing, 19, 62-73.
Solove, D. J., & Rotenberg, M. (2003). Information privacy law. New 
York, NY: Aspen Publishers.
Sommer, R. (1959). Studies in personal space. Sociometry, 22, 247-260.
Stephenson, W. (1935). Technique of factor analysis. Nature, 136, 297.
Stephenson, W. (1953). The study of behavior: Q-technique and its 
methodology. Chicago: University of Chicago Press.
Tavani, H. (2000). Privacy-enhancing technologies as a panacea for 
online privacy concerns. Journal of information ethics, fall, 26-36
UCLA Center for Communication Policy. (2000). The UCLA Internet 
report: Surveying the digital future. Retrieved October 7, 2001, from
Valenta, A. L. & Wigger, U. (1997). Q-methodology: Definition and 
application in health care informatics. Journal of American Medical 
Information Association, 4, 501510.
Wang, H., Lee, M., & Wang, C. (1998). Consumer privacy concerns about 
Internet marketing. Communication of the ACM, 41, 63-70.
Westin, A. (1967). Privacy and freedom. New York, NY: Atheneum. 

Back to: Top of Message | Previous Page | Main AEJMC Page



CataList Email List Search Powered by the LISTSERV Email List Manager