ࡱ> #` Lx bjbj\.\. rK >D>D< -SjL"o"o"owd[8lH<"+3:3T4T4AK Ww[w!y!y!y!y!y!y!$ $h&X!"o fK@AK f f!T4T4!Bɀɀɀ fZ:T4"oT4ɀ fw!ɀɀ.X"oT4 d% gm @S |!H<"|&qy<&"`&"oD6G]_ɀatcG]G]G]!!G]G]G]<" f f f fHHHHHH6oTtVwHTV End-User Privacy in Human-Computer Interaction Giovanni Iachello and Jason Hong Georgia Institute of Technology Carnegie Mellon University August 18, 2007 DRAFT, PLEASE DO NOT REDISTRIBUTE VERSION:  FILENAME privacy_landscape_in_hci_57 Abstract The purpose of this article is twofold. First, we summarize research on the topic of privacy in Human-Computer Interaction (HCI), outlining current approaches, results, and trends. Practitioners and researchers can draw upon this review when working on topics related to privacy in the context of HCI and CSCW. The second purpose is that of charting future research trends and of pointing out areas of research that are timely but lagging. This work is based on a comprehensive analysis of published academic and industrial literature spanning three decades, and on the experience of both ourselves and of many of our colleagues. Table of Contents  TOC \o "2-3" \h \z \t "Heading 1,1"  HYPERLINK \l "_Toc175210961" 1 Introduction  PAGEREF _Toc175210961 \h 4  HYPERLINK \l "_Toc175210962" 1.1 Why Should HCI Researchers Care About Privacy?  PAGEREF _Toc175210962 \h 4  HYPERLINK \l "_Toc175210963" 1.2 Sources Used and Limitations of this Survey  PAGEREF _Toc175210963 \h 6  HYPERLINK \l "_Toc175210964" 2 The Privacy Landscape  PAGEREF _Toc175210964 \h 7  HYPERLINK \l "_Toc175210965" 2.1 Often-Cited Legal Foundations  PAGEREF _Toc175210965 \h 7  HYPERLINK \l "_Toc175210966" 2.2 Philosophical Perspectives on Privacy  PAGEREF _Toc175210966 \h 8  HYPERLINK \l "_Toc175210967" 2.2.1 Principled Views and Common Interests  PAGEREF _Toc175210967 \h 8  HYPERLINK \l "_Toc175210968" 2.2.2 Data Protection and Personal Privacy  PAGEREF _Toc175210968 \h 8  HYPERLINK \l "_Toc175210969" 2.3 An Historic Perspective on Privacy  PAGEREF _Toc175210969 \h 10  HYPERLINK \l "_Toc175210970" 2.3.1 Changes in Expectations of Privacy  PAGEREF _Toc175210970 \h 10  HYPERLINK \l "_Toc175210971" 2.3.2 Changes in Privacy Methodologies  PAGEREF _Toc175210971 \h 11  HYPERLINK \l "_Toc175210972" 3 Understanding, Building and Evaluating Privacy in Interactive Systems  PAGEREF _Toc175210972 \h 12  HYPERLINK \l "_Toc175210973" 3.1 Understanding Users Privacy Preferences  PAGEREF _Toc175210973 \h 13  HYPERLINK \l "_Toc175210974" 3.1.1 Data Protection and Privacy Preferences  PAGEREF _Toc175210974 \h 14  HYPERLINK \l "_Toc175210975" 3.1.2 Privacy on the World Wide Web, Privacy and E-commerce  PAGEREF _Toc175210975 \h 17  HYPERLINK \l "_Toc175210976" 3.1.3 Instant Messaging, Environmental Privacy, and Personal Availability  PAGEREF _Toc175210976 \h 18  HYPERLINK \l "_Toc175210977" 3.1.4 Incidental Information Privacy  PAGEREF _Toc175210977 \h 19  HYPERLINK \l "_Toc175210978" 3.1.5 Media Spaces  PAGEREF _Toc175210978 \h 20  HYPERLINK \l "_Toc175210979" 3.1.6 Ubiquitous Computing, Sensors, and RFID  PAGEREF _Toc175210979 \h 20  HYPERLINK \l "_Toc175210980" 3.1.7 Mobile and Location-Enhanced Technologies  PAGEREF _Toc175210980 \h 21  HYPERLINK \l "_Toc175210981" 3.2 Methodological Issues  PAGEREF _Toc175210981 \h 22  HYPERLINK \l "_Toc175210982" 3.2.1 The Use of Surveys in Privacy Research  PAGEREF _Toc175210982 \h 22  HYPERLINK \l "_Toc175210983" 3.2.2 Directly Asking About Privacy versus Observation  PAGEREF _Toc175210983 \h 23  HYPERLINK \l "_Toc175210984" 3.2.3 Controlled Experiments and Case Studies  PAGEREF _Toc175210984 \h 23  HYPERLINK \l "_Toc175210985" 3.2.4 Participatory Design and Privacy  PAGEREF _Toc175210985 \h 24  HYPERLINK \l "_Toc175210986" 3.2.5 Ethics and Privacy  PAGEREF _Toc175210986 \h 25  HYPERLINK \l "_Toc175210987" 3.2.6 Conclusions on Methodology  PAGEREF _Toc175210987 \h 26  HYPERLINK \l "_Toc175210988" 3.3 Prototyping, Building, and Deploying Privacy-Sensitive Applications  PAGEREF _Toc175210988 \h 26  HYPERLINK \l "_Toc175210989" 3.3.1 Privacy Policies for Products  PAGEREF _Toc175210989 \h 27  HYPERLINK \l "_Toc175210990" 3.3.2 Helping End-Users Specify Their Privacy Preferences  PAGEREF _Toc175210990 \h 29  HYPERLINK \l "_Toc175210991" 3.3.3 Machine-Readable Privacy Preferences and Policies  PAGEREF _Toc175210991 \h 30  HYPERLINK \l "_Toc175210992" 3.3.4 Identity Management and Anonymization  PAGEREF _Toc175210992 \h 32  HYPERLINK \l "_Toc175210993" 3.3.5 End-User Awareness of Personal Disclosures  PAGEREF _Toc175210993 \h 33  HYPERLINK \l "_Toc175210994" 3.3.6 Interpersonal Awareness  PAGEREF _Toc175210994 \h 34  HYPERLINK \l "_Toc175210995" 3.3.7 Shared Displays: Incidental Information and Blinding  PAGEREF _Toc175210995 \h 36  HYPERLINK \l "_Toc175210996" 3.3.8 Plausible Deniability, Ambiguity, and Social Translucency  PAGEREF _Toc175210996 \h 36  HYPERLINK \l "_Toc175210997" 3.3.9 Fostering Trust in Deployed Systems  PAGEREF _Toc175210997 \h 37  HYPERLINK \l "_Toc175210998" 3.3.10 Personalization and Adaptation  PAGEREF _Toc175210998 \h 39  HYPERLINK \l "_Toc175210999" 3.4 Evaluation  PAGEREF _Toc175210999 \h 40  HYPERLINK \l "_Toc175211000" 3.4.1 Evaluation of User Interfaces  PAGEREF _Toc175211000 \h 40  HYPERLINK \l "_Toc175211001" 3.4.2 Holistic Evaluation  PAGEREF _Toc175211001 \h 42  HYPERLINK \l "_Toc175211002" 3.4.3 The Tension between Transparency and Privacy  PAGEREF _Toc175211002 \h 43  HYPERLINK \l "_Toc175211003" 3.5 Privacy Frameworks  PAGEREF _Toc175211003 \h 45  HYPERLINK \l "_Toc175211004" 3.5.1 Privacy Guidelines  PAGEREF _Toc175211004 \h 45  HYPERLINK \l "_Toc175211005" 3.5.2 Process Frameworks  PAGEREF _Toc175211005 \h 49  HYPERLINK \l "_Toc175211006" 3.5.3 Modeling Frameworks  PAGEREF _Toc175211006 \h 54  HYPERLINK \l "_Toc175211007" 4 Trends and Challenges in Privacy HCI Research  PAGEREF _Toc175211007 \h 57  HYPERLINK \l "_Toc175211008" 4.1 Better Ways of Helping End-Users Manage Their Personal Privacy  PAGEREF _Toc175211008 \h 57  HYPERLINK \l "_Toc175211009" 4.2 A Deeper Understanding of Peoples Attitudes and Behaviors towards Privacy  PAGEREF _Toc175211009 \h 59  HYPERLINK \l "_Toc175211010" 4.3 Developing a Privacy HCI Toolbox  PAGEREF _Toc175211010 \h 60  HYPERLINK \l "_Toc175211011" 4.4 Better Organizational Practices  PAGEREF _Toc175211011 \h 61  HYPERLINK \l "_Toc175211012" 4.5 Understanding Adoption  PAGEREF _Toc175211012 \h 63  HYPERLINK \l "_Toc175211013" 4.5.1 A Story of Rejection And Acceptance: The Importance Of Value Propositions  PAGEREF _Toc175211013 \h 63  HYPERLINK \l "_Toc175211014" 4.5.2 Models of Privacy Factors Affecting Acceptance  PAGEREF _Toc175211014 \h 64  HYPERLINK \l "_Toc175211015" 5 Conclusions  PAGEREF _Toc175211015 \h 67  Introduction Privacy is emerging as a critical design element for interactive systems in areas as diverse as e-commerce  ADDIN EN.CITE Cranor20031039103910Lorrie CranorI Didnt Buy it for Myself Privacy and Ecommerce PersonalizationWorkshop on Privacy in the Electronic Society1111172003Washington, DC, USAACM Press[69], health care  ADDIN EN.CITE US Department of Health and Human Services200383783750US Department of Health and Human Services,Health Insurance Reform: Security Standards; Final Rule45 CFR Parts 160, 162, and 1642003[289], office work  ADDIN EN.CITE International Labor Organization19938198196International Labor Organization,Workers Privacy Part II: Monitoring and Surveillance in the Workplace Conditions of WorkSpecial Series On Workers PrivacyDigest 12:11993ISBN 92-2-108740-9[160] and personal communications. These systems face the same fundamental tension. On the one hand, personal information can be used to streamline interactions, facilitate communication, and improve services. On the other hand, this same information introduces risks, ranging from mere distractions to extreme threats. Government reports  ADDIN EN.CITE Privacy Protection Study Commission19771111111127Privacy Protection Study Commission, Personal Privacy in an Information Society1977Washington D.C., USAGovernment Printing OfficeUnited States Department of Health Education and Welfare197368368327United States Department of Health Education and Welfare,Records, Computers and the Rights of Citizens, Report of the Secretary's Advisory Committee on Automated Personal Data Systems1973http://aspe.os.dhhs.gov/datacncl/1973privacy/tocprefacemembers.htm[244, 288], essays  ADDIN EN.CITE Norris19998438436Norris, C.Armstrong, G.The maximum surveillance society: The rise of CCTV1999Oxford, EnglandBerg1-85973-226-7[228], books  ADDIN EN.CITE Altman19758548546Altman, IrwinThe Environment and Social BehaviorPrivacy, Personal Space, Territory, Crowding1975Monterey, CABrooks/Cole Publishing Company0-8185-0168-5Etzioni19992122126Amitai EtzioniThe Limits of Privacy1999New YorkBasic BooksLessig19991141146Lawrence LessigCode and Other Laws of Cyberspace1999New York NYBasic BooksWestin197197997928Westin, A.F.Information Technology in a Democracy1971Cambridge, MA, USAHarvard University Press[23, 97, 200, 306], and media coverage  ADDIN EN.CITE Scalet20051091109123Scalet, S.D.The Five Most Shocking Things About the Choice Point DebacleCSO Magazine2005May 1, 2005Wardell200586186123Jane WardellLexisNexis Breach May Be Worse Than ThoughtAP Financial Wire
Business News
20054/13/2005 London
Zetter200586286223Kim ZetterCardSystems Data Left UnsecuredWired News
Technology
20056/22/2005http://www.wired.com/news/technology/0,1282,67980,00.html
[257, 297, 314] testify on peoples concerns regarding the potential for abuse and general unease over the lack of control over a variety of computer systems. Similarly, application developers worry that privacy concerns can impair the acceptance and adoption of their systems. No end-to-end solutions exist to design privacy-respecting systems that cater to user concerns. Lessig provided a very high level framework for structuring the protection of individuals privacy, which leverages four forces: laws, social norms, the market, and technical mechanisms  ADDIN EN.CITE Lessig199983083017Lessig, LawrenceThe Architecture of PrivacyVanderbilt Journal of Entertainment Law & Practice5611999Spring 1999[199]. However, the challenge is in turning these broad guidelines into actionable design solutions. Our thesis is that HCI (and CSCW) researchers can greatly improve the protection of individuals personal information, because many of the threats and vulnerabilities associated with privacy originate from the interactions between the people using information systems, rather than the actual systems. Approaching the topic of privacy can be daunting for the HCI practitioner, because the research literature on privacy is dispersed across multiple communities, including computer networking, systems, human-computer interaction, requirements engineering, management information systems (MIS), marketing, jurisprudence, and the social sciences. Even within HCI, the privacy literature is fairly spread out. Furthermore, many IT professionals have common-sense notions about privacy that can turn out to be inaccurate. Hence, the goal of this article is to provide a unified overview of privacy research in HCI, focusing specifically on issues related to the design and evaluation of end-user systems that have privacy implications. Section 3 presents this material structured along an ideal inquiry-build-evaluate development cycle. In addition to a literature review, in Section 2, we present two philosophical outlooks on privacy that will help the practitioner frame research questions and design issues. We also show how privacy research has evolved in parallel with HCI over the past 30 years. Finally, in Section 4, we outline key research challenges, where we think that HCI methods and research approaches can make a significant impact in furthering our knowledge about information privacy and personal data protection. In the remainder of this Section, we explain why we think privacy research is challenging and interesting for HCI, and map out relevant literature published in HCI conferences and journals, and in neighboring fields such as MIS and CSCW. Why Should HCI Researchers Care About Privacy? Human-computer interaction is uniquely suited to help design teams manage the challenges brought by the need of protecting privacy and personal information. First, HCI can help understand the many notions of privacy that people have. Westin describes four states of privacy: solitude, intimacy, anonymity, and reserve  ADDIN EN.CITE Westin19672002006Alan F. WestinPrivacy and Freedom1967New York NYAtheneum[307]. As practical examples, Murphy lists the following as expressions of privacy: to be free from physical invasion of ones home or person, the right to make certain personal and intimate decisions free from government interference, the right to prevent commercial publicity of ones own name and image, and the control of information concerning an individuals person  ADDIN EN.CITE Murphy19961029102917Richard S. MurphyProperty Rights in Personal Information: An Economic Defense of PrivacyGeorgetown Law Journal84 Geo. L.J. 23811996July, 1996[216]. These perspectives represent different and sometimes conflicting worldviews on privacy. For example, while some scholars argue that privacy is a fundamental right, Moor claims that privacy is not a core value on par with life, security, and freedom, and asserts that privacy is just instrumental for protecting personal security  ADDIN EN.CITE Moor19971034103417James H. MoorTowards a Theory of Privacy in the Information AgeComputers and Society27322731997September 1997Association for Computing Machinery[213]. Second, a concept of tradeoff is implicit in most discussions about privacy. In 1890, Warren and Brandeis pointed out that privacy should be limited by the public interest, a position that has been supported by a long history of court rulings and legal analysis  ADDIN EN.CITE Warren189028028017Samuel D. WarrenLouis D. BrandeisThe Right to PrivacyHarvard Law ReviewIV51890Dec 15, 1890[298]. Tradeoffs must also be made between competing interests in system design. For example, the developer of a retail web site may have security or business requirements that compete with the end-user privacy requirements, thus creating a tension that must be resolved through tradeoffs. Because HCI practitioners possess an holistic view of the interaction of the user with the technology, they are ideally positioned to optimally work through and solve these tradeoffs. Third, privacy interacts with other social concerns, such as control, authority, appropriateness, and appearance. For example, while parents may view location-tracking phones as a way of ensuring safety and maintaining peace of mind, their children may perceive the same technology as smothering and an obstacle to establishing their identity. These relationships are compellingly exemplified in Goffmans description of the behavior of individuals in small social groups  ADDIN EN.CITE Goffman19666886886Goffman, ErwinBehavior In Public Places1966Free Press0029119405[122]. For instance, closing ones office door not only protects an individuals privacy, but asserts his ability to do so and emphasizes the difference from other colleagues who do not own an individual office. Here, the discriminating application of HCI tools can vastly improve the accuracy and quality of the assumptions and requirements feeding into system design. Fourth, privacy can be hard to rationalize. Multiple studies have demonstrated that there is a difference between privacy preferences and actual behavior  ADDIN EN.CITE Berendt200588488417Bettina BerendtOliver GntherSarah Spiekermann Privacy in e-commerce: stated preferences vs. actual behavior Communications of the ACM1011064842005April 2005Acquisti200595195117Acquisti, AlessandroGroklags, JensPrivacy and Rationality in Individual Decision MakingIEEE Security and Privacy2633312005IEEE[14, 44]. Many people are also unable to accurately evaluate low probability but high impact risks  ADDIN EN.CITE Schneier2006117911796Schneier, BruceBeyond Fear2006New York, NY, USASpringer978-0387026206[260], especially related to events that may be far removed from the time and place of the initial cause  ADDIN EN.CITE Grudin200111111117Jonathan GrudinDesituating Action: Digital Representation of ContextHuman-Computer Interaction (HCI) Journal162-42001[132]. For example, a hastily written blog entry or impulsive photograph on MySpace may cause unintentional embarrassment several years down the road. Furthermore, privacy is fraught with exceptions, due to contingent situations and historical context. The need for flexibility in these constructs is reflected by all the exceptions present in data protection legislation and by social science literature that describes privacy as a continuous interpersonal boundary-definition process rather than a static condition  ADDIN EN.CITE Altman19758548546Altman, IrwinThe Environment and Social BehaviorPrivacy, Personal Space, Territory, Crowding1975Monterey, CABrooks/Cole Publishing Company0-8185-0168-5[23]. The use of modern behavioral inquiry techniques in HCI can help explicate these behaviors and exceptions. Finally, it is often difficult to evaluate the effects of technology on privacy. There are few well-defined methods for anticipating what privacy features are necessary for a system to gain wide-scale adoption by consumers. Similarly, there is little guidance for measuring what level of privacy a system effectively offers or what its overall return on investment is. Like usability and security, privacy is a holistic property of interactive systems, which include the people using them. An entire system may be ruined by a single poorly implemented component that leaks personal information. In our opinion, Human-computer interaction is uniquely suited to help design teams manage these challenges. HCI provides a rich set of tools that can be used to probe how people perceive privacy threats, understand how people share personal information with others, and evaluate how well a given system facilitates (or inhibits) desired privacy practices. Indeed, the bulk of this paper examines past work that has shed light on these issues of privacy. As much as we have progressed our understanding of privacy within HCI in the last 30 years, we also recognize that there are major research challenges remaining. Hence, we close this article by identifying five grand challenges in HCI and privacy: Developing standard privacy-enhancing interaction techniques. Developing analysis techniques and survey tools. Documenting the effectiveness of design tools, and creating a privacy toolbox. Furthering organizational support for managing personal data. Developing a theory of technological acceptance, specifically related to privacy. These are only few of the challenges facing the field. We believe that focusing research efforts on these issues will lead to bountiful, timely and relevant results that will positively affect all users of information technology. Sources Used and Limitations of this Survey In this survey paper, we primarily draw on the research literature in HCI, CSCW, and other branches of Computer Science. However, readers should be aware that there is a great deal of literature on privacy in the MIS, advertising and marketing, human factors, and legal communities. The MIS community has focused primarily on corporate organizations, where privacy perceptions and preferences have a strong impact on the adoption of technologies by customers and on relationships between employees. The advertising and marketing communities have examined privacy issues in reference to privacy policies, and the effects that these have on consumers (e.g., work by Sheehan  ADDIN EN.CITE Sheehan20051096109617Sheehan, Kim BartelIn poor health: An assessment of privacy policies at direct-to-consumer web sitesJournal of Public Policy & Marketing2422005Fall 2005[262]). The legal community has long focused on the implications of specific technologies on existing balances, such as previous court rulings and the constitutional status quo. We did not include legal literature in this article because much scholarly work in this area is difficult to use in practice during IT design. However, this work has some bearing on HCI and researchers may find some analyses inspiring, including articles on data protection  ADDIN EN.CITE Samuelson200018618617Pamela SamuelsonPrivacy As Intellectual Property?52 Stanford Law Review 11252000[254], the relation between legislation and technology  ADDIN EN.CITE Lessig199983083017Lessig, LawrenceThe Architecture of PrivacyVanderbilt Journal of Entertainment Law & Practice5611999Spring 1999[199], identity  ADDIN EN.CITE Karas20031097109717Stan KarasPrivacy, Identity, Databases52 American University Law Review 3932003http://www.wcl.american.edu/journal/lawrev/52/Karas.pdf[175], data mining  ADDIN EN.CITE Zarski200284684617Zarski, Tal Z.Mine Your Own Business!: Making the Case for the Implications of the Data Mining of Personal Information in the Forum of Public OpinionYale Journal of Law & Technology15452002/20035 Yale Symp. L & Tech. 12002[313], and employee privacy  ADDIN EN.CITE Lasprogata20041098109817Gail LasprogataNancy J. KingSukanya PillayRegulation of Electronic Employee Monitoring: Identifying Fundamental Principles of Employee Privacy through a Comparative Study of Data Privacy Legislation in the European Union, United States and CanadaStanford Technology Law Review2004 42004http://stlr.stanford.edu/STLR/Articles/04_STLR_4[192]. As one specific example, Strahilevitz outlines a methodology for helping courts decide on whether an individual has a reasonable expectation of privacy based on the social networking literature  ADDIN EN.CITE Strahilevitz20051028102817Lior Jacob StrahilevitzA Social Networks Theory of PrivacyUniversity of Chicago Law Review72 U. Chi. L. Rev. 9192005Summer, 2005[277]. As another example, Murphy discusses whether or not the default privacy rule should allow disclosure or protection of personal information  ADDIN EN.CITE Murphy19961029102917Richard S. MurphyProperty Rights in Personal Information: An Economic Defense of PrivacyGeorgetown Law Journal84 Geo. L.J. 23811996July, 1996[216]. Privacy research is closely intertwined with security research. However, we will not reference HCI work in the security field. Instead, we direct readers to the books Security and Usability  ADDIN EN.CITE Cranor20051093109328Cranor, Lorrie FGarfinkel, SimsonSecurity and Usability: Designing Secure Systems That People Can Use 2005Sebastopol, CA, USAO'Reilly0-596-00827-9[73] and Multilateral Security in Communications  ADDIN EN.CITE Mller199969469428Mller, Gnter Rannenberg, KaiMultilateral Security in Communications, Volume 3: Technology, Infrastructure, Economy1999MnchenAddison Wesley3-8273-1426-7[214] for more information. We also only tangentially mention IT management. Management is becoming increasingly important in connection to privacy, especially after the enactment of data protection legislation  ADDIN EN.CITE Knapp20041104110427Knapp, K.J.Marshall, T.E.Rainer, R.K.Morrow, D.W.<style face="normal" font="default" size="100%">Top Ranked Information Security Issues: The 2004 International Information Systems Security Certification Consortium (ISC)</style><style face="superscript" font="default" size="100%">2</style><style face="normal" font="default" size="100%"> Survey Results</style>2004Auburn, ALAuburn University[182]. However, academia largely ignores these issues and industry does not publish on these topics because specialists perceive knowledge in this area as a strategic and confidential asset. Governments occasionally publish reports on privacy management. However, the reader should be aware that there is much unpublished knowledge in the privacy management field, especially in CSCW and e-commerce contexts. This survey paper also focuses primarily on end-users who employ personal applications, such as those used in telecommunications and e-commerce. We only partially consider applications in workplaces. However, perceived control of information is one of the elements of acceptance models such as Venkatesh et al.s extension  ADDIN EN.CITE Venkatesh20001094109417Venkatesh, ViswanathDeterminants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance ModelInformation Systems Research342-3651142000December 2000INFORMS[291] of the Technology Acceptance Model  ADDIN EN.CITE Davis19891095109517Davis, Fred D.Perceived usefulness, perceived ease of use, and user acceptance of information technologyMIS Quarterly3193401331989[80]. Kraut et al. discuss similar acceptance issues in a CSCW context  ADDIN EN.CITE Kraut19941132113210Robert E. KrautColleen CoolRonald E. RiceRobert S. FishLife and Death of New Technology: Task, Utility and Social Influences on the Use of a Communication MediumCSCW 9413211994Chapel Hill, NC, USAACM Press[187], pointing out that in addition to usefulness, critical mass and social influences affect the adoption of novel technologies. The Privacy Landscape In this section, we introduce often-cited foundations of the privacy discourse. We then discuss two perspectives on privacy that provide useful characterizations of research and design efforts, perspectives that affect how we bring to bear the notions of law and architecture on the issue of privacy. These perspectives are (1) the grounding of privacy on principled views as opposed to on common interest, (2) the differences between informational self-determination and personal privacy. Finally, we provide a historical outlook on 30 years of privacy HCI research and on how privacy expectations co-evolved with technology. Often-Cited Legal Foundations In this section, we describe a set of legal resources often cited by privacy researchers. In our opinion, HCI researchers working in the field of privacy should be familiar with all these texts because they show how to approach many privacy issues from a social and legal standpoint, while uncovering areas where legislation may be lacking. Many authors in the privacy literature cite a renowned 1890 Harvard Law Review article by Judges Warren and Brandeis entitled The Right to Privacy as a seminal work in the US legal tradition  ADDIN EN.CITE Warren189028028017Samuel D. WarrenLouis D. BrandeisThe Right to PrivacyHarvard Law ReviewIV51890Dec 15, 1890[298]. Warren and Brandeis explicitly argued that the right of individuals to be let alone was a distinct and unique right, claiming that individuals should be protected from unwarranted publications of any details of their personal life that they might want to keep confidential. In this sense, this right to privacy relates to the modern concept of informational self-determination. It is interesting to note that Warren and Brandeis did not cite the US Constitutions Fourth Amendment, which protects the property and dwelling of individuals from unwarranted search and seizure (and, by extension, their electronic property and communications). The Fourth Amendment is often cited by privacy advocates, especially in relation to surveillance technologies and to attempts to control cryptographic tools. The Fourth Amendment also underpins much privacy legislation in the USA, such as the Electronic Communications Privacy Act, or ECPA. Constitutional guarantees of privacy also exist in other legal texts, for example the EU Convention on Human Rights  ADDIN EN.CITE Council of Europe19501016, 8101627Council of Europe,The European Convention on Human Rights1950Rome, Italy[67, 8]. In the United States, case law provides more material for HCI practitioners. Famous cases involving the impact of new technologies on the privacy of individuals in the United States include Olmstead v. United States (1928), which declared telephone wiretapping constitutional; Katz vs. United States (1967), again on telephone wiretapping and overturning Olmstead; Kyllo vs. United States (2001), on the use of advanced sensing technologies by police; and Barnicki vs. Vopper (2001) on the interception of over-the-air cell phone transmissions. Regulatory entities such as the FTC, the FCC, and European Data Protection Authorities also publish rulings and reports with which HCI professionals working in the field of privacy should be familiar. For example, the EU Article 29 Working Party has issued a series of rulings and expressed opinions on such topics as the impact of video surveillance, the use of biometric technologies, and the need for simplified privacy policies. Finally, HCI researchers often cite legal resources such as the European Data Protection Directive of 1995  ADDIN EN.CITE 199580280231Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such dataOfficial Journal of the European Communities3150
L281
199524/10/1995
[1] and HIPAA, the US Health Insurance Portability and Accountability Act of 1999  ADDIN EN.CITE 19998658654United States Health Insurance Portability And Accountability ActUSC42104-191
1179
1999
[4]. Many of these data protection laws were inspired by the Fair Information Practices (discussed in more detail in section  REF _Ref163490554 \r \h 3.5.1), and impose a complex set of data management requirements and end-user rights. HCI practitioners should be aware that different jurisdictions use legislation differently to protect privacy, and that there is much more to privacy than the constitutional rights and laws described above. Philosophical Perspectives on Privacy Arguments about privacy often hinge on ones specific outlook, because designers values and priorities influence how one thinks about and designs solutions  ADDIN EN.CITE Friedman199695895817Friedman, B.Value-Sensitive DesignInteractions: New Visions of Human-Computer Interaction1723361996NovemberDecember, 1996[112]. In this section, we present alternative perspectives on privacy without advocating one particular view. The reader should instead refer to ethical principles suggested by professional organizations, such as the ACM or the IFIP  ADDIN EN.CITE Association for Computing Machinery97697612Association for Computing Machinery,ACM Code of Ethics3/21/2006http://www.acm.org/serving/ethics.htmlBerleur199697597528Berleur, J.Brunnstein, K.Ethics of Computing: Codes, spaces for discussion and law1996London, EnglandChapman & Hall0-412-72620-3[31, 46]. Still, we believe that an understanding of different perspectives is useful, because it provides a framework for designers to select the most appropriate approach for solving a specific problem. Principled Views and Common Interests The first perspective contrasts a principled view with a communitarian view. The principled view sees privacy as a fundamental right of humans. This view is supported by modern constitutions, for example the US 4th Amendment, and texts such as the European Convention on Human Rights  ADDIN EN.CITE Council of Europe19501016101627Council of Europe,The European Convention on Human Rights1950Rome, Italy[67]. In contrast, the communitarian view emphasizes the common interest, and espouses an utilitarian view of privacy where individual rights may be circumscribed to benefit the society at large  ADDIN EN.CITE Etzioni19992122126Amitai EtzioniThe Limits of Privacy1999New YorkBasic Books[97]. For an example of how this dichotomy has been translated into a framework for assessing the privacy concerns brought about by ubiquitous computing technologies, see work by Terrel, Jacobs, and Abowd  ADDIN EN.CITE Jacobs20031012101217Jacobs, Anne R.Gregory D. Abowd A framework for comparing perspectives on privacy and pervasive technologiesIEEE Pervasive Computing7884242003October 2003IEEE Presshttp://doi.ieeecomputersociety.org/10.1109/MPRV.2003.1251171Terrell200284184117Terrell, T.Jacobs, A.Privacy, technology, and terrorism: Bartnicki, Kyllo, and the normative struggle behind competing claims to solitude and securityEmory Law Journal146915115142002Fall 2002[163, 283]. The tension between principled approaches and utilitarian views is reflected in debates over the use of many technologies. For example, Etzioni discusses the merits and disadvantages of mandatory HIV testing and video surveillance. In the case of information and communication technologies, the contrast between these two views can be seen in the ongoing debate between civil liberties associations (e.g., the Electronic Frontier Foundation) and governments over strong encryption technologies and surveillance systems. These contrasting views can also help explain differences in approaches in the privacy research community. For example, some privacy-enhancing technologies (PETs) have been developed more as a matter of principle than on solid commercial grounds. Some researchers in the privacy community argue that the mere existence of these PETs is more important for their impact on policy debate than their actual widespread use or even commercial viability. Reportedly, this is the reason why organizations such as the Electronic Frontier Foundation support some of these projects. Data Protection and Personal Privacy The second perspective contrasts data protection with personal privacy. Data protection (also known as informational self-determination) refers to the management of personally identifiable information, typically by governments or commercial entities. Here, the focus is on protecting such data by regulating how, when, and for what purpose data can be collected, used, and disclosed. The modern version of this concept stems from work by Alan Westin and others  ADDIN EN.CITE Westin19672002006Alan F. WestinPrivacy and Freedom1967New York NYAtheneumWestin197197997928Westin, A.F.Information Technology in a Democracy1971Cambridge, MA, USAHarvard University Press[306, 307], and came about because of concerns over how databases could be used to collect and search personal information  ADDIN EN.CITE United States Department of Health Education and Welfare197368368327United States Department of Health Education and Welfare,Records, Computers and the Rights of Citizens, Report of the Secretary's Advisory Committee on Automated Personal Data Systems1973http://aspe.os.dhhs.gov/datacncl/1973privacy/tocprefacemembers.htm[288]. Westins work led to the creation of the influential Fair Information Practices (FIPS), which are a set of guidelines for personal information management. The FIPS include notions such as purpose specification, participation, and accountability (see Section  REF _Ref163700007 \r \h 3.5.1). The FIPS have greatly influenced research on privacy, including standards like P3P  ADDIN EN.CITE Cranor200014514512Lorrie CranorMarc LangheinrichMassimo MarchioriMartin Presler-MarshallJoseph ReagleThe Platform for Privacy Preferences 1.0 (P3P1.0) specification.December 2000W3C Recommendation 16 April 20022000W3Chttp://www.w3.org/TR/P3P/[72], privacy policies on web sites, and data management policies  ADDIN EN.CITE Karat20061083108310Karat, Clare-MarieKarat, JohnBrodie, CarolynFeng, JinjuanEvaluating interfaces for privacy policy rule authoringSIGCHI Conference on Human Factors in Computing Systems83-922006April 22 - 27, 2006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124787[176]. More recently, the FIPS have been reinterpreted with reference to RFID systems  ADDIN EN.CITE Garfinkel200286686610Simson GarfinkelAdopting Fair Information Practices to Low Cost RFID SystemsUbiquitous Computing 2002 Privacy Workshop2002http://www.teco.edu/~philip/ubicomp2002ws/1/10/2005[116] and ubiquitous computing  ADDIN EN.CITE Langheinrich200177277210Marc LangheinrichPrivacy by Design Principles of Privacy-Aware Ubiquitous SystemsUbicomp 2001LNCS273291LNCS 220122012001Springer Verlag[191]. In contrast, personal privacy describes how people manage their privacy with respect to other individuals, as opposed to large organizations. Drawing from Irwin Altmans research on how people manage personal space  ADDIN EN.CITE Altman19758548546Altman, IrwinThe Environment and Social BehaviorPrivacy, Personal Space, Territory, Crowding1975Monterey, CABrooks/Cole Publishing Company0-8185-0168-5[23], Palen and Dourish argue that privacy is not simply a problem of setting rules and enforcing them, but rather an ongoing and organic boundary definition process in which disclosure and identity are fluidly negotiated  ADDIN EN.CITE Palen200324924917Leysia PalenPaul DourishUnpacking "Privacy" for a Networked WorldCHI LettersHuman Factors in Computing Systems: CHI 2003129-136512003Ft. Lauderdale, FLACMhttp://guir.berkeley.edu/projects/denim/denim-chi-2000.pdf[232]. The use of window blinds and doors to achieve varying levels of privacy or openness is an example of such boundary setting. Other scholars have made similar observations. Darrah et al. observed that people tend to devise strategies to restrict their own accessibility to others while simultaneously seeking to maximize their ability to reach people  ADDIN EN.CITE Darrah200146046012Charles DarrahJan English-LueckJames FreemanFamiles and Work: An Ethnography of Dual Career Families2001http://www2.sjsu.edu/depts/anthropology/svcp/SVCPslnr.html [79]. Westin argued that Each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication  ADDIN EN.CITE Westin19672002006Alan F. WestinPrivacy and Freedom1967New York NYAtheneum[307]. Altmans work is in part inspired by Goffmans work on social and interpersonal relations in small groups  ADDIN EN.CITE Goffman19592922926Erving GoffmanThe Presentation of Self in Everyday Life1959New YorkAnchor, DoubledayGoffman19666886886Goffman, ErwinBehavior In Public Places1966Free Press0029119405[122, 123]. One of Goffmans key insights is that we project different personas to different people in different situations. For example, a doctor might present a professional persona while working in the hospital, but might be far more casual and open with close friends and family. The problem with respect to the design of interactive systems is that these roles cannot always be easily captured or algorithmically modeled. Personal privacy appears to be a better model for explaining peoples use of IT in cases where the information requiring protection is not well defined, such as managing ones availability to being interrupted or minute interpersonal communication. Here, the choice of whether or not to disclose personal information to others is highly situational depending on the social and historical context of the people involved. An example of this is whether or not to disclose ones location when on-the-go using cell phones or other kinds of friend finders  ADDIN EN.CITE Ito2003102210225Ito, M. Daisuke, O.Ling, R.Pedersen, P.Mobile Phones, Japanese Youth and the Replacement of Social ContactFront Stage/Back Stage: Mobile Communication and the Renegotiation of the Social Sphere20032224 June 2003Grimstad, Norway.[162]. Current research suggests that these kinds of situations tend to be difficult to model using rigid privacy policies that are typical of data protection guidelines  ADDIN EN.CITE Lederer2005118111815S LedererJ I HongA DeyJ A LandayS. GarfinkelL. F. CranorFive Pitfalls in the Design for PrivacySecurity and Usability421-445862005[196]. In summary, data protection focuses on the relationship between individual citizens and large organizations. To use a blunt expression, the power of knowledge here lies in quantity. In contrast, personal privacy focuses more on interpersonal relationships and tight social circles, where the concern is about intimacy. This distinction is not just academic, but has direct consequences on design. Modeling privacy according to data protection guidelines will likely result in refined access control and usage policies for personal information. This is appropriate for many IT applications today, ranging from healthcare to e-commerce. Typical design tools based on the data protection viewpoint include privacy policies on web sites, consent checkboxes, certification programs (such as TRUSTe), and regulations that increase the trust of consumers towards organizations. For applications that manage access to ones physical space or attention or interpersonal communication (e.g., chat, email, and social networking sites, as well as some location-enhanced applications such as person finders), a data protection outlook may result in a cumbersome design. For example, imagine highly detailed policies for when others could send instant messages to you. Instead, IM clients provide a refined moment-by-moment control of availability through away features and plausible deniability. For applications affecting personal privacy, negotiation needs to be dialectic and continuous, making it easy for people to project a desired persona, depending on social context, pressures, and expectations of appropriate conduct. How should these different views of privacy be reconciled? Our best answer to this question is that they should not be. Each approach to privacy has produced a wealth of tools, including analytic instruments, design guidelines, legislation, and social expectations. Furthermore, many applications see both aspects at work at the same time. For example, a social networking web site has to apply a data protection perspective to protect the data they are collecting from individuals, a personal privacy perspective to let individuals project a desired image of themselves, and a data protection perspective again to prevent users from crawling and data mining their web site. An Historic Perspective on Privacy Privacy is not a static target: changes in technology, in our understanding of the specific social uses of such technologies, and in social expectations have led to shifts in the focus of privacy research in HCI. In this section, we discuss changes in the expectation of privacy over the past three decades and summarize the consequences of these changes on HCI practice. Changes in Expectations of Privacy While the basic structures of social relationsfor example, power relations and the presentation of selfhave remained relatively stable with technical evolution  ADDIN EN.CITE Goffman19592926Erving GoffmanThe Presentation of Self in Everyday Life1959New YorkAnchor, Doubleday[123], there have been large shifts in perceptions and expectations of privacy. These shifts can be seen in the gradual adoption of telecommunication technologies, electronic payment systems, and surveillance systems, notwithstanding initial privacy worries. There are two noteworthy aspects on how privacy expectations have changed. The first is that social practice and expectations co-evolve with technical development, making it difficult to establish causal effects between the two. The second aspect is that privacy expectations evolve along multi-dimensional lines, and the same technology can have opposite effects on different types of privacy. Social practice and technology co-evolve. For example, the introduction of digital cameras, or location technology in cell phones, happened alongside the gradual introduction of legislation  ADDIN EN.CITE 200272672631Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)Official Journal of the European CommunitiesDirective on privacy and electronic communications3747
L201
200231/07/2002
20048358354United States Video Voyeurism Prevention ActUSC18
1801 et seq.
2004
19868388384United States Electronic Communications Privacy Act of 1986USC18
2510 et seq.
1986
[2, 3, 5] and the emergence of a social etiquette regulating their use. Legislation often follows technical development, although in some cases specific legislation preempts technical development. For example, digital signature legislation in some European countries was enacted well before the technology was fully developed, which may have in fact slowed down adoption by negatively affecting its usability  ADDIN EN.CITE Aalberts19991013101327Babette Aalbertsvan der Hof, SimoneDigital Signature Blindness Analysis of legislative approaches toward electronic authentication1999November 1999http://cwis.kub.nl/~frw/people/hof/ds-fr.htm[7]. It is often difficult to tease cause and effect apart: whether social practices and expectations drive the development of technology or vice-versa. Some observers have noted that the relationship between social constructs and technology is better described as co-evolution. Latour talks of socio-technological hybrids, undividable structures encompassing technology as well as culturenorms, social practices and perceptions  ADDIN EN.CITE Latour19918368366Latour, BrunoCatherine PorterWeve never been modern1991Cambridge, MA, USAHarvard University Press0-674-94839-4[193]. Latour claims that these hybrids should be studied as a whole. This viewpoint is reflected in HCI research, including the proponents of participatory design  ADDIN EN.CITE Scacchi2004107810785Walt ScacchiW. S. BainbridgeSocio-Technical Design The Encyclopedia of Human-Computer Interaction2004Berkshire Publishing GroupEhn1987107710775Ehn, P.Kyng, M.Bjerkes, G.Ehn, P.Kyng, MortenThe Collective Approach to Systems DesignComputers and Democracy: A Scandinavian Challenge17581987Aldershot, Great BritainAvebury[92, 256] and researchers of social computing  ADDIN EN.CITE Dourish20051020102027Paul DourishKen AndersonPrivacy, Security and Risk and Danger and Secrecy and Trust and Morality and Identity and Power: Understanding Collective Information Practices2005Irvine, CA, USAInstitute for Software Research, University of California at IrvineTechnical Report UCI-ISR-05-1http://www.isr.uci.edu/tech_reports/UCI-ISR-05-1.pdf[85]. Iachello et al. even go as far as claiming that in the domain of privacy, adoption patterns should be designed as part of the application and can be influenced to maximize the chances of successful acceptance  ADDIN EN.CITE Iachello200696796710Giovanni IachelloKhai N. Truong Gregory D. Abowd Gillian R. HayesMolly StevensExperience Prototyping and Sampling to Evaluate Ubicomp Privacy in the Real WorldCHI 20062006Montreal, CanadaACM Press[158]. The reader should note that in some cases, technologies that affect privacy are developed without much public debate. For example, Geographic Information Systems (GIS) classify geographic units based on census, credit, and consumer information. Curry and Philips note that GIS had a strong impact on the concepts of community and individual, but were introduced almost silently, over the course of several decades, by a combination of government action, developments in IT, and private enterprises, without spurring much public debate  ADDIN EN.CITE Curry20041009100917Michael R. CurryDavid J. PhillipsPriscilla M. ReganEmergency Response Systems and the Creeping Legibility of People and PlacesThe Information Society35736920520042004Routledge0197-2243http://dx.doi.org/10.1080/01972240490508018[78]. Understanding these changes is not a straightforward task, because technical development often has contradictory effects on social practice. The same artifact may produce apparently opposite consequences in terms of privacy, strengthening some aspect of privacy and reducing others. For example, cell phones both increase social connectedness, by enabling distant friends and acquaintances to talk more often and in a less scheduled way than previously possible, but also raise barriers between physically co-present individuals, creating bubbles of private space in very public and crowded spaces such as a train compartment  ADDIN EN.CITE Arnold200379879817M. ArnoldOn the phenomenology of technology: the "Janus-faces" of mobile phonesInformation and OrganizationInformation and Organization231256132003[29]. From this standpoint, privacy-sensitive IT design becomes an exercise of systematically reconciling potentially conflicting effects of new devices and services. For example, interruption management systems based on sensing networks (such as those prototyped by Nagel et al.  ADDIN EN.CITE Nagel200498898810Nagel, KrisHudson, JimAbowd, Gregory D.Predictors of availability in home life context-mediated communicationCSCW044975062004ACM Press[218]) aim at increasing personal and environmental privacy by reducing unwanted phone calls, but can affect information privacy due to the collection of additional information through activity sensors. We highlight this issue of how expectations of privacy change over time as an ongoing research challenge in Section  REF _Ref172888347 \r \h 4.5. Changes in Privacy Methodologies The discourse on human-computer interaction and on privacy in information technology (IT) shares a similar history over the past forty years. Reflections on the implications of IT on privacy surged in the late 1960s with the proposal of a National Data Center in the United States  ADDIN EN.CITE Dunn19671023102317Edgar S. DunnThe Idea of a National Data Center and the Issue of Personal PrivacyThe American Statistician21272111967Feb. 1967http://links.jstor.org/sici?sici=0003-1305%28196702%2921%3A1%3C21%3ATIOAND%3E2.0.CO%3B2-5[88] and culminated with the publication of the 1973 report Records, Computers and the Rights of Citizens  ADDIN EN.CITE United States Department of Health Education and Welfare197368368327United States Department of Health Education and Welfare,Records, Computers and the Rights of Citizens, Report of the Secretary's Advisory Committee on Automated Personal Data Systems1973http://aspe.os.dhhs.gov/datacncl/1973privacy/tocprefacemembers.htm[288] which introduced the Fair Information Practices. By the early 1970s, the accumulation of large amounts of personal data had prompted several industrialized countries to enact laws regulating the collection, use, and disclosure of personal information. The FIPS reflect the top-down and systems approach typical of IT at the time. Systems were relatively few, carefully planned, developed for a specific purpose, centrally managed, and their use was not discretionary. The terminology used to describe privacy reflects this perspective as well. Data subjects were protected through data protection mechanisms, which were centrally administered and verified by a data controller or data owner (the organization managing the data). Trust originated in the government and in the accountability of data owners. HCI in the 1970s also reflected carefully planned, structured process modeling of non-discretionary applications  ADDIN EN.CITE Grudin20051011101117Grudin, JonathanThree faces of human-computer interactionAnnals of the History of Computing46622742005Oct.-Dec. 2005IEEE1058-6180http://dx.doi.org/10.1109/MAHC.2005.67[134]. Computer-related work tasks were modeled and evaluated to improve performance, usability, and effectiveness using techniques such as GOMS  ADDIN EN.CITE Gray19931014101417Gray, W. D.John, B. E.Atwood, M. E.Project Ernestine: Validating a GOMS analysis for predicting and explaining real-world performanceHuman-Computer Interaction237309831993[129]. This picture began to change with advances in personal computing. Discretionary use became the predominant mode for many applications, even in office settings, and HCI started to concentrate more on ease-of-use, learning curves, and pleasurable interaction. Users enjoyed increasing discretion of what applications and services to employ. At the same time, the collection of personal data expanded with advances in storage and processing power, making trust a fundamental component in the provisioning of IT services. This increased choice and shift of approaches is reflected in data protection legislation in the 1980s, where the original concepts of use limitation gives way to the more far-reaching concept of Informational Self-Determination  ADDIN EN.CITE 1983100510057Volkszhlungsurteil vom 15. Dezember 1983, BVerfGE 65,11983December 15, 1983German Constitutional Court (Bundesverfassungsgerichts)http://www.datenschutz-berlin.de/gesetze/sonstige/volksz.htm[6]. Finally, the 1990s saw the emergence of the Internet, which enabled new kinds of applications and forms of communication. Regulators and industry started developing more flexible and comprehensive legislation to support the greatly increased amounts of personal information that was being shared and used. Privacy research followed these changes, acknowledging the use of IT for communication purposes and the increasing fluidity of personal information collected and used by individuals, businesses, and governments. The development of privacy-enhancing technologies like machine-readable privacy policies  ADDIN EN.CITE Cranor200014514512Lorrie CranorMarc LangheinrichMassimo MarchioriMartin Presler-MarshallJoseph ReagleThe Platform for Privacy Preferences 1.0 (P3P1.0) specification.December 2000W3C Recommendation 16 April 20022000W3Chttp://www.w3.org/TR/P3P/[72], of concepts such as Multilateral Security  ADDIN EN.CITE Rannenberg199361461410Kai RannenbergRichard Sizer Louise Yngstrm Henrik Kaspersen Simone Fischer-HbnerRecent Development in Information Technology Security Evaluation The Need for Evaluation Criteria for Multilateral SecuritySecurity and Control of Information Technology in Society Proceedings of the IFIP TC9/WG 9.6 Working Conference1131281993August 1217, 1993Onboard M/S Ilich and ashore at St. Petersburg, RussiaNorth-Holland, Amsterdam0-444-81831-6[247], and of technology supporting anonymous transactions (e.g., mail encryption tools, mix networks, anonymizing web services) are manifestations of the complexity of the IT landscape. At the same time, HCI research and practices began to focus on the use of IT to enable interpersonal communications and support social and work groups, first in small environments such as offices, later in society at large. Example domains studied by HCI researchers at this time include remote collaboration, telecommunications, and organizations. Following these developments, interpersonal relations became an important domain of the privacy discourse, and research started to focus on interpersonal privacy within office environments  ADDIN EN.CITE Gaver199285085010Gaver, WilliamMoran, TomMacLean, A.Lovstrand, L.Dourish, P. Carter, K.Buxton, WilliamRealizing a Video Environment: EuroPARC's RAVE SystemCHI'9227351992May 1992Monterey, CA, USAACM PressMuller199090090017Michael J. MullerJohn G. SmithJ. Zachary ShoherHarry Goldberg Privacy, anonymity and interpersonal competition issues identified during participatory design of project management groupwareACM SIGCHI Bulletin2311990December 1990[118, 215] and in everyday interactions and communications (e.g., instant messaging, email). Today, the combination of wireless networking, sensors, and computing devices of all form factors has spurred the development of new kinds of mobile and ubiquitous computing applications. Many of these new applications operate in non-traditional settings, such as the home or groups of friends, which lead to new challenges for HCI and privacy  ADDIN EN.CITE Smith20021025102517Alan D. SmithFelix OffodileInformation management of automatic data capture: an overview of technical developmentsInformation Management & Computer Security 1091181032002MCP UP LimitedLangheinrich200177277210Marc LangheinrichPrivacy by Design Principles of Privacy-Aware Ubiquitous SystemsUbicomp 2001LNCS273291LNCS 220122012001Springer Verlag[191, 267]. For example, the implicit nature of interaction with these systems requires developers to re-think both Normans seven steps of interaction  ADDIN EN.CITE Norman20022612616Donald A. NormanThe Design of Everyday Things2002New York, NYBasic Books[227] and established tenets of privacy such as informed consent  ADDIN EN.CITE Ackerman2005100810085Ackerman, Mark S.Mainwaring, Scott D.Garfinkel, SimsonCranor, LorriePrivacy Issues and Human-Computer InteractionSecurity and Usability: Designing Secure Systems That People Can Use 3814002005Sebastopol, CA, USAO'Reilly0-596-00827-9[11]. Furthermore, the type, quantity and quality of information collected from ubicomp environments significantly heighten risks of misuse. This brief historical review should have convinced the reader that privacy is a very dynamic construct, and that design for privacy is a function of social and technological contexts, which vary over time. Against this backdrop, we next survey the research landscape of privacy in HCI. Understanding, Building and Evaluating Privacy in Interactive Systems In this section, we survey HCI privacy literature, organized according to threads of research on specific topics, such as mobile computing or identity management. Privacy research in the HCI field has seen a surge starting in the early 1990s and is now booming. The increased interest in privacy within HCI is also testified by countless workshops at HCI conferences, and the recent creation of conferences like SOUPS (Symposium on Usable Privacy and Security).  REF _Ref171596579 \h Figure 1 depicts our view of the evolution of HCI privacy research between 1970 and 2006. Each line represents a particular subfield, defined as a timeline of related work (e.g., location-enhanced technologies privacy). Beneath each line, we provide a sample of salient studies (which are referenced in the bibliography). Note that the intent is not to provide an exhaustive listing of references, but to illustrate with select references the scope of each line of research. The figure clearly shows the dichotomy between personal privacy research and data protection, described above in Section 2.2.2. The picture also shows shaded regions (see Section 2.3): the non-discretionary era of centralized personal data management (1960-1980); the period of informational self-determination (1980-2000); the more recent developments towards implicit interaction and behavioral analysis of users with respect to privacy concerns (2000 to present).  Figure  SEQ Figure \* ARABIC 1. Timeline of HCI privacy research. In the following sections, we describe the main research efforts and results in each of the subfields identified in Figure 1. The material is organized according to an ideal application development cycle, from understanding user needs, to designing the application, to evaluating it. Understanding Users Privacy Preferences We start by describing work on understanding the privacy preferences of individuals. As noted above, privacy preferences are determined by social context and are sometimes difficult to articulate. For example, the need for plausible deniability is evident in social relations  ADDIN EN.CITE DePaulo199896996917DePaulo, B.M.Kashy, D.A.Everyday Lies in Close and Casual RelationshipsJournal of Personality and Social Psychology63797411998January 1998American Psychological Association[83], but participants of a survey may not admit it or be consciously aware of certain dynamics that are ingrained in ones daily behavior. Consequently, privacy preferences and concerns can be difficult to generalize and should be probed with reference to a specific circumstance. One implication is that it can be misleading to take privacy preferences from one domain (e.g. attitudes towards the use of loyalty cards or internet shopping) and extrapolate them to another domain (e.g., social relations such as family and colleagues). Notwithstanding these difficulties, a wide array of techniques has been developed to gather data about users preferences and attitudes. These techniques include both quantitative tools, such as surveys to probe mass-market applications, and qualitative techniques to probe personal privacy dynamics.  REF _Ref171344222 \h Table 1 provides an overview of the research space, with a sampling of the most used techniques and a few representative studies for each, with an indication of their scope, advantages and limitations. We first show how these techniques have been used in several application domains. In Section  REF _Ref174100347 \r \h 3.2, we discuss the drawbacks and advantages of specific techniques, specifically in relation to privacy. In Section  REF _Ref172889191 \r \h 4.3, we argue that there is still a great need for improving these techniques. Data Protection and Privacy Preferences The development of data collection practices during the 1970s and 1980s led governments to enact data protection legislation. At the same time, a number of studies were conducted to probe public opinion regarding these practices. Many of these studies were commissioned or conducted by the government, large IT companies, or research institutions. In the United States, a well-known series of surveys was developed by the Pew Research Center, a non profit organization that provides information on the attitudes and trends shaping American public opinion  ADDIN EN.CITE Pew Internet & American Life200011611612Pew Internet & American Life,Trust and Privacy Online: Why Americans Want to Rewrite the Rules2000http://www.pewinternet.org/reports/toc.asp?Report=19[238]. One of the most cited series of surveys was conducted by Privacy & American Business  ADDIN EN.CITE Privacy & American Business200381681617Privacy & American Business,Consumer Privacy Attitudes: A Major Shift Since 2000 and WhyPrivacy & American Business Newsletter1062003September[243], a research consultancy founded by Alan Westin (who also worked on the initial version of the FIPS). Westins surveys have been used to segment people into three categories based on their privacy preferences towards commercial entities  ADDIN EN.CITE Westin199189189127Westin, A. F.Harris-Equifax consumer privacy survey 19911991Atlanta, GeorgiaEquifax Inc.http://www.privacyexchange.org/iss/surveys/eqfx.execsum.1991.html.[305]. Fundamentalists are those individuals who are most concerned about privacy, believe that personal information is not handled securely and responsibly by commercial organizations, and consider existing legislative protection to be insufficient. Unconcerned individuals are not worried about the handling of their personal data and believe that sufficient safeguards are in place. Pragmatists, which are the majority of the sampled population, lie somewhere in the middle. They acknowledge risks to personal information but believe that sufficient safeguards are in place. Temporal trends over the past ten years show that the distributions in the three categories vary over time  ADDIN EN.CITE Westin200199299246Alan WestinThe House Committee on Energy and Commerce, Subcommittee on Commerce, Trade, and Consumer Protection,Opinion Surveys: What Consumers Have To Say About Information Privacy2001http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm[303], and in general, the percentages hover around 15%25% fundamentalists, 1525% unconcerned, and 4060% pragmatists. Similar figures are reported by the Eurobarometer survey in the EU  ADDIN EN.CITE European Opinion Research Group EEIG20031073107327European Opinion Research Group EEIG,Special Eurobarometer Data Protection Executive SummarySpecial Eurobarometer122003December 2003Bruxelles, BelgiumEuropean Commission Special Eurobarometer 196 Wave 60.0http://ec.europa.eu/public_opinion/archives/ebs/ebs_196_exec_summ.pdf[102]. This distribution has also been observed in a scenario-based survey by Ackerman et al.  ADDIN EN.CITE Ackerman199989489410Ackerman, M.S.Cranor, L. Reagle, J.Privacy in e-commerce: examining user scenarios and privacy preferencesACM conference on electronic commerce (EC99)181999November 1999Denver, Colorado[9] and in a controlled experiment  ADDIN EN.CITE Jensen20051053105317Carlos JensenColin PottsChristian JensenPrivacy practices of Internet users: Self-reports versus observed behaviorInt. J. Human-Computer Studies20322763 2005Elsevier[169]. Table  SEQ Table \* ARABIC 1. Summary of techniques for understanding users privacy preferences, with example studies. Technique ScopeData Protection / Personal PrivacyPrincipled / CommunitarianSample sizesProsConsSurveys Data ProtectionNeutral1000-10000Statistically significantProbes opinions only SuperficialWestinSegmentationData ProtectionPrincipled1000-10000SimpleGVUGeneral preferencesData ProtectionNeutral10000Historic sequence of studiesSmith et al.Data protection in organizationsData ProtectionNeutral<1000ValidatedNot adequate for new technologiesScenario-based surveysIndividuals decisionsNeutral~100Realism ControlBias Probes opinions onlySpiekermannControl in ubicompData ProtectionCommunitarian128ValidatedOlson et al.Two-phased (identify items, then probe prefs)PersonalNeutral30-80Efficient use of participantsHawkey and InkpenIncidental privacyPersonalPrincipled155ESM / SimulationsNeutralNeutralRealism ImmediacyCost IntrusivenessConsolvo et al.Location PrivacyPersonalPrincipled16ImplausibilityAmmenwerth et al.Mobile computingPersonalNeutral31Expert feedbackExtensive training Requires expertsIachello et al.Mobile computingPersonal Communitarian41Realism ImmediacyCost IntrusivenessFocus GroupsNeutralNeutralRich dataRequires experts Crosstalk EfficientKaasinenRelation of user with TelecomsData ProtectionNeutral13 groups, 3-7 people eachRequires ExpertsHayesSchool-based surveillancePersonalNeutral4 groups, 4-5 people eachRich dataRequires Experts Table 1 (cont.) Technique ScopeData Protection / Personal PrivacyPrincipled / CommunitarianSample sizesProsConsInterviews Personal10-50March et al.Mobile phonesPersonalNeutral10-20Rich data Probes sensitive topicsCostMelenhorstUbiquitous computing PersonalNeutral44Rich analysisRequires demonstration CostExperimentsScientifically sound ControlCost Difficult to reproduce realistic situationsKindbergMobile payment systems trustPersonalNeutral24Jensene-commerce Data ProtectionNeutral175Statistical significanceCase Studies1-3Reference to real systemsInsider access or extensive public literature search AnecdotalAntonAirlines and GovernmentData ProtectionPrincipled2EsslingerPKI in banksPersonalNeutral1Participatory design (Muller et al.)Project Management Groupware systemPersonalPrincipled1Buy-in of users Rich analysis Costly Embedded in development This kind of segmentation allows service providers to devise service improvements or marketing strategies. For example, both Ackerman et al. and Jensen et al. have attempted to characterize individual behavior on retail web sites based on Westins privacy classifications. Specifically, Jensen et al. found that while the purchasing decisions of those classified as pragmatists and unconcerned were affected by the presence of trust marks and privacy policies on web sites, fundamentalists decisions were not  ADDIN EN.CITE Jensen20051053105317Carlos JensenColin PottsChristian JensenPrivacy practices of Internet users: Self-reports versus observed behaviorInt. J. Human-Computer Studies20322763 2005Elsevier[169]. Culnan and Armstrongs scenario-based survey also examined the propensity of people to disclose personal information in ecommerce settings  ADDIN EN.CITE Culnan199991891817Culnan, M.J.Armstrong, P.K.Information privacy concerns, procedural fairness, and impersonal trust: an empirical investigationOrganization Science1041151011999Jan/Feb 19991047-7039http://search.epnet.com/login.aspx?direct=true&db=buh&an=2251838[77]. The repeated-measures survey was administered by phone to one thousand individuals using two scenarios that involved the collection of personal information. In the first scenario, the researchers did not indicate that fair information practices would be employed, while in the second, they specified that the data collector would apply control and notification measures. In the first condition, people with a high degree of concern for privacy would disclose information less often than the others, while in the second condition, there was no difference. Interestingly, these results on the effect of privacy assurances differ from Jensen et al.s conclusions. While the privacy segmentation model is stable and identifies similar trends in different countries, it is much harder to associate a particular demographic to privacy preferences. Westin only found weak correlations between gender and concern  ADDIN EN.CITE Westin19981075107527Westin, Alan F.E-commerce & Privacy: What Net Users Want1998Hackensack, NJPrivacy & American Business[304]. Ackerman did not find any correlation  ADDIN EN.CITE Ackerman199989489410Ackerman, M.S.Cranor, L. Reagle, J.Privacy in e-commerce: examining user scenarios and privacy preferencesACM conference on electronic commerce (EC99)181999November 1999Denver, Colorado[9]. The Eurobarometer survey showed that differences in privacy perceptions are attributable to different national contexts rather than demographics, presumably influenced by local legislative situation and media coverage  ADDIN EN.CITE European Opinion Research Group EEIG20031073107327European Opinion Research Group EEIG,Special Eurobarometer Data Protection Executive SummarySpecial Eurobarometer122003December 2003Bruxelles, BelgiumEuropean Commission Special Eurobarometer 196 Wave 60.0http://ec.europa.eu/public_opinion/archives/ebs/ebs_196_exec_summ.pdf[102]. Westins survey has been employed to classify participants of experimental studies, to support the interpretation of results. However, the segmentation should be interpreted carefully, for two reasons. First, the Westin classification only probes opinions on the use of personal information by commercial entities, and can thus be described as examining peoples attitudes towards data protection. It would be misleading to infer that views on data protection correspond to views on personal privacy with family, friends, and co-workers. In fact, Consolvo et al. found that there was no strong correlation in how participants responded to Westins survey and how willing they were to disclose their current location to others with a person finder device  ADDIN EN.CITE Consolvo200589289210Consolvo, SunnySmith, IanMatthews, TaraLaMarca, AnthonyTabert, J.Powledge, P.Location Disclosure to Social Relations: Why, When, & What People Want to ShareCHI 2005, Conference on Human Factors in Computing Systems82902005ACM Press[65]. Second, Kumaraguru and Cranor point out that the questions in the Westin surveys have changed over the years, based on the goals of the commercial entities commissioning the studies  ADDIN EN.CITE Kumaraguru20051116111627Ponnurangam KumaraguruLorrie Faith CranorPrivacy Indexes: A Survey of Westin's Studies212005December 2005Institute for Software Research International, School of Computer Science, Carnegie Mellon UniversityCMU-ISRI-05-138http://reports-archive.adm.cs.cmu.edu/anon/isri2005/abstracts/05-138.html[188]. Thus, it is not immediately clear how well the results of past surveys can be combined with more recent surveys to establish trends. Smith et al. developed a privacy attitudes questionnaire that is more elaborate than the Westin segmentation survey  ADDIN EN.CITE Smith199692392317Smith, H.J.Milberg, S.J.Burke, S.J.Information privacy: measuring individuals concerns about organizational practicesMIS Quart1671962021996[268]. Like Westins, Smith et al.s questionnaire assesses concerns about privacy in data protection settings, and its validation procedure has been accurately documented. Based on an analysis of the responses of a large sample set, Smith et al. identified four subscales that constitute overall privacy concerns: concerns about collection of personal information, processing errors, further use of personal data (control), and improper access to the information. The advantage of this questionnaire is that it decomposes privacy concerns in meaningful subscales (thus, providing more information than Westins survey). However, this tool does not take into account new technologies such as the Internet and ubiquitous computing, nor does it consider issues of personal privacy. Smith et al.s survey would thus require additions to be useful in these new research areas. Privacy on the World Wide Web, Privacy and E-commerce In the mid 1990s, privacy and security concerns were considered to be significant limiting factors to the development of e-commerce over the World Wide Web. For this reason, several surveys were conducted to assess privacy preferences of web users. One such survey was Georgia Techs World Wide Web User Survey, which was executed ten times between 1994 and 1998  ADDIN EN.CITE GVU Center199989389327GVU Center,10th WWW User Survey Online Privacy and Security1999GVU Center, Georgia Insitute of Technologyhttp://www.gvu.gatech.edu/user_surveys/survey-1998-10/graphs/graphs.html#privacy3/21/2006[137]. The Fifth GVU survey (April 1996) asked three general questions about privacy notices and information. Over the following years, the range of questions about privacy and security grew, with the last survey containing 37 detailed questions on topics ranging from reporting security breaches to clearinghouse organizations, to childrens online privacy. Results of the Tenth GVU survey (October 1998) show that the majority of surveyed internet users were very concerned about privacy and security in e-commerce, and that most favored the availability of FIPS-inspired data protection mechanisms such as collection notification and disclosure control. Participants in the GVU surveys were also grouped in three geographic regions (USA, Europe, and the rest of the world), but responses were similar across geographical areas. The 1999 IBM Multi-National Consumer Privacy Study also probed consumers perceptions across three large industrialized economies: the US, UK and Germany  ADDIN EN.CITE Harris Interactive19991050105027Harris Interactive,IBM Multi-National Consumer Privacy Survey1999ftp://www6.software.ibm.com/software/security/privacy_survey_oct991.pdfSeptember 12, 2006[141]. IBMs survey is interesting because in a joint project, the manufacturer also surveyed executives in high privacy risk industries, including the health care, financial services, insurance, and retail industries. This survey showed that executives generally underestimated consumers privacy concerns. The survey also indicated that more tech-savvy and educated respondents were more aware and more concerned about potential privacy violations online. Finally, respondents indicated the desire for notification mechanisms and an overall concern for privacy. Subsequent research has however shown that privacy notices only partially assuage user concerns; well-known and reputable brands remain the most effective communication tools for this purpose. In 2003, Baumer et al. surveyed 415 individuals via email, probing their likelihood of disclosing information on e-commerce web sites as a function of the availability of privacy seals, privacy notices, and of the demographics of the respondents  ADDIN EN.CITE Baumer20031048104817David L. BaumerJulia Brande EarpPamela S. EversTit for Tat in Cyberspace: Consumer and Website Responses to Anarchy in the Market for Personal InformationNorth Carolina Journal of Law and Technology217274422003Spring 2003[36]. They found that respondents were more willing to reveal personal information in several categories to well-known web sites as compared to less well-known web sites. The presence of privacy policies and privacy seals only provided a marginal benefit, possibly due to skepticism regarding compliance. Baumer et al. argue that it is important to situate privacy questions with sufficient context to elicit reasonably accurate answers. Baumer et al.s survey included a scenario before the actual questions to help situate the responses rather than leaving the decision context to the imagination of the user. Since the late 1990s, many of the best practices indicated by these surveys have been widely adopted by e-commerce operators. IT manufacturers, such as IBM and Microsoft, still claim that privacy concerns are limiting the growth of online business, especially after several high-profile scandals  ADDIN EN.CITE IBM Corporation20071182118212IBM Corporation,IBM Corporation. Privacy is good for business2007March 26, 20072007http://www.ibm.com/innovation/us/customerloyalty/harriet_pearson_interview.shtmlMicrosoft Corporation20071183118312Microsoft Corporation,Protecting Americans Privacy, Issued March 20, 2007March 28, 20072007http://www.microsoft.com/issues/essays/2007/03-20ProtectingPrivacy.mspx[159, 209]. These manufacturers advocate stronger and uniform privacy protection legislation in countries that lack it, such as the United States. Instant Messaging, Environmental Privacy, and Personal Availability One aspect of online personal privacy relates to ones availability to communicate with others. New communication media alter the way individuals offer themselves to communication, based on the affordances of the medium. Two such media that have enjoyed widespread adoption in recent years are SMS and Instant Messaging (IM). Patil and Kobsa interviewed seven participants on the privacy issues involved in IM  ADDIN EN.CITE Patil20041021102110Sameer Patil Alfred KobsaInstant Messaging and PrivacyHCI 200485882004Leeds, UK[233]. Hkkil and Chatfield surveyed people in two different locales (Finland and Australia) about SMS messaging practices and privacy expectations of the medium  ADDIN EN.CITE Hkkil200588988910Jonna HkkilCraig Chatfield Toward social mobility: 'It's like if you opened someone else's letter': user perceived privacy and social practices with SMS communicationHuman Computer Interaction With Mobile Devices & Services MobileHCI '05 219222 2005September 2005Salzburg, AustriaACM Press1-59593-089-2http://doi.acm.org/10.1145/1085777.1085814[138]. In both studies, the interviewees were very familiar with the domain being probed and were able to reflect on their behaviors and expectations, thus making them expert informants. Results showed that the mobile device was perceived as a private object and that a strong etiquette protecting the confidentiality of voice and especially text communication existed within the social group (e.g., interviewees would not pick up others phone calls, and expected the recipient of their text messages to preserve confidentiality). Hkkil and Chatfield note that the selection of communication medium (SMS over voice) was influenced by confidentiality considerations. For example, SMS was considered more discreet than voice. Grinter and Palen also studied teens use of IM and SMS  ADDIN EN.CITE Grinter200247647610Rebecca E. GrinterLeysia PalenInstant Messaging in Teenage LifeACM Conference on Computer Supported Cooperative Work (CSCW2002)21302002ACM Press1-58113-560-2http://doi.acm.org/10.1145/587078.587082[131]. Like Hkkil and Chatfield, Grinter and Palen found that the selection of the communication medium was based on privacy considerations (e.g., leaving no written trace) as well as convenience and availability. Specifically, Grinter and Palen showed how interviewees used the different features of IM to control access to themselves. At the same time, IM allowed users to keep a connection with their social group and to carve a private space in the household where they were unlikely to be overheard  ADDIN EN.CITE Ito2003102210225Ito, M. Daisuke, O.Ling, R.Pedersen, P.Mobile Phones, Japanese Youth and the Replacement of Social ContactFront Stage/Back Stage: Mobile Communication and the Renegotiation of the Social Sphere20032224 June 2003Grimstad, Norway.[162]. Grinter and Palen asked questions about privacy as part of a broad interview about usage patterns and social context, which we believe is conductive to balanced and realistic results. Grinter and Palen noticed that different members of an outwardly homogeneous demographicteensreport very different behaviors in terms of privacy, which warns against standard common sense assumptions about privacy expectations and preferences. A similar observation was made by Iachello et al.  ADDIN EN.CITE Iachello200597497410Iachello, GiovanniSmith, IanConsolvo, SunnyChen, MikeAbowd, Gregory D.Developing Privacy Guidelines for Social Location Disclosure Applications and ServicesSymposium On Usable Privacy and Security (SOUPS)65762005July 68, 2005Pittsburgh, PA, USAACM Press[157] in relation to inter-family use of mobile person finders. Privacy also emerged as a fundamental component in two ethnographic studies of teens use of SMS, by Ito and Ling respectively  ADDIN EN.CITE Ito2003102210225Ito, M. Daisuke, O.Ling, R.Pedersen, P.Mobile Phones, Japanese Youth and the Replacement of Social ContactFront Stage/Back Stage: Mobile Communication and the Renegotiation of the Social Sphere20032224 June 2003Grimstad, Norway.Ling20044744746Rich LingThe Mobile Connection : The Cell Phone's Impact on Society3rd2004Morgan Kaufmann[162, 201]. While these studies were not specifically designed to probe privacy, they exposed the relationship between privacy, group communication, accessibility, and familial power structures. Similar to Grinter and Palen, both Ito and Ling reported that the unobtrusive qualities of text messaging allowed teenagers to be connected with their social milieu even in situations where an open phone conversation would be inappropriate, such as a family dinner. They also discovered that environmental privacy (e.g., not interrupting or disturbing the physical environment) is an important aspect of communications for these teens. The issues of environmental privacy and availability to communication can be extended to the sharing of other types of personal information with immediate relations. For example, Olson et al. probed information sharing practices in interpersonal settings  ADDIN EN.CITE Olson200589689610Judith S. Olson Jonathan GrudinEric HorvitzA study of preferences for sharing and privacyCHI '05 extended abstracts on Human factors in computing systems2005April 2005[229]. They surveyed the propensity to share information such as availability to communication, contact information, and personal communication preferences with other people. Olson et al. identified clusters, based on the type of information respondents would share and the recipient of the information (i.e., family and friends, close colleagues, remote colleagues, and others). Expectedly, Olson et al.s study showed that individuals indicated that they would share more sensitive information with closer acquaintances. It should be noted that Olson et al.s study design was hypothetical. In a study using Experience Sampling, Consolvo et al. showed that disclosure of location information is heavily influenced by additional factors, including the purpose of the disclosure  ADDIN EN.CITE Consolvo200589289210Consolvo, SunnySmith, IanMatthews, TaraLaMarca, AnthonyTabert, J.Powledge, P.Location Disclosure to Social Relations: Why, When, & What People Want to ShareCHI 2005, Conference on Human Factors in Computing Systems82902005ACM Press[65]. These differences suggest that personal privacy dynamics should be investigated with studies that closely simulate the experience of the users, rather than on a hypothetical basis. Incidental Information Privacy A common problem encountered when several individuals are viewing the same computer screen is that potentially private information, such as bookmarks or financial information, may be accidentally disclosed. These accidental disclosures can happen, for example, when projecting onto a shared display or when a bystander happens to see someone elses screen (i.e., shoulder surfing). In a scenario-based survey, Hawkey and Inkpen confirmed that incidental eavesdropping is a concern for a majority of the surveyed participants  ADDIN EN.CITE Hawkey20061031103110Kirstie HawkeyKori M. InkpenKeeping up appearances: understanding the dimensions of incidental information privacyProceedings of the SIGCHI conference on Human Factors in computing systems8218302006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124893[142]. Incidental eavesdropping relates to information that can be glanced from casually viewing the screen of a user or overhearing a conversation. Hawkey and Inkpen also investigated what kinds of information individuals may be comfortable having others see, specifically focusing on web browsers, past search engine queries and browser bookmarks. They showed that the comfort level of the user in displaying personal information in the presence of onlookers is impacted not just by the sensitivity of the information being displayed, and by the identity of the viewer (e.g., spouse, friend/relative, work colleague), but also by the amount of control on the input devices (mouse, keyboard) that the onlooker has. Managing incidental information disclosures is an example of the interpersonal boundary definition process described by Palen and Dourish  ADDIN EN.CITE Palen200324924917Leysia PalenPaul DourishUnpacking "Privacy" for a Networked WorldCHI LettersHuman Factors in Computing Systems: CHI 2003129-136512003Ft. Lauderdale, FLACMhttp://guir.berkeley.edu/projects/denim/denim-chi-2000.pdf[232]. Drawing from this approach, Grinter et al.  ADDIN EN.CITE Grinter200496296217Grinter, R.Dourish, P.Delgado de la Flor, J.Joseph, M.Security in the wild: user strategies for managing security as an everyday, practical problemPersonal and Ubiquitous ComputingPersonal and Ubiquitous Computing391401862004November 2004[130] analyzed everyday security and privacy practices in an organizational setting, examining the problem of incidental privacy with respect to its physical and informational aspects. Through interviews, Grinter et al. observed that their interviewees employed subtle practices to achieve privacy and security goals, such as positioning a computer screen such that visitors in an office could not see it, or stacking papers according to a secret rationale. The increasing use of IT in mobile and casual situations suggests that the potential for incidental information privacy breaches is likely to become more relevant in the future. It is likely that an increasing amount of research in HCI will focus on privacy with respect to incidental information, shared displays, and related topics. Media Spaces We next examine privacy preferences in the context of media spaces, which are physical spaces enhanced with multimedia communication or recording technologies such as videoconferencing and always-on multimedia links between remote locations. Privacy concerns were recognized early on in this domain. For example, Root discusses the design of Cruiser, a multimedia communication tool developed at Bell Research in the late 1980s  ADDIN EN.CITE Root19881185118510R W RootDesign of a multi-media vehicle for social browsingThe 1988 ACM Conference on Computer-supported Cooperative Work (CSCW 88)1988http://doi.acm.org/10.1145/62266.62269[251]. Through observational research in office environments, Root noted that the activity of observing other people is typically symmetric, meaning that it is not possible to observe others without being seen. This principle was applied to the design of the Cruiser system. In addition, a busy feature was added to the design, allowing users to block communication at will  ADDIN EN.CITE Fish19921184118410R FishR E KrautR W RootR E RiceEvaluating video as a technology for informal communicationHuman Factors in Computing Sys-tems (CHI 92)1992http://doi.acm.org/10.1145/142750.142755[108]. Jancke et al. also studied the social effects of a multimedia communication system linking public spaces together  ADDIN EN.CITE Jancke20011161116117Gavin JanckeGina Danielle VenoliaJonathan GrudinJJ CadizAnoop GuptaLinking Public Spaces: Technical and Social IssuesCHI Letters (Human Factors in Computing Systems: CHI 2001)530-537312001Seattle, WAACM Press[165]. In their work, Jancke et al. noted that symmetry and the ability to opt out were important design components of a privacy-respecting system. Subsequent research, however, has showed that other concerns and design features are needed for successful implementations of media spaces. In a preliminary study of the organizational impact of a multimedia recording technology in special education classrooms, Hayes and Abowd led focus groups with professionals who would experience both the benefits and the potential downsides of the technology. Hayes and Abowd discovered that in addition to control, purposefulness was a fundamental aspect of the privacy balance of their design  ADDIN EN.CITE Hayes20061015101510Gillian R. HayesGregory D. AbowdTensions in designing capture technologies for an evidence-based care communityProceedings of the SIGCHI conference on Human Factors in computing systems937-9462006Montr\&\#233;al, Qu\&\#233;bec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124911[143]. That is, users accepted potential privacy risks if they perceived the application to provide value either to them or to some other stakeholder. We believe that during the development of novel technologies, such as media spaces, sensing systems, or location technologies, it is important to emphasize the value proposition of the technology. Users can thus express their privacy concerns and preferences with reference to the actual needs that are satisfied by the technology. Ubiquitous Computing, Sensors, and RFID One way of conveying the value proposition of a technology is to show a working example to the intended users. This may be problematic for technologies that are still at the conceptual stage, as is the case with many ubiquitous computing applications. Spiekermann proposed and partially validated a survey to probe privacy attitudes toward ubiquitous computing technologies  ADDIN EN.CITE Spiekermann20051010101010Sarah SpiekermannL. Ardissono T. MitrovicPerceived Control: Scales for Privacy in Ubiquitous ComputingConference on User Modeling UM05, 200520052429 July 2005Edinburgh, UK[271]. She presented a short video demonstrating an application of RFID technology to participants, who then responded to a privacy survey. The video scenario provided people with an experience of how the application would work without actually having to build it. Spiekermanns survey included questions on control, choice, and ease-of-use. Analysis identified three main concerns from respondents, namely concerns about further use of collected data, perceived helplessness, and ease-of-use of the technology. In particular, participants were concerned over a loss of control over the technology and uncertainties regarding the technologys utility and effective operation. More realistic demonstrations may help users imagine the everyday operation of a new technology. Melenhorst et al. combined live demonstrations of sensing technologies with interviews probing the perceived usefulness and privacy concerns of the intended users  ADDIN EN.CITE Melenhorst200487987910Melenhorst, A.S. Fisk, A.D.Mynatt, E.D.Rogers, W.A. Potential intrusiveness of aware home technology: Perceptions of older adultsHuman Factors and Ergonomics Society 48th Annual Meeting2662702004HFES Press[208]. Elderly interviewees were shown several home-based ubiquitous computing applications, for example, an activity monitor that distant relatives could use to track the elderly persons activity throughout the day. Interviewees were then asked questions about privacy perceptions and opinions. The results suggested that participants were likely to accept potentially invasive technology given an adequate level of trust in the people managing the technology and safety benefits. According to Spiekerman et al., a fundamental difficulty in probing privacy though scenarios lies in avoiding bias in participants response  ADDIN EN.CITE Spiekermann20051003100310Spiekermann, S.Ziekow, H.RFID: a 7-point plan to ensure privacy13th European Conference on Information Systems2005May 2005ECIS, Regensburg, Germany[273], particularly for applications that do not yet exist. Mobile and Location-Enhanced Technologies We finally explore the problem of understanding user preferences in the domain of mobile and location enhanced applications. In particular, location-enhanced applications have been widely discussed in the media and have been the topic of much research in the fields of security, privacy, systems, and computer networking. Kindberg et al. conducted evaluations to assess peoples perceptions of trust, privacy, and security with respect to electronic payments using wireless point-of-sale terminals in a simulated restaurant setting  ADDIN EN.CITE Kindberg200493893810Tim KindbergAbigail SellenErik GeelhoedNigel DaviesElizabeth MynattItiro SiioSecurity and Trust in Mobile Interactions: a Study of Users' Perceptions and ReasoningUbicomp 2004196213LNCS 32052004September 710, 2004Nottingham, UKSpringer Verlag3-540-22955-8[178]. Their experiment included demonstrations of different payment methods followed by interviews, sorting exercises, and questionnaires devised to elicit privacy and security perceptions and preferences. Their results show that in the users view, privacy is mixed with concerns about convenience and social appropriateness  ADDIN EN.CITE Dourish20051020102027Paul DourishKen AndersonPrivacy, Security and Risk and Danger and Secrecy and Trust and Morality and Identity and Power: Understanding Collective Information Practices2005Irvine, CA, USAInstitute for Software Research, University of California at IrvineTechnical Report UCI-ISR-05-1http://www.isr.uci.edu/tech_reports/UCI-ISR-05-1.pdf[85]. Kindberg et al.s analysis is interesting because they positioned each participant within a privacy perception space defined by the following three dimensions: privacy concerns, desire for convenience, and desire to be socially appropriate. Location technologies have been a hot topic because of the numerous privacy implications and economic interests involved. In many cases, researchers have employed scenario-based questionnaires or experience sampling to probe location disclosure preferences. One study, conducted by Lederer et al., found that people were more likely to make a decision about a location disclosure based on who was asking rather than where the person currently was  ADDIN EN.CITE Lederer200329129110Scott LedererJen MankoffAnind Kumar DeyWho Wants to Know What When? Privacy Preference Determinants in Ubiquitous ComputingExtended Abstracts of CHI 2003, ACM Conference on Human Factors in Computing Systems724-7252003Fort Lauderdale, FL[198]. Barkhuus and Dey employed a diary to perform an interval-contingent study about the location disclosure preferences in location-based applications  ADDIN EN.CITE Barkhuus200380880810Louise BarkhuusAnind DeyLocation-Based Services for Mobile Telephony: a Study of Users Privacy ConcernsInteract 2003, 9th IFIP TC13 International Conference on Human-Computer Interaction7097122003Zurich, SwitzerlandACM Press[35]. This study was based in part on the Active Campus technology developed at UCSD, which includes a location-aware mobile terminal usable within the university campus. In Barkhuus and Deys study, participants were asked to fill out, every evening, a diary entry detailing the perceived usefulness and perceived invasiveness of one of two kinds of location-based applications, with reference to the participants activities during that day. Results showed that an application that tracked the location of the user to send recommendations or inform friends was perceived as more invasive than an application that only reacted to the location of the user to set interface operating parameters, such as ringtone volume. In general, however, users entrust the mobile service provider to provide adequate privacy protection for location information. Kaasinen  ADDIN EN.CITE Kaasinen200333033017Eija KaasinenUser Needs for Location-aware Mobile ServicesPersonal and Ubiquitous ComputingPersonal and Ubiquitous Computing70-79712003[174] conducted user focus groups, interviews, and demonstrations of location-based services to probe their usability and privacy concerns. Kaasinens results show that privacy concerns are often cleared by the trusted relationship between customer and mobile operator, as well as by the oversight of regulatory agencies. These findings suggest that sophisticated technologies devised for protecting location privacy may be unnecessary in the views of most users. It should be noted, though, that Kaasinens participants were all Finnish, and there may be cultural differences in trying to generalize these findings (for example, to cultures that do not have as much trust in governments and corporations). Until recently, many researchers had assumed that a fundamental parameter in the disclosure of location information is the degree of precision of the disclosure (i.e., whether the device discloses complete geographical coordinates or only an approximation, such as the city name). Consolvo et al.s experience sampling study of a location-enhanced person finder, found however, that in most cases, participants did not blur their location to avoid telling others where they were  ADDIN EN.CITE Consolvo200589289210Consolvo, SunnySmith, IanMatthews, TaraLaMarca, AnthonyTabert, J.Powledge, P.Location Disclosure to Social Relations: Why, When, & What People Want to ShareCHI 2005, Conference on Human Factors in Computing Systems82902005ACM Press[65]. Instead, participants would either not respond at all, or provide the other person with the location information that they thought would be most meaningful to the recipient. The findings of Kaasinen and Consolvo et al. diverge from common wisdom in the privacy community. We believe that these studies are compelling examples of why HCI research is important for furthering understanding of end-user privacy concerns. Methodological Issues In this section, we sketch out some of the methodological issues that arise when studying privacy preferences and concerns. The Use of Surveys in Privacy Research Surveys are typically used to probe general opinions about well-known applications (e.g., e-commerce), issues (e.g., identity theft), and concerns (e.g., loss of control). Surveys can be used to efficiently probe the preferences and opinions of large numbers of people, and can provide statistically significant and credible results. However, surveying privacy concerns presents the problem of conveying sufficient and unbiased information to non-expert users so that they can express reasonable and informed preferences. Risk analysis is hard even for experts, let alone individuals unfamiliar with a given domain or application. To address this problem, scenarios have been used to convey contextual information, and can greatly increase the effectiveness and credibility of survey responses, but at the risk of introducing significant bias. A second limitation of privacy surveys, even those employing scenarios, is that they only collect participants attitudes, which may be quite different from actual behavior and thus not as useful for furthering understanding and aiding system design. To increase realism, Experience Sampling Method (ESM) studies can be used to probe individuals feelings, preferences and opinions in a specific setting  ADDIN EN.CITE Wheeler199198798717Wheeler, L.Rois, H.T.Self-Recording of Everyday Life Events: Origins, Types, and UsesJournal of Personality3393555931991[309]. Experience Sampling techniques are defined as interval-, signal- and event-contingent, depending on what initiates the self-report procedure (respectively, the elapsing of a predefined time interval, a signal provided by the researchers, or a specific event involving the participant). Diaries are often used in conjunction with ESM for studying mobile technologies. For example, Barkhuus and Dey employed a diary to perform an interval-contingent study regarding the location disclosure preferences of possible location-based applications  ADDIN EN.CITE Barkhuus200380880810Louise BarkhuusAnind DeyLocation-Based Services for Mobile Telephony: a Study of Users Privacy ConcernsInteract 2003, 9th IFIP TC13 International Conference on Human-Computer Interaction7097122003Zurich, SwitzerlandACM Press[35]. Colbert notes that in diary studies, the participant is asked a hypothetical question about how they would react were their position information obtained, albeit contextualised in an actual rendezvous  ADDIN EN.CITE Colbert20011018101810Martin ColbertA diary study of rendezvousing: implications for position-aware computing and communications for the general public2001 International ACM SIGGROUP Conference on Supporting Group Work15232001Boulder, Colorado, USAACM Presshttp://doi.acm.org/10.1145/500286.500292http://doi.acm.org/10.1145/500286.500292[63]. However, without a working reference technology, recall errors and the hypothetical nature of questions may bias the results. For example, usefulness may be underrated. Consolvo et al. increased the realism of their ESM study using Palm PDAs that simulated location requests from their friends, family and colleagues at random times  ADDIN EN.CITE Consolvo200589289210Consolvo, SunnySmith, IanMatthews, TaraLaMarca, AnthonyTabert, J.Powledge, P.Location Disclosure to Social Relations: Why, When, & What People Want to ShareCHI 2005, Conference on Human Factors in Computing Systems82902005ACM Press[65]. The participants were asked to respond to the request assuming that it had been actually made by the specific individual. However, Consolvo et al. noted that the random simulated requests were often implausible from a social standpoint. To add even more context, Iachello et al. combined event-contingent ESM with experience prototyping  ADDIN EN.CITE Buchenau200094894810Buchenau, M.Suri, J.F.Experience PrototypingDIS 20004244332000ACM Press[56], calling this technique paratyping  ADDIN EN.CITE Iachello200696796710Giovanni IachelloKhai N. Truong Gregory D. Abowd Gillian R. HayesMolly StevensExperience Prototyping and Sampling to Evaluate Ubicomp Privacy in the Real WorldCHI 20062006Montreal, CanadaACM Press[158]. A technique similar to paratyping was developed by Ronagel et al. in the context of IT end-user security evaluation  ADDIN EN.CITE Ronagel19991138113828Ronagel, AlexanderHaux, R.Herzog, W. Mobile und sichere Kommunikation im Gesundheitswesen1999Braunschweig, GermanyVieweg[252]. In related work, Ammenwerth et al. point out that there are inherent tensions in the formative evaluation of IT security mechanisms  ADDIN EN.CITE Ammenwerth19996736735Elske AmmenwerthAnke BuchauerHans-Bernd BludauAlexander RonagelGnter Mller Kai RannenbergSimulation Studies for the Evaluation of Security TechnologyMultilateral Security: Technology, Infrastructure, EconomyMultilateral Security in Communications5475603kollegbuch31999Addison Wesley Longman Verlag GmbH[24]. When testing IT end-user security, users reactions and performance must be evaluated on technology that does not exist, and yet the user must be familiar with the technology. Further, tests should include breakdowns that would be unacceptable if they happened in reality. Ammenwerth et al. describe how they used a simulation study to conduct this kind of evaluation. In simulation studies, a working prototype is tested by real users [performing] realistic tasks in a real social context [and subject to] real attacks and breakdowns  ADDIN EN.CITE Ammenwerth19996736735Elske AmmenwerthAnke BuchauerHans-Bernd BludauAlexander RonagelGnter Mller Kai RannenbergSimulation Studies for the Evaluation of Security TechnologyMultilateral Security: Technology, Infrastructure, EconomyMultilateral Security in Communications5475603kollegbuch31999Addison Wesley Longman Verlag GmbH[24]. Simulation studies are more complicated and expensive than Iachellos paratypes, because they require careful selection of expert participants, extensive briefing to familiarize them with the technology, and complex data collection procedures. For this reason, they are best used at later stages of design. Directly Asking About Privacy versus Observation An important issue that needs to be considered in all techniques for understanding and evaluating privacy is that there is often a difference between what people say they want and what they actually do in practice. For example, in the first part of a controlled experiment by Berendt et al.  ADDIN EN.CITE Berendt200588488417Bettina BerendtOliver GntherSarah Spiekermann Privacy in e-commerce: stated preferences vs. actual behavior Communications of the ACM1011064842005April 2005[44], participants indicated their privacy preferences on a questionnaire. Later, the same participants went through a web-based shopping tour and were much more likely to disclose personal information than previously stated. Their explanation is that participants were enticed in disclosing information in view of potential benefits they would receive. Focus groups can be used to gather privacy preferences  ADDIN EN.CITE Kaasinen200333017Eija KaasinenUser Needs for Location-aware Mobile ServicesPersonal and Ubiquitous ComputingPersonal and Ubiquitous Computing70-79712003Hayes20061015101510Gillian R. HayesGregory D. AbowdTensions in designing capture technologies for an evidence-based care communityProceedings of the SIGCHI conference on Human Factors in computing systems937-9462006Montr\&\#233;al, Qu\&\#233;bec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124911[143, 174]. The advantages and drawbacks of focus groups are well known in the HCI and Software Engineering community and are similar in this context  ADDIN EN.CITE Kontio20041117111710Jyrki KontioLaura LehtolaJohanna BraggeUsing the Focus Group Method in Software Engineering: Obtaining Practitioner and User ExperiencesInternational Symposium on Empirical Software Engineering (ISESE)2004Aug 19-20, 2004Redondo Beach, U.S.A.IEEE Computer Society[185]. We have found that focus groups on privacy have unique drawbacks, including susceptibility to cross-talk between informants and the fact that conventions of social appropriateness may bias responses to questions that an informant may consider sensitive or inappropriate. The latter is especially relevant in the context of privacy. For example, when investigating personal privacy issues between different generations of a family, a focus group with both parents and children will provide poor data. Individual interviews, especially taking appropriate precautions to strengthen informants trust of the researcher, will result in better information  ADDIN EN.CITE March20061156115610March, WendyFleuriot, C.R. GrinterT. RoddenP. AokiE. CutrellR. JeffriesG. OlsonGirls, technology and privacy: "is my mother listening?"SIGCHI Conference on Human Factors in Computing Systems107-1102006April 22 - 27, 2006Montral, Qubec, Canada, ACM Presshttp://doi.acm.org/10.1145/1124772.1124790[206]. Still, interviews have other weaknesses. First, the information that can be gained from an interview is limited by peoples familiarity with a given system. Second, interviews do not scale well. Third, interviews tend to gather what people say, but not always what they do. Fourth, interviews can be subject to interviewer bias, for example if there is a large difference in age or socio-economic status between interviewee and interviewer. Controlled Experiments and Case Studies Controlled experiments can be very useful for understanding privacy behavior and trust determinants. However, it can be difficult to design experiments that are both plausible and elicit realistic responses from credible privacy threats or concerns. One precaution taken by Kindberg et al. was to avoid explicitly mentioning privacy and security to the participants at the outset of the study  ADDIN EN.CITE Kindberg200493893810Tim KindbergAbigail SellenErik GeelhoedNigel DaviesElizabeth MynattItiro SiioSecurity and Trust in Mobile Interactions: a Study of Users' Perceptions and ReasoningUbicomp 2004196213LNCS 32052004September 710, 2004Nottingham, UKSpringer Verlag3-540-22955-8[178]. The rationale was to avoid leading participants into specific privacy concerns, and rather probe the natural concerns of the users. We are not aware of any research proving that participants of studies on privacy should not be explicitly led into privacy or security. However, we believe that this is good precautionary practice, and that the topic of privacy can always be brought up after the experiment. While conducting user studies, it is important to ensure that the tasks used are as realistic as possible, to give greater confidence of the validity of the results. In particular, participants need to be properly motivated to protect their personal information. Participants should also be put in settings that match expected usage. In their evaluation of PGP, Whitten and Tygar asked people to role-play, acting out in a situation that would require secure email  ADDIN EN.CITE Whitten199929529510Alma WhittenJ.D. TygarWhy Johnny Can't Encrypt: A Usability Evaluation of PGP 5.08th USENIX Security Symposium1999[310]. While it is clear that PGP had serious usability defects, it is also possible that participants could have been more motivated if they had a more personal stake in the matter, or could have performed better if they had been placed in an environment with multiple active users of PGP. As another example, in Egelman et als evaluation of Privacy Finder, they discovered that individuals were willing to spend a little more money for privacy, by having participants purchase potentially embarrassing items  ADDIN EN.CITE Egelman20061121112110Serge EgelmanJanice TsaiLorrie Faith CranorAlessandro AcquistiStudying the Impact of Privacy Information on Online Purchase Decisions2006 CHI Privacy Methods Workshop2006Montreal, Quebec, CanadaGideon20061188118810J GideonS EgelmanL F CranorA AcquistiPower Strips, Prophylactics, and Privacy, Oh My!The 2006 Symposium On Usable Privacy and Security (SOUPS 2006)2006Pittsburgh, PA[91, 121]. To make the purchase as realistic as possible, they had participants use their own credit cards (though participants also had the option of shipping the purchased items to the people running the study). This tradeoff in running realistic yet ethical user studies of privacy and security is an ongoing topic of research. The most realistic observations can be obtained from case studies  ADDIN EN.CITE Feng20061045104547Jinjuan FengA Brief Review of Research Methods In Privacy StudiesPrivacy Methods Workshop at CHI 20062006Montreal, Quebec, Canada[105]. Many case studies focus on a specific market or an organizations use or introduction of a specific technology with privacy implications. For example, case studies have also been used to discuss widespread privacy policy violations by US airlines  ADDIN EN.CITE Anton20041046104617Anton, A.He, Q.Baumer, D.The complexity underlying JetBlues privacy policy violationsIEEE Security & Privacy1218262004November/Decembre 2004IEEE[26], the introduction of PKI-based systems in banks  ADDIN EN.CITE Esslinger19996576575Bernhard EsslingerDirk FoxGnter MllerKai RannenbergPublic Key Infrastructures in Banks -- Enterprise-wide PKIsMultilateral Security in Communications: Technology, Infrastructure, EconomyMultilateral Security in Communications283--3003kollegbuch3, PKI1999Addison Wesley Longman Verlag GmbH[96], and the introduction of electronic patient records in healthcare IT systems  ADDIN EN.CITE Ball20031047104717Ball, E.Chadwick, D.W.Mundy, D.Patient privacy in electronic prescription transferIEEE Security & Privacy77801220032003IEEEhttp://dx.doi.org/10.1109/MSECP.2003.1193217[34]. Some researchers have advocated using ethnographic methods, including contextual inquiry  ADDIN EN.CITE Holtzblatt1997117411746Karen HoltzblattHugh BeyerContextual Design : A Customer-Centered Approach to Systems Designs1997Morgan-Kaufmann[148], to address the weaknesses of interviews. The basic idea is to observe actual users in situ, to understand their current practices and to experience their social and organizational context firsthand. Ethnographic methods have been successfully used to study privacy in the context of everyday life  ADDIN EN.CITE Grinter200496296217Grinter, R.Dourish, P.Delgado de la Flor, J.Joseph, M.Security in the wild: user strategies for managing security as an everyday, practical problemPersonal and Ubiquitous ComputingPersonal and Ubiquitous Computing391401862004November 2004Ling20044744746Rich LingThe Mobile Connection : The Cell Phone's Impact on Society3rd2004Morgan KaufmannIto2003102210225Ito, M. Daisuke, O.Ling, R.Pedersen, P.Mobile Phones, Japanese Youth and the Replacement of Social ContactFront Stage/Back Stage: Mobile Communication and the Renegotiation of the Social Sphere20032224 June 2003Grimstad, Norway.[130, 162, 201]. However, committing to this methodological approach requires the researcher to take an exploratory stance which may be incompatible with the tight process requirements of typical IT development. Nevertheless, we believe that this type of exploratory research is important because many privacy issues are still not well understood, and many of our analytical tools still depend on inaccurate and unverified models of individuals behavior. We return on this point in the conclusion. Participatory Design and Privacy Privacy issues can take on a very different meaning within a workplace, where issues of trust, authority, and competition may arise in a way quite different quite different than with family and friends. Participatory design has been used as a way of understanding user needs in such environments, helping to address privacy concerns up front and increasing overall user acceptance of systems. For example, Muller et al. investigated privacy and interpersonal competition issues in a collaborative project management system using participatory design  ADDIN EN.CITE Muller199090090017Michael J. MullerJohn G. SmithJ. Zachary ShoherHarry Goldberg Privacy, anonymity and interpersonal competition issues identified during participatory design of project management groupwareACM SIGCHI Bulletin2311990December 1990[215]. They discovered that specific system features could have contradictory effects on privacy. For example, an alert feature could increase vulnerability to colleagues by letting colleagues set alerts based on ones progress, while simultaneously protecting one from potential embarrassment by letting individuals add alerts based on other peoples alerts (e.g., remind me about this project five days before the earliest alert set on it by anyone else.). This observation is consistent with current sociological thinking, as mentioned earlier in Section 2.3.1  ADDIN EN.CITE Arnold200379879817M. ArnoldOn the phenomenology of technology: the "Janus-faces" of mobile phonesInformation and OrganizationInformation and Organization231256132003Giddens19918498496Giddens, AnthonyModernity and Self-Identity: Self and Society in the Late Modern Age1991Stanford, CA, USAStanford University Press[29, 120]. Participatory design can help uncover and analyze privacy tensions which might go unnoticed at first glance, because representatives of the end-users are involved throughout the design process and can influence technical choices with their values and needs. Clearly, participatory design also carries ethical and political assumptions that may not be appropriate or applicable in all design contexts  ADDIN EN.CITE Spinuzzi20021118111810Spinuzzi, ClayA Scandinavian Challenge, a US Response: Methodological Assumptions in Scandinavian and US Prototyping ApproachesSIGDOC022002October 20-23, 2002Toronto, Ontario, CanadaACM Press[274]. Perhaps due to this reason, we did not find many accounts of the use of participatory design for privacy-affecting applications. Consequently, practitioners should evaluate whether this approach can be carried out or not in their specific context. Ethics and Privacy Finally, we discuss ethical issues arising during the design and development of IT that may impact the privacy of stakeholders, including research participants and users of future technologies. Specifically, we focus on the problems inherent in the representation of users opinions, on informed consent of research subjects, and on the issue of deception of subjects. Many organizations conducting R&D on IT have developed guidelines and procedures to preserve the privacy of research participants and users of prototypes. These guidelines respond to legislation or organizational policy and originate from a long-standing discussion on research ethics. For example, the US Federal Government has issued regulations requiring the protection of research participants privacy, including the confidentiality of collected data, informed consent procedures, and confidentiality of attribution  ADDIN EN.CITE Department of Health and Human Services20011080108050Department of Health and Human Services,National Institutes of Health,Office For Protection from Research Risks,Protection of Human Subjects45 CFR 462001December 13, 2001[82]. Mackay discussed the ethical issues related to the use of videotaping techniques for usability studies and prototype evaluation  ADDIN EN.CITE Mackay199588588510Wendy E. Mackay Ethics, lies and videotape SIGCHI conference on Human factors in computing systems 1995[202]. Drawing on other fields such as medicine and psychology, Mackay suggests specific guidelines for how videos should be captured and used. With respect to research participants privacy, these guidelines cover issues such as informed consent, purposefulness, confidentiality, further use of the video, misrepresentation, and fairness. Many of MacKays suggestions overlap with IRB requirements and constitute a commonly-accepted baseline practice for the protection of participants privacy. In the past few years, however, researchers have voiced concerns from the application of IRB requirements to social, behavioral, and economic research  ADDIN EN.CITE Citro20031082108228Citro, C.F.Iglen, D.R.Marrett, C.B.Protecting Participants and Facilitating Social and Behavioral Sciences Research2003Washington, DC, USANational Academies Press[62]. In the HCI community, researchers face similar challenges. For example, in a study investigating privacy preferences of a ubicomp application, Iachello et al. encountered problems related to consent requirements set by the IRB. In that case, it was essential that the survey procedure be as minimally invasive as possible. However, the information notice required by the IRB disrupted the experience even further than the disruption caused by filling out the survey  ADDIN EN.CITE Iachello200696796710Giovanni IachelloKhai N. Truong Gregory D. Abowd Gillian R. HayesMolly StevensExperience Prototyping and Sampling to Evaluate Ubicomp Privacy in the Real WorldCHI 20062006Montreal, CanadaACM Press[158]. Iachello et al. noted that more concise consent notices would be helpful, though changing standard wording requires extensive collaboration with IRB officials. Further ethical issues are raised by Hudson et al.  ADDIN EN.CITE Hudson20051123112310Hudson, Jim M.Bruckman, AmyH. GellersenUsing Empirical Data to Reason about Internet Research EthicsECSCW 05287-306200518-22 September 2005Paris, FranceSpringer VerlagISBN 1-4020-4022-9[151], who report on a study of privacy in web-based chat rooms. Hudson and Bruckman note that obtaining informed consent from research participants may skew the observations by destroying the very expectations of privacy that are the object of study. Another ethical issue relates to studies involving participant deception. One remarkable study was conducted by Jagatic et al. at Indiana University to study the behavior of victims of phishing schemes. In this IRB-approved study, the researchers harvested freely available data of users of a departmental email system by crawling social network web sites; this allowed the researchers to construct a network of acquaintances for each user. They then sent to these users emails, apparently originating from friends and acquaintances, and asking to input departmental authentication data on a specially set-up web page  ADDIN EN.CITE Jagatic20051079107917Tom JagaticNathaniel JohnsonMarkus JakobssonFilippo MenczerSocial Phishingto appear in Commun. ACM2005December 12, 2005http://www.informatics.indiana.edu/fil/papers.asp[164]a sophisticated phishing scheme. Their results showed remarkable rates of successful deception. Participants were informed of the deception immediately after the study ended and were given the option to withdraw from the study per IRB requirements; a small percentage of participants did withdraw. However, some participants complained vehemently to the researchers because they felt an invasion of privacy and believed that their email accounts had been hacked. Conclusions on Methodology In summary, methodological issues in HCI research relate to privacy in multiple ways. One salient question is whether surveys, focus groups, and interviews should be structured to present both benefits and losses to participants. Clearly, a balanced presentation could elicit very different responses than a partial description. A second ethical question relates to whether uninformed attitudes and preferences should drive design, or whether researchers should only consider actual behavior. These questions are but instances of similar issues identified in user-centered design over the past two decades, but are raised time and again in the context of privacy  ADDIN EN.CITE Cranor20002322325Lorrie Faith CranorJoseph ReagleMark S. AckermanIngo VogelsangBenjamin M. CompaineBeyond Concern: Understanding Net Users' Attitudes About Online PrivacyThe Internet Upheaval: Raising Questions, Seeking Answers in Communications Policy47-702000Cambridge, MAMIT PressSpiekermann200189589510Spiekermann, S.Grossklags, J.Berendt, B.E-privacy in 2nd generation e-commerce: privacy preferences versus actual behaviorACM conference on electronic commerce (EC 2001)38-462001October 2001Tampa, Florida[76, 272]. Stated preferences vs. actual behavior is another important methodological issue. As Acquisti and Groklags point out, individual decision making is not always rational, full information is seldom available, and the topic is often too complex for the typical user to understand  ADDIN EN.CITE Acquisti200595195117Acquisti, AlessandroGroklags, JensPrivacy and Rationality in Individual Decision MakingIEEE Security and Privacy2633312005IEEE[14]. For these reasons, basing system design on the result of surveys may be potentially misleading. Because of the difficulty of probing behavior, techniques that only probe attitudes toward privacy should be used with great care and the results should be interpreted accordingly. Third, privacy can be a difficult topic to investigate from a procedural standpoint. Iachello et al.s and Hudson and Bruckmans experience shows that IRB informed consent requirements may impede achieving the immediacy required for authentic collection of privacy preferences. Second, participant privacy may be violated when following certain protocol designs, even when these protocols are approved by the IRB. We believe that an open discussion on an IRBs role in HCI research on privacy should help evolve current guidelines, often developed for medical-type research, to the dynamic and short-term participant-based research in our field. Prototyping, Building, and Deploying Privacy-Sensitive Applications In this section, we focus on privacy with respect to prototyping, building, and deploying applications. We consider both research on methods (i.e., what processes to use to uncover privacy issues during design) and practical solutions (i.e., what design solutions help protect privacy). Cranor, Hong, and Reiter have sketched out three general approaches to improve user interfaces for usable privacy and security  ADDIN EN.CITE Cranor20061187118712L F CranorJ I HongM ReiterUsable Privacy and Security: Course Overview Lecture NotesMarch 26, 20072006http://cups.cs.cmu.edu/courses/ups-sp06/slides/060117-overview.ppt[74]: Make it invisible. Make it understandable, through better awareness, usability, and metaphors. Train users. These three themes come up repeatedly in the subsections below. It is also worth pointing out user interface advice from Chris Nodder, who was responsible for the user experience for Windows XP Service Pack 2: Present choices, not dilemmas. User interfaces should help people make good choices rather than making them confused about what their options are and obfuscating what the implications of those decisions are. Work on privacy-enhancing interaction techniques is quite extensive and we present it here in several subsections. Early Privacy Enhancing Technologies (PETs) were developed with the intent of empowering users, giving them the ability to determine their own preferences  ADDIN EN.CITE Wolf19996496495Gritta WolfAndreas PfitzmannGnter Mller Kai RannenbergEmpowering Users to Set Their Security GoalsTechnology, Infrastructure, EconomyMultilateral Security in Communications1131353kollegbuch31999Addison Wesley Longman Verlag GmbH[312]. More recent work has taken a holistic and more nuanced approach encompassing architectural and cognitive constraints as well as the user interface. For example, work on identity management and plausible deniability demands that the whole system architecture and user interface be designed with those end-user concerns in mind  ADDIN EN.CITE Pettersson20051068106810John Sren PetterssonSimone Fischer-HbnerNinni DanielssonJenny NilssonMike BergmannSebastian ClaussThomas KriegelsteinHenry KrasemannMaking PRIME UsableSOUPS '052005Pittsburgh, PA, USAACM Press[236]. Finally, the reader will note that the literature relating to interaction techniques for privacy is intertwined with that of usable security. This is because security mechanisms are the basic tools of privacy protection. We limit our discussion to interaction techniques specifically targeted at privacy, ignoring other work on topics such as biometrics and authentication if it is not directly connected with privacy. Finally, we note that there is still a strong need for better tools and techniques for designing, implementing, and deploying privacy-sensitive systems. We discuss these issues as key research challenges in Sections 4.2.2 through 4.2.5. Privacy Policies for Products Publishing a privacy policy is one of the simplest ways of improving the privacy properties of an IT product, such as a web site. Privacy policies provide information to end-users to express informed consent and help products comply with the Openness and Transparency principles of the FIPS. Privacy policies are very popular on the World Wide Web, both in nations that mandate them whenever personal data is collected (e.g., the EU) and where they are used because of market pressure (e.g., in certain industries in the USA). The specific content and format of privacy policies varies greatly between national contexts, markets, and industries. Under many legal regimes, the content of privacy notices is specified by law, and web site publishers have little leeway in writing them. The objective of these laws is to inform the user of his rights and to provide notices that enable informed consent. In other cases, privacy policies are written with the goal of increasing user trust and have a reassuring, rather than objective, tone. Certification programs such as TRUSTe and BBBOnline also mandate certain minimal requirements for privacy policies. These programs also verify that participating web sites comply with their stated policy, although such verification is shallow because the certification programs do not assess the internal processes of the organizations running the web sites. Helping End-Users Understand Privacy Policies There have been extensive efforts to make policies more understandable by consumers, especially for Business-to-Consumer (B2C) e-commerce web sites. However, the results thus far have not been encouraging. Controlled experiments by Good et al. on End-User Licensing Agreements  ADDIN EN.CITE Good200388288210Nathaniel S. GoodAaron Krekelberg Usability and Privacy: a Study of Kazaa P2P File-SharingCHI 20031371442003ACM Presshttp://portal.acm.org/citation.cfm?id=1073001.1073006[127] and by Jensen et al. on web site privacy policies  ADDIN EN.CITE Jensen20051053105317Carlos JensenColin PottsChristian JensenPrivacy practices of Internet users: Self-reports versus observed behaviorInt. J. Human-Computer Studies20322763 2005Elsevier[169] strongly suggest that users tend not to read policies. These studies also indicate that policies are often written in technical and legal language, are tedious to read, and stand in the way of the primary goal of the user (i.e., concluding the transaction). Evidence external to the HCI field confirms this finding. A 2003 report by the EU Commission showed that eight years after the introduction of the EU data protection directive 95/46, the public is still not knowledgeable of its rights under data protection legislation  ADDIN EN.CITE Commission of the European Communities200371571527Commission of the European Communities,First report on the implementation of the Data Protection Directive (95/46/EC)27200315/5/2003Brussels, BelgiumCommission of the European CommunitiesCOM(2003) 265 final[64]. This is remarkable, considering that these rights must be repeated to the users in a mandatory privacy policy every time personal information is collected, and that the user must agree with the policy before the collection can take place. Indeed, the general consensus in the research community is that privacy policies are designed more to shield the operators of IT services from liability than to inform users. Furthermore, Jensen and Pottss evaluation of the readability and usability of privacy policies suggests that current policies are unfit as decision making tools due to their location, content, language, and complexity  ADDIN EN.CITE Jensen200488188110Carlos JensenColin Potts Privacy policies as decision-making tools: an evaluation of online privacy noticesCHI 20042004ACM Press[168]. Users instead tend to receive information about privacy-related topics such as identity theft from the media and trusted sources like expert friends. Multi-level policies have been proposed as one way to increase comprehensibility and the percentage of users reading policies. In 2004, the European Unions committee of data privacy commissioners, also known as the Article 29 Working Party, published a plan calling for EU member states to adopt common rules for privacy policies that are easy for consumers to understand  ADDIN EN.CITE European Commission Article 29 Working Party20041069106927European Commission Article 29 Working Party,Opinion on More Harmonised Information Provisions2004November 25 2004European Commission11987/04/EN WP 100http://www.europa.eu.int/comm/privacy[100]. This plan also called for displaying privacy policies in three layers: short, condensed, and complete. The short privacy policy, only a few sentences long, is meant to be printed on a warranty card or sent via a mobile phone message. It might contain a link to the condensed privacy notice. The condensed privacy policy is a half-page summary of the complete privacy policy. The condensed privacy policy summarizes the most important points, whereas the complete privacy policy might span multiple pages is comprehensive. Experimental evidence suggests that two-level policies are somewhat more successful at influencing users behavior  ADDIN EN.CITE Good20051051105110Nathaniel S. GoodRachna DhamijaJens GrossklagsDavid ThawSteven AronowitzDeirdre MulliganJoseph Konstan Stopping Spyware at the Gate: A User Study of Privacy, Notice and SpywareSymposium On Usable Privacy and Security (SOUPS) 20052005July 6-8, 2005Pittsburgh, PA, USAACM Press[126]. To systematize the wide range of claims contained in privacy policies, Anton and Earp produced a dictionary of privacy claims contained in the privacy policies of 25 major US retailers web sites  ADDIN EN.CITE Anton20041052105217Annie I. AntonJulia B. EarpA requirements taxonomy for reducing Web site privacy vulnerabilitiesRequirements Engineering169-1859200410.1007/s00766-003-0183-z[27]. Similar to Dourish et al.  ADDIN EN.CITE Dourish200488888817Paul DourishBeki E. GrinterJessica Delgado de la FlorMelissa JosephSecurity in the wild: user strategies for managing security as an everyday, practical problemPersonal and Ubiquitous ComputingPersonal and Ubiquitous Computing862004November 2004[86], Anton and Earp used Grounded Theory and goal mining techniques to extract these claims and produced a list of 124 privacy goals. They categorized claims in privacy policies as protection goals (i.e., assertions with the intent of protecting users data privacy) and vulnerabilities (i.e., assertions that describe management practices that may harm user privacy such as sharing of personal information). The privacy goals taxonomy reflects the usual categories of notification, consent, redress, etc., while the vulnerabilities taxonomy includes such issues as data monitoring, aggregation, storage, transfer, collection, personalization, and contact. The emergent picture is that end-user privacy policies are complex instruments which need careful planning, constant updates, and careful drafting to ensure that users read them, understand them, and use them. Obviously, they must reflect to actual organizational practices, which can be a problem especially in rapidly-evolving organizations. Deploying, Managing, and Enforcing Privacy Policies The mere presence of a privacy policy does not mean that it will be enforced. A full treatment of policy enforcement is outside of the scope of this article, but has wide-reaching implications on information systems design and management. Furthermore, different kinds of enforcement procedures exist depending on the data protection legislation and institutions in place. For example, some companies have a Chief Privacy Officer, whose responsibilities may range from public relations to actual involvement in spelling out and enforcing privacy policies. As another example, in the United States, the Federal Trade Commission has been tasked with enforcing the Childrens Online Privacy Protection Act (COPPA), and has actively pursued remedies against businesses that are in violation. Although the management of personal information has not traditionally been the topic of public research, there have recently been several efforts in this field, specifically in two areas: tools for privacy policy creation, enforcement and management, and certification of information management practices. The most significant project in the first area is SPARCLE. The vision of SPARCLE is to provide a bridge between natural language and automatic enforcement systems, such as Tivoli  ADDIN EN.CITE Ashley200291191110P. AshleyM. SchunterC. PowersFrom Privacy Promises to Privacy Management A New Approach for Enforcing Privacy Throughout an EnterpriseNew Security Paradigms WorkshopNSPW2002Virginia Beach, VAACM Press[30]. SPARCLE is currently implemented as a web-based tool for translating privacy policies stated in natural language into machine-readable formats akin P3P  ADDIN EN.CITE Karat20061083108310Karat, Clare-MarieKarat, JohnBrodie, CarolynFeng, JinjuanEvaluating interfaces for privacy policy rule authoringSIGCHI Conference on Human Factors in Computing Systems83-922006April 22 - 27, 2006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124787[176]. The request for this tool came from professionals of IBMs IT services division, suggesting that even expert consultants may find it difficult to write consistent and complete privacy policies. While the difficulties of professionals drafting privacy policies are not documented in academic literature, our own experience coupled with press coverage suggests that the implementation and enforcement of privacy policies within organizations is a pressing and very challenging issue. See, for example, the recent leaks of personal information at Cardsystems  ADDIN EN.CITE Federal Trade Commission20061085108527Federal Trade Commission,In the Matter of CardSystems Solutions, Inc., and Solidus Networks, Inc., Doing Business as Pay By Touch Solutions -- Complaint 2006February 23, 2006Federal Trade CommissionFile No. 052 3148http://www.ftc.gov/os/caselist/0523148/0523148complaint.pdf[104] and Choicepoint  ADDIN EN.CITE Husted20061086108623Bill HustedChoicePoint's fine sets recordAtlanta Journal-Constitution2006January 27, 2006Atlanta, GAWeber20061148114823Harry WeberU.S. trade commission fines data warehouser ChoicePoint over data breachAssociated Press Worldstream
BUSINESS NEWS
2006January 26, 2006Atlanta
[153, 300]. SPARCLE has recently undergone tests to evaluate what type of policy statement input modality is most effective, i.e., free-text, where the user types the policy directly in the system, or guided, through menu selections. These tests were aimed at an expert user population and measured the time necessary to write a policy and the quality of the resulting statements sets  ADDIN EN.CITE Karat20061083108310Karat, Clare-MarieKarat, JohnBrodie, CarolynFeng, JinjuanEvaluating interfaces for privacy policy rule authoringSIGCHI Conference on Human Factors in Computing Systems83-922006April 22 - 27, 2006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124787[176]. The second aspect of privacy management relates to the IT and human systems that process and secure personal data within organizations. Unfortunately, public information on this topic is scarce. Furthermore, except for checklists such as the Canadian Privacy Impact Assessment  ADDIN EN.CITE Treasury Board of the Government of Canada20021092109212Treasury Board of the Government of Canada,Privacy Impact Assessment Policy8/1/20062002http://www.tbs-sct.gc.ca/pubs_pol/ciopubs/pia-pefr/siglist_e.asp[284], general standards are lacking. For example, Iachello analyzed IS17799, a popular information security best practice standard, vis--vis data protection legislation. He found that the IS17799 lacks support for several common data protection requirements found in legislation, such as limitation of use or the development of a privacy policy. As a result, Iachello proposed augmenting the standard with additional requirements specifically aimed at privacy  ADDIN EN.CITE Iachello200369669610Giovanni IachelloProtecting Personal Data: Can IT Security Management Standards Help?ACSAC2662752003Dec. 2003Las Vegas, Nevada, USAIEEE Press0-7695-2041-3[155]. In general, we still see little attention to the problem of managing personal information at the organizational level. Given the attention that the HCI and CSCW communities has devoted to issues such as collaboration and groupware systems, and the progress that has been made in these fields since the 1980s, we believe that HCI research could greatly improve the organizational aspects of personal information management. We believe that the challenge in this field lies in aligning the interests of the research community with the needs of practitioners and corporations. We discuss this point more as an ongoing research challenge in Section  REF _Ref174098610 \r \h 4.4. Helping End-Users Specify Their Privacy Preferences Many applications let people specify privacy preferences. For example, most social networking web sites let people specify who can see what information about them. There are three design parameters for such applications, namely when users should specify preferences, what the granularity of control is, and what the defaults should be. The first question can be reframed by deciding when should pessimistic, optimistic, and interactive style user interfaces be used  ADDIN EN.CITE Grudin200336836813Jonathan GrudinHorvitz, EricPresenting choices in context: approaches to information sharing2003Workshop on Ubicomp communities: Privacy as Boundary Negotiationhttp://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers.htmPovey199934134110Dean PoveyOptimistic Security: A New Access Control Paradigm1999 New Security Paradigms Workshop1999http://security.dstc.edu.au/staff/povey/papers/optimistic.pdf[135, 241]. The goal of a pessimistic style is to prevent security or privacy breakdowns, e.g., denying access to data. For example, some applications ask users to specify privacy preferences immediately after installation. However, defining configurations and policies upfront, before starting to use a product, may be difficult for users because the definition process is taken out of context, when the user does not have sufficient information to take a reasoned decision. The goal of the optimistic style is to help end-users detect misuses and then fix them afterwards. An employee might allow everyone in her work group to see her location, but may add security and privacy rules if she feels a specific individual is abusing such permissions. This kind of interaction style relies on social translucency to prevent abuses. For example, Alice is less likely to repeatedly query Bobs location if she knows that Bob can see each of her requests. Section  REF _Ref163703199 \r \h 3.3.8 discusses social translucency in more detail. The goal of the interactive style is to provide enough information for end-users to make better choices, helping them avoid security and privacy violations as well as overly permissive security policies. An example is choosing whether to answer a phone call given the identity of the caller. Here, people would be interrupted for each request and would make an immediate decision. One refinement of this idea is to let end-users defer making privacy choices until they are more familiar with the system, similar to the notion of safe staging introduced by Whitten and Tygar  ADDIN EN.CITE Whitten199929529510Alma WhittenJ.D. TygarWhy Johnny Can't Encrypt: A Usability Evaluation of PGP 5.08th USENIX Security Symposium1999[310]. A refinement of this concept are Just-In-Time Click-Through Agreements (JITCTA) adopted by the EU PISA project  ADDIN EN.CITE Patrick20031000100010Patrick, Andrew S.Kenny, SteveRoger DingledineFrom Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer InteractionsPET 2003107124LNCS 27602003Springer Verlag[235], and later by the EU PRIME PRivacy and Identity Management for Europe project  ADDIN EN.CITE Pettersson20051068106810John Sren PetterssonSimone Fischer-HbnerNinni DanielssonJenny NilssonMike BergmannSebastian ClaussThomas KriegelsteinHenry KrasemannMaking PRIME UsableSOUPS '052005Pittsburgh, PA, USAACM Press[236]. JITCTA are presented to the user at a time when he or she can take an informed decision on her privacy preferences. However, Petterson et al. note that users may be induced to automate their consent clicks when presented with multiple instances of click through agreements over time, without really reading their contents  ADDIN EN.CITE Pettersson20051068106810John Sren PetterssonSimone Fischer-HbnerNinni DanielssonJenny NilssonMike BergmannSebastian ClaussThomas KriegelsteinHenry KrasemannMaking PRIME UsableSOUPS '052005Pittsburgh, PA, USAACM Press[236]. It is likely that all three styles are needed in practice, but the optimal mix that balances control, security and ease of use is currently unclear. Furthermore, some domains may have constraints that favor one style over another. With respect to the granularity of control, Lederer et al. argue that applications should focus more on providing simple coarse-grained controls rather than fine-grained ones, because coarse-grained controls are simpler to understand and use  ADDIN EN.CITE Lederer2005118111815S LedererJ I HongA DeyJ A LandayS. GarfinkelL. F. CranorFive Pitfalls in the Design for PrivacySecurity and Usability421-445862005[196]. For example, providing simple ways of turning a system on and off may be more useful than complex controls that provide flexibility at the expense of usability. Lau et al. take a different path, distinguishing between extensive and intensional privacy interfaces  ADDIN EN.CITE Lau19991040104017Tessa LauOren EtzioniDaniel S. WeldPrivacy interfaces for information managementCommunications of the ACM88-9442101999[194]. In the context of sharing web browser histories in a collaborative setting, they defined extensive interfaces as those where individual data items (i.e., each URL) are labeled as private or public. In their prototype, this was done by toggling a traffic light widget on the browser. In contrast, intensional privacy interfaces allow the user to define an entire set of objects that should be governed by a single privacy policy. In their prototype, this was accomplished with access control rules indicating public or private pages, based on specific keywords or URLs, with optional wildcards. The third design choice is specifying the default privacy policies. For example, Palen found that 81% of corporate users of a shared calendar kept the default access settings, and that these defaults had a strong influence on the social practices that evolved around the application  ADDIN EN.CITE Palen199921121117Leysia PalenSocial, Individual and Technological Issues for Groupware Calendar SystemsCHI Letters: Human Factors in Computing Systems, CHI 9917-24211999[231]. Agre and Rotenberg note a similar issue with Caller ID  ADDIN EN.CITE Agre19972892896Philip E. AgreMarc RotenbergTechnology and Privacy: The New Landscape1997Cambridge MAMIT Press[19]. They note that if CNID [i.e., Caller ID] is blocked by default then most subscribers may never turn it on, thus lessening the value of CNID capture systems to marketing organizations; if CNID is unblocked by default and the blocking option is inconvenient or little-known, callers' privacy may not be adequately protected. In short, while default settings may seem like a trivial design decision, they can have significant impact in whether people adopt a technology and how they use it. There is currently no consensus in the research community as to when coarse-grained versus fine-grained controls are more appropriate and for which situations, and what the defaults should be. It is likely that users will need a mixture of controls, ones that provide the right level of flexibility with the right level of simplicity for the application at hand. Machine-Readable Privacy Preferences and Policies Given that most users may not be interested in specifying their privacy policy, another line of research has attempted to automate the delivery and verification of policies for web sites. The most prominent work in this area is the Platform for Privacy Preferences Protocol (P3P). P3P lets web sites transmit policy information to web browsers in a machine-readable format. Users can then view policies in a standard format and then decide whether to share personal information  ADDIN EN.CITE Cranor200014514512Lorrie CranorMarc LangheinrichMassimo MarchioriMartin Presler-MarshallJoseph ReagleThe Platform for Privacy Preferences 1.0 (P3P1.0) specification.December 2000W3C Recommendation 16 April 20022000W3Chttp://www.w3.org/TR/P3P/[72]. Users can also set up their web browser to automate this process of sharing. It is worth noting that the idea of a machine-readable privacy policy has been extended to other domains. For example, both Ackerman and Langheinrich proposed using labeling protocols similar to P3P for data collected in ubiquitous computing environments, to communicate such things as what location data about individuals is available, what kinds of things the environment would record, etc.  ADDIN EN.CITE Ackerman200490690617Ackerman, Mark S.Privacy in pervasive environments: next generation labeling protocolsPers Ubiquit Comput43043982004DOI 10.1007/s00779-004-0305-8Langheinrich200224224210Marc LangheinrichA Privacy Awareness System for Ubiquitous Computing EnvironmentsUbicomp 2002237-2452002Goteberg, Sweden[8, 190]. Although P3P was developed with feedback from various industrial stakeholders, it has been a hotly contested technology (see Hochheiser for an extensive discussion of the history of P3P  ADDIN EN.CITE Hochheiser200292792717Hochheiser, HarryThe platform for privacy preference as a social protocol: an examination within the US policy contextACM Trans Internet Technol276306242002[147]). One principled criticism is that automating privacy negotiations may work against users interests and lead to loss of control. Ackerman notes that most users do not want complete automaticity of any private data exchange. Users want to okay any transfer of private data.  ADDIN EN.CITE Ackerman199989489410Ackerman, M.S.Cranor, L. Reagle, J.Privacy in e-commerce: examining user scenarios and privacy preferencesACM conference on electronic commerce (EC99)181999November 1999Denver, Colorado[9] In practice, P3P has not yet been widely adopted. Egelman et al. indicate that, out of a sample of e-commerce web sites obtained through Googles Froogle web site in 2006 (froogle.google.com), only 21% contained a P3P policy  ADDIN EN.CITE Egelman20061044104410Serge EgelmanLorrie Faith CranorAbdur ChowdhuryAn Analysis of P3P-Enabled Web Sites among Top-20 Search ResultsEighth International Conference on Electronic Commerce2006August 14-16, 2006Fredericton, New Brunswick, Canada[90]. Reasons may include lack of enforcement  ADDIN EN.CITE Electronic Privacy Information Center (EPIC)200023523512Electronic Privacy Information Center (EPIC),JunkbustersPretty Poor Privacy: An Assessment of P3P and Internet Privacy2000June 2000http://www.epic.org/reports/prettypoorprivacy.html[93], lack of motivation to adopt stringent policy automation by commercial players  ADDIN EN.CITE Hochheiser200292792717Hochheiser, HarryThe platform for privacy preference as a social protocol: an examination within the US policy contextACM Trans Internet Technol276306242002[147], and the lack of appropriate user interfaces for delivering the P3P policy to users and involving them in the decision processes  ADDIN EN.CITE Ackerman199913213210Mark S. AckermanLorrie F. CranorPrivacy Critics: Safe-guarding Users' Personal DataHuman Factors in Computing Systems: CHI '99258-2591999[10]. In our view, there are three main roadblocks to the adoption of P3P. The first issue relates to the ability of users to define and control their preferences intuitively. This difficulty could be addressed through enhancements to the user interface of web browsers. For example, Microsoft Internet Explorer 6.0 only has rudimentary support for P3P privacy preferences, letting end-users simply manage how cookies are sent. Some solutions to this roadblock are discussed in the following section. The second roadblock is that users may not be sufficiently motivated to use these technologies. Many users do not understand the issues involved in disclosing personal information, and may simply decide to use a service based on factors such as the benefit the service offers, branding, and social navigation. We believe that there are many research opportunities here in the area of understanding user motivation with respect to privacy. The third roadblock is that many web sites owners may not have strong economic, market, and legal incentives for deploying these technologies. For example, they may feel that a standard text-based privacy policy may be sufficient for their needs. Web site owners may also not desire a machine-readable privacy policies, because it eliminates ambiguity and thus potential flexibility in how user data may be used. Privacy Agents From a data protection viewpoint, a privacy decision is made every time a user or a device under her control discloses personal information. The increasing ubiquity and frequency of information exchanges has made attending to all such decisions unmanageable. User interfaces for privacy were developed in part to cater to the users inability to handle the complexity and sheer volume of these disclosures. Early work focused on storing user privacy preferences and automating exchanges of personal data excluding the user from the loop. An example of this is APPEL, a privacy preferences specification language developed by Cranor et al. which can be used to describe and exchange personal privacy preferences  ADDIN EN.CITE Cranor20021119111912Cranor, Lorrie F.Langheinrich, MarcMarchiori, MassimoA P3P Preference Exchange Language 1.0 (APPEL1.0)April 2002.2002World Wide Web Consortium Working Drafthttp://www.w3.org/TR/WD-P3P-Preferences[75]. When this model was not widely adopted, researchers started investigating the causes. Ackerman et al. noted that users want to be in control for every data exchange of relevance  ADDIN EN.CITE Ackerman199989489410Ackerman, M.S.Cranor, L. Reagle, J.Privacy in e-commerce: examining user scenarios and privacy preferencesACM conference on electronic commerce (EC99)181999November 1999Denver, Colorado[9]. The concept of Privacy Critics brings the user back in the loop. Critics are agents that help guide the user in making good privacy choices  ADDIN EN.CITE Ackerman199913213210Mark S. AckermanLorrie F. CranorPrivacy Critics: Safe-guarding Users' Personal DataHuman Factors in Computing Systems: CHI '99258-2591999[10] and were introduced by Fischer et al. in the context of software design  ADDIN EN.CITE Fischer19901042104210Fischer, G.Lemke, A. C.Mastaglio, T.Morch, A. I.J. C. ChewJ. WhitesideUsing critics to empower usersConference on Human Factors in Computing Systems337-3471990Seattle, WA, USAACM Press, New York, NYhttp://doi.acm.org/10.1145/97243.97305[107]. Rather than automating decisions, Privacy Critics warn the user when an exchange of personal data is going to happen. It should be noted that modern web browsers have incorporated the concept of critic for other kinds of data transactions, e.g., displaying non-secure pages and accepting dubious PKI certificates. However, it is also worth pointing out that these types of dialog tend to be ignored by users. This issue is discussed in Section 4.2 as an open challenge for future work. Following this line of research, Cranor et al. developed an agent called Privacy Bird  ADDIN EN.CITE Cranor20061070107017Lorrie CranorP. GuduruM. ArjulaUser Interfaces for Privacy AgentsTo appear in ACM Transactions on Computer-Human Interaction2006[71]. Privacy Bird compares a web sites P3P policy with a users privacy preferences and alerts the user to any mismatches. In designing Privacy Bird, precautions were taken to increase the comprehensibility of the privacy preferences user interface, keeping only the relevant elements of P3P, removing jargon, and grouping items based on end-user categories rather than on P3P structure. Cranor et al. evaluated Privacy Bird according to Bellotti and Sellens feedback and control criteria  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43], and found that users of Internet Explorer with Privacy Bird were more aware about the privacy policies of web sites than those without the Privacy Bird. In related work, Cranor et al. also developed a search engine that prioritizes search results based on their conformance to the policy defined by the user  ADDIN EN.CITE Byers20041189118910S ByersL F CranorD KormannP McDanielSearching for Privacy: Design and Implementation of a P3P-Enabled Search EngineWorkshop on Privacy Enhancing Technologies (PET2004)2004[57]. An evaluation of this privacy-sensitive search engine showed that when privacy policy information is readily available and can be easily compared, individuals may be willing to spend a little more for increased privacy protection, depending on the nature of the items to be purchased  ADDIN EN.CITE Egelman20061121112110Serge EgelmanJanice TsaiLorrie Faith CranorAlessandro AcquistiStudying the Impact of Privacy Information on Online Purchase Decisions2006 CHI Privacy Methods Workshop2006Montreal, Quebec, CanadaGideon20061188118810J GideonS EgelmanL F CranorA AcquistiPower Strips, Prophylactics, and Privacy, Oh My!The 2006 Symposium On Usable Privacy and Security (SOUPS 2006)2006Pittsburgh, PA[91, 121]. Identity Management and Anonymization The concept of privacy assistants is also central to work by Rannenberg et al. and Jendricke and Gerd tom Markotten on reachability managers  ADDIN EN.CITE Rannenberg200068768710Rannenberg, KaiMultilateral Security: A Concept and Examples for Balanced SecurityNew Security Paradigms Workshop1511622000Ballycotton, IrelandACM Press Jendricke200042142147Uwe JendrickeGerd tom Markotten, Daniela Usability Meets Security: The Identity-Manager as Your Personal Security Assistant for the Internet16th Annual Computer Security Applications Conference (ACSAC 00)2000December 2000New Orleans, LA, USA[166, 246]. Jendricke and Gerd tom Markotten claim that PETs can help people negotiate their privacy boundary by associating different privacy profiles with several digital identities. In this model, users can dynamically define and select privacy profiles, for example, based on the current activity of the user, the web site visited, or the current desktop application used. The interface provides an unobtrusive cue of the current selected identity so that the user can continuously adjust her status. However, it is not clear whether a profile-based approach can simplify privacy preferences. Users may forget to switch profiles, as happens with profiles on cell phones and away messages on IM. Studying user interfaces for managing profiles of ubiquitous computing environments, Lederer et al. found that participants had difficulty predicting what information would actually be disclosed  ADDIN EN.CITE Lederer2005118111815S LedererJ I HongA DeyJ A LandayS. GarfinkelL. F. CranorFive Pitfalls in the Design for PrivacySecurity and Usability421-445862005[196]. Furthermore, Cadiz and Gupta, in their analysis of sharing preferences in collaborative settings, discovered that sharing personal information is a nuanced activity  ADDIN EN.CITE Cadiz200133333327JJ CadizAnoop GuptaPrivacy Interfaces for Collaboration2001Redmond, WAMicrosoft ResearchMSR-TR-2001-82http://www.research.microsoft.com/research/coet/Privacy/TRs/01-82.pdf[58]. The concept of profiles has been further developed into the more general idea of identity management. Here, users have several identities, or personas, which can be used to perform different online transactions. For example, users could have an anonymous persona to surf general web sites, a domestic persona for accessing retail web sites, and an office persona for accessing corporate intranets. Decoupling personas from individuals can reduce the information collected about a single individual. However, identity management technologies are rather complex. So far, allowing easy definition of policies and simple awareness active personas has proven to be a difficult task. Various designs for identity management have been developed. For example, Boyds Faceted Id/entity system uses a technique similar to Venn diagrams to explicitly specify different groups and people within those groups  ADDIN EN.CITE Boyd200242242232danah boydFaceted Id/entity: Managing representation in a digital worldMIT Media Lab2002Cambridge, MAMITMaster's Thesis[48]. The EU PRIME project has also explored various user interfaces for identity management, including menu-based approaches, textual/graphic interfaces, and more sophisticated animated representations that leverage a town map metaphor  ADDIN EN.CITE Pettersson20051068106810John Sren PetterssonSimone Fischer-HbnerNinni DanielssonJenny NilssonMike BergmannSebastian ClaussThomas KriegelsteinHenry KrasemannMaking PRIME UsableSOUPS '052005Pittsburgh, PA, USAACM Press[236]. Graphical metaphors are often used with other PETs, e.g., using images of keys, seals, and envelopes for email encryption. However, researchers agree that representing security and privacy concepts often fails due to their abstract nature. For example, Pettersson et al. evaluated alternative user interfaces for identity management, and concluded that it is difficult to develop a uniform and understandable vocabulary and set of icons that support the complex transactions involved in identity management and privacy management. The Challenges of Complex PET UIs The problem of developing appropriate interfaces for configuration and action is common to other advanced PETs, such as anonymization tools like JAP, ZeroKnowledge, Anonymizer, and Freenet. Freenet, an anonymizing web browsing and publishing network based on a Mix architecture  ADDIN EN.CITE Chaum198151751717D. ChaumUntraceable Electronic Mail, Return Addresses, and Digital PseudonymsCommunications of the ACMCommunications of the ACM8488242mix replay attack1981[60], was hampered by the lack of a simple interface. Recently, the developers of Tor, another anonymizing network based on onion routing  ADDIN EN.CITE Goldschlag199970270217Goldschlag, D.M. Reed, M.G.Syverson, P.F.Onion routing for anonymous and private internet connectionCommun ACM39414221999[125], acknowledged this problem and issued a grand challenge to develop a usable interface for the system. Whatever their technical merits, anonymization systemsboth free and commercialhave not been widely adopted, meeting commercial failure in the case of ZeroKnowledge  ADDIN EN.CITE Goldberg20021122112210Goldberg, IanPrivacy-enhancing technologies for the Internet, II: Five years later2002[124] and government resistance in other cases (e.g., JAP). Repeated failures in developing effective user interfaces for advanced PETs may be a sign that these technologies are best embedded in the architecture of the network or product and operated automatically. They should not require installation, maintenance, and configuration. As an example, consider the success of SSL in HTTP protocols versus the failure of email encryption. The underlying technology is quite similar, though email encryption is not automatic and has not seen widespread adoption. Ubiquitous computing technologies present further challenges for the protection of users privacy. Location privacy has been a hot topic on the media and the research community following the development of mobile phone networks and the E911 location requirements. There have been several technological solutions for protecting users privacy in mobile networks. For example, Beresford and Stajano propose the idea of Mix zones, where users are not location tracked with their real identity but with a one-time pseudonym  ADDIN EN.CITE Beresford200370070017Beresford, A.R.Frank StajanoLocation Privacy in Pervasive ComputingIEEE Pervasive Computing4655212003IEEE Press[45]. Gruteser and Grunwald also proposed location-based services that guarantee k-anonymity  ADDIN EN.CITE Gruteser200470470410Gruteser, MarcoGrunwald, DirkA Methodological Assessment of Location Privacy Risks in Wireless Hotspot NetworksSecurity in Pervasive Computing Conference1024LNCS 28022004Boppard, GermanySpringer Verlaghttp://www.springerlink.com/openurl.asp?genre=article&id=0H4KDUY2FXYPA3JC [136]. Beresford and Stajano claim that using Mix technology for cloaking location information enables interesting applications without disclosing the identity or the movement patterns of the user. Tang et al. suggest that many location-based applications can still work in a system where the identities of the disclosing parties are anonymouse.g., just to compute how busy a place is, such as a part of a highway or a caf  ADDIN EN.CITE Tang20061043104310Tang, K. P.Keyani, P.Fogarty, JamesHong, Jason I.Putting people in their place: an anonymous and privacy-sensitive approach to collecting sensed data in location-based applicationsConference on Human Factors in Computing Systems93-1022006Montral, Qubec, CanadaACM Press, New York, NYhttp://doi.acm.org/10.1145/1124772.1124788[281]. Yet, it is not clear whether anonymization technologies will be ever widely adopted. On the one hand, network service providers act as trusted third parties and are bound by contractual and legislative requirements to protect the location information of users, reducing the commercial motivation of strong PETs. In other words, location privacy may already be good enough. On the other hand, location-based services are not in widespread use, and privacy frictions could arise as more people use these services. In general, we see a good potential for HCI research in this area. End-User Awareness of Personal Disclosures Initially focused on network applications (e.g., World Wide Web and instant messaging), work on disclosure awareness has expanded into areas such as identity management systems, privacy agents, and other advanced PETs. Browser manufacturers have developed artifacts such as the lock icon, specially colored address bars, and security warnings to provide security awareness in browsing sessions. Friedman et al. developed user interfaces to show to end-users what cookies are used by different web sites  ADDIN EN.CITE Friedman200228628610Batya FriedmanDaniel C. HoweEdward FeltenInformed Consent in the Mozilla Browser: Implementing Value-Sensitive DesignThe Thirty-Fifth Annual Hawai'i International Conference on System SciencesCD-ROM of full-paper, OSPE1012002IEEE Computer Society[113]. However, there are few published studies on the effectiveness of these mechanisms. Few notable exceptions include Friedman et al.s study showing the low recognition rate of secure connections by diverse sets of users  ADDIN EN.CITE Friedman20021058105810Batya FriedmanDavid HurleyDaniel C. HoweEdward FeltenHelen NissenbaumUsers' conceptions of web security: a comparative studyCHI '02 extended abstracts on Human factors in computing systems2002April 20-25, 2002Minneapolis, Minnesota, USA[114], and Whalen and Inkpens experiments on the effectiveness of security cues (the lock icon) in web surfing sessions  ADDIN EN.CITE Whalen200589889810Tara WhalenKori M. InkpenPrivacy and security awareness: Gathering evidence: use of visual security cues in web browsers2005 conference on Graphics interface GI '052005May 2005Canadian Human-Computer Communications Society[308]. Whalen and Inkpen used eye-tracking techniques to follow users focus of view when interacting with web sites. The results indicate that users do not look at, or interact with, the lock icon to verify certificate information. Furthermore, they showed that even when viewed, certificate information was not helpful to the user in understanding whether the web page is authentic or not. Recently, interaction techniques for awareness have been developed in the context of ubiquitous computing, because the lack of appropriate feedback is exacerbated by the often-invisible nature of these technologies  ADDIN EN.CITE Weiser199791915Mark WeiserJohn Seely BrownPeter DenningRobert MetcalfeThe Coming Age of Calm TechnologyBeyond Calculation: The Next Fifty Years of Computing1997New YorkSpringer-Verlag[302]. Nguyen and Mynatt observed that in the physical world, people can use mirrors to see how others would see them. Drawing on this analogy, they introduced the idea of Privacy Mirrors, artifacts that can help people see what information might be shared with others. According to Nguyen and Mynatt, technology must provide a history of relevant events, feedback about privacy-affecting data exchanges, awareness of ongoing transactions, accountability for the transactions, and the ability to change privacy state and preferences. This framework was used to critique a multi-user web-based application and to develop original design ideas for it  ADDIN EN.CITE Nguyen200115215210David H. NguyenElizabeth D. MynattPrivacy Mirrors: Making Ubicomp VisibleHuman Factors in Computing Systems: CHI 2001 (Workshop on Building the User Experience in Ubiquitous Computing)2001Seattle, WAACM Press[225]. However, the Privacy Mirrors concept itself was not formally evaluated. An interesting variant of the Privacy Mirror concept is the peripheral privacy notification device developed by Kowitz and Cranor  ADDIN EN.CITE Kowitz20051038103810Braden KowitzLorrie CranorPeripheral Privacy Notifications for Wireless NetworksWorkshop On Privacy In The Electronic Society '0590962005Alexandria, VA, USA[186]. In this system, a display located in a shared workplace shows words taken from unencrypted chats, web browsing sessions, and emails transiting on the local wireless network. Kowitz and Cranor carefully designed this awareness device so that only generic words are anonymously projected on the display (i.e., no personal names), and words are selected out of context so that the meaning of the phrase is likely not intelligible by others. Kowitz and Cranor assessed the reactions of users through interviews and questionnaires before and after the deployment of the device. The self-reported results indicate that the users of the wireless network became more aware of the unencrypted wireless network, but did not change their usage behavior. Kowitz and Cranor note that the change in perception was likely due to the awareness display since participants already knew that wireless traffic was visible to eavesdroppers. However, awareness was not tied to any actionable items, as the system did not suggest what steps one could take to protect oneself. A key design issue in awareness user interfaces is how to provide meaningful notifications that are not overwhelming nor annoying. Good et al. showed that end-users typically skip over end-user license agreements  ADDIN EN.CITE Good20051051105110Nathaniel S. GoodRachna DhamijaJens GrossklagsDavid ThawSteven AronowitzDeirdre MulliganJoseph Konstan Stopping Spyware at the Gate: A User Study of Privacy, Notice and SpywareSymposium On Usable Privacy and Security (SOUPS) 20052005July 6-8, 2005Pittsburgh, PA, USAACM Press[126]. Many users also ignore alert boxes in their web browsers, having become inured to them. Currently, there is no strong consensus in the research community or in industry as to how these kinds of user interfaces for awareness should be built. This issue is discussed as a key challenge for future work in Section  REF _Ref174099256 \r \h 4.1. For further reading, we suggest Brunks overview of privacy and security awareness systems  ADDIN EN.CITE Brunk20021192119212Benjamin BrunkUnderstanding the Privacy SpaceMarch 27, 20072002http://www.firstmonday.org/issues/issue7_10/brunk/[54] and Lederers examples of feedback systems of privacy events in the context of ubiquitous computing  ADDIN EN.CITE Lederer200339539532Scott LedererDesigning Disclosure: Interactive Personal Privacy at the Dawn of Ubiquitous ComputingComputer Science Division2003Berkeley, CAUniversity of California, BerkeleyMaster of Sciencehttp://www.cs.berkeley.edu/projects/io/publications/privacy-lederer-msreport-1.01-no-appendicies.pdf[195]. Interpersonal Awareness An alternate use of the term awareness relates to the sharing of information about individuals in social groups to facilitate communication or collaboration. This type of sharing occurs for example in communication media, including videoconferencing  ADDIN EN.CITE Gaver199285085010Gaver, WilliamMoran, TomMacLean, A.Lovstrand, L.Dourish, P. Carter, K.Buxton, WilliamRealizing a Video Environment: EuroPARC's RAVE SystemCHI'9227351992May 1992Monterey, CA, USAACM PressSohlenkamp19941137113710Markus SohlenkampGreg ChwelosIntegrating Communication, Cooperation, and Awareness: The DIVA Virtual Office EnvironmentCSCW 94 199410/94Chapel Hill, NC, USAACM Press[118, 269], group calendars  ADDIN EN.CITE Begole20021136113610James BegoleJohn C. TangRandall B. SmithNicole YankelovichWork Rhythms: Analyzing Visualizations of Awareness Histories of Distributed GroupsCSCW022002November 16-20, 2002New Orleans, LA, USAACM PressTullio20051127112732Tullio, J. C.Exploring the Design and Use of Forecasting Groupware Applications with an Augmented Shared CalendarCollege of Computing2005Atlanta, GA, USAGeorgia Institute of TechnologyUMI Order Number: AAI3170117Doctoral Thesishttp://etd.gatech.edu[39, 287], and synchronous communications  ADDIN EN.CITE Begole20041129112910Begole, J.B.Matsakis, N.E.Tang, J.C.Lilsys: Sensing UnavailabilityConference on Computer Supported Cooperative Work2004ChicagoACM PressPatil20041021102110Sameer Patil Alfred KobsaInstant Messaging and PrivacyHCI 200485882004Leeds, UK[40, 233]. One example of awareness system is RAVE, developed in the late 1980s at EuroPARC  ADDIN EN.CITE Gaver199285085010Gaver, WilliamMoran, TomMacLean, A.Lovstrand, L.Dourish, P. Carter, K.Buxton, WilliamRealizing a Video Environment: EuroPARC's RAVE SystemCHI'9227351992May 1992Monterey, CA, USAACM Press[118]. RAVE was an always on audio/video teleconferencing and awareness system. Based on the RAVE experience, Bellotti and Sellen wrote an influential paper presenting a framework for personal privacy in audio-video media spaces  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43] (see Section  REF _Ref174099272 \r \h 3.5.2). RAVE provided visible signals of the operation of the video camera to the people being observed, to compensate the disembodiment of the observer-observed relationship. Moreover, Bellotti and Sellen also suggested leveraging symmetric communication to overcome privacy concerns. Symmetric communication is defined as the concurrent exchange of the same information in both directions between two individuals (e.g., both are observers and observed). Providing feedback of information flows and allowing their control is a complex problem. Neustaedter and Greenbergs media space is a showcase of a variety of interaction techniques. To minimize potential privacy risks, they used motion sensors near a doorway to detect other people, weight sensors in chairs to detect the primary user, physical sliders to control volume, and a large physical button to easily turn the system on and off  ADDIN EN.CITE Neustaedter20031194119410C NeustaedterS GreenbergThe Design of a Context-Aware Home Media SpaceFifth International Conference on Ubiquitous Computing (UBICOMP 2003)297-314LNCS Vol 28642003Springer-Verlag3-540-20301-X[222]. Hudson and Smith proposed obfuscating media feeds by using filters on the video and audio  ADDIN EN.CITE Hudson199686886810Hudson, S.Smith, IanTechniques for Addressing Fundamental Privacy and Disruption Tradeoffs in Awareness Support SystemsCSCW 962482571996ACM Press0-89791-765-0http://doi.acm.org/10.1145/240080.240295[152]. These filters include artificial shadows in the video image as well as muffled audio. While they did not evaluate these privacy-enhancing techniques, Hudson and Smith posited that privacy and usefulness had to be traded off to achieve an optimal balance. Boyle et al. also proposed video obfuscation to protect privacy for webcams in homes  ADDIN EN.CITE Boyle200086786710Boyle, M.Edwards, C.Greenberg, S.The effects of filtered video on awareness and privacyACM CSCW 20001102000ACM PressNeustaedter20031060106010Carman NeustaedterSaul GreenbergA.K. Dey et al.The Design of a Context-Aware Home Media Space for Balancing Privacy and AwarenessUbiComp 2003297-314LNCS 28642003Springer Verlag[49, 223]. However, evaluation by Neustaedter et al. showed that obfuscation neither increased users confidence in the technology nor their comfort level  ADDIN EN.CITE Neustaedter2005in press105617Neustaedter, C.Greenberg, S.Boyle, M.Blur Filtration Fails to Preserve Privacy for Home-Based Video Conferencingto appear in ACM Transactions on Computer Human Interactions (TOCHI)2005, in press[224]. It is thus not clear whether obfuscation techniques, which are based on an information-theoretic view (i.e., disclosing less information increases privacy), actually succeed in assuring users that their privacy is better protected. The idea of blurring information was also proposed in the domain of location information  ADDIN EN.CITE Price20051066106617Blaine A. PriceKarim AdamBashar NuseibehKeeping ubiquitous computing to yourself: A practical model for user control of privacyInt. J. Human-Computer Studies22825363 2005Elsevierhttp://dx.doi.org/10.1016/j.ijhcs.2005.04.008Duckham200594394310Duckham, MattKulik, LarsHans W. GellersenRoy WantAlbrecht SchmidtA Formal Model of Obfuscation and Negotiation for Location PrivacyPervasive 2005Lecture Notes in Computer Science152170LNCS 34682005Munich, GermanySpringer Verlaghttp://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11428572_10 [87, 242]. However, the results of Neustaedter et al. for video are paralleled by results by Consolvo et al. in location systems  ADDIN EN.CITE Consolvo200589289210Consolvo, SunnySmith, IanMatthews, TaraLaMarca, AnthonyTabert, J.Powledge, P.Location Disclosure to Social Relations: Why, When, & What People Want to ShareCHI 2005, Conference on Human Factors in Computing Systems82902005ACM Press[65]. Consolvo et al. discovered that users disclosing location seldom make use of blurring (i.e., disclosing an imprecise location, such as the city instead of a street address), in part for lack of need and because of the increased burden on usability. Tang et al. suggest using Hitchhiking as an alternative approach: rather than modulating the precision of location disclosures, the identity of the disclosing party along with any sensed data is anonymized  ADDIN EN.CITE Tang20061043104310Tang, K. P.Keyani, P.Fogarty, JamesHong, Jason I.Putting people in their place: an anonymous and privacy-sensitive approach to collecting sensed data in location-based applicationsConference on Human Factors in Computing Systems93-1022006Montral, Qubec, CanadaACM Press, New York, NYhttp://doi.acm.org/10.1145/1124772.1124788[281]. This approach can still support a useful class of location-based applications, ones that focus on places rather than on individuals. For example, a count of the number of wireless devices in a space could indicate how busy a coffee shop is. More recent work has investigated how a caller can assess the receiving partys availability to communicate, by providing information about the context of the called party. See, for example, Schmandt et al.s Garblephone  ADDIN EN.CITE Schmandt20021057105710Schmandt, C.Kim, J.Lee, K.Vallejo, G.Ackerman, M. Mediated voice communication via mobile IP15th Annual ACM Symposium on User interface Software and Technology UIST '02141-1502002Paris, France, October 27 - 30, 2002ACM Presshttp://doi.acm.org/10.1145/571985.572005[259], Nagels Family Intercom  ADDIN EN.CITE Nagel200131031010Kris NagelCory D. KiddThomas OConnellAnind DeyGregory D. AbowdThe Family Intercom: Developing a Context-Aware Audio Communication SystemUbicomp 2001176-1832001Atlanta, GA[219], Avrahami et al.s context cell phone protocol  ADDIN EN.CITE Avrahami20071181118117Daniel AvrahamiDarren GergleScott E. HudsonSara KieslerImproving the accuracy of cell phone interruptions: A study on the effect of contextual information on the behaviour of callersBehavior And Information TechnologyBehavior And Information Technology247-2592632007dx.doi.org/10.1080/01449290500402338[32]. Milewski and Smith included availability information in shared address books  ADDIN EN.CITE Milewski200048148110Allen E. MilewskiThomas M. Smith Providing presence cues to telephone usersThe 2000 ACM conference on Computer supported cooperative work (CSCW2000)89-962000ACM Press[210]. Schilit provides a survey of these kinds of context-aware communication, observing that increased awareness can be useful, though at the cost of privacy  ADDIN EN.CITE Schilit46246217B.N. SchilitD.M. HilbertJ. TrevorContext-aware communicationIEEE Wireless Communications46 -5495[258]. In fact, these systems have contradictory effects on privacy perceptions (Section  REF _Ref148803388 \r \h 2.3.1). On the one hand, they can increase environmental privacy because the caller can choose not to disturb the recipient if she is busy. On the other hand, these awareness systems cause information about individuals to be communicated automatically and reduce plausible deniability. More recently, Davis and Gutwin surveyed disclosure preferences of awareness information. They asked individuals what types of awareness information they would disclose to seven different relationship types and found that most individuals would allow decreasing amounts of information to weaker relationships  ADDIN EN.CITE Davis20051054105410Davis, S.Gutwin, C.Using relationship to control disclosure in Awareness servers2005 Conference on Graphics interfaceACM International Conference Proceeding Series145-1521122005Victoria, British Columbia, May 09 - 11, 2005Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo, Ontario[81]. Yet, Nagel observed, based on extensive user studies, that individuals may not want to share availability information due to a perceived lack of usefulness of having the caller such information  ADDIN EN.CITE Nagel20061059105932Nagel, KrisGregory D. AbowdUsing Availability Indicators to Enhance Context-Aware Family Communication ApplicationsComputer Science2006Atlanta, GA, USAGeorgia Institute of TechnologyPhD[217]. Nagels results suggest that the utility and privacy costs of these systems are yet unclear. Shared Displays: Incidental Information and Blinding A common problem encountered when several individuals are viewing the same computer screen is that potentially private information, such as bookmarks or financial information, may be accidentally disclosed. This issue may arise due to multiple people using the same computer, when projecting a laptop onto a larger screen, or shoulder surfing in which a bystander happens to see someone elses screen. Some on-the-road professionals apply a physical filter on their laptop screens. Blinding may help in these cases. Blinders are GUI artifacts that hide parts of the user interface to block view of sensitive information. Tarasewich and Campbell proposed using automatic blinders to protect personal data in web browsers  ADDIN EN.CITE Tarasewich20051061106110Peter TarasewichChristopher CampbellWhat Are You Looking At?SOUPS 20052005Pittsburgh, PA, USAACM Press[282]. Sensitive information is first identified using pattern recognition. This information can be redacted with black rectangular blocks or encoded using a set of secret colors. Experimental results suggest that these techniques are surprisingly usable in everyday tasks. Similarly, Miller and Stasko used coded displays for sensitive information shown in semi-public peripheral displays  ADDIN EN.CITE Stasko20051154115427John StaskoDave McColginTodd MillerChris PlaueZach PousmanEvaluating the InfoCanvas Peripheral Awareness System: A Longitudinal, In Situ Study222005March 2005Atlanta, GA, USAGVU Center / Georgia Institute of TechnologyTechnical Report GIT-GVU-05-08ftp://ftp.cc.gatech.edu/pub/gvu/tr/2005/05-08.pdf[275]. In their Infocanvas system, sensitive information such as stock quotes is depicted in a graphical, artful way (e.g., by a cloud hovering over a landscape), using a secret code. While not strong from a security standpoint, this technique may be acceptable for many deployment settings. Schoemaker and Inkpen developed an alternative approach for displaying private information on shared displays using blinding goggles typically used for achieving stereoscopic 3D vision on traditional computer screens  ADDIN EN.CITE Shoemaker20011126112610Shoemaker, G. B.Inkpen, Kori M.Single display privacyware: augmenting public displays with private informationSIGCHI Conference on Human Factors in Computing Systems522-5292001Seattle, Washington, United StatesACM Presshttp://doi.acm.org/10.1145/365024.365349[264]. The display shows public data to all viewers and private data only to the users whose goggles are currently transparent. Ideally, a system would be able to quickly multiplex all these views on the same display. Schoemaker and Inkpen evaluated the system using a collaborative game and found it to be usable by the participants. They also claim that mixed shared/public displays could provide opportunities for enhanced collaboration, supporting both shared data and individual exploration and elaboration of the data. The proliferation of personal, semi-public and public displays suggests that blinding and coding may become common techniques in the HCI privacy landscape. Plausible Deniability, Ambiguity, and Social Translucency Past work by Hindus et al. in the home  ADDIN EN.CITE Hindus200133233217Debby HindusScott D. MainwaringNicole LeducAnna Elisabeth HagstrmOliver BayleyCasablanca: Designing Social Communication Devices for the HomeCHI LettersHuman Factors in Computing Systems: CHI 2001325-332312001ACM[146] and by Hong for location-based services  ADDIN EN.CITE Hong200482682610Hong, JasonNg, J.D.Lederer, ScottLanday, James A. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing SystemsDIS 2004911002004ACM Press[149] suggested a social need to avoid potentially embarrassing situations, undesired intrusions, and unwanted social obligations. Plausible deniability has been recognized as a way of achieving a desired level of environmental and personal privacy in a socially acceptable way  ADDIN EN.CITE Walton19961063106317Walton, D.Plausible Deniability and Evasion of Burden of ProofArgumentation 47-58101996Kluwer Academic Publishers[295]. This ambiguity is the basis for plausible deniability in many communication systems. For example, Nardi et al. observed that people could ignore incoming instant messages without offending the sender, because the sender does not know for certain whether the intended recipient is there or not  ADDIN EN.CITE Nardi200037937910Bonnie NardiSteve WhittakerErin BradnerInteraction and Outeraction: Instant Messaging in ActionACM Conference on Computer Supported Cooperative Work (CSCW2000)79882000ACM Press[220]. Consequently, failing to respond is not interpreted as rude or unresponsive. Traditionally, ambiguity has been considered a negative side-effect of the interaction between humans and computers. Recently, however, researchers have recognized that ambiguity can be a resource for design instead of a roadblock. Gaver et al. claim that ambiguity not only provides a concrete framework for achieving plausible deniability, but also a way of enriching interpersonal communications and even games  ADDIN EN.CITE Gaver20031062106210Gaver, W.Beaver, J.Benford, S.Ambiguity as a Resource for DesignCHI 2003233-2402003Ft. Lauderdale, FL, USAACM Press[117]. Several accounts of ambiguity in voice-based communication systems have been documented  ADDIN EN.CITE Aoki20051065106510Aoki, P.M.Woodruff, A.Making Space for Stories: Ambiguity in the Design of Personal Communication SystemsHuman Factors in Computing Systems (CHI 2005)181-1902005Portland, OR, USAACM Press[28]. For example, the affordances of cell phones enable a social protocol that allows individuals sufficient leeway to claim not having heard the phone ringing. Successful communication tools often incorporate features that support plausible deniability practices  ADDIN EN.CITE Hancock200481081010Jeffrey T. HancockJennifer Thom-SantelliThompson RitchieDeception and Design: The Impact of Communication Technology on Lying BehaviorCHI 2004129134200424-29 AprilVienna, AustriaACM Press[139]. Researchers have attempted to build on the privacy features of traditional Instant Messaging by adding explicit controls on the availability status of the user, though with varying success. For example, Fogarty et al.  ADDIN EN.CITE Fogarty20041128112817Fogarty, J.Lai, J.Christensen, J.Presence versus Availability: The Design and Evaluation of a Context-Aware Communication ClientInternational Journal of Human-Computer Studies299 - 3176132004September 2004http://dx.doi.org/10.1016/j.ijhcs.2003.12.016[109] examined the use of contextual information, such as sound and location information, to provide availability cues in MyVine, a client that integrates phone, instant messaging, and email. Fogarty et al. discovered that users sent Instant Messages to their communication partners even if they were sensed as busy by the system. Fogarty attributes this behavior to a lack of accountability, in that telling senders that they should not have sent the message may be considered more impolite than the interruption itself. When plausible deniability mechanisms become explicit, they can lose much of their value. For example, the Lilsys system by Begole et al. uses a traffic sign metaphor to warn others of ones unavailability for communication  ADDIN EN.CITE Begole20041129112910Begole, J.B.Matsakis, N.E.Tang, J.C.Lilsys: Sensing UnavailabilityConference on Computer Supported Cooperative Work2004ChicagoACM Press[40]. Begole et al. report that the traffic signs metaphor was not well liked by participants in a user study. More importantly, users expressed discomfort at being portrayed as unavailable. Begole et al. believe this discomfort was due to a social desire to appear approachable. Overall, this result suggests that people prefer the flexibility of ambiguity over a clear message that offers no such latitude. It is also worth noting that plausible deniability is at odds with a traditional view of security, defined as confidentiality, integrity, and availability  ADDIN EN.CITE European Commission19911064106427European Commission,Information Technology Security Evaluation Criteria1991June 1991Technical Report, Version 1.2[98]. Integrity and availability contrast with the idea that individuals should be granted a certain amount of unaccountability within information systems. Social science suggests however that plausible deniability is a fundamental element of social relations. Thus, plausible deniability should be viewed as a possible requirement for information technology, especially for artifacts meant to support communication between individuals and organizations. A related issue is that plausible deniability may inhibit social translucency, which has been touted as one of the characteristics that makes computer mediated communications effective and efficient. Erickson and Kellogg define socially translucent systems as IT that supports coherent behavior by making participants and their activities visible to one another  ADDIN EN.CITE Erickson200074574517Thomas EricksonWendy A. KelloggSocial translucence: an approach to designing systems that support social processesACM Transactions on Computer-Human Interaction (TOCHI)ACM Transactions on Computer-Human Interaction (TOCHI)5983712000ACM Press[94]. Plausible deniability may make it hard to hold other people accountable for their actions in such systems. A similar tension is explicitly acknowledged in the context of CSCW research by Kling  ADDIN EN.CITE Kling19941131113117Rob KlingFair Information Practices with Computer Supported Cooperative Work (CSCW)Computer-Mediated Communication Magazine5121994June 1, 1994[180] and was debated as early as 1992 at the CSCW conference  ADDIN EN.CITE Allen19931130113017Allen, J. P.Controversies about privacy and open information in CSCW: a panel report from the CSCW'92 conferenceSIGOIS Bull19-201411993Jul. 1993http://doi.acm.org/10.1145/155748.155759[21]. It is currently not clear what the best way of balancing these two issues is. Social translucency is also discussed with respect to evaluation in Section 3.3.3. Finally, one must take into consideration the fact that users of computer-mediated communications systems often perceive more privacy than what the technology really provides. For example, Hudson and Bruckman show that people have a far greater expectation of privacy in Internet Relay Chat than can be realistically provided given the design and implementation of IRC  ADDIN EN.CITE Hudson20051123112310Hudson, Jim M.Bruckman, AmyH. GellersenUsing Empirical Data to Reason about Internet Research EthicsECSCW 05287-306200518-22 September 2005Paris, FranceSpringer VerlagISBN 1-4020-4022-9[151]. Thus, in addition to balancing plausible deniability with social translucency, designers must also consider users expectations of those properties. We concur with Hudson and Bruckman that more research is necessary in this field. This point is raised again in the final part of this article. Fostering Trust in Deployed Systems The issue of trust in IT is a complex and vast topic, involving credibility, acceptance, and adoption patterns. Clearly, respecting the privacy of the user can increase trust in the system. The relationship also works in the opposite direction: if an application or web site is trusted by the user (e.g., due a reputable brand), privacy concerns may be assuaged. In this section, we provide a brief overview of HCI research on technology and trust with respect to information privacy, both as a social construct and as a technical feature. Trust is a fundamental component of any privacy-affecting technology. Many PETs have been developed with the assumption that once adopted, users would then use IT services with increased trust  ADDIN EN.CITE Pfitzmann19996476475Andreas PfitzmannGnter MllerKai RannenbergTechnologies for Multilateral SecurityTechnology, Infrastructure, EconomyMultilateral Security in Communications85--913kollegbuch31999Addison Wesley Longman Verlag GmbH[239]. One particularly interesting concept is that of trust distribution, where information processing is split up among independent, non-colluding parties  ADDIN EN.CITE Chaum198151751717D. ChaumUntraceable Electronic Mail, Return Addresses, and Digital PseudonymsCommunications of the ACMCommunications of the ACM8488242mix replay attack1981[60]. Trust distribution can also be adapted to human systems, e.g., assigning two keys to a safe to two different managers. Social context is another factor impacting trust and privacy. Shneiderman discusses the generation of trust in CSCW systems  ADDIN EN.CITE Shneiderman20001071107117Shneiderman, BenDesigning trust into online experiencesCommun. ACM57-5943122000Dec. 2000http://doi.acm.org/10.1145/355112.355124[263], claiming that just like a handshake is a trust-building protocol in the real world, it is necessary to create new traditions and methods for computer-mediated communication. Management science has also explored the differences of meetings that are face-to-face versus using some form of telepresence, such as a phone or videoconference  ADDIN EN.CITE Bekkering20061158115817Ernst BekkeringJ.P. ShimTrust in videoconferencingCommunications of the ACM103-1074972006July 2006ACM PressWainfan2004115911595Wainfan, LynneDavis, Paul K.Trevisani, Dawn A.Sisti, Alex F.Virtual collaboration: face-to-face versus videoconference, audioconference, and computer-mediated communicationsEnabling Technologies for Simulation Science VIII. Proceedings of the SPIE384-3985423200408/2004SPIE--The International Society for Optical Engineering10.1117/12.547427[41, 294]. These studies have generally concluded that for initial or intensive exchanges, in-person meetings are more effective at generating trust. An interesting example of how social context affects the operation of IT can be seen with an experimental office memory project at an Electricit de France research lab  ADDIN EN.CITE Lahlou20051074107410Lahlou, SaadiLiving in a goldfish bowl: lessons learned about privacy issues in a privacy-challenged environment.Privacy Workshop at Ubicomp Conference20059/11/2005Tokyo Japan[189]. The employees developed and used an audio-video recording system that continuously archived everything that was said and done in the lab. Access to the recordings was unrestricted to all researchers of the lab. The application was used sparingly and generally perceived as useful. An interesting privacy control was that every access to the recordings would be tracked, similar to the optimistic security protocol  ADDIN EN.CITE Povey199934134110Dean PoveyOptimistic Security: A New Access Control Paradigm1999 New Security Paradigms Workshop1999http://security.dstc.edu.au/staff/povey/papers/optimistic.pdf[241], and that each individual would be informed of the identity of the person looking up the recordings of her workstation. This feature reduced misuse by leveraging existing privacy practices. Leveraging the social context of the office memory application was essential for its acceptance. Acceptance would likely have been very different if the technology had been introduced from the outside or to people who did not trust its management and operation. In fact, Want et al. reported resistance in the deployment of the Active Badge system roughly fifteen years earlier at Olivetti Research Labs  ADDIN EN.CITE Want1992494917Roy WantAndy HopperVeronica FalcoJonathan GibbonsThe Active Badge Location SystemACM Transactions on Information Systems91-1021011992[296]. It is also worth noting that many criticisms of the original work on ubiquitous computing at PARC came from researchers in a different lab than the one actually developing the systems  ADDIN EN.CITE Harper19961162116210Richard H. HarperWhy People Do and Don't Wear Active Badges: A Case StudyComputer Supported Cooperative Work (CSCW96)297-3181996[140]. Two explanations are possible. First, in some regards, the lab developing the ubiquitous computing systems was engaging in a form of participatory design with their own lab members, increasing adoption and overall acceptance. Second, some members of the other lab felt that the system was being imposed on them. Persuasiveness is an important factor influencing user perceptions about a technologys trustworthiness  ADDIN EN.CITE Fogg19991072107210Fogg, B. J.Tseng, HsiangThe elements of computer credibilitySIGCHI Conference on Human Factors in Computing Systems: the CHI Is the Limit 80-871999May 15 - 20, 1999Pittsburgh, Pennsylvania, United StatesACM Presshttp://doi.acm.org/10.1145/302979.303001[110]. Given the power of perceptions in influencing decisions on privacy preferences, it should not be surprising that relatively weak items, such as the mere presence of a privacy policy or having a well-designed web site, can increase the confidence of users with respect to privacy. Privacy certification programs can increase user trust. There are various types of certification programs for privacy, targeting organizations as well as products (e.g., TRUSTe and BBBOnline). A good summary of these programs is provided by Anton and Earp  ADDIN EN.CITE Anton20041052105217Annie I. AntonJulia B. EarpA requirements taxonomy for reducing Web site privacy vulnerabilitiesRequirements Engineering169-1859200410.1007/s00766-003-0183-z[27]. Rannenberg proposed more stringent certification  ADDIN EN.CITE Rannenberg19996716715Kai RannenbergGnter MllerKai RannenbergWhat can IT Security Certification do for Multilateral Security?Technology, Infrastructure, EconomyMultilateral Security in Communications515--5303kollegbuch31999Addison Wesley Longman Verlag GmbH[248]. The idea behind these efforts is that IT systems could be evaluated by independent underwriters and granted a certificate, which would promote the products in the marketplace and increase user confidence. This certification focuses on IT products. However, the management of IT is much more to blame for privacy infringements rather than the actual technical properties of the technology  ADDIN EN.CITE Iachello200369669610Giovanni IachelloProtecting Personal Data: Can IT Security Management Standards Help?ACSAC2662752003Dec. 2003Las Vegas, Nevada, USAIEEE Press0-7695-2041-3[155]. Iachello claims that sound personal information management practices should be included in security management standards such as IS17799  ADDIN EN.CITE International Organization for Standardization / International Electrotechnical Commission200068568527International Organization for Standardization / International Electrotechnical Commission,IS17799:2000 Information technology Code of practice for information security management2000[161]. In summary, a variety of factors influence end-users trust in a system. In our opinion, however, strong brands and a positive direct experience remain the most effective ways of assuring users that sound organizational information privacy practices are being followed. Personalization and Adaptation Personalization and adaptation technologies can have strong effects on privacy. The tension here is between improving the user experience (e.g., recommendations) and collecting large amounts of data about the user behavior (e.g., online navigation patterns). For example, Kobsa points out that personalization technologies may be in conflict with privacy concerns of computer users, and with privacy laws that are in effect in many countries  ADDIN EN.CITE Kobsa20021036103617Kobsa, AlfredPersonalized hypermedia and international privacyCommunications of the ACM64-674552002ACM Press[183]. Furthermore, Kobsa and Shreck note that users with strong privacy concerns often take actions that can undermine personalization, such as providing false registration information on web sites  ADDIN EN.CITE Kobsa200390390317Alfred KobsaJrg SchreckPrivacy through pseudonymity in user-adaptive systems ACM Transactions on Internet Technology (TOIT)322003May 2003[184]. Trewin even claims that control of privacy should take precedence over the use of personal information for personalization purposes, but acknowledges that such control may increase the complexity of the user interface  ADDIN EN.CITE Trewin20001037103710Shari TrewinConfiguration agents, control and privacy2000 conference on Universal Usability9162000Arlington, VA, USAACM Press[286]. Several solutions have been developed to protect users while offering personalized services. For example, Kobsa and Shreck propose anonymous personalization services  ADDIN EN.CITE Kobsa200390390317Alfred KobsaJrg SchreckPrivacy through pseudonymity in user-adaptive systems ACM Transactions on Internet Technology (TOIT)322003May 2003[184]. However, Cranor points out that these strong anonymization techniques may be too complex for commercial adoption  ADDIN EN.CITE Cranor20031039103910Lorrie CranorI Didnt Buy it for Myself Privacy and Ecommerce PersonalizationWorkshop on Privacy in the Electronic Society1111172003Washington, DC, USAACM Press[69]. Cranor also observes that privacy risks can be reduced by employing pseudonyms (i.e., associating the interaction to a persona that is indirectly bound to a real identity), client-side data stores (i.e., leveraging user increased control on local data), and task-based personalization (i.e., personalization for one single session or work task). Notwithstanding Kobsa and Schrecks and Cranors work, real-world experience tells us that many users are willing to give up privacy for the benefits of personalization. One need only look at the success of Amazon.coms recommender system as an example. Awad and Krishnan provide another perspective on this argument. Their survey probed users views on the benefits of personalization and their preferences in data transparency (i.e., providing to users access to the data that organizations store about them and to how it is processed)  ADDIN EN.CITE Awad20061084108417Naveen Farag AwadM. S. KrishnanThe Personalization Privacy Paradox: An Empirical Evaluation of Information Transparency and the Willingness to Be Profiled Online For PersonalizationMIS Quarterly13-283012006MIS Research Center University of Minnesota0276-7783http://search.epnet.com/login.aspx?direct=true&db=buh&an=19754858[33]. Awad and Krishnan concluded that those users with the highest privacy concerns (fundamentalists), would be unwilling to use personalization functions even with increased data transparency. They suggested focusing instead on providing personalization benefits to those users who are unconcerned or pragmatists and to ignore concerned individuals. Awad and Krishnans article also includes a brief overview of privacy literature in the MIS community. Trevor et al. discuss the issue of personalization in ubicomp environments  ADDIN EN.CITE Trevor200227727710Jonathan TrevorDavid M. HilbertBill N. SchilitIssues in Personalizing Shared Ubiquitous DevicesUbicomp 200256-722002Gteborg, Sweden[285]. They note that in these environments, an increasing number of devices are shared between multiple users and this can cause incidental privacy issues. In their evaluation, Trevor et al. probed the personalization preferences of users of a ubiquitous document sharing system in an office setting. They discovered that privacy preferences depend not only on whether the personalized interface runs on a fixed terminal or a portable device, but also on its location and on its purpose of use. In summary, research in this area suggests that the issue of personalization and privacy is highly contextual and depend heavily on trust, interpersonal relations, and organizational setting. The evidence also suggests that users and marketers alike appreciate customized services. Finally, it is also not clear if sophisticated PETs are commercially viable. Consequently, a normative approach to preventing misuse of personal information might be better advised. Evaluation In this section, we outline significant work either evaluating PETs or specifically probing the privacy characteristics of applications. Most PETs require advanced knowledge to use, are complex to configure and operate correctly, and ultimately fail to meet end-user needs. However, it is worth pointing out that there are also many examples of IT applications which successfully integrate privacy-enhancing functions, for example instant messaging clients and mobile person finders. While some researchers had pointed out the importance of user-centered design in security technology  ADDIN EN.CITE Zurko199669569510Mary Ellen ZurkoRichard T. SimonUser-Centered SecurityNew Security Paradigms Workshop27331996IEEE Press[317], only recently has the security and privacy communities started moving down this path. Unfortunately, since many security applications are developed commercially, the results of in-house usability tests, interviews, and heuristic evaluations are not available. User testing of the privacy-related aspects of applications is difficult due to various reasons, including their non-functional nature and their prolonged appropriation curves. As a result, there are not many reports available describing summative evaluation work on PETs and privacy-sensitive technologies. Evaluation of User Interfaces One of the earliest and most renowned papers discussing HCI issues and PETs was Whitten and Tygars Why Johnny Cant Encrypt  ADDIN EN.CITE Whitten199929529510Alma WhittenJ.D. TygarWhy Johnny Can't Encrypt: A Usability Evaluation of PGP 5.08th USENIX Security Symposium1999[310]. Whitten and Tygar reported on the usability of Pretty Good Privacy (PGP), a popular email encryption application  ADDIN EN.CITE Zimmermann19945525526Philip ZimmermannPGP User's Guide1994MIT Press[315]. They conducted a cognitive walkthrough and a lab-based usability test on PGP. In the usability test, experienced email users were asked to perform basic tasks, for example, generating keys and encrypting and decrypting emails. Results showed that a majority of users did not form a correct mental model of the public-key encryption process. Some users also made significant mistakes such as sending unencrypted email, while others did not manage to send mail at all within the time limit. Friedman et al. have studied the user interfaces for web browsers cookie handling in depth. Millett, Friedman, and Felten, for example, studied how the notification interfaces for cookies changed between 1995 and 2000, both in Netscapes and Microsofts web browsers  ADDIN EN.CITE Millett200141941917Lynette I. MillettBatya FriedmanEdward FeltenCookies and Web Browser Design: Toward Realizing Informed Consent OnlineCHI Letters46-52312001[211]. Expert analysis of UI metrics, including depth of menu items for configuration and richness of configuration options, showed that significant changes ensued over this five-year period. Configuration options were expanded, which Millett et al. consider a positive development. Further enhancements include better wording for configuration options and more refined cookie management (e.g., allowing users to delete individual cookies). Providing users more choice and better tools to express informed consent clearly comports with Value Sensitive Design  ADDIN EN.CITE Friedman200228628610Batya FriedmanDaniel C. HoweEdward FeltenInformed Consent in the Mozilla Browser: Implementing Value-Sensitive DesignThe Thirty-Fifth Annual Hawai'i International Conference on System SciencesCD-ROM of full-paper, OSPE1012002IEEE Computer Society[113]. However, the evaluation of PGP, discussed above, suggests that UI complexity is a fundamental drawback of these technologies and that PETs might be more effective with fewer, rather than more, choices. As noted in Section 3.2, systems should present meaningful choices rather than dilemmas. In related research, Whalen and Inkpen analyzed the usage of security user interfaces in web browsers, including the padlock icon that signals a HTTPS connection with a valid certificate  ADDIN EN.CITE Whalen200589889810Tara WhalenKori M. InkpenPrivacy and security awareness: Gathering evidence: use of visual security cues in web browsers2005 conference on Graphics interface GI '052005May 2005Canadian Human-Computer Communications Society[308]. Using eyetracker data, they found that while the lock icon was viewed by participants, the corresponding certificate data was not. In fact, participants rarely pulled up certificate information and stopped looking for security cues after they have signed into a site. Complexity may be again a culprit here, considering that web browser certificate information dialogs are typically difficult to interpret for all but the most security savvy users. The same theme of configuration complexity emerges from Good et al.s work on the privacy implications of KaZaA, a popular file-sharing network  ADDIN EN.CITE Good200388288210Nathaniel S. GoodAaron Krekelberg Usability and Privacy: a Study of Kazaa P2P File-SharingCHI 20031371442003ACM Presshttp://portal.acm.org/citation.cfm?id=1073001.1073006[127]. Good et al. performed a cognitive walkthrough of the KaZaA client as well as a laboratory user study of its user interface. Results showed that a majority of participants were unable to tell what files they were sharing, and some even thought that they were not sharing any files while in fact all files on their hard drive were shared. Good et al. also probed the KaZaA network, finding that in fact a large number of users appeared to be unwittingly sharing personal and private files, and that some users were [] downloading files containing ostensibly private information. In summary, Whitten and Tygars, Whalen et al.s, and Good et al.s findings all indicate that privacy-affecting technologies are easily misunderstood and that their safe use is not obvious. Difficulties in comprehension affect not only PETs but also privacy policies. Jensen and Potts analyzed sixty-four privacy policies of both high-traffic web sites and web sites of American health-related organizations (thus subject to HIPAA)  ADDIN EN.CITE Jensen200488188110Carlos JensenColin Potts Privacy policies as decision-making tools: an evaluation of online privacy noticesCHI 20042004ACM Press[168]. They analyzed policy features including accessibility, writing, content, and evolution over time. The results portray a rather dismal situation. While policies are generally easy to find, they are difficult to understand. The surveyed policies were in general too complex from a readability standpoint to be usable by a large part of the population, which Jensen and Potts note also questions their legal validity. Furthermore, the user herself is typically responsible for tracking any changes to policies, thus curtailing effective notification. The policies of some web sites were very old, exposing both users and site operators to potential risks (respectively, unauthorized uses of personal information and legal liability). Finally, Jensen and Potts note that users typically do not have the choice to decline terms of the policy if they want to use the service. In short, the resulting picture is not encouraging. Users may well be responsible for not reading privacy policies  ADDIN EN.CITE Good20051051105110Nathaniel S. GoodRachna DhamijaJens GrossklagsDavid ThawSteven AronowitzDeirdre MulliganJoseph Konstan Stopping Spyware at the Gate: A User Study of Privacy, Notice and SpywareSymposium On Usable Privacy and Security (SOUPS) 20052005July 6-8, 2005Pittsburgh, PA, USAACM Press[126], but even if they did, they would find it difficult to understand them, track them over time, and resist accepting them. Evaluation of privacy-sensitive IT applications has also extended to off-the-desktop interaction. For example, Beckwith discusses the challenges of evaluating ubiquitous sensing technologies in assisted living facilities  ADDIN EN.CITE Beckwith200241641617Richard BeckwithDesigning for Ubiquity: The Perception of PrivacyIEEE Pervasive40-46222002[38]. Beckwith deployed an activity sensing and location tracking system in a facility for elderly care, and evaluated it using semiformal observation and unstructured interviews with caregivers, patients, and their relatives. One question that arose was how users can express informed consent when they do not understand the operation of the technology or are not aware of it. Their observations highlight the users lack of understanding with respect to the recipient of the data and its purpose of use. Beckwith proposed renewing informed consent on a regular basis, through jack-in-the-box proceduresan approach that resembles the Just-In-Time Click-Through Agreements of Patrick & Kenney  ADDIN EN.CITE Patrick20031000100010Patrick, Andrew S.Kenny, SteveRoger DingledineFrom Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer InteractionsPET 2003107124LNCS 27602003Springer Verlag[235]. In conclusion, existing work evaluating privacy-affecting technologies shows that these technologies are too demanding on users  ADDIN EN.CITE Espey19996666665Jrgen EspeyGeorg RudingerHartmut NeufGnter MllerKai RannenbergExcessive Demands on Users of TechnologyMultilateral Security: Technology, Infrastructure, EconomyMultilateral Security in Communications439--4493kollegbuch31999Addison Wesley Longman Verlag GmbH[95]. Besides establishing common practices and safe defaults, we need to define appropriate metrics on user understanding and ability to express consent, and consistently try to improve them over time. Holistic Evaluation In addition to basic usability, applications must also be evaluated in their overall context of use. One key aspect of holistic evaluation is understanding the social and organizational context in which an application is deployed, because it can affect acceptance and skew the results of an evaluation (e.g., Kellers analysis of privacy issues of electronic voting machines  ADDIN EN.CITE Keller200488388310Arthur M. KellerDavid MertzJoseph Lorenzo HallArnold UrkenPrivacy issues in an electronic voting machine2004 ACM workshop on Privacy in the electronic society2004OctoberACM Press[177]). This kind of analysis is often done with retrospective case studies and controlled deployments of prototypes  ADDIN EN.CITE Brostoff20011125112510Brostoff, S. Sasse, M. AngelaSafe and sound: a safety-critical design approach to securityNew Security Paradigms Workshop 200141-502001Sept. 10-13, Cloudcroft, NMACM Press[53], but is challenging due to the temporal timeframe of the evaluation and complex data collection methods. One interesting example of how social context affects the acceptance of privacy-sensitive IT is provided by the office memory project developed at the Laboratory of Design for Cognition at Electricit de France  ADDIN EN.CITE Lahlou20051074107410Lahlou, SaadiLiving in a goldfish bowl: lessons learned about privacy issues in a privacy-challenged environment.Privacy Workshop at Ubicomp Conference20059/11/2005Tokyo Japan[189] discussed in Section 3.2.9. Here, the social context was essential for acceptance: the users were by and large the builders of the application. It is likely that acceptance would have been much lower in another setting. For example, as noted in Section  REF _Ref163708467 \r \h 3.3.9, there was much resistance to the deployment of the Active Badge system  ADDIN EN.CITE Want1992494917Roy WantAndy HopperVeronica FalcoJonathan GibbonsThe Active Badge Location SystemACM Transactions on Information Systems91-1021011992[296] outside of the group that developed it  ADDIN EN.CITE Harper19961162116210Richard H. HarperWhy People Do and Don't Wear Active Badges: A Case StudyComputer Supported Cooperative Work (CSCW96)297-3181996[140]. Perception of individual autonomy, political structures, and group tensions all contributed to the rejection of a technology that was perceived as invasive. Similarly, in hospitals, locator badges are used to facilitate coordination and protect nurses from spurious patient claims. However, in many cases, these locator badges have led to increased friction between workers and employers, as they were perceived by nurses as a surreptitious surveillance system  ADDIN EN.CITE allnurses.com200238138112allnurses.comNew Restroom protocol per management....2002http://allnurses.com/t16164.html[22]. In at least two separate cases, nurses outright refused to wear the locator badges  ADDIN EN.CITE California Nurses Association200239439412California Nurses Association,Eden RNs Protest Electronic Tracking Devices: Mass Turn-in of Nurse Locator Buttons2002http://www.calnurse.org/cna/press/90402a.htmlallnurses.com200238138112allnurses.comNew Restroom protocol per management....2002http://allnurses.com/t16164.html[22, 59]. In cases where the value proposition was clear to the nurses using it, and where management respected the nurses, the system was accepted. In cases where the value proposition was not clear or was seen as not directly helping the nurses, the system tended to exacerbate existing tensions between the staff and management. A second contentious social issue with respect to privacy-invasive systems is adjudication, that is, whose preferences should prevail in situations where part of the user base favors a technology and part opposes it. Although a general discussion is beyond the scope of this paper, one interesting comment is made by Jancke et al. in the context of a video awareness systems  ADDIN EN.CITE Jancke20011161116117Gavin JanckeGina Danielle VenoliaJonathan GrudinJJ CadizAnoop GuptaLinking Public Spaces: Technical and Social IssuesCHI Letters (Human Factors in Computing Systems: CHI 2001)530-537312001Seattle, WAACM Press[165]. Jancke et al. note that what is commonly considered a public space is not one-dimensionally so. A vocal minority of their users were unsettled by an always-on system linking two public spaces. These users felt that there were many private activities that took place in that public space such as personal phone calls, eating lunch, and occasional meetings, and that the private nature of this public space was being subverted. Before the video awareness system was deployed, there was a degree of privacy based on the proxemics of the space. However, when computer-mediated communication technologies are introduced, such privacy was destroyed because individuals could not easily see who was present at the other end of the system. This shows that a legal or technical definition of public space often does not align with peoples expectations. A third key aspect of holistic evaluation stems from the observation that privacy and security features are often appropriated late in the learning curve of an application  ADDIN EN.CITE Iachello200597497410Iachello, GiovanniSmith, IanConsolvo, SunnyChen, MikeAbowd, Gregory D.Developing Privacy Guidelines for Social Location Disclosure Applications and ServicesSymposium On Usable Privacy and Security (SOUPS)65762005July 68, 2005Pittsburgh, PA, USAACM Press[157], often after some unexpected security or privacy incident. Forcing participants to use privacy-related features can speed up the evaluation, but may be detrimental because the participants attention is focused on a specific feature instead of the whole application. Thus, the evaluation of privacy and security through test deployments requires researchers to engage in the observation of prolonged and continued use. For example, Ackerman et al. performed a field study of an audio media space over the course of two months  ADDIN EN.CITE Ackerman199788788717Mark S. AckermanBrian StarrDebby HindusScott D. MainwaringHanging on the wire: a field study of an audio-only media spaceACM Transactions on Computer-Human Interaction (TOCHI)411997March 1997[12]. Their system provided an always-on audio communication link between remote co-workers. Users experiences were studied through interviews, transcripts of communications, usage logs, and direct observation  ADDIN EN.CITE Hindus19961134113410Hindus, DebbyAckerman, MarkMainwaring, Scott D.Starr, BrianThunderwire: A Field Study of an Audio-Only Media SpaceComputer Supported Cooperative Work 96238-2471996Cambridge MA USAACM Press[145]. Ackerman et al. report the gradual emergence of social norms regulating the use of the space by group members. Users started ignoring disclosures by other users that were clearly personal in nature and had been transmitted through the system by mistake, perhaps because one party had forgotten to turn off the media space before a sensitive conversation. Cool et al. also discuss the long-term evaluation of a videoconferencing system developed at Bellcore during the 1990s  ADDIN EN.CITE Cool19921133113310Cool, ColleenFish, Robert S.Kraut, R.Lowery, C.M.Iterative Design of Video Communication SystemsCSCW 9225321992November 1992ACM Press[66]. The system started out as an always-on link between public spaces and evolved into a personal videoconferencing system on personal workstations. Cool et al. observed four issues with their videoconferencing systems: system drift (system use and norms evolve over time), conflicting social goals of one user within the social system, concerns of social appropriateness and evaluation, and reaching a critical mass of users. Cool et al. point out that test implementations should be as complete and robust as possible, i.e., real products, if credible observations social behavior are sought. Studies should also extend over a long timeframe to motivate conclusions about the systems acceptance. Finally, technology must be evaluated in the context of planned use rather than in a laboratory. Cool et al.s work leads to a final aspect of holistic evaluation, namely that it can be difficult to gather data on the privacy-sensitive aspects of IT applications. First, privacy and security are non-functional properties which may not be obvious to the user and might not be obvious in the UI. Second, case studies on privacy and security are often hampered by the lack of public knowledge on failures or successes. Third, concerns of social appropriateness can affect perceptions as well as cause tensions in collaborative environments, all of which can affect observations. These factors suggest that, to interpret observations correctly, researchers must take a broad view of the application and its perceived properties. Only through careful observations will user privacy concerns and perceptions emerge from product evaluations. The Tension between Transparency and Privacy In Section 3.2.8, we briefly touched on the tension between privacy and social transparency. One of the goals of CSCW research is to increase communication opportunities through technology. However, increased transparency, e.g., in the form of awareness of others activities, can conflict with an individuals need for autonomy and solitude, with detrimental effects on organizational effectiveness. To a degree, these tensions have always existed, but Grudin points out that electronically collecting and distributing data about individuals significantly increases the risk of undesired uses  ADDIN EN.CITE Grudin200111111117Jonathan GrudinDesituating Action: Digital Representation of ContextHuman-Computer Interaction (HCI) Journal162-42001[132]. The point of this section is to show that the tension between transparency and privacy is subtle and that simple design features can often make the difference between accepting and rejecting a system. Groupware calendars provide a prime example of this tension. Two obvious advantages of group calendars are more effective planning and better access to colleagues. However, these advantages also impact users personal space and work time. Palen describes the prolonged use of a groupware calendar system within a large organization, based on observations and expert analysis  ADDIN EN.CITE Palen199921121117Leysia PalenSocial, Individual and Technological Issues for Groupware Calendar SystemsCHI Letters: Human Factors in Computing Systems, CHI 9917-24211999[231]. She points out that technological infrastructure can curb risks by making misuse too expensive in the face of the potential gains. She identifies three techniques to achieve this goal. First, Palen proposes limiting calendar surfing, that is, accessing others calendar information without a specific need and knowledge of that person. Second, privacy controls should be reciprocal, meaning that social subgroups share the same type of information in a symmetric way. Finally, social anonymity helps prevent systematic misuse. Palen notes that calendars were retrieved based on a specific employee name. Consequently, while any employee could in theory access any other employees calendar, this rarely happened since he would only know the names of a limited number of people in the company. Tullio discusses a groupware calendar used to predict other users availability, for purposes of initiating in-person or mediated communication  ADDIN EN.CITE Tullio20051127112732Tullio, J. C.Exploring the Design and Use of Forecasting Groupware Applications with an Augmented Shared CalendarCollege of Computing2005Atlanta, GA, USAGeorgia Institute of TechnologyUMI Order Number: AAI3170117Doctoral Thesishttp://etd.gatech.edu[287]. In addition to a qualitative analysis, Tullio performed an expert analysis of his groupware calendaring application using Jensens STRAP method and identified several potential privacy vulnerabilities, including prediction accuracy, consent, and notification. Tullio also notes that in these kinds of systems, concerns arise for both [] controlling access as well as presenting a desired impression to others. These dynamics are related to Goffmans work on presentation of self and to the concept of personal privacy we outlined in Section  REF _Ref163708666 \r \h 2.2.2. An explicit analysis of varying degrees of social transparency is encompassed in Erickson et al.s work on socially translucent systems  ADDIN EN.CITE Erickson200074574517Thomas EricksonWendy A. KelloggSocial translucence: an approach to designing systems that support social processesACM Transactions on Computer-Human Interaction (TOCHI)ACM Transactions on Computer-Human Interaction (TOCHI)5983712000ACM Press[94]. In socially translucent systems, the overall goal is to increase awareness and communication opportunities by presenting information about others activities. These systems are translucent since they only present select aspects of activity, as opposed to being transparent and presenting all aspects  ADDIN EN.CITE Brin19981151156David BrinThe Transparent Society1998Reading, MAPerseus Books[51]. Erickson et al. developed Babble, a chat system that allows one-to-one and group communication. Babble stores a persistent, topic-threaded copy of the chats, and offers a graphical representation of users that provides awareness of their activity within the chat system. The system was used for over two years within the research organization of the authors. Thus, observations of Babbles use were grounded in an extensive deployment that saw both adoption successes in some groups and failures in other groups. The authors report that the system was often used to initiate opportunistic interactions, and contributed to increasing group awareness while preserving a sufficient degree of privacy for the involved parties. One interesting aspect of Erickson et al.s work is that they claim to have willfully refrained from building norms and social conventions in the UI and system architecture. For example, Babble did not provide specific tools for protecting privacy, expecting instead that users would develop their own acceptable behaviors and norms around the system. They argue that this did indeed happen. In fact, Erickson et al. go as far as stating that building such privacy-protecting mechanisms would have prevented users from showing one another that they could be trusted in their use of the system, a process that strengthened rather than weakened the social bonds within the organization  ADDIN EN.CITE Erickson200074574517Thomas EricksonWendy A. KelloggSocial translucence: an approach to designing systems that support social processesACM Transactions on Computer-Human Interaction (TOCHI)ACM Transactions on Computer-Human Interaction (TOCHI)5983712000ACM Press[94]. Clearly, such an approach is possible only in specific contexts which should be carefully evaluated by the designer. In many cases, though, privacy-enhancing features cannot be avoided. However, simple privacy precautions are often sufficient. An example is provided by Grasso and Meuniers evaluation of a smart printing system deployed at Xerox R&D France  ADDIN EN.CITE Grasso20021135113510Antonietta GrassoJean-Luc MeunierWho Can Claim Complete Abstinence from Peeking at Print Jobs?CSCW '022963052002ACM Press[128]. Their printing system has two main functions: it stores printed jobs on the print server for future access, and has an affinity function that shows, on the header page of each print job, information about similar print jobs submitted by other users. The objective of the latter function is to enable social networking between people interested in the same type of information. Grasso and Meunier claim that the simple privacy-enhancing features built in the system are sufficient for preventing abuse. First, users must intentionally use the smart printer. Regular printers are still available. Second, a forget function is available that removes any trace of the print history of a specific user. In conclusion, the examples above show that the interaction between social norms and technology is often subtle. Privacy by obscurity, such as in Palens case study, can effectively curtail privacy violations, even if it is not a strong mechanism. Erickson et al.s work suggests that technology should leverage, rather than mechanically reproduce, social norms. Finally, designers should remember that often simple UI features are sufficient to curtail misuse, as Grasso and Meuniers experience shows. Privacy Frameworks Unlike other areas of HCI, there are few widely accepted frameworks for privacy, due to the elusiveness of privacy preferences and the technical hurdles of applying guidelines to specific cases. In this section, we discuss some of the frameworks that have been proposed to analyze and organize privacy requirements, and note the benefits and drawbacks of each (see Table 2). Design frameworks relevant to HCI researchers and practitioners can be roughly grouped into three categories. These include guidelines, such as the aforementioned Fair Information Practices  ADDIN EN.CITE Organization for Economic Co-operation and Development198069369327Organization for Economic Co-operation and Development,Guidelines on the Protection of Privacy and Transborder Flows of Personal Data1980http://www.oecd.org[230]; process frameworks, such as Jensens STRAP  ADDIN EN.CITE Jensen200587187127Jensen, CarlosTullio, J.Potts, ColinMynatt, Elizabeth D.STRAP: A Structured Analysis Framework for Privacy2005January 2005Atlanta, GA, USAGVU Center, Georgia Institute of TechnologyGVU Technical Report 05-02http://www.cc.gatech.edu/gvu/research/tr05_02.html2/10/2006[170] or Bellotti and Sellens Questions Options Criteria (QOC) process  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43]; and modeling frameworks, such as Jiang et al.s Approximate Information Flows  ADDIN EN.CITE Jiang200221021010Xiaodong JiangJason I. HongJames A. LandayApproximate Information Flows: Socially-based Modeling of Privacy in Ubiquitous ComputingUbicomp 2002176-1932002Gteborg, Sweden[172]. These frameworks are meant to provide guidance for analysis and design. However, it should be noted that few of these frameworks have been validated. By validation, we mean a process that provides evidence of the frameworks effectiveness in solving the design issues at hand by some metric, for example design time, quality of the overall design, or comprehensiveness of requirements analysis. In most cases, these frameworks were derived based on application practice in related fields or from the authors experiences. This lack of validation partially explains why many frameworks have not been widely adopted. Indeed, case studies have been better received. Nevertheless, the issue of knowledge reuse in HCI is pressing  ADDIN EN.CITE Sutcliffe20007327325Alistair SutcliffeJohn M. CarrollOn the Effectuve Use and Reuse of HCI KnowledgeHuman-Computer Interaction in the New Millennium3--292000ACM Press[278] and accounts of single applications are not an efficient way of communicating knowledge. We believe that research on privacy can greatly benefit from general guidelines and methods, if they are thoroughly tested and validated, and if practitioners and researchers use them with an understanding of their performance and limitations. In fact, we suggest in the conclusion that the development of a privacy toolbox composed of several complementary techniques is one of the main research challenges of the field. Privacy Guidelines Privacy guidelines are general principles or assertions that can be used as shortcut design tools to: identify general application requirements prior to domain analysis; evaluate alternative design options; suggest prepackaged solutions to recurrent problems. Table  SEQ Table \* ARABIC 2. Overview of HCI Privacy Frameworks Framework NameScopeData Protection vs Personal PrivacyPrincipled vs CommunitarianProsConsGuidelinesFIPSBasic personal data management principlesData ProtectionPrincipledSimple PopularSystem-centered Do not consider value propositionDesign Patterns (Chung)Ubiquitous computingPersonalPrincipledEasy to learnMismatch with design Process FrameworksQOC (Bellotti)Questions and Criteria for evaluating designsPersonalPrincipledSimpleLimited to VMS systemsRisk Analysis (Hong)Ubiquitous computingNeutralCommunitarianClear checklistsDifficult to valuate riskPrivacy Interface AnalysisWeb applicationsData ProtectionPrincipledGood rationaleComplex to applyProportionalityValue proposition balanced with privacy riskNeutralCommunitarianLightweight Used in related communities Explicit balanceDemands in-depth analysisSTRAPPrivacy analysis based on goal analysisNeutralNeutralLightweightGoal-driven, may ignore non-functional issuesModeling FrameworksEconomic FrameworksModels of disclosure behaviorsData ProtectionCommunitarianSimple economic justification Compatibility with risk reduction cost metricsFrail assumptions of user behaviorApproximate Information FlowsModel of information flowsData ProtectionPrincipledComprehensive frameworkFrail assumptions Incompatible with data protection lawMultilateral SecurityGeneral modelNeutralCommunitarianExplicit balanceLack of process model Fair Information Practices Based on work by Westin in the early 1970s, the Fair Information Practices (FIPS) are among the earliest guidelines and were influential on almost all data protection legislation. The FIPS were developed specifically to help design large databanks of personal information, such as health records, financial databases, and government records (Table 3). The FIPS are the only framework that has been used extensively in industry and by regulatory entities. Data Protection Authorities (DPA) use these guidelines to analyze specific technologies  ADDIN EN.CITE European Commission Article 29 Working Party200467767727European Commission Article 29 Working Party,Opinion 4/2004 on the Processing of Personal Data by means of Video Surveillance200411750/02/EN WP 89http://www.europa.eu.intEuropean Commission Article 29 Working Party200467867827European Commission Article 29 Working Party,Working Document on Biometrics200412168/02/EN WP80 http://europa.eu.int[99, 101]. The Working Party bases its analyses on a case-by-case application of the FIPS, along with other principles such as legitimacy and proportionality. The FIPS have also been adapted over time to novel technologies  ADDIN EN.CITE Langheinrich200177277210Marc LangheinrichPrivacy by Design Principles of Privacy-Aware Ubiquitous SystemsUbicomp 2001LNCS273291LNCS 220122012001Springer Verlag[191]  ADDIN EN.CITE Garfinkel200286686610Simson GarfinkelAdopting Fair Information Practices to Low Cost RFID SystemsUbiquitous Computing 2002 Privacy Workshop2002http://www.teco.edu/~philip/ubicomp2002ws/1/10/2005[116] and processes (Privacy Incorporated Software Agents)  ADDIN EN.CITE Patrick20031000100010Patrick, Andrew S.Kenny, SteveRoger DingledineFrom Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer InteractionsPET 2003107124LNCS 27602003Springer Verlag[235]. However, it should be noted that since the FIPS were developed in the context of large databases of personal information held by institutions, they adopt a data protection and systems-centered viewpoint that may not be appropriate for other applications. The FIPS only suggest evaluating if data collection is commensurate with the goal of the application. In other words, the FIPS are applicable once the general structure of the planned system has been established, but they may fail an analyst in understanding whether an application is useful, acceptable to its stakeholders, and commensurate to its perceived or actual unwanted impact. These factors hint at two situations where the FIPS may be difficult to apply. The first is in cases where technology mediates relationships between individuals (i.e., personal privacy, see Section 2.2.2) as opposed to between individuals and organizations. The second is in cases where the data is not structured and application purposes are ill-defined (e.g., exploratory applications). Table  SEQ Table \* ARABIC 3. The Fair Information Practices (FIPS), OECD version. PrincipleDescription Collection LimitationThere should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.Data QualityPersonal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.Purpose SpecificationThe purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.Use LimitationPersonal data should not be disclosed, made available or otherwise used [] except: a) with the consent of the data subject; or b) by the authority of law.Security SafeguardsPersonal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data.OpennessThere should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.Individual ParticipationAn individual should have the right: a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him; b) to have communicated to him, data relating to him within a reasonable time; at a charge, if any, that is not excessive; in a reasonable manner; and in a form that is readily intelligible to him; c) to be given reasons if a request made under subparagraphs(a) and (b) is denied, and to be able to challenge such denial; and d) to challenge data relating to him and, if the challenge is successful to have the data erased, rectified, completed or amended.AccountabilityA data controll150er should be accountable for complying with measures which give effect to the principles stated above.Table  SEQ Table \* ARABIC 4. Privacy Guidelines for Social Location Disclosure Applications and Services  ADDIN EN.CITE Iachello200597497410Iachello, GiovanniSmith, IanConsolvo, SunnyChen, MikeAbowd, Gregory D.Developing Privacy Guidelines for Social Location Disclosure Applications and ServicesSymposium On Usable Privacy and Security (SOUPS)65762005July 68, 2005Pittsburgh, PA, USAACM Press[157]. GuidelineDescriptionFlexible RepliesUsers should be able to choose what the system discloses as a reply to a location request.Support DenialCommunication media should support the ability to ignore requests.Support Simple EvasionDesigns should include the ability of signaling busy as a baseline evasive reply.Dont Start With AutomationAutomatic functions that communicate on behalf of the user should not be introduced by default, but only when a real need arises.Support DeceptionCommunication media should support the ability to deceive in the reply.Start with Person-to-Person CommunicationSocial mobile applications should support person-to-person communication before attempting group communication.Provide Status / Away MessagesProvide a way of signaling availability status.Operators Should Avoid Handling User DataSocial location disclosure applications should not be provided by centralized services. Guidelines for Ubiquitous Computing and Location-Based Services In addition to general principles, specific guidelines have also been proposed for more limited application domains. For example, Lederer et al.  ADDIN EN.CITE Lederer2005118111815S LedererJ I HongA DeyJ A LandayS. GarfinkelL. F. CranorFive Pitfalls in the Design for PrivacySecurity and Usability421-445862005[196] observed that, in the context of ubiquitous computing applications, successful designs must: make both potential and actual information flows visible, provide coarse-grain control, enable social nuance, and emphasize action over configuration. These guidelines originate from qualitative reflection on the researchers experience. Guidelines with even more limited scope are available as well. For example, Iachello et al. proposed eight specific guidelines for the development of social location disclosure applications  ADDIN EN.CITE Iachello200597497410Iachello, GiovanniSmith, IanConsolvo, SunnyChen, MikeAbowd, Gregory D.Developing Privacy Guidelines for Social Location Disclosure Applications and ServicesSymposium On Usable Privacy and Security (SOUPS)65762005July 68, 2005Pittsburgh, PA, USAACM Press[157] (Table 4). Design Patterns for Privacy Design patterns are somewhat related to guidelines. The concept of patterns originates from work by Alexander  ADDIN EN.CITE Alexander19798708706Alexander, C.The Timeless Way Of Building1979New York, NY, USAOxford University Press0-19-502402-8[20], and was later used in the context of software design  ADDIN EN.CITE Gamma1995115711576Gamma, ErichRichard HelmRalph JohnsonJohn VlissidesDesign Patterns: Elements of Reusable Object-Oriented Software3951995Addison-Wesley0-201-63361-2[115]. One key difference between guidelines and patterns is that patterns are meant to be generative, helping designers create solutions by re-purposing existing solutions, whereas guidelines tend to be higher level and not tied to specific examples. Both Junestrand et al.  ADDIN EN.CITE Junestrand20011089108917Junestrand, StefanKeijer, UlfTollmar, KonradPrivate and public digital domestic spacesInternational Journal of Human-Computer Studies753-7785452001http://www.sciencedirect.com/science/article/B6WGR-4582BYY-1B/2/f771b221ec6a5c3c9ec3905e6be0ab5a [173] and Chung et al.  ADDIN EN.CITE Chung200483383310Chung, E.Hong, JasonLin, J.Prabaker, M.Landay, JamesLiu, ADevelopment and Evaluation of Emerging Design Patterns for Ubiquitous ComputingDIS 2004: Designing Interactive Systems2332422004Boston, MA, USAACM Press[61] developed design patterns to solve common privacy problems of ubicomp applications. The patterns developed by Chung et al. are listed in Table 5 and are inspired by a combination of the FIPS, HCI research, and security research. While Chung et al.s patterns are relatively high-levele.g., Building Trust and Credibility, Fair Information Practices,Junestrand et al.s are application-specific. Chung et al. evaluated their patterns using a design exercise with students and experienced designers. The authors observed that the privacy patterns were not used in any meaningful way by the participants. Expert reviewers did not evaluate the designs produced with the patterns to be any better than the others  ADDIN EN.CITE Chung200483383310Chung, E.Hong, JasonLin, J.Prabaker, M.Landay, JamesLiu, ADevelopment and Evaluation of Emerging Design Patterns for Ubiquitous ComputingDIS 2004: Designing Interactive Systems2332422004Boston, MA, USAACM Press[61]. Several explanations are likely, including limitations of the experimental setup and the fact that privacy is often a secondary concern of the designers. Table  SEQ Table \* ARABIC 5. Privacy Pre-Patterns  ADDIN EN.CITE Chung200483383310Chung, E.Hong, JasonLin, J.Prabaker, M.Landay, JamesLiu, ADevelopment and Evaluation of Emerging Design Patterns for Ubiquitous ComputingDIS 2004: Designing Interactive Systems2332422004Boston, MA, USAACM Press[61] Design PatternDescriptionFair Information Practices The Fair Information Practices are a set of privacy guidelines for companies and organizations for managing the personal information of individuals.Respecting Social Organizations If [members of] the organization [] [do] not trust and respect one another, then the more intimate the technology, the more problems there will likely be.Building Trust and Credibility Trust and credibility are the foundation for an ongoing relationship.Reasonable Level of Control Curtains provide a simple form of control for maintaining ones privacy while at home. Appropriate Privacy Feedback Appropriate feedback loops are needed to help ensure people understand what data is being collected and who can see that data.Privacy-Sensitive ArchitecturesJust as the architecture of a building can influence how it is perceived and used, the architecture of a ubiquitous computing system can influence how peoples perceptions of privacy, and consequently, how they use the system.Partial Identification Rather than requiring precise identity, systems could just know that there is a person or a person that has used this system before.Physical Privacy Zones People need places where they feel that they are free from being monitored.Blurred Personal Data [] Users can select the level of location information disclosed to web sites, potentially on a page by page basis.Limited Access to Personal Data One way of managing your privacy with others is by limiting who can see what about you. Invisible Mode Invisible mode is a simple and useful interaction for hiding from all others.Limited Data Retention Sensitive personal information, such as ones location and activity, should only be kept as long as needed and no longer.Notification on Access of Personal Data AT&T Wireless Find Friends service notifies your friend if you ask for her location.Privacy Mirrors Privacy mirrors provide useful feedback to users by reflecting what the system currently knows about them.Keeping Personal Data on Personal Devices One way of managing privacy concerns is to store and present personal data on a personal device owned by the user.  The lack of an established design practice and knowledge is an inherent problem with applying design patterns to privacy-sensitive applications. Chung et al. acknowledged that design patterns may be premature in the ubicomp domain. An argument could be made that in situations of exploratory and uncertain design, only thorough analysis on a case-by-case basis can provide strong arguments for an applications acceptability. Process Frameworks While guidelines are ready-made parcels of analysis and solutions to common problems, the process frameworks described in this section provide guidance to designers on how to approach the analysis and design of privacy-sensitive IT applications. Questions Options Criteria Media spaces combine audio, video, and computer networking technology to provide a rich communicative environment for collaboration (see Sections 3.1.5 and 3.2.6). Bellotti and Sellen published early work on privacy in the context of video media spaces, based in part on the experience of the RAVE media space at EuroPARC  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43]. Table  SEQ Table \* ARABIC 6. Questions and Evaluation Criteria for video media spaces  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43]. QuestionsFeedback aboutControl overCaptureWhen and what information about me gets into the system.When and when not to give out what information. I can enforce my own preferences for system behaviours with respect to each type of information I convey.ConstructionWhat happens to information about me once it gets inside the system.What happens to information about me. I can set automatic default behaviours and permissions.AccessibilityWhich people and what software (e.g., daemons or servers) have access to information about me and what information they see or use.Who and what has access to what information about me. I can set automatic default behaviours and permissions.PurposesWhat people want information about me for. Since this is outside of the system, it may only be possible to infer purpose from construction and access behaviours.It is infeasible for me to have technical control over purposes. With appropriate feedback, however, I can exercise social control to restrict intrusion, unethical, and illegal usage.Evaluation criteriaTrustworthinessSystems must be technically reliable and instill confidence in usersAppropriate timingFeedback should be provided at a time when control is most likely to be requiredPerceptibilityFeedback should be noticeableUnobtrusivenessFeedback should not distract or annoyMinimal intrusivenessFeedback should not involve information which compromisesFail-safetyThe system should minimise information capture, construction and access by defaultFlexibilityMechanisms of control over user and system behaviours may need to be tailorableLow effortDesign solutions must be lightweight to useMeaningfulnessFeedback and control must incorporate meaningful representationsLearnabilityProposed designs should not require a complex model of how the system worksLow-costNaturally, we wish to keep costs of design solutions down They developed a framework for addressing personal privacy in media spaces. According to their framework, media spaces should provide appropriate feedback and control structures to users in four areas (Table 6). Feedback and control are described by Norman as basic structures in the use of artifacts  ADDIN EN.CITE Norman20022612616Donald A. NormanThe Design of Everyday Things2002New York, NYBasic Books[227], and are at the base of the Openness and Participation principles in the FIPS. Bellotti and Sellen adapted MacLean et al.s Questions, Options, Criteria framework  ADDIN EN.CITE MacLean199194094017A. MacLeanR.M. Young V. Bellotti Tom P. MoranQuestions, Options, and Criteria: Elements of Design Space AnalysisHuman-Computer Interaction (HCI) Journal20125063&41991Lawrence Erlbaum Associates, Inc., Mahwah, NJ[203] to guide their privacy analysis process. They proposed evaluating alternative design options based on eight questions and eleven criteria, derived from their own experience and from other sources (see Table 6). Some criteria are closely related to security evaluation (such as trustworthiness), while other criteria try to address the problem of the human cost of security mechanisms. Bellotti and Sellens criteria are similar to those of Heuristic Evaluation  ADDIN EN.CITE Nielsen199487787728Jakob NielsenRobert L. MackUsability Inspection Methods1994New York, NY, USAJohn Wiley & Sons0-471-01877-5[226], a well-known discount usability technique for evaluating user interfaces. The evaluation of alternatives is common to several privacy frameworks, and is characteristic of design methods targeted at tough design problems that do not enjoy an established design practice. Bellotti and Sellen do not provide guidance on how to develop design options, acknowledging the complex nature of the design space. However, one could imagine a pattern language such as Chung et al.s providing such design options. Table  SEQ Table \* ARABIC 7. Ubicomp Privacy Risk Analysis Questions  ADDIN EN.CITE Hong200482682610Hong, JasonNg, J.D.Lederer, ScottLanday, James A. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing SystemsDIS 2004911002004ACM Press[149]. Social and Organizational ContextWho are the users of the system? Who are the data sharers, the people sharing personal information? Who are the data observers, the people that see that personal information? What kinds of personal information are shared? Under what circumstances? What is the value proposition for sharing personal information? What are the relationships between data sharers and data observers? What is the relevant level, nature, and symmetry of trust? What incentives do data observers have to protect data sharers personal information (or not, as the case may be)? Is there the potential for malicious data observers (e.g., spammers and stalkers)? What kinds of personal information are they interested in? Are there other stakeholders or third parties that might be directly or indirectly impacted by the system?TechnologyHow is personal information collected? Who has control over the computers and sensors used to collect information? How is personal information shared? Is it opt-in or is it opt-out (or do data sharers even have a choice at all)? Do data sharers push personal information to data observers? Or do data observers pull personal information from data sharers? How much information is shared? Is it discrete and one-time? Is it continuous? What is the quality of the information shared? With respect to space, is the data at the room, building, street, or neighborhood level? With respect to time, is it real-time, or is it several hours or even days old? With respect to identity, is it a specific person, a pseudonym, or anonymous? How long is personal data retained? Where is it stored? Who has access to it? Table  SEQ Table \* ARABIC 8. Risk Management Questions  ADDIN EN.CITE Hong200482682610Hong, JasonNg, J.D.Lederer, ScottLanday, James A. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing SystemsDIS 2004911002004ACM Press[149]. Managing Privacy RisksHow does the unwanted disclosure take place? Is it an accident (for example, hitting the wrong button)? A misunderstanding (for example, the data sharer thinks they are doing one thing, but the system does another)? A malicious disclosure? How much choice, control, and awareness do data sharers have over their personal information? What kinds of control and feedback mechanisms do data sharers have to give them choice, control, and awareness? Are these mechanisms simple and understandable? What is the privacy policy, and how is it communicated to data sharers? What are the default settings? Are these defaults useful in preserving ones privacy? In what cases is it easier, more important, or more cost-effective to prevent unwanted disclosures and abuses? Detect disclosures and abuses? Are there ways for data sharers to maintain plausible deniability? What mechanisms for recourse or recovery are there if there is an unwanted disclosure or an abuse of personal information? Risk Analysis Risk management has long been used to prioritize and evaluate risks and to develop effective countermeasures. The use of risk analysis is less common in the HCI and Human Factors communities, although it has been employed to evaluate risks in systems where humans and computers interact, e.g., aviation  ADDIN EN.CITE Neal2003103310335Andrew NealMichael HumphreysDavid LeadbetterPeter LindsayG. Edkins P. PfisterDevelopment of hazard analysis techniques for human-computer systemsInnovation and Consolidation in Aviation255-2622003Aldershot, UKAshgate0 7546 1999 0[221]. However, only recently have risk analysis models been developed in the HCI literature specifically to tackle privacy issues in IT. Hong et al. proposed using risk analysis to tackle privacy issues in ubicomp applications  ADDIN EN.CITE Hong200482682610Hong, JasonNg, J.D.Lederer, ScottLanday, James A. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing SystemsDIS 2004911002004ACM Press[149]. Their process enhances standard risk analysis by providing a set of social and technical questions to drive the analysis, as well as a set of heuristics to drive risk management. The analysis questions, shown in Table 7, are designed to elicit potential privacy risks for ubicomp applications. The authors propose a semi-quantitative risk evaluation framework, suggesting to act upon each identified risk if the standard C < LD equation is satisfied. To evaluate the components of this formula, a set of risk management questions are used, listed in Table 8. One important point of Hong et al.s framework is that it requires the designer to evaluate the motivation and cost of a potential attacker who would misuse personal information. The economic aspect of such misuse is important because it can help in devising a credible risk evaluation strategy and represents the implicit assumption of analysis performed by regulatory entities. Although risk analysis is a fundamental component of security engineering, many aspects of design in this domain cannot be easily framed in a quantitative manner, and a qualitative approach may be necessary. Also, quantitative approaches may prove misleading, failing to consider user perceptions and opinions  ADDIN EN.CITE Boyle200590790717Michael BoyleSaul GreenbergThe language of privacy: Learning from video media space analysis and design ACM Transactions on Computer-Human Interaction (TOCHI)1222005June 2005[50]. An interesting qualitative approach to risk analysis for ubicomp is provided by Hilty et al.  ADDIN EN.CITE Hilty200495295217Lorenz M. HiltyClaudia SomAndreas KhlerAssessing the Human, Social and Environmental Risks of Pervasive ComputingHuman and Ecological Risk Assessment853874102004ASP1080-7039doi: 10.1080/10807030490513874[144]. They suggest using a risk analysis process based on risk screening and risk filtering. In the screening phase, an expert panel identifies relevant risks for a given application (thus using the experts experience directly, instead of checklists like Hong et al.s). In the filtering phase, experts prioritize risks according to several criteria that respond to the precautionary principle. According to the precautionary principle, risk management should be driven by making the social system more adaptive to surprises  ADDIN EN.CITE Klinke2002708107470817Klinke, AndreasRenn, OrtwinA New Approach to Risk Evaluation and Management: Risk-Based, Precaution-Based, and Discourse-Based StrategiesRisk Analysis1071-10942262002http://www.blackwell-synergy.com/doi/abs/10.1111/1539-6924.00274 [181]. They suggest to filter risks according to qualitative prioritization based on the following criteria  ADDIN EN.CITE Hilty200495295217Lorenz M. HiltyClaudia SomAndreas KhlerAssessing the Human, Social and Environmental Risks of Pervasive ComputingHuman and Ecological Risk Assessment853874102004ASP1080-7039doi: 10.1080/10807030490513874[144]: Socioeconomic irreversibility (Is it possible to restore the status before the effect of the technology has occurred?) Delay effect (is the time span between the technological cause and the negative effect long?) Potential conflicts, including voluntariness (Is exposure to the risk voluntary?) and fairness (Are there any externalities?) Burden on posterity (Does the technology compromise the possibilities of future generations to meet their needs?) The authors used this framework to analyze the social and technical risks of ubicomp technologies, including their social and environmental impact. However, while their heuristics are adequate for analyzing large scale social risks, they may not be adequate for risks arising at the interpersonal level. Furthermore, even qualitative risk analysis may be inadequate, because security and privacy design decisions interact with issues that cannot be modeled as risks, both internal (e.g., application usefulness), and external (e.g., regulatory requirements) as pointed out in work by Hudson and Smith  ADDIN EN.CITE Hudson199686886810Hudson, S.Smith, IanTechniques for Addressing Fundamental Privacy and Disruption Tradeoffs in Awareness Support SystemsCSCW 962482571996ACM Press0-89791-765-0http://doi.acm.org/10.1145/240080.240295[152] and Barkhuus and Dey  ADDIN EN.CITE Barkhuus200380880810Louise BarkhuusAnind DeyLocation-Based Services for Mobile Telephony: a Study of Users Privacy ConcernsInteract 2003, 9th IFIP TC13 International Conference on Human-Computer Interaction7097122003Zurich, SwitzerlandACM Press[35]. Functionality- and Goal-Oriented Analysis One of the difficulties in identifying privacy requirements is that they are often non-functional characteristics of a product and are difficult to enumerate exhaustively. Patrick and Kennys Privacy Interface Analysis (PIA) is a process to systematically identify vulnerabilities in privacy-sensitive user interfaces  ADDIN EN.CITE Patrick20031000100010Patrick, Andrew S.Kenny, SteveRoger DingledineFrom Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer InteractionsPET 2003107124LNCS 27602003Springer Verlag[235]. In PIA, designers describe the service or application using UML case models and derive the necessary interface functionalities from them. Then, they consider each functionality with respect to the principles of transparency, finality and use limitation, legitimate processing, and legal rights. Patrick and Kenny combine a functionality-oriented analysis process with an evaluation of the legal and social legitimacy of a given application. However, their process is relatively time consuming. STRAP (Structured Analysis Framework for Privacy) also attempts to facilitate the identification of privacy vulnerabilities in interactive applications  ADDIN EN.CITE Jensen200599999932Jensen, CarlosDesigning For Privacy in Interactive SystemsCollege of Computingxviii+2722005Atlanta, GA, USAGeorgia Institute of TechnologyDoctoral DissertationDoctoral Dissertationhttp://etd.gatech.edu[167]. STRAP employs a goal-oriented, iterative analysis process, and is composed of three successive steps: vulnerability analysis, design refinement, and evaluation. The analyst starts by defining the overall goals of the application and recursively subdividing these goals into subgoals in a tree-like fashion. Specific implementations are then attached to the leafs of this recursive goal definition tree, and vulnerabilities are then identified for each, leading to privacy requirements. Jensen compared STRAPs performance with PIAs  ADDIN EN.CITE Patrick20031000100010Patrick, Andrew S.Kenny, SteveRoger DingledineFrom Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer InteractionsPET 2003107124LNCS 27602003Springer Verlag[235], Bellotti and Sellens framework  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43], and Hongs Risk Analysis framework  ADDIN EN.CITE Hong200440540510Jason I. HongJennifer NgJames A. LandayPrivacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing SystemsDesigning Interactive Systems (DIS2004)91-1002004Boston, MA[150]. The results of this evaluation encouragingly suggest that designers using STRAP identified more privacy issues and more quickly than the other groups. Jensen notes, however, that the design of a shared calendaring system used in the study did not overlap with the applicability domain of the frameworks developed by Bellotti and Sellen and by Hong et al. This underscores the importance of tightly defining the scope of design methods. Proportionality Iachello and Abowd proposed employing the principle of proportionality and a related development process adapted from the legal and Data Protection Authority communities to analyze privacy  ADDIN EN.CITE Iachello200582382310Iachello, GiovanniAbowd, Gregory D.Privacy and Proportionality: Adapting Legal Evaluation Techniques to Inform Design In Ubiquitous ComputingCHI 2005CHI 2005911002005Portland, OR, USAACM Press[156]. In a nutshell, the proportionality principle asserts that the burden on stakeholders of any IT application should be compatible with the benefits of the application. Assessing legitimacy implies a balancing between the benefits of data collection and the interest of the data subject in controlling the collection and disclosure of personal information. This balancing of interests is, of course, not unique to the European data protection community. Court rulings in the United States, including Supreme Court rulings, employ similar assessments  ADDIN EN.CITE Terrell200284184117Terrell, T.Jacobs, A.Privacy, technology, and terrorism: Bartnicki, Kyllo, and the normative struggle behind competing claims to solitude and securityEmory Law Journal146915115142002Fall 2002[283]. Iachello and Abowd further propose to evaluate design alternatives at three stages of an iterative development process: at the outset of design, when application goals are defined (this part of the analysis is called the desirability judgment); during the selection of a technology to implement the application goals (this part is called appropriateness); and during the definition of local design choices impacting parameters and minor aspects of the design (this part is called adequacy). Iachello and Abowd evaluated the proportionality method in a controlled experiment with Hongs risk analysis  ADDIN EN.CITE Hong200440540510Jason I. HongJennifer NgJames A. LandayPrivacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing SystemsDesigning Interactive Systems (DIS2004)91-1002004Boston, MA[150], Bellotti and Sellens method  ADDIN EN.CITE Bellotti199316516510Victoria BellottiAbigail SellenDesign for Privacy in Ubiquitous Computing EnvironmentsThe Third European Conference on Computer Supported Cooperative Work (ECSCW'93)1993Milan, ItalyKluwer Academic Publishers[43], and, as a control condition, Design Rationale  ADDIN EN.CITE MacLean198982882810MacLean, Alan Young, R.M.Moran, T.P.Design Rationale: The Argument Behind The ArtifactCHI 19892472521989ACM Press[204]. The results of the evaluation show that none of the participants in the four conditions identified all the privacy issues in the application. Each design method prompted the participants of the evaluation to probe a certain set of issues, based on the questions that were included in the design framework. Moreover, the level of experience of the participants and the amount of time employed to perform the analysis were better correlated than the design method used with the number of privacy issues identified by each participant  ADDIN EN.CITE Iachello20061103110332Iachello, GiovanniPrivacy and ProportionalityCollege of Computingxv+2612006Atlanta, GA, USAGeorgia Inst. of TechnologyDoctoral Dissertationhttp://etd.gatech.edu[154]. The results of this study suggest that, again, the scope of the design method strongly influences its effectiveness in analyzing specific design problems. Second generation design methods  ADDIN EN.CITE Fallman20031001100110Fallman, DanielDesign-oriented Human-Computer InteractionCHI 20032252322003April 510, 2003Ft. Lauderdale, Florida, USAACM Press[103] can help in the privacy requirements analysis by forcing designers to think through the design as extensively as possible. Modeling Frameworks The third type of design methods we discuss are modeling frameworks. Some modeling frameworks, such as k-anonymity  ADDIN EN.CITE Sweeney200235135117Latanya Sweeneyk-anonymity: a model for protecting privacyInternational Journal on Uncertainty, Fuzziness and Knowledge-based Systems5575701052002[279] and the Freiburg Privacy Diamond  ADDIN EN.CITE Zugenmaier200371271210Zugenmaier, AlfThe Freiburg privacy diamondGlobal Telecommunications Conference, 2003. GLOBECOM '0315011505320031-5 Dec. 2003IEEE0-7803-7974-810.1109/GLOCOM.2003.1258488[316], are heavily influenced by information theory. They describe exchanges of information mathematically, which allows for requirements to be tightly defined and verified. Given the lack of reference to the human user, however, these frameworks are not used in the HCI community. Instead, HCI researchers have focused on economic and behavioral models. Economic Frameworks and Decision Making Models Researchers have developed economic models to describe individuals decision making in the disclosure of personal information. Early work in this area includes Posners and Stiglers work in the late 1970s  ADDIN EN.CITE Posner19781026102617Posner, Richard A.An Economic Theory of PrivacyRegulation19261978May/June 1978Stigler19801027102717Stigler, G.J.An Introduction to Privacy in Economics and PoliticsJournal of Legal Studies91980623644[240, 276]. In particular, Posner argues that privacy is detrimental from an economic standpoint because it reduces the fluidity of information and thus market efficiency. Posner predicts markets for personal information, where individuals can freely trade their personal data. Varian argues that from an economic analysis standpoint, personal information could be protected by associating with it an economic value, thus increasing the cost of collecting and using it to an equitable balance  ADDIN EN.CITE Varian19971851855Hal R. VarianNTIAEconomic Aspects of Personal PrivacyPrivacy and Self-Regulation in the Information Age1997US Department of Commerce[290]. In these markets, data users pay license rights to the data subjects for using their personal information. Similar markets exist already (i.e., credit and consumer reporting agencies). However, critics of these economic models question whether increased fluidity actually provides economic benefit  ADDIN EN.CITE Murphy19961029102917Richard S. MurphyProperty Rights in Personal Information: An Economic Defense of PrivacyGeorgetown Law Journal84 Geo. L.J. 23811996July, 1996[216]. It should be noted that these markets are quite incompatible with the underlying assumptions of data protection legislation such as EU Directive 95/46, which treats personal information as an unalienable object and not as property. Varian takes a more pragmatic approach, suggesting that disclosure decisions should be made by balancing the costs and the subjective benefits of the disclosure  ADDIN EN.CITE Varian19971851855Hal R. VarianNTIAEconomic Aspects of Personal PrivacyPrivacy and Self-Regulation in the Information Age1997US Department of Commerce[290]. Researchers have also developed economic models to describe disclosure behaviors. For example, Vila et al. have developed a sophisticated economic model to explain the low effectiveness of privacy policies on web sites  ADDIN EN.CITE Vila20031035103510Tony VilaRachel GreenstadtDavid MolnarWhy we can't be bothered to read privacy policies Models of privacy economics as a lemons market5th international conference on Electronic commerce4034072003Pittsburgh, PA, USAACM Press[293]. Acquisti explains why Privacy-Enhancing Technologies (PETs) have not enjoyed widespread adoption, by modeling the costs and expected benefits of using a PET versus not using it, treating users as rational economic agents  ADDIN EN.CITE Acquisti200228828810Alessandro AcquistiProtecting Privacy with Economics: Economic Incentives for Preventive Technologies in Ubiquitous Computing EnvironmentsWorkshop on Socially-informed Design of Privacy-enhancing Solutions, 4th International Conference on Ubiquitous Computing (UBICOMP 02)2002Goteborg, Swedenhttp://guir.berkeley.edu/privacyworkshop2002/[13]. Acquisti also argues that economics can help the design of privacy in IT by identifying situations in which all economic actors have incentives to participate in the system (e.g., in systems that require the collaboration of multiple parties, such as anonymizing networks). He further contends that economics can help in identifying what information should be protected and what should not, for example, identifying situations in which the cost of breaching privacy is lower than the expected return (a basic risk analysis exercise). The main limitation of economic models is that the models assumptions are not always verified. Individuals are not resource-unlimited (they lack sufficient information for making rational decisions), and decisions are often affected by non-rational factors such as peer pressure and social navigation  ADDIN EN.CITE Acquisti200595195117Acquisti, AlessandroGroklags, JensPrivacy and Rationality in Individual Decision MakingIEEE Security and Privacy2633312005IEEE[14]. One explanatory theory they discuss is that of bounded rationality, i.e., that individuals cannot fully process the complex set of risk assessments, economic constraints, and consequences of a disclosure of personal data. Acquisti and Groklags research casts serious doubts on whether individuals are capable of expressing meaningful preferences in relation to data protection (i.e., the collection of data by organizations). While in interpersonal relations, individuals have a refined set of expectations and norms that help decision-making and a fine-grained disclosure or hiding process, the same is not true for data protection disclosures. The Approximate Information Flows (AIF) framework proposed by Jiang et al.  ADDIN EN.CITE Jiang200221021010Xiaodong JiangJason I. HongJames A. LandayApproximate Information Flows: Socially-based Modeling of Privacy in Ubiquitous ComputingUbicomp 2002176-1932002Gteborg, Sweden[172] combines ideas from economics and information theory. In AIF, Jiang et al. state the Principle of Minimum Asymmetry: A privacy-aware system should minimize the asymmetry of information between data owners and data collectors and data users, by decreasing the flow of information from data owners to data collectors and users and increasing the [reverse] flow of information  ADDIN EN.CITE Jiang200221021010Xiaodong JiangJason I. HongJames A. LandayApproximate Information Flows: Socially-based Modeling of Privacy in Ubiquitous ComputingUbicomp 2002176-1932002Gteborg, Sweden[172] To implement this principle, the authors propose a three-pronged strategy. First, personal information should be managed by modulating and enforcing limits on the persistency (retention time), accuracy (a measure of how precise the data is) and confidence (a probability measure that the data is correct) of information within an information system. Second, the personal information lifecycle should be analyzed according to the categories of collection, access, and second use. Third, at each of these stages, the system should provide ways to prevent, avoid, and detect the collection, access and further use of personal information. The authors used AIF to analyze several technologies and applications, such as P3P, feedback and control systems, etc. to show how these fit within the framework. However, this model has some limitations. First, the authors have used AIF as an analytic tool, but AIF has not been used as a design model. Second, all data users are expected to comply with the AIF model and respect the constraints on the use and interpretation of personal data. Finally, there is a potential conflict between this approach and data protection legislation in certain jurisdictions, because data protection legislation requires data controllers to guarantee the integrity and correctness of the data they are entrusted with, which is incompatible with the idea of data decay proposed by the AIF framework. Analytic Frameworks Analytic frameworks attempt to answer the question what is privacy in a way that is actionable for design purposes. For example, the concept of Multilateral Security is an analysis model for systems with multiple competing security and privacy requirements  ADDIN EN.CITE Mller199969469428Mller, Gnter Rannenberg, KaiMultilateral Security in Communications, Volume 3: Technology, Infrastructure, Economy1999MnchenAddison Wesley3-8273-1426-7Rannenberg199361461410Kai RannenbergRichard Sizer Louise Yngstrm Henrik Kaspersen Simone Fischer-HbnerRecent Development in Information Technology Security Evaluation The Need for Evaluation Criteria for Multilateral SecuritySecurity and Control of Information Technology in Society Proceedings of the IFIP TC9/WG 9.6 Working Conference1131281993August 1217, 1993Onboard M/S Ilich and ashore at St. Petersburg, RussiaNorth-Holland, Amsterdam0-444-81831-6[214, 247]. One of the innovations of Multilateral Security is that it frames privacy requirements as a special case of security requirements. According to Multilateral Security, security and privacy are elements of the same balancing process among contrasting interests. The aim is to develop technology that is both acceptable to users and profitable for manufacturers and service providers. Multilateral Security asserts that designers must account for all stakeholders needs and concerns by: considering and negotiating conflicting requirements, respecting individual interests, and supporting user sovereignty. Consequently, Multilateral Security highlights the role of designers in producing equitable technology, and that of users who must be empowered to set their own security or privacy goals  ADDIN EN.CITE Wolf19996496495Gritta WolfAndreas PfitzmannGnter Mller Kai RannenbergEmpowering Users to Set Their Security GoalsTechnology, Infrastructure, EconomyMultilateral Security in Communications1131353kollegbuch31999Addison Wesley Longman Verlag GmbH[312]. Multilateral security was applied to several case studies, including a deployment of a prototype mobile application for reachability management for medical professionals (i.e., brokering availability to incoming phone calls)  ADDIN EN.CITE Rannenberg200068768710Rannenberg, KaiMultilateral Security: A Concept and Examples for Balanced SecurityNew Security Paradigms Workshop1511622000Ballycotton, IrelandACM Press [246]. Table  SEQ Table \* ARABIC 9. Privacy Dimensions  ADDIN EN.CITE Lederer200337137110Scott LedererJennifer MankoffAnind DeyTowards a Deconstruction of the Privacy SpaceWorkshop on Privacy In Ubicomp 2003: Ubicomp communities: privacy as boundary negotiation2003Seattle, WA, USAhttp://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers/lederer-privacyspace.pdf[197] DimensionDescriptionFeedback and ControlDifferent privacy-related systems employ different ratios, degrees, and methods of feedback about and control over the disclosure process.Surveillance vs. TransactionSurveillance relates to continuous observation and collection of personal information (e.g., surveillance cameras). Transactions are identifiable events in which personal information is exchanged (e.g., purchase on the internet).Interpersonal vs. Institutional Distinction between revealing sensitive information to another person and revealing it to industry or the state. Similar to our distinction of personal privacy and data protection in Section 2.2.2, limited to the recipient of personal information.Familiarity The degree of acquaintance of the recipient to the disclosing party and vice-versa.Persona vs. ActivityWhether the information relates describes the individual (e.g., age, address) or her actions (e.g., crossing an automatic toll booth).Primary vs. IncidentalHere we distinguish between whether the sensitive information is the primary content or an incidental byproduct of the disclosure. A different model is offered by Lederer et al.s deconstruction of the privacy space  ADDIN EN.CITE Lederer200337137110Scott LedererJennifer MankoffAnind DeyTowards a Deconstruction of the Privacy SpaceWorkshop on Privacy In Ubicomp 2003: Ubicomp communities: privacy as boundary negotiation2003Seattle, WA, USAhttp://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers/lederer-privacyspace.pdf[197]. According to Lederer et al., privacy issues can be classified along six dimensions (Table 9). These dimensions are obtained from the analysis of prior literature, including Agre  ADDIN EN.CITE Agre19941115111517Agre, Philip E.Surveillance and Capture: Two models of privacyThe Information Society101127101994[18], Lessig  ADDIN EN.CITE Lessig199983083017Lessig, LawrenceThe Architecture of PrivacyVanderbilt Journal of Entertainment Law & Practice5611999Spring 1999[199], and Agre and Rotenberg  ADDIN EN.CITE Agre200115315317Philip E. AgreChanging Places: Contexts of Awareness in ComputingHuman-Computer Interaction177-192162-42001[17]. Privacy issues located in different positions of the space will have different characteristics and typical design solutions will be different. Unfortunately, Lederer et al. do not describe what design solutions should be used for applications in various locations of this analytical space. Thus, Lederer et al.s framework is a good candidate for a privacy vocabulary and as a descriptive model, but currently does not necessarily help as an aid to design. In addition to general models, constrained models exist for specific applications. Adams presents a model to analyze perceived infringements of privacy in multimedia communication systems  ADDIN EN.CITE Adams200029329310Anne AdamsMultimedia Information Changes the Whole Privacy Ball GameComputers, Freedom, and Privacy25-322000Toronto, CanadaACM Press[15]. Through several user evaluations, she identified three factors that influence peoples perceptions of these systems: information sensitivity, i.e., how private a user considered a piece of information; information receiver, i.e., who the person receiving the information was; and information usage, i.e., how the information will be used. Boyle and Greenberg define a language for privacy in video media spaces, i.e., networked teleconferencing and awareness applications using digital video and audio feeds. Boyle and Greenberg provide a comprehensive summary of research on privacy in media spaces  ADDIN EN.CITE Boyle200590790717Michael BoyleSaul GreenbergThe language of privacy: Learning from video media space analysis and design ACM Transactions on Computer-Human Interaction (TOCHI)1222005June 2005[50]. They claim that in these applications, designers must consider at least the following privacy issues: Deliberate privacy abuses Inadvertent privacy violations Users and nonusers apprehensiveness about technology Boyle and Greenberg also propose deconstructing the far-reaching concept of privacy into three aspects: solitude (control over ones interpersonal interactions, akin our definition personal privacy), confidentiality (control over others access to information about oneself, i.e. informational self-determination), and autonomy (control over the observable manifestations of the self, also related to an ontological concept of personal privacy). However, Boyle and Greenberg observe that that there is still insufficient knowledge about the users of this technology to draft effective guidelines. Even worse, the authors note that the very analytic tools currently employed are still inadequate for mapping system functions (e.g. open a communication channel) to individual preferences and actions. Conclusions on Modeling Frameworks Patterns and guidelines are similar in many respects because they provide a standard set of typical solutions to the designer and are popular due to their relatively simple structure and ease-of-use. For well-established domains and technologies these can be very useful. However, it becomes very difficult to apply them when the scope and level of generality of the guideline do not match with the design task. Process methods standardize the analysis and design process, and increase the coverage of the design space by considering as many questions and issues as possible upfront. The proponents of modeling frameworks attempt to proceed one step further, by systematizing factual knowledge about privacy in general structures that can be used for many types of applications. However, experimental evidence and our review of the literature suggest that the privacy design space may be too broad to be systematized in one single framework or model. If different methods address different parts of the design space, one option for attempting to increase analytic and design thoroughness would be to combine methods. While this is indeed possible, we believe that a combined method would be even more difficult to validate and would not be adopted easily. An alternative to creating a large unified analysis process would be to document a modular toolbox of privacy heuristics that can be used upon need with a clear understanding of their limitations and contributions. This privacy toolbox should clearly indicate for what applications and social settings certain approaches are more effective, and what the designer can expect from them. We will return to this subject in Section  REF _Ref171657883 \r \h 4.3. Trends and Challenges in Privacy HCI Research In the previous sections, we provided an overview of the research landscape of HCI as it relates to privacy. As a conclusion to this article, we outline several trends that are changing the privacy landscape, as well as major research challenges in the field. While the research subfields reviewed in Section 3 tackle a specific aspect of privacy in HCI, we focus here on five grand challenges that span several subfields: Developing more effective and efficient ways for end-users to manage their privacy. Gaining a deeper understanding of peoples attitudes and behaviors towards privacy Developing a Privacy Toolbox Improving organizational management of personal data Converging privacy research with technological adoption models Below, we outline each of these trends, indicate where we see current research headed, and what are the challenges facing researchers and practitioners. Better Ways of Helping End-Users Manage Their Personal Privacy It is becoming increasingly difficult to manage personal privacy as information and communication technologies become pervasive. Personal information is fragmented across a number of devices, applications, web sites, and organizations, each with different user interfaces, notifications, and management policies. We argue that we need new approaches for alleviating the burden of managing users personal privacy. Information and communication technologies increasingly preserve information about the individuals using them and surveillance systems are spreading into the workplace (in the form of email and web monitoring) and to other spheres of daily activity (e.g., broadcasting the interior of night clubs, bars, or beaches  ADDIN EN.CITE British Institute of International and Comparative Law200367967927British Institute of International and Comparative Law,The implementation of Directive 95/46/EC to the Processing of Sound and Image Data2003http://www.europa.eu.int[52]). Often, these systems collect information unbeknownst to the user. Furthermore, the development of digital sensors has enabled the collection of novel types of information in everyday situations (e.g., automatic toll payment systems based on RFID and license plate recognition  ADDIN EN.CITE Foresti20001141114128Foresti, G.Mhnen, P.Regazzoni, C.Multimedia video-based surveillance systems: Requirements, Issues and Solutions2000Norwell, MA, USAKluwer Academic Publishers0-7923-7927-6[111], implantable sensors monitoring the health of patients  ADDIN EN.CITE Maheu20011143114328Maheu, M. M.Whitten, P.Allen, A.E-Health, Telehealth, and Telemedicine : A Guide to Start-up and SuccessJossey-Bass Health Series2001San FranciscoJossey Bass0787944203[205], monitoring systems deployed in the homes of elderly people  ADDIN EN.CITE Beckwith200241641617Richard BeckwithDesigning for Ubiquity: The Perception of PrivacyIEEE Pervasive40-46222002[38]). Technical and economic considerations suggest that sensing technologies will become a ubiquitously present infrastructure, open for use by individuals as well as organizations for a wide array of purposes. A distinctive characteristic of these systems is that the interaction is increasingly becoming implicit, out of the scope of control of Normans Seven Steps of Interaction  ADDIN EN.CITE Norman20022612616Donald A. NormanThe Design of Everyday Things2002New York, NYBasic Books[227]. This kind of implicit interaction requires new mechanisms for managing the resulting risks to personal information and privacy. One possible solution to problems above is to develop more effective and less burdensome user interfaces for helping people make good decisions. A key challenge here is that there is currently no agreement as to what kinds of interaction styles are best for each type of information disclosure. Rule- or policy-based mechanisms may be suboptimal for many applications, as discussed in Section 3.2.2. Other interaction styles, such as social translucency and plausible deniability, might be able to achieve comparable effects with far less burden and with a greater sense of control  ADDIN EN.CITE Aoki20051065106510Aoki, P.M.Woodruff, A.Making Space for Stories: Ambiguity in the Design of Personal Communication SystemsHuman Factors in Computing Systems (CHI 2005)181-1902005Portland, OR, USAACM Press[28], but there are no clear guidelines on how to build plausible deniability into computing systems. Ambiguity has been discussed as a design resource in other contexts (e.g., games)  ADDIN EN.CITE Gaver20031062106210Gaver, W.Beaver, J.Benford, S.Ambiguity as a Resource for DesignCHI 2003233-2402003Ft. Lauderdale, FL, USAACM Press[117], and we believe it will become an increasingly important design element in the context of privacy. In short, there needs to be much more work to determine the efficacy of these different ideas in a wider range of contexts. Another possibility is to consider a better division of labor that helps shoulder the burden of managing personal privacy. A consensus is slowly building in the research community that privacy-sensitive applications cannot make all data transfers explicit, nor require users to track them all. The related UIs and interaction patterns would simply be too complex and unwieldy. From a data protection viewpoint, experience shows that most data subjects are unable or unwilling to control all disclosures of personal information, and to keep track of all parties that process their personal data  ADDIN EN.CITE Commission of the European Communities200371571527Commission of the European Communities,First report on the implementation of the Data Protection Directive (95/46/EC)27200315/5/2003Brussels, BelgiumCommission of the European CommunitiesCOM(2003) 265 finalEspey19996666665Jrgen EspeyGeorg RudingerHartmut NeufGnter MllerKai RannenbergExcessive Demands on Users of TechnologyMultilateral Security: Technology, Infrastructure, EconomyMultilateral Security in Communications439--4493kollegbuch31999Addison Wesley Longman Verlag GmbH[64, 95]. Distributing the burden of managing ones personal privacy across a combination of operating systems, networking infrastructure, software applications, system administrators, organizations, and third parties could help address this problem. Ideally, these entities would provide advice to users or make trusted decisions on their behalf, with the ultimate goal being to reduce the overall effort required to make good decisions. Taking email spam as an example, multiple entitiesincluding ISPs, local system administrators, and automatic filtersall contribute to reducing the amount of spam that end-users receive. Here, it makes sense to share the costs of spam reduction since the hardship would otherwise be borne by a large number of individuals. Trusted proxies are another example of a third-party organization that can help manage privacy. For instance, MedicAlert is a paid service that stores personal medical records and forwards them to first responders in the case of medical emergencies. Such organizations, either not-for-profit (like MedicAlert), or for-profit (regulated by a service contract), could include: evaluation clearinghouses, that indicate what products and services to trust. For example, SiteAdvisor  ADDIN EN.CITE SiteAdvisor20071190119012McAfee SiteAdvisorMcAfee SiteAdvisorMarch 27, 20072007http://www.siteadvisor.com/[265] evaluates web sites spam, popup, and virus risks, and provides ratings via a web browser plug-in. services that hold users location information and disclose it in case of emergency or subpoena, similar to current mobile telecom operators. A service that seeds suspected privacy violators with fake personal data and tracks how that data is used and shared. A service that checks if an individual reveals too much personal information in her resume and is at risk for identity theft  ADDIN EN.CITE Sweeney20061140114017Sweeney, L.Protecting Job Seekers from Identity TheftIEEE Internet Computing1022006March 2006IEEE Presshttp://privacy.cs.cmu.edu/dataprivacy/projects/idangel/paper3.html[280]. In summary, privacy protection is a systemic property that requires support at all levels. However, special care should be exercised in allocating responsibility and oversight correctly, because the business goals of many organizations may not be aligned with those of the users, as suggested by recent controversies over security leaks at large personal data brokerage firms  ADDIN EN.CITE Wardell200586186123Jane WardellLexisNexis Breach May Be Worse Than ThoughtAP Financial Wire
Business News
20054/13/2005 London
Weber20061148114823Harry WeberU.S. trade commission fines data warehouser ChoicePoint over data breachAssociated Press Worldstream
BUSINESS NEWS
2006January 26, 2006Atlanta
[297, 300]. A Deeper Understanding of Peoples Attitudes and Behaviors towards Privacy The second challenge is in gaining a deeper understanding of the behaviors of individuals towards privacy-affecting systems, at all levels of interaction. One area where research is sorely needed is developing better ways of presenting warnings and notifications to people. There are many difficult forces to balance in creating an effective warning system. Warnings must be visible, comprehensible, understandable, and plausible to end-users  ADDIN EN.CITE Cranor20061195119519Lorrie CranorWhat Do They "Indicate?": Evaluating Security and Privacy IndicatorsInteractions45-571332006May/June 2006Wogalter2006119611966Michael S. WogalterHandbook of Warnings2006Lawrence Erlbaum Associates[70, 311]. Cranor has also argued that warnings need to be tied to clear actions, and be designed so that users keep doing the right thing (rather than ignoring or turning them off). A counterexample to almost all of the above would be standard warning dialogs, most of which are simply swatted away because they get in the way of the users primary goals. Another needed line of research is in understanding how attitudes and behaviors towards privacy-affecting systems evolve and reconcile over time. For example, recent research has shown that behavior in privacy matters often differs from stated preferences, for a variety of reasons  ADDIN EN.CITE Spiekermann200189589510Spiekermann, S.Grossklags, J.Berendt, B.E-privacy in 2nd generation e-commerce: privacy preferences versus actual behaviorACM conference on electronic commerce (EC 2001)38-462001October 2001Tampa, Florida[272]. Acquisti and Gross have also shown that on the Facebook social networking site, people perceived others as revealing too much information despite revealing a great deal of information about themselves. A third needed line of work is that of understanding how to influence the behavior of users. For example, Jagatic et al. provide a striking instance of how publicly available information gathered from social networking web sites can be used to trick people into giving up personal information, such as passwords. They showed that individuals are more likely to fall for phishing attacks if the sender is from their existing social network  ADDIN EN.CITE Jagatic20051079107917Tom JagaticNathaniel JohnsonMarkus JakobssonFilippo MenczerSocial Phishingto appear in Commun. ACM2005December 12, 2005http://www.informatics.indiana.edu/fil/papers.asp[164]. These attacks are known as spear-phishing, or context-aware phishing. By incorporating sender information mined from a social networking site, they showed that scam emails were much more effective in deceiving the targets. Two other examples of research that would fall into this category include convincing people not to abuse other peoples trust (for example, cyber-stalking a person), and persuading people that they can do simple things to protect their privacy online. Here, one challenge is that the very behaviors under scrutiny are not stable, but evolve with the adoption of new technologies. For example, the surge of identity theft and the enactment of legislation countering it suggests that the public is becoming slowly, if painfully, aware of the risks of combining personal information from multiple data sources. On the other hand, the availability of personal information from multiple sources has transformed the previously difficult task of constructing individuals profiles into a fairly trivial activity  ADDIN EN.CITE Palen200324924917Leysia PalenPaul DourishUnpacking "Privacy" for a Networked WorldCHI LettersHuman Factors in Computing Systems: CHI 2003129-136512003Ft. Lauderdale, FLACMhttp://guir.berkeley.edu/projects/denim/denim-chi-2000.pdf[232]. It is not uncommon for people to google potential dates and prospective employees and find past postings on message boards, photographs, and with some effort, information on political affiliations, social networks, criminal records, and financial standing. Furthermore, the willingness of people to ultimately accept these technologies despite the intrinsic risks shows that HCI researchers should not trust stated preferences relative to unknown technologies, but analyse the use of the technologies in practice. We discuss this point further below in Section 4.5 in relation to acceptance. To summarize, we see an increasing role for behavioral research in HCI relative to privacy. The cost of this kind of research is higher than traditional survey-based or even lab-based experiments. However, we are convinced that the nature of the issues revolving around privacy demand this additional expense if the goal is to obtain credible and generalizable results. Developing a Privacy HCI Toolbox A third grand challenge is in providing more support to guide the development of privacy-sensitive systems. Design teams often have to grope through a design space, relying primarily on their intuition to guide them. What is needed are better methods, tools, guidelines, and design patterns to help teams iteratively design, implement, and evaluate applications. With respect to design, we believe that there would be great value in developing an organic privacy toolbox. This privacy toolbox would be a catalog of privacy design methods and models, with an indication of what applications and social settings each is most effective. Practitioners could then choose to use these tools with a competent understanding of their contributions and limitation. We would like to stress that we are not proposing to develop a Software Engineering methodology  ADDIN EN.CITE Song19921109110917Song, XipingOsterweil, Leon J.Toward Objective, Systematic Design-Method ComparisonsIEEE Software4353931992IEEE[270]our proposal is simply a coherent collection that assists practitioners. An initial catalog of design techniques for privacy and HCI would be relatively easy to devise. For example, we mentioned above that the FIPS are particularly fit for large personal data processing enterprises and have been adapted to novel technologies, both in the technical literature  ADDIN EN.CITE Langheinrich200177210Marc LangheinrichPrivacy by Design Principles of Privacy-Aware Ubiquitous SystemsUbicomp 2001LNCS273291LNCS 220122012001Springer VerlagGarfinkel200286610Simson GarfinkelAdopting Fair Information Practices to Low Cost RFID SystemsUbiquitous Computing 2002 Privacy Workshop2002http://www.teco.edu/~philip/ubicomp2002ws/1/10/2005[116, 191] and in the Data Protection Authority community. Similarly, privacy guidelines, patterns, and risk models could help designers in specific, well delimited, circumstances  ADDIN EN.CITE Iachello200582382310Iachello, GiovanniAbowd, Gregory D.Privacy and Proportionality: Adapting Legal Evaluation Techniques to Inform Design In Ubiquitous ComputingCHI 2005CHI 2005911002005Portland, OR, USAACM PressChung200483383310Chung, E.Hong, JasonLin, J.Prabaker, M.Landay, JamesLiu, ADevelopment and Evaluation of Emerging Design Patterns for Ubiquitous ComputingDIS 2004: Designing Interactive Systems2332422004Boston, MA, USAACM PressPatrick20031000100010Patrick, Andrew S.Kenny, SteveRoger DingledineFrom Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer InteractionsPET 2003107124LNCS 27602003Springer Verlag[61, 156, 235]. A precise description of method applicability is essential. Thus, the toolbox should include a selection process, based on the application domain, the deployment context, and the type of privacy and security issues involved (e.g., personal privacy, data protection, sensitive information, etc.). A credible selection process requires the testing of the various methods effectiveness and usefulness, which is by far the most challenging aspect of this idea. With respect to implementation, design teams are sorely lacking tools, frameworks, and reusable UI components and metaphors for creating privacy-sensitive systems. Examining the evolution of the graphical user interface (GUI) may help chart a research agenda to address this need. Similar to GUI components, we could develop reusable privacy tools, services, and toolkits for building privacy-sensitive UIs. Some possibilities include specialized GUI widgets and interaction techniques for helping end-users manage their personal privacy, new visualizations and user interfaces for helping administrators set privacy policies and manage large collections of personal information, and model-based user interfaces for weaving and enforcing privacy throughout the entire UI. Developers should also pay attention to seemingly innocuous technologies, which may have unintentionally negative privacy implications (e.g., cookies in web browsers). Verification techniques able to identify these issues upfront, before deployment, would be very beneficial. However, the unpredictable nature of emergent use suggests that systematic techniques for identifying these issues may be very difficult to devise. Finally, regarding evaluation, design teams need techniques specific to privacy, similar to heuristic evaluation and cognitive walkthrough. There is a general lack of understanding on how to evaluate the quality of a design with respect to privacy. This challenge is exacerbated by the rarity of privacy breaches, by the disconnect between the time and place of the actual privacy breach and when the user becomes aware of it, and by the ever-shifting attitudes and behaviors of users becoming familiar with new technologies. Several techniques have been employed to address these challenges, such as presenting realistic previews of features (e.g., with the scenarios discussed in section  REF _Ref163712917 \r \h 0), sampling peoples reactions to privacy concerns through remote usability tests and remote surveys, etc. Some work has also been already done on adapting QOC and heuristic evaluation (e.g., Bellotti and Sellens QOC technique  ADDIN EN.CITE Bellotti1997108710875Bellotti, VictoriaAgre, PhilRotenberg, MarcDesign for Privacy in Multimedia Computing and Communications EnvironmentsTechnology and Privacy: The New Landscape1997Cambridge, MA, USAMIT Press[42]). Other promising, yet unexplored, approaches are the use of cognitive walkthroughs tailored for privacy, as well as improved methods for conducting user studies to elicit possible privacy concerns. However, work on validating these techniques to assess their effectiveness is necessary before practitioners will be willing to embrace them. Better Organizational Practices The fourth research challenge encompasses the development of tools for managing personal information within organizations. Several authors have pointed out that information security software often fails not due to technical causes, but because of issues of management and control of the people operating the technology  ADDIN EN.CITE Mitnick2002114711476Kevin MitnickWilliam SimonThe Art of Deception: Controlling the Human Element of Security3041st2002Wiley0471237124Schneier20007167166Schneier, BruceSecrets and Lies: Digital Security in a Networked Worldxv+4122000New York, NY, USAWiley Computer Publishing0-471-25311-1[212, 261]. In his study of Automatic Teller Machines (ATM) failures, Anderson indicated that the three main reasons for failure were program bugs, interception of mail containing ATM cards, and theft and fraud by insiders  ADDIN EN.CITE Anderson199489089017Ross AndersonWhy Cryptosystems FailCommunications of the ACM37111994[25]. Similarly, reports of privacy breaches show that many breaches are attributable to those responsible for safeguarding the data, for example, airlines providing data to third parties  ADDIN EN.CITE Anton20041046104617Anton, A.He, Q.Baumer, D.The complexity underlying JetBlues privacy policy violationsIEEE Security & Privacy1218262004November/Decembre 2004IEEE[26], and consumer reporting agencies providing personal data to outsiders pretending to be legitimate customers  ADDIN EN.CITE Wardell200586186123Jane WardellLexisNexis Breach May Be Worse Than ThoughtAP Financial Wire
Business News
20054/13/2005 London
[297]. The privacy breaches mentioned above indicate that helping organizations create and enforce effective privacy policies is a significant research challenge that should also involve researchers both in HCI and CSCW. Corporate caretakers of personal information are becoming increasingly aware of the importance of privacy. Many companies have defined policies and procedures for handling personal information, and a few have gone so far as creating the position of Chief Privacy Officer. Some of these programs have been enacted voluntarily, under pressure by the market to curb privacy breaches. Other organizations have implemented these changes to comply with legislation such as EU Directive 95/46 or HIPAA. Knowledge in this area is in part hidden behind corporate walls, and the academic community has largely ignored these issues. This lack of attention in academia is worrying, because management of personal information is one of the most challenging aspects of IT security today  ADDIN EN.CITE Knapp20041104110427Knapp, K.J.Marshall, T.E.Rainer, R.K.Morrow, D.W.<style face="normal" font="default" size="100%">Top Ranked Information Security Issues: The 2004 International Information Systems Security Certification Consortium (ISC)</style><style face="superscript" font="default" size="100%">2</style><style face="normal" font="default" size="100%"> Survey Results</style>2004Auburn, ALAuburn University[182]. Much more work is needed in this domain, and specifically in three areas: 1) defining privacy policies, 2) implementing and enforcing them, and 3) auditing system performance. With respect to the first issue, we need better tools for defining privacy policies, both at the level of the organization and in relation to its IT systems. Industry standards and procedures could be very helpful to draft policies  ADDIN EN.CITE Iachello200369669610Giovanni IachelloProtecting Personal Data: Can IT Security Management Standards Help?ACSAC2662752003Dec. 2003Las Vegas, Nevada, USAIEEE Press0-7695-2041-3[155], but require an open dialogue between industry and academia with which many commercial organizations may still be uncomfortable. Once policies are drafted, tools such as IBMs SPARCLE  ADDIN EN.CITE Karat20061083108310Karat, Clare-MarieKarat, JohnBrodie, CarolynFeng, JinjuanEvaluating interfaces for privacy policy rule authoringSIGCHI Conference on Human Factors in Computing Systems83-922006April 22 - 27, 2006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124787[176] could be used to convert the policies into machine-readable form, facilitating implementation. One fundamental open question is whether a machine-readable privacy policy language (e.g., P3P) can be comprehensive enough to model all possible requirements and organizational assumptions. Second, we need more support for implementing and enforcing privacy policies. These challenges rest both with the people and the technology involved in the personal data processing. The technical implementation of privacy policies has been the topic of systems research  ADDIN EN.CITE Ashley200291191110P. AshleyM. SchunterC. PowersFrom Privacy Promises to Privacy Management A New Approach for Enforcing Privacy Throughout an EnterpriseNew Security Paradigms WorkshopNSPW2002Virginia Beach, VAACM Press[30], and some of those ideas have been incorporated into commercial products (e.g., IBMs Tivoli product line). It is worth noting that the challenge of enforcement is exacerbated as we move towards mobile and ubiquitous computing environments. A single, unaccounted mobile device can create massive problems for an organization that are difficult to remedy. For example, because most laptops are configured to tunnel through corporate firewalls, a company would have to assume that a lost or stolen laptop could be used to breach network security. There have also been many incidents of laptops containing personal data on thousands of people being stolen or lost. Incidents like these dramatically expose organizations vulnerability to large-scale identity theft. Technical considerations aside  ADDIN EN.CITE Blackburn20041150115017Blackburn, MariaHIPAA, Heal ThyselfJohns Hopkins Magazine5652004November 2004The Johns Hopkins UniversityRambll Management Denmark20051151115127Rambll Management Denmark,Economic Evaluation of the Data Protection Directive 95/46/EC Final Report 2005May 2005http://ec.europa.eu/justice_home/fsj/privacy/docs/studies/economic_evaluation_en.pdfOctober 1, 2006[47, 245], there are also considerable acceptance challenges to implementing a privacy management program within an organization. Developing the human side of the policies should be a priority for the MIS and CSCW communities, as shown by the work by Adams and Blandford. Adams and Blandford discuss the effects of the introduction of access control systems to patient data within a health care settings  ADDIN EN.CITE Adams200571471417Anne AdamsAnn BlandfordBridging the gap between organizational and user perspectives of security in the clinical domainInt. J. Human-Computer Studies175202632005Elsevier[16]. They studied two hospitals through in-depth interviews, focus groups, and observations, and found that in one hospital, a user-centered approach resulted in a collaborative system that was accepted and used by the organization, but still clashed with existing working practices. In the second hospital, poor communication to workers about IT security resulted in their misuse by some employees, who viewed them as a tool of social control. Similarly Gaw et al. observed that email encryption tools can fail adoption because of social pressure and perceptions of ones identity  ADDIN EN.CITE Gaw20061088108810Gaw, S.Felten, E. W.Fernandez-Kelly, P.R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries, and G. OlsonSecrecy, flagging, and paranoia: adoption criteria in encrypted emailSIGCHI Conference on Human Factors in Computing Systems591-6002006April 22 - 27, 2006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124862[119]. Finally, the privacy community needs better tools for performing audits, probing data processing practices, and tracing information leaks. The former tools would ensure that information is not being leaked accidentally (e.g., being published on web sites, such as in a case with AOL  ADDIN EN.CITE Jesdanun20061149114923Anick JesdanunAOL: Breach of Privacy Was a MistakeAssociated Press Financial Wire
BUSINESS NEWS
2006August 8, 2006New YorkAssociated Press Financial Wire
[171]) or intentionally. The latter tools would ensure that any published information can be traced back to the original owner so that appropriate corrective actions can be taken. Sasse reflects on the current usability disaster afflicting security technology and suggests two courses of action for recovery  ADDIN EN.CITE Sasse200396196110Sasse, A.Computer Security: Anatomy of a Usability Disaster, and a Plan for Recovery2003 Workshop on Human-Computer Interaction and Security Systems at CHI 20032003http://www.andrewpatrick.ca/CHI2003/HCISEC/3/30/2005[255]. She suggests using HCI techniques to analyze the cognitive demands of security technologies such as password schemes. Sasse also suggests using these techniques to predict expected behaviors, such as users writing down hard-to-remember passwords. In fact, Sasse points out relevant research challenges, noting that carelessness for security and privacy depends largely on user attitudes. One possible way of fostering secure behavior is to make it the preferable option, that is devising technologies that are secure by default. We took a similar stance above in Section  REF _Ref148891578 \r \h 3.3.3, when we discussed the option of motivating users to adopt more secure behaviors. In summary, since HCI researchers have started to study how security technology is used in the real world  ADDIN EN.CITE Grinter200496296217Grinter, R.Dourish, P.Delgado de la Flor, J.Joseph, M.Security in the wild: user strategies for managing security as an everyday, practical problemPersonal and Ubiquitous ComputingPersonal and Ubiquitous Computing391401862004November 2004[130], security and privacy management should be viewed as a major and promising item requiring much additional research. Understanding Adoption Finally, the fifth emerging theme that we see emerging is the convergence of research on privacy with research on end-user technological acceptance and adoption. The main evidence supporting this trend is 1) that privacy expectations and perceptions change over time as people become accustomed to using a particularly technology, and 2) that privacy concerns are only one of several elements involved in the success of a particular application. In Section 3.1, we described some methods that have been employed to understand user needs; however, it is still difficult to assess what the potential privacy impact will be before actually deploying a system. A typical process is to develop a full system (or new feature), deploy it, and then wait for negative responses from the public or the media, fixing or canceling the system in response. However, it is well known that modifying an existing system late in the design cycle is an expensive proposition. There is a strong need for better methods and tools for quickly and accurately assessing potential privacy risks as well as end-user privacy perceptions. To illustrate this argument, we consider the acceptance history of ubiquitous computing technologies, which have been hotly debated for the past 15 years over their effects on privacy. A Story of Rejection And Acceptance: The Importance Of Value Propositions Xerox PARCs initial foray into ubiquitous computing in the late 1980s provides an instructive case study on privacy. While groundbreaking research was being conducted at PARC, researchers in other labs (and even at PARC) had visceral and highly negative responses to the entire research program. Harper quotes one colleague external to the research team which developed Active Badges as saying: Do I wear badges? No way. I am completely against wearing badges. I dont want management to know where I am. No. I think the people who made them should be taken out and shot... it is stupid to think that they should research badges because it is technologically interesting. They (badges) will be used to track me around. They will be used to track me around in my private life. They make me furious.  ADDIN EN.CITE Harper19961162116210Richard H. HarperWhy People Do and Don't Wear Active Badges: A Case StudyComputer Supported Cooperative Work (CSCW96)297-3181996[140] The media amplified the potential privacy risks posed by these technologies, publishing headlines such as Big Brother, Pinned to Your Chest  ADDIN EN.CITE Coy199237537519Peter CoyBig Brother, Pinned to your chestBusiness Week1992August 17[68] and Orwellian Dream Come True: A Badge That Pinpoints You  ADDIN EN.CITE Sloane199233533523Leonard SloaneOrwellian Dream Come True: A Badge That Pinpoints YouNew York Times141992Sept 12 1992[266]. Ubiquitous computing was not seen as an aid for people in their everyday lives, but as a pervasive surveillance system that would further cement existing power structures. Similar observations were voiced also in the IT community. For example, Stephen Doheny-Farina published an essay entitled Default = Offline, or Why Ubicomp Scares Me  ADDIN EN.CITE Doheny-Farina199424024019Stephen Doheny-FarinaThe Last Link: Default = Offline, Or Why Ubicomp Scares MeComputer-mediated Communication18-20161994October 1[84]. Howard Rheingold observed that ubiquitous computing technologies might lead directly to a future of safe, efficient, soulless, and merciless universal surveillance  ADDIN EN.CITE Rheingold199437637617Howard RheingoldPARC is Back!Wired221994http://www.wired.com/wired/archive/2.02/parc.html[249]. One reason for these negative reactions was that PARCs ubicomp system was all or nothing. Users did not have control on how the information was shared with others. There were no provisions for ambiguity. Furthermore, the system provided no feedback about what information was revealed to others. This resulted in concerns that a co-worker or boss could monitor a users location by making repeated queries about the users location without that user ever knowing. A second important reason for these reactions lays in the way the ubiquitous computing project itself was presented. The researchers often talked about the technological underpinnings, but had few compelling applications to describe. Thus, discussions often revolved around the technology rather than the value proposition for end-users. To underscore this point, once researchers at PARC started talking about their technology in terms of invisible computing and calm computing, news articles came out with more positive headlines like Visionaries See Invisible Computing  ADDIN EN.CITE Rowan19971144114423Geoffrey RowanVisionaries See Invisible ComputingThe Globe and Mail1997April 1, 1997[253] and Here, There and Everywhere  ADDIN EN.CITE Wasserman19981145114523Elizabeth WassermanHere, There and EverywhereSan Jose Mercury News1F1998January 6, 1998[299]. Thinking about privacy from the perspective of the value proposition also helps to explain many of the recent protests against the proposed deployment of Radio Frequency Identification (rfid) systems in the United States and in England  ADDIN EN.CITE BBC News200335535512BBC News,Radio tags spark privacy worries2003November, 21 2003http://news.bbc.co.uk/1/hi/technology/3224920.stm[37]. From a retailers perspective, rfids reduce the costs of tracking inventory, and maintaining steady supply chains. However, from a customers perspective, rfids are potentially harmful, because they expose customers to the risk of surreptitious tracking without any benefit to them. Models of Privacy Factors Affecting Acceptance The lack of a value proposition in the privacy debate can be analyzed using Grudins Law. Informally, it states that when those who benefit from a technology are not the same as those who bear the brunt of operating it, then it is likely to fail or be subverted  ADDIN EN.CITE Grudin199412012019Jonathan GrudinGroupware and Social Dynamics: Eight Challenges for DevelopersCommunications of the ACM92-1053711994[133]. The privacy corollary is that when those who share personal information do not benefit in proportion to the perceived risks, the technology is likely to fail. However, a more nuanced view suggests that even strong value proposition may not be sufficient to achieve acceptance of novel applications. Eventually, applications enter the hands of users and are accepted or rejected based on their actual or perceived benefits. HCI practitioners would benefit from reliable models of how privacy attitudes impact adoption. We see two aspects of understanding acceptance patterns: 1) a static view, in which an acceptance decision is made one-off based on available information, and 2) a dynamic view, in which acceptance and adoption evolve over time. We discuss two working hypotheses related to these two aspects of acceptance next. Static Acceptance Models In a renowned article on technology credibility, Fogg and Tseng drafted three models of credibility evaluation: the binary, threshold, and the spectral evaluation models  ADDIN EN.CITE Fogg19991072107210Fogg, B. J.Tseng, HsiangThe elements of computer credibilitySIGCHI Conference on Human Factors in Computing Systems: the CHI Is the Limit 80-871999May 15 - 20, 1999Pittsburgh, Pennsylvania, United StatesACM Presshttp://doi.acm.org/10.1145/302979.303001[110]. Fogg and Tseng argued that these models helped explain how different levels of interest and knowledge affect how users perceive the credibility of a product, thus impacting adoption. We advance a similar argument here, adopting these three models with respect to privacy (see  REF _Ref172894609 \h Figure 2). The binary evaluation model suggests that the acceptance or rejection of a technology is impacted by its perception of being trustworthy (or not) in protecting the users privacy. This strategy is adopted by users who lack the time, interest, or knowledge for making a more nuanced decision.  SHAPE \* MERGEFORMAT  Figure  SEQ Figure \* ARABIC 2. Three models of privacy concerns impacting adoption. A simple view of the domain leads to a binary evaluation model. Increasingly sophisticated understanding allow users to employ more refined evaluation models (Threshold Evaluation and Spectral Evaluation). Picture adapted from  ADDIN EN.CITE Fogg1999107284107210Fogg, B. J.Tseng, HsiangThe elements of computer credibilitySIGCHI Conference on Human Factors in Computing Systems: the CHI Is the Limit 80-871999May 15 - 20, 1999Pittsburgh, Pennsylvania, United StatesACM Presshttp://doi.acm.org/10.1145/302979.303001[110]. The threshold evaluation model is adopted by users with moderate interest or knowledge in a particular technology. It suggests that a product is accepted if the perceived trustworthiness is above a certain threshold. Between these thresholds, a more nuanced opinion is formed by the user and other considerations are brought to bear, which may affect an acceptance judgment. The spectral evaluation model is adopted by users with the resources and knowledge to form a sophisticated view of a system, and does not necessarily imply a flat-out rejection or acceptance of a system, whatever its privacy qualities. While these models are only informed speculation, we believe that there is value in studying acceptance in the context of HCI and privacy. MIS literature on technological acceptance informs us that adoption hinges on several factors, including usability, usefulness, and social influences. Social influences also includes social appropriateness and the users comfort level, specifically in relation to privacy concerns  ADDIN EN.CITE Venkatesh200395695617Venkatesh, V.Morris, M.G.Davis, G.B.Davis, F.D.User Acceptance of Information Technology: Toward a Unified ViewMIS Quarterly4254782732003Management Information Systems Research Center, University of Minnesota02767783[292]. Patrick, Briggs, and Marsh emphasize the issue of trust as an important factor in peoples acceptance of systems  ADDIN EN.CITE Patrick2005119811985Andrew PatrickPamela BriggsStephen MarshLorrie CranorSimson GarfinkelDesigning Systems that People Will TrustSecurity and Usability75-992005[234]. They provide an overview of different layered kinds of trust. These include dispositional trust, based on ones personality; learned trust, based on ones personal experiences; and situational trust, based on ones current circumstances. They also outline a number of models of trust, which take into account factors such as familiarity, willingness to transact, customer loyalty, uncertainty, credibility, and ease of use. There currently is not a great deal of work examining trust with respect to privacy, but it the reader should be convinced that there is a strong link between trust and privacy. One complication of these theories is that the cultural context affects acceptance. Themes that are hotly debated by a nations media can significantly impact the perception of privacy risks. For example, a 2003 poll in the European Union showed that privacy concerns vary by national context based on media attention on the subject  ADDIN EN.CITE European Opinion Research Group EEIG20031073107327European Opinion Research Group EEIG,Special Eurobarometer Data Protection Executive SummarySpecial Eurobarometer122003December 2003Bruxelles, BelgiumEuropean Commission Special Eurobarometer 196 Wave 60.0http://ec.europa.eu/public_opinion/archives/ebs/ebs_196_exec_summ.pdf[102]. However, it is not clear how to reliably predict such concerns when moving from country to country. Perhaps a general survey administered prior to deployment could be useful in these situations. Finally, other factors, such as education, socio-economic status, and labor relations can affect privacy concerns, but we are not aware of any work in these areas in the HCI community. Clearly, there needs to be more work focusing on cultural and social context to gain a more refined understanding of how the phenomena of acceptance unfolds within a given user base.  SHAPE \* MERGEFORMAT  Figure  SEQ Figure \* ARABIC 3. The Privacy Hump, a working hypothesis describing the acceptance of potentially intrusive technologies. Early in the life cycle of a technology, users have concerns about how the technology will be used, often couched in terms of privacy. However, if, over time, privacy violations do not occur, and a system of market, social, legal, and technical forces addresses legitimate concerns, then a community of users can overcome the hump and the technology is accepted. The Privacy Hump In addition to static acceptance models, HCI practitioners would benefit from reliable models to predict the evolution of privacy attitudes and behaviors over time  ADDIN EN.CITE Iachello200696796710Giovanni IachelloKhai N. Truong Gregory D. Abowd Gillian R. HayesMolly StevensExperience Prototyping and Sampling to Evaluate Ubicomp Privacy in the Real WorldCHI 20062006Montreal, CanadaACM Press[158]. Looking back at past technologies and understanding the drivers for acceptance or rejection can help formulate informed hypotheses going forward. Our basic assumption is that the notion of information privacy is constantly re-formulated as new technologies become widespread and accepted in everyday practice. Some technologies, initially perceived as intrusive, are now commonplace and even seen as desirable, clearly demonstrating that peoples attitudes and behaviors towards a technology change over time. For example, when the telephone was first introduced, many people objected to having phones in their homes because it permitted intrusion by solicitors, purveyors of inferior music, eavesdropping operators, and even wire-transmitted germs  ADDIN EN.CITE Fischer19944064066Claude S. FischerAmerica Calling: A Social History of the Telephone to 19404241994University of California Press0520086473[106]. These concerns, expressed by people at the time, would be easily dismissed today. We hypothesize that the resistance in accepting many potentially intrusive technologies follows a curve that we call the Privacy Hump (see  REF _Ref172894829 \h Figure 3). Early on in the life cycle of a technology, there are many concerns about how these technologies will be used. Some of these are legitimate concerns, while others are based more on misunderstandings about the technology (for example, the quote above that phones could transmit germs). There are also many questions about the right way of deploying these technologies. Businesses have not worked out how to convey the right value propositions to consumers, and society has not worked out what is and is not acceptable use of these technologies. Many of these concerns are lumped together under the rubric of privacy, or invasiveness, forming a privacy hump that represents a barrier to the acceptance of a potentially intrusive technology. Over time, however, the concerns may fade, especially if the value proposition of the technology is strong enough. The worst fears do not materialize, society adapts to the technology, and laws are passed to punish violators. An example of the former is that most people understand it is appropriate to take a photo at a wedding but not at a funeral. An example of the latter are do not call lists that protect individuals from telemarketers and laws punishing camera voyeurs  ADDIN EN.CITE 20048358354United States Video Voyeurism Prevention ActUSC18
1801 et seq.
2004
[5]. In other words, if a large enough community of users overcomes the privacy hump, it is not because their privacy concerns disappear, but because the entire systemthe market, social norms, laws, and technology  ADDIN EN.CITE Lessig199983083017Lessig, LawrenceThe Architecture of PrivacyVanderbilt Journal of Entertainment Law & Practice5611999Spring 1999[199]adapt to make these concerns understandable and manageable. It should be noted, that the privacy hump cannot always be overcome. For example, nurses have rejected the use of locator badges in more than one instance  ADDIN EN.CITE California Nurses Association200239439412California Nurses Association,Eden RNs Protest Electronic Tracking Devices: Mass Turn-in of Nurse Locator Buttons2002http://www.calnurse.org/cna/press/90402a.htmlallnurses.com200238138112allnurses.comNew Restroom protocol per management....2002http://allnurses.com/t16164.html[22, 59]. The privacy hump hypothesis is an educated speculation, and it is not clear to us how to acquire empirical evidence to confirm or refute it. However, if this predictive model is correct, it would suggest many directions for future research. For example, research could investigate what factors contribute to the concerns expressed by a community of users. This might include better ways of tailoring new technologies to different categories of people, perhaps along the fundamentalist / pragmatist / unconcerned continuum (as described Section  REF _Ref172894884 \r \h 3.1.1) or along an innovators / early adopters / early majority / late majority / laggards spectrum, as described by Rogers  ADDIN EN.CITE Rogers2003117811786Everett RogersDiffusion of Innovations5th2003Free Press[250]. Other work could investigate what UIs, value propositions, and policies flatten the peak of the privacy hump and accelerate the process of acceptance (assuming a positive judgment by the designer that a given technology ought to be accepted)  ADDIN EN.CITE Iachello200696796710Giovanni IachelloKhai N. Truong Gregory D. Abowd Gillian R. HayesMolly StevensExperience Prototyping and Sampling to Evaluate Ubicomp Privacy in the Real WorldCHI 20062006Montreal, CanadaACM Press[158]. For example, we mentioned earlier in Section 4.5.1, when recounting PARCs ubicomp experience, how poor framing of a technology severely impacted its acceptance. Lastly, personal experience may affect an individuals conception of privacy risks. For example, a preliminary study conducted by Pew Internet & American Life suggests that when people first use the Internet, they are less likely to engage in risky activities such as buying online or chatting with strangers, but are more likely to do so after a year of experience  ADDIN EN.CITE Pew Internet & American Life200123123112Pew Internet & American Life,Testimony of Lee Rainie: Director, Pew Internet & American Life Project2001May 8, 2001http://www.pewinternet.org/reports/toc.asp?Report=34[237]. Understanding the privacy hump from these perspectives would be useful, because it would help us to understand how to better design and deploy technologies, how to increase the likelihood of their acceptance, and what acceptance timeline to expect. Conclusions In the past ten years, privacy has become a mainstream topic in human-computer interaction research, as attested by the growing number of surveys, studies, and experiments in this area. In this article, we presented a survey of this rich and diverse landscape, describing some of the legal foundations and historical aspects of privacy, sketching out an overview of the body of knowledge with respect to designing, implementing, and evaluating privacy-affecting systems, and charting many directions for future work. We believe that the strong interest in and growth of this field is a response to legitimate concerns arising from the introduction of new technologies, and is, overall, a positive development. However, understanding privacy requires HCI practitioners to expand their field of view from traditional HCI domains such as social psychology and cognitive science, to a broader picture which includes economics and law. In Section 4, we listed five challenges facing the field today, that must be tackled to advance the current state of the art in this field: The development of better interaction techniques and standard defaults that users can easily understand. The development of stronger analysis techniques and survey tools. The documentation of the effectiveness of design tools, and the creation of a privacy toolbox. The development of organizational support for managing personal data. The development of a rigorous theory of acceptance dynamics of users, specifically related to privacy. This review shows that work is well already underway in most of these directions, but is still unorganized and dispersed. Our hope that this article, summarizing thirty years of privacy research in HCI and CSCW, helps to shed light on many of the salient issues and will help practitioners and researchers alike explore these complex issues in a more informed and conscious way. Acknowledgements We thank Gregory Abowd, Alessandro Acquisti, Ben Bederson, Lorrie Cranor, Paul Dourish, Gillian Hayes, James Finlay, Heather Richter, Norman Sadeh, Karen Tang, and all the reviewers for their help. We are also indebted with countless colleagues in three continents for stimulating intellectual exchanges. Support for this work was provided by the NSF Graduate Research Fellowship Program, the President of Georgia Tech, the Dean of the College of Computing of Georgia Tech, and the MacArthur Foundation through the Sam Nunn Security Program. This work is also supported in part by Intel Research, NSF Grants CNS-0627513 ( HYPERLINK "http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0627513" \o "Use this link to retrieve the award" \t "_blank" User-Controllable Security and Privacy for Pervasive Computing) and IIS- HYPERLINK "http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0534406" \o "Use this link to retrieve the award" \t "_blank" 0534406 ( HYPERLINK "http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0534406" \o "Use this link to retrieve the award" \t "_blank" Next Generation Instant Messaging: Communication, Coordination, and Privacy for Mobile, Multimodal, and Location-Aware Devices), and ARO research grant DAAD19-02-1-0389 (Perpetually Available and Secure Information Systems) to Carnegie Mellon University's CyLab. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the agencies above.  ADDIN EN.REFLIST 1. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, in Official Journal of the European Communities. 1995. p. 3150. 2. Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), in Official Journal of the European Communities. 2002. p. 3747. 3. United States Electronic Communications Privacy Act of 1986, in USC. 1986. 4. United States Health Insurance Portability And Accountability Act, in USC. 1999. 5. United States Video Voyeurism Prevention Act, in USC. 2004. 6. Volkszhlungsurteil vom 15. Dezember 1983, BVerfGE 65,1. 1983, German Constitutional Court (Bundesverfassungsgerichts).  HYPERLINK "http://www.datenschutz-berlin.de/gesetze/sonstige/volksz.htm" http://www.datenschutz-berlin.de/gesetze/sonstige/volksz.htm 7. Aalberts, B. and S. van der Hof, Digital Signature Blindness Analysis of legislative approaches toward electronic authentication. Technical, November 1999 1999. 8. Ackerman, M.S., Privacy in pervasive environments: next generation labeling protocols. Pers Ubiquit Comput 2004. 8: p. 430439. 9. Ackerman, M.S., L. Cranor, and J. Reagle. Privacy in e-commerce: examining user scenarios and privacy preferences. In Proceedings of ACM conference on electronic commerce (EC99). Denver, Colorado. pp. 18, November 1999 1999. 10. Ackerman, M.S. and L.F. Cranor. Privacy Critics: Safe-guarding Users' Personal Data. In Proceedings of Human Factors in Computing Systems: CHI '99. pp. 258-259 1999. 11. Ackerman, M.S. and S.D. Mainwaring, Privacy Issues and Human-Computer Interaction, in Security and Usability: Designing Secure Systems That People Can Use, S. Garfinkel and L. Cranor, Editors. O'Reilly: Sebastopol, CA, USA. p. 381400, 2005. 12. Ackerman, M.S., B. Starr, D. Hindus, and S.D. Mainwaring, Hanging on the wire: a field study of an audio-only media space. ACM Transactions on Computer-Human Interaction (TOCHI) 1997. 4(1). 13. Acquisti, A. Protecting Privacy with Economics: Economic Incentives for Preventive Technologies in Ubiquitous Computing Environments. In Proceedings of Workshop on Socially-informed Design of Privacy-enhancing Solutions, 4th International Conference on Ubiquitous Computing (UBICOMP 02). Goteborg, Sweden 2002.  HYPERLINK "http://guir.berkeley.edu/privacyworkshop2002/" http://guir.berkeley.edu/privacyworkshop2002/ 14. Acquisti, A. and J. Groklags, Privacy and Rationality in Individual Decision Making. IEEE Security and Privacy 2005. 3(1): p. 2633. 15. Adams, A. Multimedia Information Changes the Whole Privacy Ball Game. In Proceedings of Computers, Freedom, and Privacy. Toronto, Canada: ACM Press. pp. 25-32 2000. 16. Adams, A. and A. Blandford, Bridging the gap between organizational and user perspectives of security in the clinical domain. Int. J. Human-Computer Studies 2005. 63: p. 175202. 17. Agre, P.E., Changing Places: Contexts of Awareness in Computing. Human-Computer Interaction 2001. 16(2-4): p. 177-192. 18. Agre, P.E., Surveillance and Capture: Two models of privacy. The Information Society 1994. 10: p. 101127. 19. Agre, P.E. and M. Rotenberg, Technology and Privacy: The New Landscape. Cambridge MA: MIT Press, 1997. 20. Alexander, C., The Timeless Way Of Building. New York, NY, USA: Oxford University Press, 1979. 21. Allen, J.P., Controversies about privacy and open information in CSCW: a panel report from the CSCW'92 conference. SIGOIS Bull 1993. 14(1): p. 19-20. 22. allnurses.com, New Restroom protocol per management. 2002.  HYPERLINK "http://allnurses.com/t16164.html" http://allnurses.com/t16164.html 23. Altman, I., The Environment and Social BehaviorPrivacy, Personal Space, Territory, Crowding. Monterey, CA: Brooks/Cole Publishing Company, 1975. 24. Ammenwerth, E., A. Buchauer, H.-B. Bludau, and A. Ronagel, Simulation Studies for the Evaluation of Security Technology, in Multilateral Security: Technology, Infrastructure, Economy, G. Mller and K. Rannenberg, Editors. Addison Wesley Longman Verlag GmbH. p. 547560, 1999. 25. Anderson, R., Why Cryptosystems Fail. Communications of the ACM 1994. 37(11). 26. Anton, A., Q. He, and D. Baumer, The complexity underlying JetBlues privacy policy violations. IEEE Security & Privacy 2004. 2(6): p. 1218. 27. Anton, A.I. and J.B. Earp, A requirements taxonomy for reducing Web site privacy vulnerabilities. Requirements Engineering 2004. 9: p. 169-185. 28. Aoki, P.M. and A. Woodruff. Making Space for Stories: Ambiguity in the Design of Personal Communication Systems. In Proceedings of Human Factors in Computing Systems (CHI 2005). Portland, OR, USA: ACM Press. pp. 181-190 2005. 29. Arnold, M., On the phenomenology of technology: the "Janus-faces" of mobile phones. Information and Organization (Information and Organization), 2003. 13: p. 231256. 30. Ashley, P., M. Schunter, and C. Powers. From Privacy Promises to Privacy Management A New Approach for Enforcing Privacy Throughout an Enterprise. In Proceedings of New Security Paradigms Workshop. Virginia Beach, VA: ACM Press 2002. 31. Association for Computing Machinery, ACM Code of Ethics.  HYPERLINK "http://www.acm.org/serving/ethics.html" http://www.acm.org/serving/ethics.html 32. Avrahami, D., D. Gergle, S.E. Hudson, and S. Kiesler, Improving the accuracy of cell phone interruptions: A study on the effect of contextual information on the behaviour of callers. Behavior And Information Technology 2007. 26(3): p. 247-259. 33. Awad, N.F. and M.S. Krishnan, The Personalization Privacy Paradox: An Empirical Evaluation of Information Transparency and the Willingness to Be Profiled Online For Personalization. MIS Quarterly 2006. 30(1): p. 13-28. 34. Ball, E., D.W. Chadwick, and D. Mundy, Patient privacy in electronic prescription transfer. IEEE Security & Privacy 2003. 1(2): p. 7780. 35. Barkhuus, L. and A. Dey. Location-Based Services for Mobile Telephony: a Study of Users Privacy Concerns. In Proceedings of Interact 2003, 9th IFIP TC13 International Conference on Human-Computer Interaction. Zurich, Switzerland: ACM Press. pp. 709712 2003. 36. Baumer, D.L., J.B. Earp, and P.S. Evers, Tit for Tat in Cyberspace: Consumer and Website Responses to Anarchy in the Market for Personal Information. North Carolina Journal of Law and Technology 2003. 4(2): p. 217274. 37. BBC News, Radio tags spark privacy worries. 2003.  HYPERLINK "http://news.bbc.co.uk/1/hi/technology/3224920.stm" http://news.bbc.co.uk/1/hi/technology/3224920.stm 38. Beckwith, R., Designing for Ubiquity: The Perception of Privacy. IEEE Pervasive 2002. 2(2): p. 40-46. 39. Begole, J., J.C. Tang, R.B. Smith, and N. Yankelovich. Work Rhythms: Analyzing Visualizations of Awareness Histories of Distributed Groups. In Proceedings of CSCW02. New Orleans, LA, USA: ACM Press, November 16-20, 2002 2002. 40. Begole, J.B., N.E. Matsakis, and J.C. Tang. Lilsys: Sensing Unavailability. In Proceedings of Conference on Computer Supported Cooperative Work. Chicago: ACM Press 2004. 41. Bekkering, E. and J.P. Shim, Trust in videoconferencing. Communications of the ACM 2006. 49(7): p. 103-107. 42. Bellotti, V., Design for Privacy in Multimedia Computing and Communications Environments, in Technology and Privacy: The New Landscape, P. Agre and M. Rotenberg, Editors. MIT Press: Cambridge, MA, USA, 1997. 43. Bellotti, V. and A. Sellen. Design for Privacy in Ubiquitous Computing Environments. In Proceedings of The Third European Conference on Computer Supported Cooperative Work (ECSCW'93). Milan, Italy: Kluwer Academic Publishers 1993. 44. Berendt, B., O. Gnther, and S. Spiekermann, Privacy in e-commerce: stated preferences vs. actual behavior. Communications of the ACM 2005. 48(4): p. 101106. 45. Beresford, A.R. and F. Stajano, Location Privacy in Pervasive Computing. IEEE Pervasive Computing 2003. 2(1): p. 4655. 46. Berleur, J. and K. Brunnstein, Ethics of Computing: Codes, spaces for discussion and law. ed. Chapman & Hall: London, England, 1996. 47. Blackburn, M., HIPAA, Heal Thyself. Johns Hopkins Magazine 2004. 56(5). 48. boyd, d., Faceted Id/entity: Managing representation in a digital world, Unpublished Master's Thesis, MIT, Cambridge, MA, 2002. 49. Boyle, M., C. Edwards, and S. Greenberg. The effects of filtered video on awareness and privacy. In Proceedings of ACM CSCW 2000: ACM Press. pp. 110 2000. 50. Boyle, M. and S. Greenberg, The language of privacy: Learning from video media space analysis and design. ACM Transactions on Computer-Human Interaction (TOCHI) 2005. 12(2). 51. Brin, D., The Transparent Society. Reading, MA: Perseus Books, 1998. 52. British Institute of International and Comparative Law, The implementation of Directive 95/46/EC to the Processing of Sound and Image Data. Technical 2003. 53. Brostoff, S. and M.A. Sasse. Safe and sound: a safety-critical design approach to security. In Proceedings of New Security Paradigms Workshop 2001: ACM Press. pp. 41-50, Sept. 10-13, Cloudcroft, NM 2001. 54. Brunk, B., Understanding the Privacy Space. 2002.  HYPERLINK "http://www.firstmonday.org/issues/issue7_10/brunk/" http://www.firstmonday.org/issues/issue7_10/brunk/ 55. Brusilovsky, P., A. Kobsa, and W. Nejdl, The Adaptive Web: Methods and Strategies of Web Personalization. ed. Springer Verlag: Heidelberg, Germany, 2007. 56. Buchenau, M. and J.F. Suri. Experience Prototyping. In Proceedings of DIS 2000: ACM Press. pp. 424433 2000. 57. Byers, S., L.F. Cranor, D. Kormann, and P. McDaniel. Searching for Privacy: Design and Implementation of a P3P-Enabled Search Engine. In Proceedings of Workshop on Privacy Enhancing Technologies (PET2004) 2004. 58. Cadiz, J. and A. Gupta, Privacy Interfaces for Collaboration. Technical Report MSR-TR-2001-82, Microsoft Research, Redmond, WA 2001. 59. California Nurses Association, Eden RNs Protest Electronic Tracking Devices: Mass Turn-in of Nurse Locator Buttons. 2002.  HYPERLINK "http://www.calnurse.org/cna/press/90402a.html" http://www.calnurse.org/cna/press/90402a.html 60. Chaum, D., Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms. Communications of the ACM (Communications of the ACM), 1981. 24(2): p. 8488. 61. Chung, E., J. Hong, J. Lin, M. Prabaker, J. Landay, and A. Liu. Development and Evaluation of Emerging Design Patterns for Ubiquitous Computing. In Proceedings of DIS 2004: Designing Interactive Systems. Boston, MA, USA: ACM Press. pp. 233242 2004. 62. Citro, C.F., D.R. Iglen, and C.B. Marrett, Protecting Participants and Facilitating Social and Behavioral Sciences Research. ed. National Academies Press: Washington, DC, USA, 2003. 63. Colbert, M. A diary study of rendezvousing: implications for position-aware computing and communications for the general public. In Proceedings of 2001 International ACM SIGGROUP Conference on Supporting Group Work. Boulder, Colorado, USA: ACM Press. pp. 1523 2001.  HYPERLINK "http://doi.acm.org/10.1145/500286.500292" http://doi.acm.org/10.1145/500286.500292 64. Commission of the European Communities, First report on the implementation of the Data Protection Directive (95/46/EC). Technical Report COM(2003) 265 final, Commission of the European Communities, Brussels, Belgium, 15/5/2003 2003. 65. Consolvo, S., I. Smith, T. Matthews, A. LaMarca, J. Tabert, and P. Powledge. Location Disclosure to Social Relations: Why, When, & What People Want to Share. In Proceedings of CHI 2005, Conference on Human Factors in Computing Systems: ACM Press. pp. 8290 2005. 66. Cool, C., R.S. Fish, R. Kraut, and C.M. Lowery. Iterative Design of Video Communication Systems. In Proceedings of CSCW 92: ACM Press. pp. 2532, November 1992 1992. 67. Council of Europe, The European Convention on Human Rights. Technical, Rome, Italy 1950. 68. Coy, P., Big Brother, Pinned to your chest, Business Week, 1992. 69. Cranor, L. I Didnt Buy it for Myself Privacy and Ecommerce Personalization. In Proceedings of Workshop on Privacy in the Electronic Society. Washington, DC, USA: ACM Press 2003. 70. Cranor, L., What Do They "Indicate?" Evaluating Security and Privacy Indicators, Interactions, vol. 13(3): pp. 45-57, 2006. 71. Cranor, L., P. Guduru, and M. Arjula, User Interfaces for Privacy Agents. To appear in ACM Transactions on Computer-Human Interaction 2006. 72. Cranor, L., M. Langheinrich, M. Marchiori, M. Presler-Marshall, and J. Reagle, The Platform for Privacy Preferences 1.0 (P3P1.0) specification. 2000, W3C.  HYPERLINK "http://www.w3.org/TR/P3P/" http://www.w3.org/TR/P3P/ 73. Cranor, L.F. and S. Garfinkel, Security and Usability: Designing Secure Systems That People Can Use. ed. O'Reilly: Sebastopol, CA, USA, 2005. 74. Cranor, L.F., J.I. Hong, and M. Reiter, Usable Privacy and Security: Course Overview Lecture Notes. 2006.  HYPERLINK "http://cups.cs.cmu.edu/courses/ups-sp06/slides/060117-overview.ppt" http://cups.cs.cmu.edu/courses/ups-sp06/slides/060117-overview.ppt 75. Cranor, L.F., M. Langheinrich, and M. Marchiori, A P3P Preference Exchange Language 1.0 (APPEL1.0). 2002, World Wide Web Consortium Working Draft.  HYPERLINK "http://www.w3.org/TR/WD-P3P-Preferences" http://www.w3.org/TR/WD-P3P-Preferences 76. Cranor, L.F., J. Reagle, and M.S. Ackerman, Beyond Concern: Understanding Net Users' Attitudes About Online Privacy, in The Internet Upheaval: Raising Questions, Seeking Answers in Communications Policy, I. Vogelsang and B.M. Compaine, Editors. MIT Press: Cambridge, MA. p. 47-70, 2000. 77. Culnan, M.J. and P.K. Armstrong, Information privacy concerns, procedural fairness, and impersonal trust: an empirical investigation. Organization Science 1999. 10(1): p. 104115. 78. Curry, M.R., D.J. Phillips, and P.M. Regan, Emergency Response Systems and the Creeping Legibility of People and Places. The Information Society 2004. 20(5): p. 357369. 79. Darrah, C., J. English-Lueck, and J. Freeman, Familes and Work: An Ethnography of Dual Career Families. 2001.  HYPERLINK "http://www2.sjsu.edu/depts/anthropology/svcp/SVCPslnr.html" http://www2.sjsu.edu/depts/anthropology/svcp/SVCPslnr.html 80. Davis, F.D., Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly 1989. 13(3): p. 319340. 81. Davis, S. and C. Gutwin. Using relationship to control disclosure in Awareness servers. In Proceedings of 2005 Conference on Graphics interface. Victoria, British Columbia, May 09 - 11, 2005: Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo, Ontario. pp. 145-152 2005. 82. Department of Health and Human Services, National Institutes of Health, and Office For Protection from Research Risks, Protection of Human Subjects. 2001. 83. DePaulo, B.M. and D.A. Kashy, Everyday Lies in Close and Casual Relationships. Journal of Personality and Social Psychology 1998. 74(1): p. 6379. 84. Doheny-Farina, S., The Last Link: Default = Offline, Or Why Ubicomp Scares Me, Computer-mediated Communication, vol. 1(6): pp. 18-20, 1994. 85. Dourish, P. and K. Anderson, Privacy, Security and Risk and Danger and Secrecy and Trust and Morality and Identity and Power: Understanding Collective Information Practices. Technical Report Technical Report UCI-ISR-05-1, Institute for Software Research, University of California at Irvine, Irvine, CA, USA 2005. 86. Dourish, P., B.E. Grinter, J.D.d.l. Flor, and M. Joseph, Security in the wild: user strategies for managing security as an everyday, practical problem. Personal and Ubiquitous Computing 2004. 8(6). 87. Duckham, M. and L. Kulik. A Formal Model of Obfuscation and Negotiation for Location Privacy. In Proceedings of Pervasive 2005. Munich, Germany: Springer Verlag. pp. 152170 2005.  HYPERLINK "http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11428572_10" http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11428572_10 88. Dunn, E.S., The Idea of a National Data Center and the Issue of Personal Privacy. The American Statistician 1967. 21(1): p. 2127. 89. Ebling, M.R., B.E. John, and M. Satyanarayanan, The Importance of Translucence in Mobile Computing Systems. ACM Transactions on Computer-Human Interaction (TOCHI) 2002. 9(1): p. 42-67. 90. Egelman, S., L.F. Cranor, and A. Chowdhury. An Analysis of P3P-Enabled Web Sites among Top-20 Search Results. In Proceedings of Eighth International Conference on Electronic Commerce. Fredericton, New Brunswick, Canada, August 14-16, 2006 2006. 91. Egelman, S., J. Tsai, L.F. Cranor, and A. Acquisti. Studying the Impact of Privacy Information on Online Purchase Decisions. In Proceedings of 2006 CHI Privacy Methods Workshop. Montreal, Quebec, Canada 2006. 92. Ehn, P. and M. Kyng, The Collective Approach to Systems Design, in Computers and Democracy: A Scandinavian Challenge, G. Bjerkes, P. Ehn, and M. Kyng, Editors. Avebury: Aldershot, Great Britain. p. 1758, 1987. 93. Electronic Privacy Information Center (EPIC) and Junkbusters, Pretty Poor Privacy: An Assessment of P3P and Internet Privacy. 2000.  HYPERLINK "http://www.epic.org/reports/prettypoorprivacy.html" http://www.epic.org/reports/prettypoorprivacy.html 94. Erickson, T. and W.A. Kellogg, Social translucence: an approach to designing systems that support social processes. ACM Transactions on Computer-Human Interaction (TOCHI) (ACM Transactions on Computer-Human Interaction (TOCHI)), 2000. 7(1): p. 5983. 95. Espey, J., G. Rudinger, and H. Neuf, Excessive Demands on Users of Technology, in Multilateral Security: Technology, Infrastructure, Economy, G. Mller and K. Rannenberg, Editors. Addison Wesley Longman Verlag GmbH. p. 439--449, 1999. 96. Esslinger, B. and D. Fox, Public Key Infrastructures in Banks -- Enterprise-wide PKIs, in Multilateral Security in Communications: Technology, Infrastructure, Economy, G. Mller and K. Rannenberg, Editors. Addison Wesley Longman Verlag GmbH. p. 283--300, 1999. 97. Etzioni, A., The Limits of Privacy. New York: Basic Books, 1999. 98. European Commission, Information Technology Security Evaluation Criteria. Technical Report Technical Report, Version 1.2, June 1991 1991. 99. European Commission Article 29 Working Party, Opinion 4/2004 on the Processing of Personal Data by means of Video Surveillance. Technical Report 11750/02/EN WP 89 2004. 100. European Commission Article 29 Working Party, Opinion on More Harmonised Information Provisions. Technical Report 11987/04/EN WP 100, European Commission, November 25 2004 2004. 101. European Commission Article 29 Working Party, Working Document on Biometrics. Technical Report 12168/02/EN WP80 2004. 102. European Opinion Research Group EEIG, Special Eurobarometer Data Protection Executive Summary. Technical Report Special Eurobarometer 196 Wave 60.0, European Commission, Bruxelles, Belgium, December 2003 2003. 103. Fallman, D. Design-oriented Human-Computer Interaction. In Proceedings of CHI 2003. Ft. Lauderdale, Florida, USA: ACM Press. pp. 225232, April 510, 2003 2003. 104. Federal Trade Commission, In the Matter of CardSystems Solutions, Inc., and Solidus Networks, Inc., Doing Business as Pay By Touch Solutions -- Complaint. Technical Report File No. 052 3148, Federal Trade Commission, February 23, 2006 2006. 105. Feng, J., A Brief Review of Research Methods In Privacy Studies, in Privacy Methods Workshop at CHI 2006. 2006: Montreal, Quebec, Canada. 106. Fischer, C.S., America Calling: A Social History of the Telephone to 1940: University of California Press. 424, 1994. 107. Fischer, G., A.C. Lemke, T. Mastaglio, and A.I. Morch. Using critics to empower users. In Proceedings of Conference on Human Factors in Computing Systems. Seattle, WA, USA: ACM Press, New York, NY. pp. 337-347 1990.  HYPERLINK "http://doi.acm.org/10.1145/97243.97305" http://doi.acm.org/10.1145/97243.97305 108. Fish, R., R.E. Kraut, R.W. Root, and R.E. Rice. Evaluating video as a technology for informal communication. In Proceedings of Human Factors in Computing Sys-tems (CHI 92) 1992.  HYPERLINK "http://doi.acm.org/10.1145/142750.142755" http://doi.acm.org/10.1145/142750.142755 109. Fogarty, J., J. Lai, and J. Christensen, Presence versus Availability: The Design and Evaluation of a Context-Aware Communication Client. International Journal of Human-Computer Studies 2004. 61(3): p. 299 - 317. 110. Fogg, B.J. and H. Tseng. The elements of computer credibility. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems: the CHI Is the Limit. Pittsburgh, Pennsylvania, United States: ACM Press. pp. 80-87, May 15 - 20, 1999 1999.  HYPERLINK "http://doi.acm.org/10.1145/302979.303001" http://doi.acm.org/10.1145/302979.303001 111. Foresti, G., P. Mhnen, and C. Regazzoni, Multimedia video-based surveillance systems: Requirements, Issues and Solutions. ed. Kluwer Academic Publishers: Norwell, MA, USA, 2000. 112. Friedman, B., Value-Sensitive Design. Interactions: New Visions of Human-Computer Interaction 1996. 3(6): p. 1723. 113. Friedman, B., D.C. Howe, and E. Felten. Informed Consent in the Mozilla Browser: Implementing Value-Sensitive Design. In Proceedings of The Thirty-Fifth Annual Hawai'i International Conference on System Sciences: IEEE Computer Society. pp. CD-ROM of full-paper, OSPE101 2002. 114. Friedman, B., D. Hurley, D.C. Howe, E. Felten, and H. Nissenbaum. Users' conceptions of web security: a comparative study. In Proceedings of CHI '02 extended abstracts on Human factors in computing systems. Minneapolis, Minnesota, USA, April 20-25, 2002 2002. 115. Gamma, E., R. Helm, R. Johnson, and J. Vlissides, Design Patterns: Elements of Reusable Object-Oriented Software: Addison-Wesley. 395, 1995. 116. Garfinkel, S. Adopting Fair Information Practices to Low Cost RFID Systems. In Proceedings of Ubiquitous Computing 2002 Privacy Workshop 2002.  HYPERLINK "http://www.teco.edu/~philip/ubicomp2002ws/" http://www.teco.edu/~philip/ubicomp2002ws/ 117. Gaver, W., J. Beaver, and S. Benford. Ambiguity as a Resource for Design. In Proceedings of CHI 2003. Ft. Lauderdale, FL, USA: ACM Press. pp. 233-240 2003. 118. Gaver, W., T. Moran, A. MacLean, L. Lovstrand, P. Dourish, K. Carter, and W. Buxton. Realizing a Video Environment: EuroPARC's RAVE System. In Proceedings of CHI'92. Monterey, CA, USA: ACM Press. pp. 2735, May 1992 1992. 119. Gaw, S., E.W. Felten, and P. Fernandez-Kelly. Secrecy, flagging, and paranoia: adoption criteria in encrypted email. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Montral, Qubec, Canada: ACM Press. pp. 591-600, April 22 - 27, 2006 2006.  HYPERLINK "http://doi.acm.org/10.1145/1124772.1124862" http://doi.acm.org/10.1145/1124772.1124862 120. Giddens, A., Modernity and Self-Identity: Self and Society in the Late Modern Age. Stanford, CA, USA: Stanford University Press, 1991. 121. Gideon, J., S. Egelman, L.F. Cranor, and A. Acquisti. Power Strips, Prophylactics, and Privacy, Oh My! In Proceedings of The 2006 Symposium On Usable Privacy and Security (SOUPS 2006). Pittsburgh, PA 2006. 122. Goffman, E., Behavior In Public Places: Free Press, 1966. 123. Goffman, E., The Presentation of Self in Everyday Life. New York: Anchor, Doubleday, 1959. 124. Goldberg, I. Privacy-enhancing technologies for the Internet, II: Five years later. In Proceedings 2002. 125. Goldschlag, D.M., M.G. Reed, and P.F. Syverson, Onion routing for anonymous and private internet connection. Commun ACM 1999. 42(2): p. 3941. 126. Good, N.S., R. Dhamija, J. Grossklags, D. Thaw, S. Aronowitz, D. Mulligan, and J. Konstan. Stopping Spyware at the Gate: A User Study of Privacy, Notice and Spyware. In Proceedings of Symposium On Usable Privacy and Security (SOUPS) 2005. Pittsburgh, PA, USA: ACM Press, July 6-8, 2005 2005. 127. Good, N.S. and A. Krekelberg. Usability and Privacy: a Study of Kazaa P2P File-Sharing. In Proceedings of CHI 2003: ACM Press. pp. 137144 2003.  HYPERLINK "http://portal.acm.org/citation.cfm?id=1073001.1073006" http://portal.acm.org/citation.cfm?id=1073001.1073006 128. Grasso, A. and J.-L. Meunier. Who Can Claim Complete Abstinence from Peeking at Print Jobs? In Proceedings of CSCW '02: ACM Press. pp. 296305 2002. 129. Gray, W.D., B.E. John, and M.E. Atwood, Project Ernestine: Validating a GOMS analysis for predicting and explaining real-world performance. Human-Computer Interaction 1993. 8(3): p. 237309. 130. Grinter, R., P. Dourish, J. Delgado de la Flor, and M. Joseph, Security in the wild: user strategies for managing security as an everyday, practical problem. Personal and Ubiquitous Computing 2004. 8(6): p. 391401. 131. Grinter, R.E. and L. Palen. Instant Messaging in Teenage Life. In Proceedings of ACM Conference on Computer Supported Cooperative Work (CSCW2002): ACM Press. pp. 2130 2002.  HYPERLINK "http://doi.acm.org/10.1145/587078.587082" http://doi.acm.org/10.1145/587078.587082 132. Grudin, J., Desituating Action: Digital Representation of Context. Human-Computer Interaction (HCI) Journal 2001. 16(2-4). 133. Grudin, J., Groupware and Social Dynamics: Eight Challenges for Developers, Communications of the ACM, vol. 37(1): pp. 92-105, 1994. 134. Grudin, J., Three faces of human-computer interaction. Annals of the History of Computing 2005. 27(4): p. 4662. 135. Grudin, J. and E. Horvitz, Presenting choices in context: approaches to information sharing. 2003: Workshop on Ubicomp communities: Privacy as Boundary Negotiation.  HYPERLINK "http://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers.htm" http://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers.htm 136. Gruteser, M. and D. Grunwald. A Methodological Assessment of Location Privacy Risks in Wireless Hotspot Networks. In Proceedings of Security in Pervasive Computing Conference. Boppard, Germany: Springer Verlag. pp. 1024 2004.  HYPERLINK "http://www.springerlink.com/openurl.asp?genre=article&id=0H4KDUY2FXYPA3JC" http://www.springerlink.com/openurl.asp?genre=article&id=0H4KDUY2FXYPA3JC 137. GVU Center, 10th WWW User Survey Online Privacy and Security. Technical, GVU Center, Georgia Insitute of Technology 1999. 138. Hkkil, J. and C. Chatfield. Toward social mobility: 'It's like if you opened someone else's letter': user perceived privacy and social practices with SMS communication. In Proceedings of Human Computer Interaction With Mobile Devices & Services MobileHCI '05. Salzburg, Austria: ACM Press. pp. 219222, September 2005 2005.  HYPERLINK "http://doi.acm.org/10.1145/1085777.1085814" http://doi.acm.org/10.1145/1085777.1085814 139. Hancock, J.T., J. Thom-Santelli, and T. Ritchie. Deception and Design: The Impact of Communication Technology on Lying Behavior. In Proceedings of CHI 2004. Vienna, Austria: ACM Press. pp. 129134, 24-29 April 2004. 140. Harper, R.H. Why People Do and Don't Wear Active Badges: A Case Study. In Proceedings of Computer Supported Cooperative Work (CSCW96). pp. 297-318 1996. 141. Harris Interactive, IBM Multi-National Consumer Privacy Survey. Technical 1999. 142. Hawkey, K. and K.M. Inkpen. Keeping up appearances: understanding the dimensions of incidental information privacy. In Proceedings of Proceedings of the SIGCHI conference on Human Factors in computing systems. Montral, Qubec, Canada: ACM Press. pp. 821830 2006. 143. Hayes, G.R. and G.D. Abowd. Tensions in designing capture technologies for an evidence-based care community. In Proceedings of Proceedings of the SIGCHI conference on Human Factors in computing systems. Montr\&\#233;al, Qu\&\#233;bec, Canada: ACM Press. pp. 937-946 2006. 144. Hilty, L.M., C. Som, and A. Khler, Assessing the Human, Social and Environmental Risks of Pervasive Computing. Human and Ecological Risk Assessment 2004. 10: p. 853874. 145. Hindus, D., M. Ackerman, S.D. Mainwaring, and B. Starr. Thunderwire: A Field Study of an Audio-Only Media Space. In Proceedings of Computer Supported Cooperative Work 96. Cambridge MA USA: ACM Press. pp. 238-247 1996. 146. Hindus, D., S.D. Mainwaring, N. Leduc, A.E. Hagstrm, and O. Bayley, Casablanca: Designing Social Communication Devices for the Home. CHI Letters (Human Factors in Computing Systems: CHI 2001), 2001. 3(1): p. 325-332. 147. Hochheiser, H., The platform for privacy preference as a social protocol: an examination within the US policy context. ACM Trans Internet Technol 2002. 2(4): p. 276306. 148. Holtzblatt, K. and H. Beyer, Contextual Design: A Customer-Centered Approach to Systems Designs: Morgan-Kaufmann, 1997. 149. Hong, J., J.D. Ng, S. Lederer, and J.A. Landay. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems. In Proceedings of DIS 2004: ACM Press. pp. 91100 2004. 150. Hong, J.I., J. Ng, and J.A. Landay. Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems. In Proceedings of Designing Interactive Systems (DIS2004). Boston, MA. pp. 91-100 2004. 151. Hudson, J.M. and A. Bruckman. Using Empirical Data to Reason about Internet Research Ethics. In Proceedings of ECSCW 05. Paris, France: Springer Verlag. pp. 287-306, 18-22 September 2005 2005. 152. Hudson, S. and I. Smith. Techniques for Addressing Fundamental Privacy and Disruption Tradeoffs in Awareness Support Systems. In Proceedings of CSCW 96: ACM Press. pp. 248257 1996.  HYPERLINK "http://doi.acm.org/10.1145/240080.240295" http://doi.acm.org/10.1145/240080.240295 153. Husted, B., ChoicePoint's fine sets record, Atlanta Journal-Constitution, 2006. 154. Iachello, G., Privacy and Proportionality, Unpublished Doctoral Dissertation, Georgia Inst. of Technology, Atlanta, GA, USA, 2006.  HYPERLINK "http://etd.gatech.edu" http://etd.gatech.edu 155. Iachello, G. Protecting Personal Data: Can IT Security Management Standards Help? In Proceedings of ACSAC. Las Vegas, Nevada, USA: IEEE Press. pp. 266275, Dec. 2003 2003. 156. Iachello, G. and G.D. Abowd. Privacy and Proportionality: Adapting Legal Evaluation Techniques to Inform Design In Ubiquitous Computing. In Proceedings of CHI 2005. Portland, OR, USA: ACM Press. pp. 91100 2005. 157. Iachello, G., I. Smith, S. Consolvo, M. Chen, and G.D. Abowd. Developing Privacy Guidelines for Social Location Disclosure Applications and Services. In Proceedings of Symposium On Usable Privacy and Security (SOUPS). Pittsburgh, PA, USA: ACM Press. pp. 6576, July 68, 2005 2005. 158. Iachello, G., K.N. Truong, G.D. Abowd, G.R. Hayes, and M. Stevens. Experience Prototyping and Sampling to Evaluate Ubicomp Privacy in the Real World. In Proceedings of CHI 2006. Montreal, Canada: ACM Press 2006. 159. IBM Corporation, IBM Corporation. Privacy is good for business. 2007.  HYPERLINK "http://www.ibm.com/innovation/us/customerloyalty/harriet_pearson_interview.shtml" http://www.ibm.com/innovation/us/customerloyalty/harriet_pearson_interview.shtml 160. International Labor Organization, Workers Privacy Part II: Monitoring and Surveillance in the Workplace Conditions of Work. Special Series On Workers Privacy. Vol. Digest 12:1, 1993. 161. International Organization for Standardization / International Electrotechnical Commission, IS17799:2000 Information technology Code of practice for information security management. Technical 2000. 162. Ito, M. and O. Daisuke, Mobile Phones, Japanese Youth and the Replacement of Social Contact, in Front Stage/Back Stage: Mobile Communication and the Renegotiation of the Social Sphere, R. Ling and P. Pedersen, Editors: Grimstad, Norway., 2003. 163. Jacobs, A.R. and G.D. Abowd, A framework for comparing perspectives on privacy and pervasive technologies. IEEE Pervasive Computing 2003. 2(4): p. 7884. 164. Jagatic, T., N. Johnson, M. Jakobsson, and F. Menczer, Social Phishing. to appear in Commun. ACM 2005. 165. Jancke, G., G.D. Venolia, J. Grudin, J. Cadiz, and A. Gupta, Linking Public Spaces: Technical and Social Issues. CHI Letters (Human Factors in Computing Systems: CHI 2001) 2001. 3(1): p. 530-537. 166. Jendricke, U. and D. Gerd tom Markotten, Usability Meets Security: The Identity-Manager as Your Personal Security Assistant for the Internet, in 16th Annual Computer Security Applications Conference (ACSAC 00). 2000: New Orleans, LA, USA. 167. Jensen, C., Designing For Privacy in Interactive Systems, Unpublished Doctoral Dissertation, Georgia Institute of Technology, Atlanta, GA, USA, 2005.  HYPERLINK "http://etd.gatech.edu" http://etd.gatech.edu 168. Jensen, C. and C. Potts. Privacy policies as decision-making tools: an evaluation of online privacy notices. In Proceedings of CHI 2004: ACM Press 2004. 169. Jensen, C., C. Potts, and C. Jensen, Privacy practices of Internet users: Self-reports versus observed behavior. Int. J. Human-Computer Studies 2005. 63: p. 203227. 170. Jensen, C., J. Tullio, C. Potts, and E.D. Mynatt, STRAP: A Structured Analysis Framework for Privacy. Technical Report GVU Technical Report 05-02, GVU Center, Georgia Institute of Technology, Atlanta, GA, USA, January 2005 2005. 171. Jesdanun, A., AOL: Breach of Privacy Was a Mistake, Associated Press Financial Wire, 2006. 172. Jiang, X., J.I. Hong, and J.A. Landay. Approximate Information Flows: Socially-based Modeling of Privacy in Ubiquitous Computing. In Proceedings of Ubicomp 2002. Gteborg, Sweden. pp. 176-193 2002. 173. Junestrand, S., U. Keijer, and K. Tollmar, Private and public digital domestic spaces. International Journal of Human-Computer Studies 2001. 54(5): p. 753-778. 174. Kaasinen, E., User Needs for Location-aware Mobile Services. Personal and Ubiquitous Computing 2003. 7(1): p. 70-79. 175. Karas, S., Privacy, Identity, Databases. 52 American University Law Review 393 2003. 176. Karat, C.-M., J. Karat, C. Brodie, and J. Feng. Evaluating interfaces for privacy policy rule authoring. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Montral, Qubec, Canada: ACM Press. pp. 83-92, April 22 - 27, 2006 2006.  HYPERLINK "http://doi.acm.org/10.1145/1124772.1124787" http://doi.acm.org/10.1145/1124772.1124787 177. Keller, A.M., D. Mertz, J.L. Hall, and A. Urken. Privacy issues in an electronic voting machine. In Proceedings of 2004 ACM workshop on Privacy in the electronic society: ACM Press, October 2004. 178. Kindberg, T., A. Sellen, and E. Geelhoed. Security and Trust in Mobile Interactions: a Study of Users' Perceptions and Reasoning. In Proceedings of Ubicomp 2004. Nottingham, UK: Springer Verlag. pp. 196213, September 710, 2004 2004. 179. Kinzie, S. and Y. Noguchi, In Online Social Club, Sharing Is the Point Until It Goes Too Far, Washington Post pp. A01, 2006. 180. Kling, R., Fair Information Practices with Computer Supported Cooperative Work (CSCW). Computer-Mediated Communication Magazine 1994. 1(2): p. 5. 181. Klinke, A. and O. Renn, A New Approach to Risk Evaluation and Management: Risk-Based, Precaution-Based, and Discourse-Based Strategies. Risk Analysis 2002. 22(6): p. 1071-1094. 182. Knapp, K.J., T.E. Marshall, R.K. Rainer, and D.W. Morrow, Top Ranked Information Security Issues: The 2004 International Information Systems Security Certification Consortium (ISC)2 Survey Results. Technical, Auburn University, Auburn, AL 2004. 183. Kobsa, A., Personalized hypermedia and international privacy. Communications of the ACM 2002. 45(5): p. 64-67. 184. Kobsa, A. and J. Schreck, Privacy through pseudonymity in user-adaptive systems. ACM Transactions on Internet Technology (TOIT) 2003. 3(2). 185. Kontio, J., L. Lehtola, and J. Bragge. Using the Focus Group Method in Software Engineering: Obtaining Practitioner and User Experiences. In Proceedings of International Symposium on Empirical Software Engineering (ISESE). Redondo Beach, U.S.A.: IEEE Computer Society, Aug 19-20, 2004 2004. 186. Kowitz, B. and L. Cranor. Peripheral Privacy Notifications for Wireless Networks. In Proceedings of Workshop On Privacy In The Electronic Society '05. Alexandria, VA, USA. pp. 9096 2005. 187. Kraut, R.E., C. Cool, R.E. Rice, and R.S. Fish. Life and Death of New Technology: Task, Utility and Social Influences on the Use of a Communication Medium. In Proceedings of CSCW 94. Chapel Hill, NC, USA: ACM Press. pp. 1321 1994. 188. Kumaraguru, P. and L.F. Cranor, Privacy Indexes: A Survey of Westin's Studies. Technical Report CMU-ISRI-05-138, Institute for Software Research International, School of Computer Science, Carnegie Mellon University, December 2005 2005. 189. Lahlou, S. Living in a goldfish bowl: lessons learned about privacy issues in a privacy-challenged environment. In Proceedings of Privacy Workshop at Ubicomp Conference. Tokyo Japan, 9/11/2005 2005. 190. Langheinrich, M. A Privacy Awareness System for Ubiquitous Computing Environments. In Proceedings of Ubicomp 2002. Goteberg, Sweden. pp. 237-245 2002. 191. Langheinrich, M. Privacy by Design Principles of Privacy-Aware Ubiquitous Systems. In Proceedings of Ubicomp 2001: Springer Verlag. pp. 273291 2001. 192. Lasprogata, G., N.J. King, and S. Pillay, Regulation of Electronic Employee Monitoring: Identifying Fundamental Principles of Employee Privacy through a Comparative Study of Data Privacy Legislation in the European Union, United States and Canada. Stanford Technology Law Review 2004. 2004 (4). 193. Latour, B., Weve never been modern. Cambridge, MA, USA: Harvard University Press, 1991. 194. Lau, T., O. Etzioni, and D.S. Weld, Privacy interfaces for information management. Communications of the ACM 1999. 42(10): p. 88-94. 195. Lederer, S., Designing Disclosure: Interactive Personal Privacy at the Dawn of Ubiquitous Computing, Unpublished Master of Science, University of California, Berkeley, Berkeley, CA, 2003.  HYPERLINK "http://www.cs.berkeley.edu/projects/io/publications/privacy-lederer-msreport-1.01-no-appendicies.pdf" http://www.cs.berkeley.edu/projects/io/publications/privacy-lederer-msreport-1.01-no-appendicies.pdf 196. Lederer, S., J.I. Hong, A. Dey, and J.A. Landay, Five Pitfalls in the Design for Privacy, in Security and Usability, S. Garfinkel and L.F. Cranor, Editors. p. 421-445, 2005. 197. Lederer, S., J. Mankoff, and A. Dey. Towards a Deconstruction of the Privacy Space. In Proceedings of Workshop on Privacy In Ubicomp 2003: Ubicomp communities: privacy as boundary negotiation. Seattle, WA, USA 2003.  HYPERLINK "http://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers/lederer-privacyspace.pdf" http://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers/lederer-privacyspace.pdf 198. Lederer, S., J. Mankoff, and A.K. Dey. Who Wants to Know What When? Privacy Preference Determinants in Ubiquitous Computing. In Proceedings of Extended Abstracts of CHI 2003, ACM Conference on Human Factors in Computing Systems. Fort Lauderdale, FL. pp. 724-725 2003. 199. Lessig, L., The Architecture of Privacy. Vanderbilt Journal of Entertainment Law & Practice 1999. 1: p. 56. 200. Lessig, L., Code and Other Laws of Cyberspace. New York NY: Basic Books, 1999. 201. Ling, R., The Mobile Connection: The Cell Phone's Impact on Society. 3rd ed: Morgan Kaufmann, 2004. 202. Mackay, W.E. Ethics, lies and videotape. In Proceedings of SIGCHI conference on Human factors in computing systems 1995. 203. MacLean, A., R.M. Young, V. Bellotti, and T.P. Moran, Questions, Options, and Criteria: Elements of Design Space Analysis. Human-Computer Interaction (HCI) Journal 1991. 6(3&4): p. 201250. 204. MacLean, A., R.M. Young, and T.P. Moran. Design Rationale: The Argument Behind The Artifact. In Proceedings of CHI 1989: ACM Press. pp. 247252 1989. 205. Maheu, M.M., P. Whitten, and A. Allen, E-Health, Telehealth, and Telemedicine: A Guide to Start-up and Success. Jossey-Bass Health Series, ed. Jossey Bass: San Francisco, 2001. 206. March, W. and C. Fleuriot. Girls, technology and privacy: "is my mother listening?" In Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Montral, Qubec, Canada: ACM Press. pp. 107-110, April 22 - 27, 2006 2006.  HYPERLINK "http://doi.acm.org/10.1145/1124772.1124790" http://doi.acm.org/10.1145/1124772.1124790 207. McCullagh, D., Intel Nixes Chip-Tracking ID. 2000.  HYPERLINK "http://www.wired.com/news/politics/0,1283,35950,00.html" http://www.wired.com/news/politics/0,1283,35950,00.html 208. Melenhorst, A.S., A.D. Fisk, E.D. Mynatt, and W.A. Rogers. Potential intrusiveness of aware home technology: Perceptions of older adults. In Proceedings of Human Factors and Ergonomics Society 48th Annual Meeting: HFES Press. pp. 266270 2004. 209. Microsoft Corporation, Protecting Americans Privacy, Issued March 20, 2007. 2007.  HYPERLINK "http://www.microsoft.com/issues/essays/2007/03-20ProtectingPrivacy.mspx" http://www.microsoft.com/issues/essays/2007/03-20ProtectingPrivacy.mspx 210. Milewski, A.E. and T.M. Smith. Providing presence cues to telephone users. In Proceedings of The 2000 ACM conference on Computer supported cooperative work (CSCW2000): ACM Press. pp. 89-96 2000. 211. Millett, L.I., B. Friedman, and E. Felten, Cookies and Web Browser Design: Toward Realizing Informed Consent Online. CHI Letters 2001. 3(1): p. 46-52. 212. Mitnick, K. and W. Simon, The Art of Deception: Controlling the Human Element of Security. 1st ed: Wiley. 304, 2002. 213. Moor, J.H., Towards a Theory of Privacy in the Information Age. Computers and Society 1997. 27(3): p. 2732. 214. Mller, G. and K. Rannenberg, Multilateral Security in Communications, Volume 3: Technology, Infrastructure, Economy. ed. Addison Wesley: Mnchen, 1999. 215. Muller, M.J., J.G. Smith, J.Z. Shoher, and H. Goldberg, Privacy, anonymity and interpersonal competition issues identified during participatory design of project management groupware. ACM SIGCHI Bulletin 1990. 23(1). 216. Murphy, R.S., Property Rights in Personal Information: An Economic Defense of Privacy. Georgetown Law Journal 1996. 84 Geo. L.J. 2381. 217. Nagel, K., Using Availability Indicators to Enhance Context-Aware Family Communication Applications, Unpublished PhD, Georgia Institute of Technology, Atlanta, GA, USA, 2006. 218. Nagel, K., J. Hudson, and G.D. Abowd. Predictors of availability in home life context-mediated communication. In Proceedings of CSCW04: ACM Press. pp. 497506 2004. 219. Nagel, K., C.D. Kidd, T. OConnell, A. Dey, and G.D. Abowd. The Family Intercom: Developing a Context-Aware Audio Communication System. In Proceedings of Ubicomp 2001. Atlanta, GA. pp. 176-183 2001. 220. Nardi, B., S. Whittaker, and E. Bradner. Interaction and Outeraction: Instant Messaging in Action. In Proceedings of ACM Conference on Computer Supported Cooperative Work (CSCW2000): ACM Press. pp. 7988 2000. 221. Neal, A., M. Humphreys, D. Leadbetter, and P. Lindsay, Development of hazard analysis techniques for human-computer systems, in Innovation and Consolidation in Aviation, G. Edkins and P. Pfister, Editors. Ashgate: Aldershot, UK. p. 255-262, 2003. 222. Neustaedter, C. and S. Greenberg. The Design of a Context-Aware Home Media Space. In Proceedings of Fifth International Conference on Ubiquitous Computing (UBICOMP 2003): Springer-Verlag. pp. 297-314 2003. 223. Neustaedter, C. and S. Greenberg. The Design of a Context-Aware Home Media Space for Balancing Privacy and Awareness. In Proceedings of UbiComp 2003: Springer Verlag. pp. 297-314 2003. 224. Neustaedter, C., S. Greenberg, and M. Boyle, Blur Filtration Fails to Preserve Privacy for Home-Based Video Conferencing. to appear in ACM Transactions on Computer Human Interactions (TOCHI) 2005, in press. 225. Nguyen, D.H. and E.D. Mynatt. Privacy Mirrors: Making Ubicomp Visible. In Proceedings of Human Factors in Computing Systems: CHI 2001 (Workshop on Building the User Experience in Ubiquitous Computing). Seattle, WA: ACM Press 2001. 226. Nielsen, J. and R.L. Mack, Usability Inspection Methods. ed. John Wiley & Sons: New York, NY, USA, 1994. 227. Norman, D.A., The Design of Everyday Things. New York, NY: Basic Books, 2002. 228. Norris, C. and G. Armstrong, The maximum surveillance society: The rise of CCTV. Oxford, England: Berg, 1999. 229. Olson, J.S., J. Grudin, and E. Horvitz. A study of preferences for sharing and privacy. In Proceedings of CHI '05 extended abstracts on Human factors in computing systems, April 2005 2005. 230. Organization for Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. Technical 1980. 231. Palen, L., Social, Individual and Technological Issues for Groupware Calendar Systems. CHI Letters: Human Factors in Computing Systems, CHI 99 1999. 2(1): p. 17-24. 232. Palen, L. and P. Dourish, Unpacking "Privacy" for a Networked World. CHI Letters (Human Factors in Computing Systems: CHI 2003), 2003. 5(1): p. 129-136. 233. Patil, S. and A. Kobsa. Instant Messaging and Privacy. In Proceedings of HCI 2004. Leeds, UK. pp. 8588 2004. 234. Patrick, A., P. Briggs, and S. Marsh, Designing Systems that People Will Trust, in Security and Usability, L. Cranor and S. Garfinkel, Editors. p. 75-99, 2005. 235. Patrick, A.S. and S. Kenny. From Privacy Legislation to Interface Design: Implementing Information Privacy in Human-Computer Interactions. In Proceedings of PET 2003: Springer Verlag. pp. 107124 2003. 236. Pettersson, J.S., S. Fischer-Hbner, N. Danielsson, J. Nilsson, M. Bergmann, S. Clauss, T. Kriegelstein, and H. Krasemann. Making PRIME Usable. In Proceedings of SOUPS '05. Pittsburgh, PA, USA: ACM Press 2005. 237. Pew Internet & American Life, Testimony of Lee Rainie: Director, Pew Internet & American Life Project. 2001.  HYPERLINK "http://www.pewinternet.org/reports/toc.asp?Report=34" http://www.pewinternet.org/reports/toc.asp?Report=34 238. Pew Internet & American Life, Trust and Privacy Online: Why Americans Want to Rewrite the Rules. 2000.  HYPERLINK "http://www.pewinternet.org/reports/toc.asp?Report=19" http://www.pewinternet.org/reports/toc.asp?Report=19 239. Pfitzmann, A., Technologies for Multilateral Security, in Technology, Infrastructure, Economy, G. Mller and K. Rannenberg, Editors. Addison Wesley Longman Verlag GmbH. p. 85--91, 1999. 240. Posner, R.A., An Economic Theory of Privacy. Regulation 1978: p. 1926. 241. Povey, D. Optimistic Security: A New Access Control Paradigm. In Proceedings of 1999 New Security Paradigms Workshop 1999.  HYPERLINK "http://security.dstc.edu.au/staff/povey/papers/optimistic.pdf" http://security.dstc.edu.au/staff/povey/papers/optimistic.pdf 242. Price, B.A., K. Adam, and B. Nuseibeh, Keeping ubiquitous computing to yourself: A practical model for user control of privacy. Int. J. Human-Computer Studies 2005. 63: p. 228253. 243. Privacy & American Business, Consumer Privacy Attitudes: A Major Shift Since 2000 and Why. Privacy & American Business Newsletter 2003. 10(6). 244. Privacy Protection Study Commission, Personal Privacy in an Information Society. Technical, Government Printing Office, Washington D.C., USA 1977. 245. Rambll Management Denmark, Economic Evaluation of the Data Protection Directive 95/46/EC Final Report. Technical, May 2005 2005. 246. Rannenberg, K. Multilateral Security: A Concept and Examples for Balanced Security. In Proceedings of New Security Paradigms Workshop. Ballycotton, Ireland: ACM Press. pp. 151162 2000. 247. Rannenberg, K. Recent Development in Information Technology Security Evaluation The Need for Evaluation Criteria for Multilateral Security. In Proceedings of Security and Control of Information Technology in Society Proceedings of the IFIP TC9/WG 9.6 Working Conference. Onboard M/S Ilich and ashore at St. Petersburg, Russia: North-Holland, Amsterdam. pp. 113128, August 1217, 1993 1993. 248. Rannenberg, K., What can IT Security Certification do for Multilateral Security? in Technology, Infrastructure, Economy, G. Mller and K. Rannenberg, Editors. Addison Wesley Longman Verlag GmbH. p. 515--530, 1999. 249. Rheingold, H., PARC is Back! Wired 1994. 2(2). 250. Rogers, E., Diffusion of Innovations. 5th ed: Free Press, 2003. 251. Root, R.W. Design of a multi-media vehicle for social browsing. In Proceedings of The 1988 ACM Conference on Computer-supported Cooperative Work (CSCW 88) 1988.  HYPERLINK "http://doi.acm.org/10.1145/62266.62269" http://doi.acm.org/10.1145/62266.62269 252. Ronagel, A., R. Haux, and W. Herzog, Mobile und sichere Kommunikation im Gesundheitswesen. ed. Vieweg: Braunschweig, Germany, 1999. 253. Rowan, G., Visionaries See Invisible Computing, The Globe and Mail, 1997. 254. Samuelson, P., Privacy As Intellectual Property? 52 Stanford Law Review 1125 2000. 255. Sasse, A. Computer Security: Anatomy of a Usability Disaster, and a Plan for Recovery. In Proceedings of 2003 Workshop on Human-Computer Interaction and Security Systems at CHI 2003 2003.  HYPERLINK "http://www.andrewpatrick.ca/CHI2003/HCISEC/" http://www.andrewpatrick.ca/CHI2003/HCISEC/ 256. Scacchi, W., Socio-Technical Design, in The Encyclopedia of Human-Computer Interaction, W.S. Bainbridge, Editor. Berkshire Publishing Group, 2004. 257. Scalet, S.D., The Five Most Shocking Things About the Choice Point Debacle, CSO Magazine, 2005. 258. Schilit, B.N., D.M. Hilbert, and J. Trevor, Context-aware communication. IEEE Wireless Communications. 9(5): p. 46 -54. 259. Schmandt, C., J. Kim, K. Lee, G. Vallejo, and M. Ackerman. Mediated voice communication via mobile IP. In Proceedings of 15th Annual ACM Symposium on User interface Software and Technology UIST '02. Paris, France, October 27 - 30, 2002: ACM Press. pp. 141-150 2002.  HYPERLINK "http://doi.acm.org/10.1145/571985.572005" http://doi.acm.org/10.1145/571985.572005 260. Schneier, B., Beyond Fear. New York, NY, USA: Springer, 2006. 261. Schneier, B., Secrets and Lies: Digital Security in a Networked World. New York, NY, USA: Wiley Computer Publishing. xv+412, 2000. 262. Sheehan, K.B., In poor health: An assessment of privacy policies at direct-to-consumer web sites. Journal of Public Policy & Marketing 2005. 24(2). 263. Shneiderman, B., Designing trust into online experiences. Commun. ACM 2000. 43(12): p. 57-59. 264. Shoemaker, G.B. and K.M. Inkpen. Single display privacyware: augmenting public displays with private information. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Seattle, Washington, United States: ACM Press. pp. 522-529 2001.  HYPERLINK "http://doi.acm.org/10.1145/365024.365349" http://doi.acm.org/10.1145/365024.365349 265. SiteAdvisor, M., McAfee SiteAdvisor. 2007.  HYPERLINK "http://www.siteadvisor.com/" http://www.siteadvisor.com/ 266. Sloane, L., Orwellian Dream Come True: A Badge That Pinpoints You, New York Times pp. 14, 1992. 267. Smith, A.D. and F. Offodile, Information management of automatic data capture: an overview of technical developments. Information Management & Computer Security 2002. 10(3): p. 109118. 268. Smith, H.J., S.J. Milberg, and S.J. Burke, Information privacy: measuring individuals concerns about organizational practices. MIS Quart 1996. 20(2): p. 167196. 269. Sohlenkamp, M. and G. Chwelos. Integrating Communication, Cooperation, and Awareness: The DIVA Virtual Office Environment. In Proceedings of CSCW 94. Chapel Hill, NC, USA: ACM Press, 10/94 1994. 270. Song, X. and L.J. Osterweil, Toward Objective, Systematic Design-Method Comparisons. IEEE Software 1992. 9(3): p. 4353. 271. Spiekermann, S. Perceived Control: Scales for Privacy in Ubiquitous Computing. In Proceedings of Conference on User Modeling UM05, 2005. Edinburgh, UK, 2429 July 2005 2005. 272. Spiekermann, S., J. Grossklags, and B. Berendt. E-privacy in 2nd generation e-commerce: privacy preferences versus actual behavior. In Proceedings of ACM conference on electronic commerce (EC 2001). Tampa, Florida. pp. 38-46, October 2001 2001. 273. Spiekermann, S. and H. Ziekow. RFID: a 7-point plan to ensure privacy. In Proceedings of 13th European Conference on Information Systems. ECIS, Regensburg, Germany, May 2005 2005. 274. Spinuzzi, C. A Scandinavian Challenge, a US Response: Methodological Assumptions in Scandinavian and US Prototyping Approaches. In Proceedings of SIGDOC02. Toronto, Ontario, Canada: ACM Press, October 20-23, 2002 2002. 275. Stasko, J., D. McColgin, T. Miller, C. Plaue, and Z. Pousman, Evaluating the InfoCanvas Peripheral Awareness System: A Longitudinal, In Situ Study. Technical Report Technical Report GIT-GVU-05-08, GVU Center / Georgia Institute of Technology, Atlanta, GA, USA, March 2005 2005. 276. Stigler, G.J., An Introduction to Privacy in Economics and Politics. Journal of Legal Studies 1980. 9. 277. Strahilevitz, L.J., A Social Networks Theory of Privacy. University of Chicago Law Review 2005. 72 U. Chi. L. Rev. 919. 278. Sutcliffe, A., On the Effectuve Use and Reuse of HCI Knowledge, in Human-Computer Interaction in the New Millennium, J.M. Carroll, Editor. ACM Press. p. 3--29, 2000. 279. Sweeney, L., k-anonymity: a model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 2002. 10(5): p. 557570. 280. Sweeney, L., Protecting Job Seekers from Identity Theft. IEEE Internet Computing 2006. 10(2). 281. Tang, K.P., P. Keyani, J. Fogarty, and J.I. Hong. Putting people in their place: an anonymous and privacy-sensitive approach to collecting sensed data in location-based applications. In Proceedings of Conference on Human Factors in Computing Systems. Montral, Qubec, Canada: ACM Press, New York, NY. pp. 93-102 2006.  HYPERLINK "http://doi.acm.org/10.1145/1124772.1124788" http://doi.acm.org/10.1145/1124772.1124788 282. Tarasewich, P. and C. Campbell. What Are You Looking At? In Proceedings of SOUPS 2005. Pittsburgh, PA, USA: ACM Press 2005. 283. Terrell, T. and A. Jacobs, Privacy, technology, and terrorism: Bartnicki, Kyllo, and the normative struggle behind competing claims to solitude and security. Emory Law Journal 2002. 51(4): p. 14691511. 284. Treasury Board of the Government of Canada, Privacy Impact Assessment Policy. 2002.  HYPERLINK "http://www.tbs-sct.gc.ca/pubs_pol/ciopubs/pia-pefr/siglist_e.asp" http://www.tbs-sct.gc.ca/pubs_pol/ciopubs/pia-pefr/siglist_e.asp 285. Trevor, J., D.M. Hilbert, and B.N. Schilit. Issues in Personalizing Shared Ubiquitous Devices. In Proceedings of Ubicomp 2002. Gteborg, Sweden. pp. 56-72 2002. 286. Trewin, S. Configuration agents, control and privacy. In Proceedings of 2000 conference on Universal Usability. Arlington, VA, USA: ACM Press. pp. 916 2000. 287. Tullio, J.C., Exploring the Design and Use of Forecasting Groupware Applications with an Augmented Shared Calendar, Unpublished Doctoral Thesis, Georgia Institute of Technology, Atlanta, GA, USA, 2005.  HYPERLINK "http://etd.gatech.edu" http://etd.gatech.edu 288. United States Department of Health Education and Welfare, Records, Computers and the Rights of Citizens, Report of the Secretary's Advisory Committee on Automated Personal Data Systems. Technical 1973. 289. US Department of Health and Human Services, Health Insurance Reform: Security Standards; Final Rule. 2003. 290. Varian, H.R., Economic Aspects of Personal Privacy, in Privacy and Self-Regulation in the Information Age, NTIA, Editor. US Department of Commerce, 1997. 291. Venkatesh, V., Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance Model. Information Systems Research 2000. 11(4): p. 342-365. 292. Venkatesh, V., M.G. Morris, G.B. Davis, and F.D. Davis, User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly 2003. 27(3): p. 425478. 293. Vila, T., R. Greenstadt, and D. Molnar. Why we can't be bothered to read privacy policies Models of privacy economics as a lemons market. In Proceedings of 5th international conference on Electronic commerce. Pittsburgh, PA, USA: ACM Press. pp. 403407 2003. 294. Wainfan, L. and P.K. Davis, Virtual collaboration: face-to-face versus videoconference, audioconference, and computer-mediated communications, in Enabling Technologies for Simulation Science VIII. Proceedings of the SPIE, D.A. Trevisani and A.F. Sisti, Editors. SPIE--The International Society for Optical Engineering. p. 384-398, 2004. 295. Walton, D., Plausible Deniability and Evasion of Burden of Proof. Argumentation 1996. 10: p. 47-58. 296. Want, R., A. Hopper, V. Falco, and J. Gibbons, The Active Badge Location System. ACM Transactions on Information Systems 1992. 10(1): p. 91-102. 297. Wardell, J., LexisNexis Breach May Be Worse Than Thought, AP Financial Wire, 2005. 298. Warren, S.D. and L.D. Brandeis, The Right to Privacy. Harvard Law Review 1890. IV(5). 299. Wasserman, E., Here, There and Everywhere, San Jose Mercury News pp. 1F, 1998. 300. Weber, H., U.S. trade commission fines data warehouser ChoicePoint over data breach, Associated Press Worldstream, 2006. 301. Weirich, D. and M.A. Sasse. Pretty Good Persuasion: A first step towards effective password security for the Real World. In Proceedings of New Security Paradigms Workshop 2001. Cloudcroft, NM: ACM Press. pp. 137-143, Sept. 10-13, 2001 2001. 302. Weiser, M. and J.S. Brown, The Coming Age of Calm Technology, in Beyond Calculation: The Next Fifty Years of Computing. Springer-Verlag: New York, 1997. 303. Westin, A., Opinion Surveys: What Consumers Have To Say About Information Privacy, S.o.C. The House Committee on Energy and Commerce, Trade, and Consumer Protection, Editor. 2001.  HYPERLINK "http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm" http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm 304. Westin, A.F., E-commerce & Privacy: What Net Users Want. Technical, Privacy & American Business, Hackensack, NJ 1998. 305. Westin, A.F., Harris-Equifax consumer privacy survey 1991. Technical, Equifax Inc., Atlanta, Georgia 1991. 306. Westin, A.F., Information Technology in a Democracy. ed. Harvard University Press: Cambridge, MA, USA, 1971. 307. Westin, A.F., Privacy and Freedom. New York NY: Atheneum, 1967. 308. Whalen, T. and K.M. Inkpen. Privacy and security awareness: Gathering evidence: use of visual security cues in web browsers. In Proceedings of 2005 conference on Graphics interface GI '05: Canadian Human-Computer Communications Society, May 2005 2005. 309. Wheeler, L. and H.T. Rois, Self-Recording of Everyday Life Events: Origins, Types, and Uses. Journal of Personality 1991. 59(3): p. 339355. 310. Whitten, A. and J.D. Tygar. Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0. In Proceedings of 8th USENIX Security Symposium 1999. 311. Wogalter, M.S., Handbook of Warnings: Lawrence Erlbaum Associates, 2006. 312. Wolf, G. and A. Pfitzmann, Empowering Users to Set Their Security Goals, in Technology, Infrastructure, Economy, G. Mller and K. Rannenberg, Editors. Addison Wesley Longman Verlag GmbH. p. 113135, 1999. 313. Zarski, T.Z., Mine Your Own Business! Making the Case for the Implications of the Data Mining of Personal Information in the Forum of Public Opinion. Yale Journal of Law & Technology 2002. 5(2002/2003): p. 154. 314. Zetter, K., CardSystems Data Left Unsecured, Wired News, 2005.  HYPERLINK "http://www.wired.com/news/technology/0,1282,67980,00.html" http://www.wired.com/news/technology/0,1282,67980,00.html 315. Zimmermann, P., PGP User's Guide: MIT Press, 1994. 316. Zugenmaier, A. The Freiburg privacy diamond. In Proceedings of Global Telecommunications Conference, 2003. GLOBECOM '03: IEEE. pp. 15011505, 1-5 Dec. 2003 2003. 10.1109/GLOCOM.2003.1258488 317. Zurko, M.E. and R.T. Simon. User-Centered Security. In Proceedings of New Security Paradigms Workshop: IEEE Press. pp. 2733 1996.   Warren and Brandeis claimed that the right to privacy is unique because the object of privacy (e.g., personal writings) cannot be characterized as intellectual property nor as a property granting future profits.  The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, [].  The ECPA regulates the recording of telecommunications and personal communications at the US Federal level, including wiretapping by government agencies. It generally outlaws any recording of which at least one party being recorded is not aware and requires various types of warrants for wiretapping or recording other telecommunication data for law enforcement purposes.  Google Desktops privacy policy brings this structure to the extreme, and prompts the user with the following notice upon installation: Read This Carefully. Its Not the Usual Yada-Yada.  Privacy policy here refers to the policy internal to the organization, which describes roles, responsibilities and is used for process definition. This is not the policy written for the data subject and posted on the web site.  J. Karat, personal communication, March 2006.  R. Dingledine, personal communication 7/8/2005. See also http://tor.eff.org/gui .  See also the Privacy Enhancing Technology Testing & Evaluation Project. http://www.ipc.on.ca/scripts/index_.asp?action=31&P_ID=15495 (Last visited 7/4/2006).  For an overview of work in this area, we refer to edited publications  ADDIN EN.CITE Brusilovsky20071180118028Brusilovsky, P.A. KobsaW. NejdlThe Adaptive Web: Methods and Strategies of Web Personalization2007Heidelberg, GermanySpringer Verlag 55. Brusilovsky, P., A. Kobsa, and W. Nejdl, The Adaptive Web: Methods and Strategies of Web Personalization. ed. Springer Verlag: Heidelberg, Germany, 2007..  We are aware that the distinction between design and evaluation is, to a certain degree, artificial in an iterative development model. However, we feel that the techniques that are discussed here specifically apply to already-developed products, i.e. are more appropriate for summative evaluation.  While low usability certainly contributed to PGPs lackluster adoption, it is also likely that a reverse network effect, where few people could decrypt email, coupled with a perceived lack of need may also be responsible. For example, it is worth noting that the competing S/MIME standard, already integrated in popular email applications like Outlook and Thunderbird, has also not yet been widely adopted, notwithstanding the fact that it is arguably simpler to use (although not necessarily to configure). Generally speaking, email encryption systems have been most successful when a service organization was present to configure and set up the clients. However, Gaw et al. found that even in organizations where email encryption technology is used, decisions about encrypting emails were driven not just by technical merit, but also by social factors  ADDIN EN.CITE Gaw20061088108810Gaw, S.Felten, E. W.Fernandez-Kelly, P.R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries, and G. OlsonSecrecy, flagging, and paranoia: adoption criteria in encrypted emailSIGCHI Conference on Human Factors in Computing Systems591-6002006April 22 - 27, 2006Montral, Qubec, CanadaACM Presshttp://doi.acm.org/10.1145/1124772.1124862 119. Gaw, S., E.W. Felten, and P. Fernandez-Kelly. Secrecy, flagging, and paranoia: adoption criteria in encrypted email. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Montral, Qubec, Canada: ACM Press. pp. 591-600, April 22 - 27, 2006 2006. http://doi.acm.org/10.1145/1124772.1124862. They found that users saw universal, routine use of encryption as paranoid. Encryption flagged a message not only as confidential but also as urgent, so users found the encryption of mundane messages annoying. Interestingly, this result is paralleled by research by Weirich and Sasse on compliance with security rulesusers who follow them are viewed as paranoid and exceedingly strict  ADDIN EN.CITE Weirich20011124112410Weirich, D.Sasse, M. AngelaPretty Good Persuasion: A first step towards effective password security for the Real WorldNew Security Paradigms Workshop 2001137-1432001Sept. 10-13, 2001Cloudcroft, NMACM Press. 301. Weirich, D. and M.A. Sasse. Pretty Good Persuasion: A first step towards effective password security for the Real World. In Proceedings of New Security Paradigms Workshop 2001. Cloudcroft, NM: ACM Press. pp. 137-143, Sept. 10-13, 2001 2001. .  The concept of translucency has also been used in other HCI domains with different meanings, for example in the design of user interfaces for mobile systems  ADDIN EN.CITE Ebling200280380317Maria R. EblingBonnie E. John M. SatyanarayananThe Importance of Translucence in Mobile Computing SystemsACM Transactions on Computer-Human Interaction (TOCHI)42-67912002March 89. Ebling, M.R., B.E. John, and M. Satyanarayanan, The Importance of Translucence in Mobile Computing Systems. ACM Transactions on Computer-Human Interaction (TOCHI) 2002. 9(1): p. 42-67..  C = cost of adequate protection; L = the likelihood that an unwanted disclosure of personal information occurs; D = the damage that happens on such a disclosure.  Quotes from Boyle and Greenberg  ADDIN EN.CITE Boyle200590734890717Michael BoyleSaul GreenbergThe language of privacy: Learning from video media space analysis and design ACM Transactions on Computer-Human Interaction (TOCHI)1222005June 2005 50. Boyle, M. and S. Greenberg, The language of privacy: Learning from video media space analysis and design. ACM Transactions on Computer-Human Interaction (TOCHI) 2005. 12(2)..  For example, personal video recorders capture a persons television viewing habits, mobile phones contain photos, call history, instant messages, and contacts, etc.  Echoing the UI design advice in Section 3.2: Present choices, not dilemmas  Part of the reason for this casual approach is that many developers do not expect such negative reactions from their work. For example, in September 2006, Facebook, a social networking site targeted at college students, added two new features to their site, News Feed and Mini-Feed  ADDIN EN.CITE Kinzie20061177117723Susan KinzieYuki NoguchiIn Online Social Club, Sharing Is the Point Until It Goes Too FarWashington PostA012006September 07 179. Kinzie, S. and Y. Noguchi, In Online Social Club, Sharing Is the Point Until It Goes Too Far, Washington Post pp. A01, 2006. . News Feed was a content module that showed what recent changes had occurred with friends and when. For example, News Feed would show that a friend had joined a group recently or had added another person as a friend. Similarly, Mini-Feed was a separate content module that let others see what recent changes an individual had made to their profile. What is interesting is that, although all of this information was already publicly available through a persons Facebook profile, these fairly innocuous features generated a tremendous amount of resentment from Facebook users, over concerns of being stalked and a lack of appropriate privacy controls in ones joining or leaving a certain social group. Facebooks experience is far from exceptional. Many other projects have faced similar concerns. For example, in 1990, Lotus proposed to sell a Housing Marketplace CD which provided directory information on the buying habits of 120 million people in the US  ADDIN EN.CITE Agre19972892896Philip E. AgreMarc RotenbergTechnology and Privacy: The New Landscape1997Cambridge MAMIT Press 19. Agre, P.E. and M. Rotenberg, Technology and Privacy: The New Landscape. Cambridge MA: MIT Press, 1997.. That project was cancelled due to privacy concerns. In 1999, Intel proposed to add unique IDs to each of their processors, to facilitate asset management and provide hardware-based certificates  ADDIN EN.CITE McCullagh20001191119112D McCullaghIntel Nixes Chip-Tracking ID2000http://www.wired.com/news/politics/0,1283,35950,00.html 207. McCullagh, D., Intel Nixes Chip-Tracking ID. 2000. http://www.wired.com/news/politics/0,1283,35950,00.html. Intel quickly reverted to disabling this feature by default.      FILENAME end-user-privacy-in-human-computer-interaction-v57.doc Page  PAGE 2 of  NUMPAGES 85 Binary Evaluation User Accepts User Rejects Product does not protect privacy Product protects privacy Threshold Evaluation User Accepts User Rejects Product does not protect privacy Product protects privacy Spectral Evaluation User Accepts User Rejects Product does not protect privacy Product protects privacy Optimistic Phase Reactions: Users become comfortable with technology Concerns assuaged by positive experience Main research questions: How to cement Architecture of privacy (Market, Legal, Social, Technical elements) How to optimize application value and minimize risks going forward Pessimistic Phase Reactions: Many legitimate concerns Alarmist reactions Main research questions: Right way to deploy / facilitate adoption Value proposition Rules of fair use P T U 6 7 r 🛓$jh(2hm)r0JUmHnHujhY1Uh-hlD h|]QNHh5 h[ h|]QhBIh|]Q hhhm)rmHnHujhvUhvh5Z&h`!h%<h"hrph;hY1hJhhjX2/Ppr s t u v w x y z { | } ~  gdgd|]Q"gdY1gdgdD xr r Kx  N c94$oS ! !  ! "gdnugd     + , - . 1 2 ` a b | } λ񥖥z`񥖥O jwhm)rUmHnHu2jh(2hm)r>*B*UmHnHphuh# mHnHu j}hm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu} ~        , - . H ҹҹǹzҹ`ҹǹ2jh(2hm)r>*B*UmHnHphu jqhm)rUmHnHuhm)rmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu#H I J K L M N O P l m n o r s ·©©{{j·©P©2jh(2hm)r>*B*UmHnHphu jehm)rUmHnHuhm)rmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jkhm)rUmHnHu  <=>?DEjklۻ̰̝zḭ̝ jYhm)rUmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu j_hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu!@AB\]^`abźůӺ{aźůP jMhm)rUmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu jShm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2jh(2hm)r>*B*UmHnHphubcde2Ӱӥz`ӥ2j h(2hm)r>*B*UmHnHphuh# mHnHu jG hm)rUmHnHujhm)rUmHnHuhm)rmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHu2346789:;WXYZ[\·©©{{j·©P©2j h(2hm)r>*B*UmHnHphu j; hm)rUmHnHuhm)rmHnHu2j h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jA hm)rUmHnHu-./123456RSTUZ[ۻ̰̝zḭ̝ j/ hm)rUmHnHu2j h(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu j5 hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu!!"#$%&BCDEJKźůӺ{aźůP j#hm)rUmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu j)hm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2j h(2hm)r>*B*UmHnHphu789:?@LMNhӰӥz`ӥ2jh(2hm)r>*B*UmHnHphuh# mHnHu jhm)rUmHnHujhm)rUmHnHuhm)rmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHuhijlmnopq·©©{{j·©P©2jh(2hm)r>*B*UmHnHphu jhm)rUmHnHuhm)rmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jhm)rUmHnHu012LMNPQRSTUqrstwxۻ̰̝zḭ̝ jhm)rUmHnHu2jh(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu j hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu! !">?@AFGwxyźůӺ{aźůP jhm)rUmHnHu2j|h(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu jhm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2jh(2hm)r>*B*UmHnHphu  u5'cd:P |5.  !  ! !     )*+,12RSTnӰӥz`ӥ2jph(2hm)r>*B*UmHnHphuh# mHnHu jhm)rUmHnHujhm)rUmHnHuhm)rmHnHu2jvh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHunoprstuvw·©©{{j·©P©2jdh(2hm)r>*B*UmHnHphu jhm)rUmHnHuhm)rmHnHu2jjh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jhm)rUmHnHu./0234567STUVYZۻ̰̝zḭ̝ jhm)rUmHnHu2j^h(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu jhm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu! !"$%&'()EFGHMNźůӺ{aźůP jhm)rUmHnHu2jRh(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu jhm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2jXh(2hm)r>*B*UmHnHphu !=>?@EFklmӰӥz`ӥ2jFh(2hm)r>*B*UmHnHphuh# mHnHu jhm)rUmHnHujhm)rUmHnHuhm)rmHnHu2jLh(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHu !"#(·©©{{j·©P©2j: h(2hm)r>*B*UmHnHphu jhm)rUmHnHuhm)rmHnHu2j@h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jhm)rUmHnHu()@AB\]^`abcdeۻ̰̝zḭ̝ j!hm)rUmHnHu2j4!h(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu j hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu!ABC]^_abcdefźůӺ{aźůP j#hm)rUmHnHu2j(#h(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu j"hm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2j."h(2hm)r>*B*UmHnHphu345789:;<XYZ[^_ijkӰӥz`ӥ2j%h(2hm)r>*B*UmHnHphuh# mHnHu j$hm)rUmHnHujhm)rUmHnHuhm)rmHnHu2j"$h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHu·©©{{j·©P©2j'h(2hm)r>*B*UmHnHphu j&hm)rUmHnHuhm)rmHnHu2j&h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu j%hm)rUmHnHu-./IJKMNOPQRnopqvwۻ̰̝zḭ̝ j(hm)rUmHnHu2j (h(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu j'hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu! !">?@AFGYZ[uvwyz{źůӺ{aźůP j{*hm)rUmHnHu2j)h(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu j)hm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2j)h(2hm)r>*B*UmHnHphu{|}~.Ӱӥz`ӥ2j+h(2hm)r>*B*UmHnHphuh# mHnHu ju+hm)rUmHnHujhm)rUmHnHuhm)rmHnHu2j*h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHu./0234567STUVWX·©©{{j·©P©2j-h(2hm)r>*B*UmHnHphu ji-hm)rUmHnHuhm)rmHnHu2j,h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jo,hm)rUmHnHu ' ( ) + , - . / 0 L M N O R S ۻ̰̝zḭ̝ j]/hm)rUmHnHu2j.h(2hm)r>*B*UmHnHphuhm)rmHnHu$jh(2hm)r0JUmHnHuh# mHnHu jc.hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu! !! !#!$!%!'!(!)!*!+!,!H!I!J!K!N!O!n!o!p!!!!!!!źůӺ{aźůP jQ1hm)rUmHnHu2j0h(2hm)r>*B*UmHnHphuhm)rmHnHuh# mHnHu jW0hm)rUmHnHujhm)rUmHnHuhm)rmHnHuhm)rCJmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHu2j/h(2hm)r>*B*UmHnHphu *!!!""K#M#N#O#P#Q#`# ,vAOFTHKgd{>gdo@ FEƀ  gdogd ! ! ! !!!!!!!!!!!!!!!!!!!!!! """"""_"`"a"{"Ӱӥz`ӥ2j2h(2hm)r>*B*UmHnHphuh# mHnHu jK2hm)rUmHnHujhm)rUmHnHuhm)rmHnHu2j1h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHu$jh(2hm)r0JUmHnHuhm)rCJmHnHu{"|"}"""""""""""""""""""""""""#####·©©{{j·©P©2j4h(2hm)r>*B*UmHnHphu j?4hm)rUmHnHuhm)rmHnHu2j3h(2hm)r>*B*UmHnHphuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu$jh(2hm)r0JUmHnHuh# mHnHujhm)rUmHnHu jE3hm)rUmHnHu##(#)#*#D#E#F#H#I#J#K#L#M#P#Q#R#`########4&5&9&:&H&I&X(Y(^(_(ۻ̰̝{wqwmwia]aYawa]aYah=[h{jh1 Uhhh} hNHh hdhohss hh-hY1h-jhY1U$jh(2hm)r0JUmHnHuh# mHnHu j95hm)rUmHnHujhm)rUmHnHuhm)rmHnHuh(2hm)r0JmHnHuhm)rCJmHnHu"_(l(m(n(*******+ +H+S+l+++++++++++++,,, , , ,,,,000000022222299::::]@^@m@n@o@y@ʿʪ h/hGmhGmhZxho@ hIh 5:h/hNH h/hh5hph^7hhDPhF> hNHh[ h=[h{jh1 Uh@h=y@z@|@}@@@@@@@@@@@@@&A3ANH hy]h{> h{>NHhFfh5h|;h$T6h{>hC( h"h"hi+h" h"NH?KKKKKKKKK6L7L=L@LWL_LbLfLlLmLnLoLLLMM;MAMSMTMzM{MMMMMMMMMMOOOOOPPwPxPPP Q Q겨ꙐjhG,Uh%RhG,NH h%RhG,jhEfUhG, hEfNHhhC( hShSh{> hy]h{>hy]hhs%hhy]hNH hy]hhEf hy]h^ h" hy]hz4KoLLW\ aoqstt^F-Eƀ  gdC($gdC(gdC(gdo@ FEƀ .gdo@ gd{> QCSDSISJSKSLSRS_SSSSSSSBTCTFTGTLTSTTTTTW WWWWWW&W.WWWXW^WgWWWWWXXrs^sgs~ssss4t>tctxttttttuu v vvvvvvwwwdwgwmwnwrwwwwwwwwwwwxxxx"x2x3x4x7x8xXxbxexox¾«···h2ShU5NHh90hhA h:;hG,h&1h4 h2ShU5hhU5hLhjXh=[ hC(NH h*NHhl h*hj<hC( hdKNHh9hdKh.)9t+u|uus-F-Eƀ  gdC(F-Eƀ  gdC(F-Eƀ  gdC(u vvw:x"|-8niddii_gdKsgdU5gdjXFEƀ .gdo@ gdC(F-Eƀ  gdC( ox|x~xxxxxxxxxxxxxxxxyyyy"yAyLyMyRyyyzyyyyyyyyy||||"|`|a||||||||||} h3uNHh3uh2ShU56h42h=[jh1 UhX hzNHhzhzhU56 h 'NHh 'h}hjX hU5NHh WVh0hG,hNh2ShU5NH h2ShU5h2hU55}}})}I}P}d}e}}}}}}}}}}} >?DEST '([\abd}&'WX]^`detxyΏٿhi h>jNHh>jh1i_h<hU5 hphp hphU5h=[jh1 Uh=shph42hG,h8h3u h3uh3uD#$)*,->bkԐ#$().UVWЕӕߕ_`defǙֿ֭֭֭֩֩֜h&^7hehjXNHhG, hehjXh{ hhjX hjX6h Oh<}\hjX6 hjXNHhjXh2hU5h=[jh1 Uh-C h<NHh<h#R=EVbc/078hstǛ*hnq{|45:;OP^_`ѡҡ,/ļh8h hKsh{h2:ohjX6hXp%h=[jh1 U hy]hjXhjvhjX6hG,hjvhjXNH hjvhjX hehjX hjXNHhjXh2h&^7=,̵nidgdgd&^7FEƀ .gdzfgdR.2FEƀ  gdKs/4;@JVzҥӥ@PW_lptæǦBPQcgjnvѧӧܧ#$`c(ĸhh!` h&^7NHh&^7h-dhhfhzf hNHhh*-h@+hkhgVhG,h h |NHh2h |h8h.UTA(/Pcj˩̩ҩөԩy}\bcgi|}~߭=¾ujh&j0JU h&jNHhih&jhG, h*-6hzbhoY6NHhzbhoY6hzbhoYh=ljh=l0JUhEh6Sh2th%`h=[jh1 U hNHh3h36h!`h3h3h6hh,.=>BUVˮͮ"$YZ`kDFXY߲ FG洰hxE h@ h@ h3uNHh3uh@ hkhyh=[h{jh1 U hNHjh"40JUhh] hG,NHhG,hhEhhoYh&j h&jNH;456>GJYyz{ŵƵʵ˵̵78ʸ˸θϸԸ۸BCcde~ҾҸұҸҭjh@ Uh@ hkhM5q hkNHhh=[h{jh1 Uhk hNH h&h h&h3uhh&^7h(1h3u h3uNH<~λϻ׻&()2DN˼̼ͼ  !34>FWijʿ ¾}h&^7hnvh=[jh1 Uh?h huNHhuh Eh\3mhg' hdhdh+hKshzfhxE h3/h@ h@ NHhjh@ hM5qhk hkNHhkh# jh@ Uj5h(,cU1̼k3<xniddgdgd(SFEƀ ..gd~=Qgd/FEƀ .gd  2:CKVd}"[\jkR]^fhthimnҽĵ޵hfnh8hH* h8h6 h$h8hhp h8h6 h8hNHh(Sh8hh~=Q hvhvhy! hKsNHh4RhKsh=[h{jh1 U h h&^7hJ h hJh&^7hg'8 ,-.'$%/036sOPVp'(9<=X]ӽ޽޽򹵱ӱӢӹӹhDe hy!6 hNHhsAhsh hjh1 Uh ) hy!NHh(Shy! h(Sh(Sh=[h{jh5Uhg'h[ h8hNHh8h>]opy 38]cpqx&-FTU !/ijkJȱȭ򭩡hjjh8\Uh8\hnh+_hqNH hqNHhqh h+_hqhx+phq6h(S hfnh~=Qh~=QhshDehg'hy! hNHh=xKL&e w gdgdSgdgQgdgdvgdFFEƀ ..gd~=Q JKUV>?TWX !&'(*38DEK^}~WX\]ab|׿ⷿ~~u~hrGhqNH hrGhqhrGhrGhFNH hrGhF hrGh-h# j36h#JUjhg'Uh PmhDehg'hKs h PmhKsh Pmh PmNH h Pmh Pmhvh h=[h{jh1 Uhj hjNH-|}(>?@BCHIJKXhir!"#@AEFHZ`мии漴朴hv,hqZh.hhy!hvhv6hvhFhR: hrGhZ9hrGhqNHhKs hrGhF hrGh"<h=[h{jh1 UhrG hrGhq hrGh]8`ghm"#()./1vwzCDIJL AI䶲ضƫƫƚƚƚƖܲܖhDc h+_hvhvjhUhhNH hhh{jhR:UhdDhR:6h h YNHh Yh8(h=[jh1 UhR: hFNHhFhU| hqNHhq6I &,3nz:NPXY[_`eqȾ⺶IJĬĦĺ̢̢hdD hdDNH h*=NHh~ZhxhUY7 hP]NHh*=hP]hs hNHhhBrhA`hh!yhy!hR:hDe hvNHhv> 4:KTijyz{"#$%&0^F[cde}h2hgQh8@h0.h!y6h<hJv h{J6 h+_h{J h{JNHh+_h!yNH h+_h!yh{Jh1 hG2>hL hNHhUY7hhxhNhR:hdDh=[h{jh1 U3&',09Z^lm0 7 8 = ? @ O P V c i n o ~                 2 7 M N U l m ޼޼h~8hR:hR:6 hR:NHho h* NHh* hh<hs hs6 hshs hK*NHhK*h 'h+t9 hgQNHh0ShR:hgQh8@>m     " *   # % & ' . v w & ( 2 4 5 h l           gh1<=[\]^6>¾̺hhHY h$SNHh$ShxhXhBYh]whZ9hhghT|C h)YNHh)YhAh4 hgQNHh]_hgQhhR:hB hzNHh+t9hz9=7 036:?niiiiiiigdiFEƀ ..gdZ9gdZ9FEƀ .gd >?Tbht>CIK]wx12789:Z[w'(4678X漑h6 hiNHh_h=[h{jh1 Uhph3AHhiNHhi h3AHhi hq{h hq{hZ9 hq{hjhZ9 hHYNHhR:hsh7ghHYh hNH9u_`!#(-.CK78          !!!!!!!%"&"ηh3AHhi6 h0NHhx hR:NH h3AHh0h0h=[h{jh1 Uh3AHhiNH h3AHhihR: hiNHhiE&"S"T"$$$$$$$$$$$$%%(%)%** *!*F*G*Z-[-_-`-k-q-----------&.../.4.5.6.00000 1 1T1U1q1z1{1|1}11111111 22ܾܸh,hR: hiNHhRhE hsNHhshW6h3AHhi6hih3AHhiNHh=[h{jh1 U h3AHhiE22252622222222233 333*6+6/6062636G6N6l6o6p6u6v6w66666778888::::: ;;; ;#;W;X;;;;;;;;;< < < <T>U>Z>[>]>o>>غh{jh1 Uh3AHhi6jh5U hiNHh=[jhR:UhR:h,h3AHhiNHhi h3AHhiH>>>>>>>>????!?/?p?q?z?????????????????@@"@#@7@8@@@@@@@@@@CCCCCC㾺h=[jh1 Uho5 hxAhcEVhcEVhxAhNH hxAhhkhhHThh# j6htykUjhy!Uhy! hRNHhRh84hR: hiNH h3AHhihi5??G:PkU_gKtjuuwidgdIFEƀ  gditgdZ9gdFEƀ ..gd CCCCC'D(DFFFFFGGGGGGGGGGHH.H/HOHPHHHHHHHIIII(IPIQIgIhIiIxI|IIIJJJJjJkJlJ*M+M0M1MMMMMMM0PĽеįȫȥ hNHh> hcEVNHhR:hR:6 hxAhcEVhcEVhh,hR:h=[h{jh1 UhxAh6hxAhNH hxAhho5A0P1P6P7P9P;PvPwPPPWQXQQQRR>R@RrRsRRRS$S%S&ScUdUgUhUkUUUUU$V%V}VVVVVVVqWrWuWWWWWZZZZZZZZ^^^^_)_*_Ļķﳯ hchhF hsh8(hk+hNH hk+hh{hR:h6 h.ENHh.Eh  hNHhcEVhh=[jh1 UA*_._G_H_R_S_U_V___` `o`w````aaaaa}g~ggggggggggg9h:hhhhhhhiii)i-i5i6innnnnnlpmprpspppssssssss"t$t%t8tDtEtKtt hNHh{hh9x4h=[jh1 UhR: hNHhtykhcEVhkh h6LttjuuuuuuuuuuuuuvvvQvRv>w?w~wwwwwwwwwwwwxxxxFxJxuxxxxxxͿwsh =h 7hb"6 hb"NHhb"hN&h# mHnHuh# j-7hdUhW6jhW6Uhd h 7NHh 7hp' hwNHh Ohwh-hh2^]hd hpNHh} hdhhph9x4 h9x4NH-xxxxxxxxy yyyzz*z+z.zGz|z}zzz{{>{E{G{H{I{J{K{R{S{i{j{k{l{m{{{{{ | |P|Q||||}}}}}ٛٗٛٛ hC'NHhC'h_ hW6NHh|h# mHnHujh^IUh^I hW6h|j7hW6Uh hc hx(h2g-hW6 hb"NHhb" hjRhN& h 7NHh 7h =hN&5wty.z}zznF-Eƀ  gd%yF-Eƀ  gd%ygdwzH{I{K{{||a\Wgddgd 7FEƀ .gd 7&$a$gd[)gdwgdW6F-Eƀ  gd 7}}}Ā%EL[\cdhoف.JKSTmq%&+-hoxу҃ǸǰΥjhdUjhdUh7n;hdNH hdNH h7n;hdhdh7n;h 76 h 7NHh 7h7n;h 7NHh=[jh 7U h7n;h 7A,-.12B[z{ÅąŅqrч҇ !"RcdZ[`硘h=[h{hhdNH hhd h7n;hd hdNHjhdU h/GNHjh/GUjh/GU hW6NHh/GhW6hdjhdUh# mHnHuh# 88ђ&17ZvFfc/$d($IfgdG&$a$gd 7gd 7gddFEƀ ..gdd`aNO%&<=JKTY[\  $%*+ǟȟɟʟ̟ןĻ󷯷hC'h# mHnHujh|Uh|hBhdNH hBhdh@Dhd6 hdNHh`}hd6NHh`}hd6h=[h{hdjhdU@ !$&ABnoxyˠ̠MN|}  89fgqxyˠˠ˗˗˅ˠˠ h|h|hGh!NHaJhGh!aJhGh|NHaJhGh|5NH\hGh|5aJhGh|5\hGh|aJhGh|5CJNH\hGh|5CJ\ h!NHh!h|hC'4ՠ '29:/$d($If^gdGFf/$d($IfgdG:;?TdI5%%/$d($IfgdG/$d($If^gdGkd$$IfT֞K xP",---HH t^"44 aytGTdlr9kd$$IfT֞K xP",---HH t^"44 aytGT/$d($IfgdGϡסݡ /$d($IfgdG/$d(P$IfgdG/$d(P$If^gdG  !89AFI99999/$d($IfgdGkd_$$IfT֞K xP",---HH t^"44 aytGTFNV[qr~/$d($If^gdGFfz/$d($IfgdG .Cor}~أ٣  %&IJ^_noڤ (z{-.@AZ[ҼҼҼҼ쬨hGh!5NH\hGh!5\ hjRh!h!h|hGh|5NHaJhGh|5aJhGh|5NH\hGh|5\ h|h|hGh!aJhGh|aJhGh|NHaJ6̢I5%%/$d($IfgdG/$d($If^gdGkd$$IfT֞K xP",---HH t^"44 aytGT /019kd5$$IfT֞K xP",---HH t^"44 aytGT/$d($IfgdG1CW`kopq/$d(P$IfgdG/$d($IfgdG/$d(P$If^gdGqrI99999/$d($IfgdGkd$$IfT֞K xP",---HH t^"44 aytGTͣޣ/$d($If^gdGFf/$d($IfgdG *3I5%%/$d($IfgdG/$d($If^gdGkd $$IfT֞K xP",---HH t^"44 aytGT3;>Ocuv9kd$$IfT֞K xP",---HH t^"44 aytGT/$d($IfgdGvƤˤ٤/$d($IfgdG/$d(P$If^gdG ٤ڤI99999/$d($IfgdGkdg$$IfT֞K xP",---HH t^"44 aytGT()2Qai/$d($If^gdGFf/$d($IfgdG I3#/$d($IfgdG/$d(P$If^gdGkd$$IfT֞K xP",---HH t^"44 aytGTȥ9kd=$$IfT֞K xP",---HH t^"44 aytGT/$d($IfgdG#Fbotyz/$d($If^gdGFfΣFf[/$d($IfgdG&gd!gd|[deyz FGPUVbfȧɧ<>KNuv01GH\]=COUݪ׹׹׹׹׹׵hhW66 hW6NHhW6h|hGh!5NHaJ h|h!hGh!NHaJhGh!5aJhGh!aJhGh!5NH\hGh!5\@Ʀ̦֦/$d($IfgdGG3#/$d($IfgdG/$d($If^gdGkdߥ$$IfT֞K xP",---HH t^"44 aytGT'*8PU/$d($IfgdGUVbcdefI99999/$d($IfgdGkd$$IfT֞K xP",---HH t^"44 aytGTf{ܧ/$d($If^gdGFf/$d($IfgdG I3#/$d($IfgdG/$d(P$If^gdGkd$$IfT֞K xP",---HH t^"44 aytGT"<=/$d(P$IfgdG/$d($IfgdG=>KLMNRI99999/$d($IfgdGkdg$$IfT֞K xP",---HH t^"44 aytGTRl˨ۨ/$d($If^gdGFf/$d($IfgdG I3#/$d($IfgdG/$d(P$If^gdGkd$$IfT֞K xP",---HH t^"44 aytGT /$d(P$IfgdG/$d($IfgdG=ajuwI99999/$d($IfgdGkd=$$IfT֞K xP",---HH t^"44 aytGTw4iF'tyF-Eƀ  gdngdW6gdnFfX/$d($IfgdG ݪ67*+01234˱̱бѱ23}~޳߳MNRXi]^HINOopwx!"%&=>ּᴼᴼ hhn hnNHh{jh1 Uhnh^]hT6jhTU hTNHhT h%ThW6h=[jhW6U hW6NHhW6h^]hW66@>CDV_;о  +,ʿ˿ +,klrwx#45:;#$@ATbYZz;<ABh,3hn6h> h;hnh{ hn6hRhn6h) h<NHh<hW6 hnNHhThnjh1 Uh=[EBY_qr-3pqvxyno%&)GHNTZ?@\^,-17?BCDHMN23y~h=[h{jh1 Uh?hb!hhz<hMIhxh,3hx{6hx{hT ha$hnh,3hn6 hnNH hn6hnCs-(gdnF-Eƀ   gdnF-Eƀ   gdnF-Eƀ   gdnR@ nFEƀ ..gdngdnFEƀ ..gdn 5NvwJ^hio{|~OV(Gb h8h hhnh5hqHhhnNH hhnh*h=[h{jh1 U hH){hn h<NHhx{h<hz<hb! hnNHhnhW6=XY()@A()@FQ\yz 67`gh{hK) h+hnhhz< h%hnhx{h=[jh1 Uh_hn6hW6hn hnNHJ;<AB]^{|6<=RVWz{*6EHPQ&'绲hZhn0J#5NHhZhn0J#5hZhn6hZhnNH hZhn hx{NHhQehx{ hW6NHhW6 hn6h=[h{jh1 U hnNH h)hnhn:QTcde5>mn?{|VWab>K`aV筧h0/]h=[h{jh1 Uh: hnNHhnh$4hn0J#56hn0J#5NH hW60J#5 h=[0J#5 h{0J#5jh1 0J#5U hn0J#5 hx{0J#5hZhn0J#58VZvw; < A B D [ \            # ; A G O X m q s        g m u w x |   ʑ hW6NH hn6]h#}hn6]h%hnNHhm_hn6]hqhn6 hm_hnhW6h=[h{jh1 Uh%hn6 h%hn hnNHhnh$4hn68         !+IJ-.2345n~34nr/056]^:;mq[¼¼ hshn h=#@hnh#}hn6] hx{NHhx{ hn6]h=[h{jh1 UhW6h0/] hnNHhA:hn6]hn hhn@  "# $,05+7nFEƀ ..gdnFEƀ ..gdngdn [\    !!5!A!c!s!v!w!!!!!!""F"J"t"v""""""""" # #^#k#~########## $M$W$$$$$ %%!%¾¾¾¾¾¾h chb hb] hnNHhnhYLhshn6]hx{hAh{hshn6h=[jh1 U hshnhshnNHC!%"%3%y%z%%%%(( ( (((()*).)Q)R)})~)),, , ,,,,l,m,,,,p/q/v/w///000000h0i0!1"1|1}11122E5F5K5L5N5񼰩񢞢ܚh{h hsh h=[0J#5jh1 0J#5U ha0J#h)hshn6]h@h=[h1 jh1 Uhk0hshnNH hshnhx{h c̶̢̢̛̛̪̦̦̾̾̾ hohnh^hW6hx{h=[h{jh1 Uh#W[ hnNHh*hnh5hg&hshn6]hBhshnNH hshn hn NHhn ;+7S7<+>CFGZH R S\QbgnFEƀ ..gdngdnFEƀ ..gdn !>">*>>>>>/?0?AAAAAA8B@BEBFBIBJBBBiCjCCCCCCCHDIDFFFFFFFG GGGGGGGGG HH#H$H9H:H>H?HYHZHcHֺ줺젙 h^hnhSGh+!hZhW6h? hx{NHhx{hkhn6h^hAh=[h{jh1 Uhhn6 hnNHhn hohnhohnNHzΑiFEƀ ..gd/Ggd/GFEƀ .gd/Ggdn qqjrkrrrbucuguhuuuuu5v6vyyyy6z7zGzMzzzzz}}}}?~@~Z~`~~~~~  !GHEK~Ćʆ,-ΊϊӊԊ23ċŋh6h/G6hC5h/G6hO.h/G6 h/G6h:!h/G6h=[h{jh/GUh/G h/GNHJ"OP،ٌ"#!"#pq͖ΖٖؖHI  12./456789;<¢â_ hh/Gh1;h/G6h=[h{jh/GU h/GNHhC5h/G6h/GhOyh/G6LΑ,`94<nFEƀ ..gd/Ggd/GFEƀ ..gd/G _``a©ޭ߭XYef  ׳سSXr}~jkԭЭh1jh/GNHjh1jh/GU h1jh/Gh1;h/G6hh'h/GNHh=[h{jh/GUhh'h/G6 hh'h/G h/GNHh/G hgh/G@*+/0ab$%[\>?DEdepq xy67VW>?./hzTh/G6hpah/G6 h/GNHjh/GUh1jh/GNHh/Gh{ h1jh/Gjh1jh/GUh=[Iu!C&cFEƀ ..gd/G x7$8$H$gd/Ggd/GFEƀ ..gd/G x|~%&FGhirstu12?@GH˿˸𫥫𙕎xplph{jh/GU hh/G h/GNHhzTh/GNH hzTh/Gh/Gh/Gh/G6]aJ h=[aJjh/Gh/GUaJ h/G]aJh/Gh/GNH]aJh/Gh/G]aJ h/GaJh/Gh/G6aJh/Gh/GNHaJh/Gh/GaJ hh/G)VW r wx9:*+./MNfgtuuv{|JPQREFKL*2 h/G6hGlh/G6h{ hU*NHhU* h/GNHh/Gjh/GUh=[R2KXop()MNST^_ABefgh      c d   '(gh()01lmghhCh/G6 h66h/G h]zh/Gh{h=[jh/GUhGlh/G6 h/GNHhU*h/GM&A 1idgd-nFEƀ .gditgdngd/GFEƀ ..gd/G0;<ET[\s 2KRS]ai}  $ƿ俯 h|]Qhg#h=[h1 jh1 Uht/hhW6h6 hPh|]QhPhg#NH hPhg#h(hhbjhbj6h hbjNHhtzhPhbjh6h h! h0L h 5 Xe s-(gd-nF-Eƀ  gd1eF-Eƀ   gdg#F-Eƀ   gdg#$3<CDVWXcde HI}IJOPRw  kl!!!!!!!!""7#ÿhtzjh1 Uh=[h{jh!U h!NHh! h9NHh9hg#hthg#NH hthg# hth1e h|]Qhg#hW6h|]Qh1eNH h|]Qh? h|]Qh1e: 7#$$B$g%))8152: DMxNNQ|Rgd|82gd|8gd5 gd8~gd?`gdRq2gd9gd:AFEƀ ..gdbhgd-n7#####$$$5$A$B$C$G$L$^$a$l$u$$$$$$$$$$$$$$$%%% %%%$%4%5%A%B%Q%R%Y%\%`%d%e%f%g%x%|%%%%%%%%%%%%& &&¼ζβܮܪܪܪܥ hW6hD hxhKs h#I{NH h\NHh\h>hx{h#I{hRq hWNHhWh! hbhhQF hbhh:AhRKw h35SNHh35SA&)&-&/&2&9&E&H&R&S&V&_&&&&&&&&&&&&''*'+'E'L'o'r'u'w'{''''''''''( ( (P(o(u(z(}(((((o)v)))))))))))*䮪h*h9?h&> h!Mhzfhphor h\NHh\hW6hX2 hh1NH hRqNHhRqhhhh1hWhW6hWC**:*?*@*|*************----------.0/0405060?0m0{0011!1"10111718191A1[1h1p1r1t111111122#2żŶŶů hRqNHhhx{hW6hor hh1hor hh1NH hh16hEahh1hzfhh1h9?6h=[h{jh1 U h9?6hRq h9?NHh9?h*h\<#2E2F2444455L5V555555555556&6.676D6I6S6o6p6~6666a7e7k7l7p7q799999991:2:7:]:a::::::::::$;-;޾̸޴̰̰hB4 h/ohB4 h8~NHh8~h'+ hEaNHh9? h?`NHh[{VhEah<2 hNHhW6h?` hRqNHhRqh=[h{jh1 Uh>-;t;};;;;;;C>D>I>J>K>L>y>>>>>>>>>>>>>>a?b?s???????@@%@&@D@G@U@W@@@DDDDDDD D!DvDDDDDDDht]Whn;horh5 huhX&jht]W0JUh8~h 5h1a hB40J8h hB40J8h=[h{jh1 Uh/ohB4NHhB4 h/ohB4>DDDDDDDOGPGTGUGjGpGqGrGJJJJJJJJ KK=K>K?KOKPKRKVKKKKKKKLL$L&L7LLLLLLLLL M MMMM|M}MMMȲ h?`NHh?`hu h1;NHh1;h1;6h1; h5 NHhorh5 6horht]Wh|j~ h5 h5 h{ h5 6h5 h5 6h=[jh1 Uh%h8~h5 ;MMMMMNN*N+NhNuNwNxNNNNNNNAOHOaObOOOPPPPPPPPPPQQQQ.R2RuRvRRRRR2S3SSS[V\V`VaVVVVVVVHZIZNZOZUZ\ZuZZZZhjyjh|80JUh=[h{jh1 UhG|] h1aNHhW6 h|8NHh|8h5 h?`NHh1ahuh?`F|RRRchrpssnnnngd|8F-Eƀ  gd|8F-Eƀ  gd|8ZZZZ[[3[4[[[[[\\]\~\\\_]_b_c_t_u_cccc-d.dUdYdZdddMeNeYeZehhhhiioiiii l lll4lAlIlllmllllmmmmmmmmhphxUh|86 hTh|8h hbhh|.1 hTh|8h# jehdUjhW6UhW6hh~RF h|8NHh|8h=[jh1 U1sNst{}ύ}gdF$gdsgd#gdD[+FEƀ ..gdbh ftgt}ttttttttuuu u!uyyyyyyyyyyyzzz#z;z=zDzEzJzzzzz{{{{{{{{{{{{{{Q|X|Z|[|ļݴݢ䘒ݴݴhh1ah# h4hjEhd h}{=NHh1;h}{=6h}{= hkONHhkO h k<h k<h=[h{jh1 Uh k<hN:Žcdijy *.GQUamntu  MQPRwΗϗ)*efkҼҼҵҰҵҼҩ hnNHhnhz4hS hz4hz4 hF$6 hdhF$ hF$NHhdhF$6hhF$6hF$ h;TNHh;Th=[h1 jh1 UhFhth1ahASfhd;klnߜΝϝ՝ם23|}krz{|Ԡ֠"¼¸܊܆|xh0mhb hMtNHh5hMt h(.h:A h(.h?` h(.h5 h(.hsAhsA hdNHhdhASf hNHhh~PNh!h!h!NH h1a6h1ah=[h{hR h!h!hnjh1 U-(ܱ:gd1gdgdMgd-n2gdFv gdnkgd:AFEƀ ..gd(. "#,167<rˡѡҡݡҤӤפؤ٤&'(  =EIJP\^`ɫʫPQϮЮкܺܺжкܺܺЮ؞ؘ hoNHhYn hNH h6NHhh6hjhdUhnkhd6 hdNHhdhBhoh=[h{jh1 Uh[hMth0mhbhkO<Ю+,ձֱٱڱ۱ܱ  ?Ihijv~{|67;<ҺӺغٺ2<TUXY[\¾ξκζh1;hkOh_h{&hdh%hjh}hshs6hsh6hf3Mh:AhhMth=[h{jh1 U hoNHhoA3<>CNTXYžʾϾӾ8:rs¿67<BCGNVWYopx|뻯뻫hxh< hvNH htwNHhvhR^ hANHhAh whdH h"NNHh}thkOh9h"NhtwhBhojh1 UB 89:eoANWXYz/1CDtugq|h1h3hDhM h?ANHhdHh?AhCn hvEvhCn hvEvhfh hsANHhdhsA h9NHh9h6h{& h"NNHh"NhB>|}~9:`f9:>?AQi~01`aefpɨŨɨŨhhvjh1 Uhd h36hhQh36h3h=[h{jhAUhhA6 hANH hAhA h1NHhMhAh?Ah1= +"78ACMuEF]^bc{&'37:@Jֽҹڧڕڧڧڕڧڧ hNHhMhM6hMh_ h&NHhdh\4h& hvNH htw6htw h1NHh1hvhh=[h{jh1 UhNc h@D6h1;h@D6:JdhlKLhwx|}6?HRY\]dެ h{&NHh{&hXq~ h'-he_hQ!h'-h'-htw6 h'-htw h'-hD h'-hG hfhM hGNHhGhdh=[h{jh1 UhGhD6hD h_NHhh_5z{"#<=01;<>j&128E[u~ھҸڴƴڮڴδδδδδh#G4h 0 hX-NH hhCNHhX- h1NHhiZhhC6hz?hwchiZh1hThhCh<h=[h{jh1 Uh h'- h'-NH?z*2T iFEƀ ..gd92gdqgdwcFEƀ ..gd9 /0:IP  !#GOUWX]^`mqrvwyzԾܲܲ侢hh{ hkkNH h NHh h=[h1 jh1 U h NHh1;h@D6h h9hdh{&hkk h#G4NH hX-NHh#G4hX-hiZ>z89_ekpstxy7;TU\buvwyz #(+2>JReuÿ˰ hNHhhdh}{= hiGh}{=hLhhy4h{&h*+ hi!NHhk`h%hi!h hiGhKIhiGhiZNHhz hiGhiZA459:;<=@Ptu"#$3=QRSWYitu~ ӹӱӹӭ乩ӽ hdNHhdh:lhnh>VhnhKI6h];h}{=h{h 0 hKINHhy4hKI hNH hiGh*+h*+hLhh=[jh1 UhQ,Uh='-478JKqyz  ()12Tdejk CDuv)*+Jpz̶̬̾̾h 0hThdjhTP'0JUh:lh=[jh1 U hTP'NHhTP'hSe hKINHhqh*+ h[NHh[hnh}{=hKIhKI6hKI?z  4 8 = { |                  9               »ϻϻϓh:lhYD heiNHh hnNHhnheih 0h{&hh hnhnhdhnhNH hnhhhhThT6 hTNHh=[h{jh1 UhSehT8   ! 1 2   !()+,23B]^  34GLRSVZe!"'+,-hHh%h{& hT6hThT6 hThTh{ h,\3NHhyh,\3h=[jh1 U hTNHhTh3E/27;VW~  #$8QW_io  #2689>?C¾hmhtshts6htshf_S hI PNHh;hwchnA$hI Ph9hThy hGNH hHNHhHhG h:lNHh (h:lh,\3h{&hAhd&?&g&h&y&z&&&&&&&''J'K'l'z''''''(("(#(*********M+T+r+{++++ⲪĮ h gNHhwch ghX-hts hI PNHhI Ph{&hd hO;hNHh.hO;hh=[h{jh1 U h8NHhxh`Jh8h2j@++,,,,*,3,A,K,P,[,e,~,,,,,,,,,,,,,,,,,,-----#-H-O-P-/////0 0 000%0)0*0/000?0G0]0e0g0q0r0000¾º hgnNHhhuhVhgn hxNHh=[h{jh1 U h_NHh_hb htsNHh. hwcNHhQCaCbChC|CCCCCIIII/I0I1INNNNNNNNOOOASBSKSӽۭ쭽ۭ쭽ۭ hUsNHh] jh1 Uhi)hhUs h FNHhh9h|h Fh .h{ h@eNH h@eh@eh=[jh@eUh{&h@e@KSLSMSuSvSSSSSVVVVVVVVVVVW WWMWhWjWkWWWWW+Z,Z0Z1Z2Z7Z>Z?Z@ZYZZZ[Z`ZaZbZdZiZjZqZyZ|Z}ZZZZZZZZZZZտh# jܵhdUjhdUhdhh|h6h&XJ h FNHh Fheh=[h{ hZ_NHhBhZ_hjh1 U>ZZZ [6[8[:[;[?[@[N[\[y[[[[\\\#\$\|\\\\\\\\\\\]']:];]>]X]s]y]z]~]]]]]p`q`v`w`x`y```````````````cѿ߷߷߳h{jh1 UhNhh=[jhN`Uhd hN`NHhN` h,UNHh,UhZ_ h|6 h|NHh:^kh|he heNHBcccccccccccccidxdddddddddddddddiiiiiiiiiiiiijj jjjj'j(j,j6j8j9jDjEjlllllllllmmm!m"mhNhN6hUj h|NHhZ_h{h:^k hbNHhbhb6heh5hbh2hNh|h=[jh1 UE"m*mMmNm`mkmmmmmmmmmmmmftgtptqttttttttttttttwwwwwwww.x2x@xAxBxVx|x}xxxxxxxxxxxxxȢhq)h|h6h5 h NHh h 6h hh6hh{h:^kh|hoq6h=[jh1 Uhoqh]Mh2hUjhNh|=xxxxxyyyy!y2y3yxyyyyyyy}}}}*},}3}<}W}]}c}}}}}}~ ~~$~+~/~:~;~<~L~Y~g~i~z~{~~~~~~~56ľĺĺĺ⤺hn hxhts6 htsNHhxhts hjNHhjhYDhZ_ h:^kNHh:^kh=[h{jh1 U hNHhhq)h5hd@6ʄ˄Єф܄efjkm!+,ԍՍڍۍ%'/0IJKPQRopȎώGhjȺضжؙؙض htsNHh# jYh!$UjhjUhjjhdU hdNHhdh-h:^khxhtsjh,MUhn hn 6hn h=[jh1 Uh{:jÏ "#/>IN`q  3yefklmn~ɗʗ˗ܗ KŽh+(-h,\3hb h`(hhoqhhlh: hjNH hPhZ_h=[h{jh1 U hZ_NHh:^khPhjhdhZ_A˗Uid_Zgd]Mgd7Jgd=#@FEƀ ..gd9gdw;gd:FEƀ ..gd9 KL\dŘǘȘΘИKYpzӚԚ *+28Knsyʝ͝ѝƼƴ¦ƢhoV$ h-NHh=[h{jh1 U hw;NHh-hw;h h{oNHh{oh7Jh@ h:hd h,\3NH hb NHh+(-hb h,\3@TUhmtۡܡ SW'<@rܣݣޣܦݦ]ާߧ-ຶයζ h 1NHh 1 h0NHh%=h h0hvjh,\3h:6h=[h{jh1 UhHSm h:NHh:hxh{o hw;NHhw; h%&NHh%&>ب٨abթ۩ܩԬլڬ۬ܬstyz}~ܯݯ7;<Yf!سٳÿÿ˻˳ששץÿÿחϗ hL}NHhL}h,\3 h\NHh4S hWNHh{jhdUhUh=[jh1 Uh4h1;h@D6hWhhV^h{h>~h\ h0h0 h:NHh:9ٳKLQRMN%&?@ںۺ#$AH`binKL¼¸ʬʦʢʬʜʘʜhJH h]MNHh_ hJHNHh4djhUheD hwNHhwhh]Mjh7JUhi1h7J6 h7JNHh7Jh|phL}h=[jh1 Uh{I] gd/gdqgdJHgdOigdFEƀ  ..gd gd]Mgd!$gd,IJQWX"#$%JKZqylmμöîööââh!$h!$6h"ngh2h; h!$NHh!$h=[h{jh7JU h7JNH hFAh,h7J hFAhFAh`n hNH h6h,h6 h,hh hFAh<(hFAh<(NH3-EFMi$%<)*+.0༸hnh} h8NHh8 h2:oNHh2:ohS^hNh! h=[jh1 U h!$NH h"ngNHh!$h"ng h;6h!$h!$6h!$h!$6NH?%*78=\abo &9:z /2Dƿ쿲͘͘͘͢͜͢hTT hGNHhG hOiNHhqh hh]MhohohJHNH hohJH hohohOi h2:o6h2:oh2:o6 hS^NHhS^h?/h2:oh=[jh1 Uh j%7DLeijUV '(-.`fr~EFJK"#ABľֶȨ֤ȶȟȨțȓ hJHNHhJHhTTh8G? hi'6haET hi'NHh=[h{jh1 U hyNHhyhi' hOiNHhQERegq *+<= CEFGUcZ`45:;ȺȳhFh*h*'hJH6 h,hq h,NHh*'h,6h, hqNHh=[h{jh1 Uhh% hJHNHhlhqhJHh?@;=> -/0PQ,HI[]^hoz             m     ܹܣܜ hgEhhlhl6 hgEhyIB hgEhgE hgEh!hoC hqNHhqhmh hYNHhlh=[h{jh1 UhWh,hYhJH=          T U Y Z [ \ ] q     "/178<=>̲̫에whmhm6hmhT4rhgEhD=NH hgEhD= hgEhT4rjhgEh0JU hgEhh{ hgEh_hq hgEh!h=[jh1 Uh*hgEhyIBNH hgEhyIBh%fJhl hgEhhgEhNH+  +6GHI "58Ufg   ./7AVWŨ hONHhO hbNHhbhH hoOr hhq hqNHh(hmh)AhYDhD=h=[h{jh1 Uhehlhq hT4rNHh zhT4rhqhT4r68#^*16779>_FEƀ .gdGfgdrgdHgd/gdH FEƀ  ..gd  W|"+1<=CGVW  ? D o p        ######:#뿻ѿќѻh{ hbNHhl hbhb h@NHh+Ahbh@jhq0JUh=[jh1 U hH NHhH hHhoOrhlh.hO6 hO6hO<:#D#F#N#U#\#]#{###########&&&& &*&1&<&A&G&&&&&(())))3)4)=)R)S)W)Y)`)a))))))!*%*,*-*\*]*^*****̼ܫh:96h36h0]hH hH 6 h@6h{h+Ah?#h=[jh1 UhHhhb hH NH hONHhOhH h#:h@h}6>***+++1+L+Z+\+j+p+q+++++++,, , ,,w,x,y,/////////000-0.0/00010;0<0j000000000011 11'1111111ְִּh'XhH6hHhe[jh=[jh1 Uh0]h66h6 h:96NHh:96hlhkk h:NHhbh:h0]h(chiLh ZB1111(4)4.4/444444444444455_5`5555556 6 6 666%6=6D6E6Z6[6h6l6t6w66666666666677޾޾޺޾޶ڶگʯh*dh5h% hrhrh'Xh$NH h'Xh$h%5hm h$NH hbNHh'XhxUhbhb6hh$hbh=[h{jh1 UhH hHNH977757?7V7i7m7s77777777777777 88"858:8X8Y8}8~88888v9w9999999:*:-:9:::A:B:Z<[<`<a<ȸдh=[h{jh1 U hMNHh1h,~hChM hlNHhljhO.0JU hY+NHhO.hY+hi2h hkhr h%5NHh%5h/$ h5h*dh% h%NH5a<i<j<q<u<<<<<<<<<==:=>=S=T=`=a=g=h=}=~==>>#>%>@>K>L>U>p>q>>>>>>>>>>>>>>>?*?+?0?1?9?:?=A>A¾h{jh1 U hi2NHhKh}hi2hQh\oh 2hkh t hLkNHh*_Mh0\ h.NHh.h1h^+hMhLkh.\ hCNHhC:>>>ENSZd1mqqZFEƀ ..gd gd0\gd*5gd.gdTgdFEƀ ..gd  >ACADAXA^A`AxAyAAAAAAAAAJCKCPCQCRCSCfCgCzCCCCCCCCCCCCCD DDD$D%D3D4D6D7D>DIDJDKDYDZDDDDDDDDDDDھĺȰȺҬh h 2NHh h^+NHh1hhw hKNHh9h^+ h}NHhq h{hKhF h 2 hi2NHhi2h}jh1 Uh=[>DDDEE;EEGEMENEEEEEEEEEF5F6F:F@FJFKFHHHHHHHHHHHH I IIIMIRI_I`IIIIIIIII$J(JXJhJmJxJJJJҸҴҰҸҨҸҸҨҤh h.hThT6hqh)T:h!h=[jh1 UhK* hTNHhTh~;h~;h~;6hh*_Mjh 0JUh 2hq h;/?JJJMMMMMM`NNOOOuOvOR R%R&R(R/R=RORURcRgRmR}RRRRRRRRRRR SS3SmSySSSSSSSSS"T%T&T,TyTzTVVWW WWWWGWRWwWxWѻѳѯhnh$h*56h*5h*56h\oh; h*5NHh*5h h.h.h>hqTh!h=[h{jh1 Uh.ExWWWWWW+X,X[XaXbXrXXXXXY YYYGYIYUY\YbYcYiYqYwYyYYYYYYYYYYYYYYZZFZGZVZ`ZfZpZ{ZZZƿ haN}haN}h haN} hnNHh\ohRh$NHhRhRNH hRhR hRh$ hRh;h;h$6 h;NH h;h$h;hnh$h$6 h*5NHh$h*55ZZ]]#]$]I]J]m]n]]]]]]^[^\^^^^^^^_____ _!_"_J_K___`````K`L`f`g``````````aKdLdQdRdzddddddddٷűŭh7s h tNHhK hXNHhXh th] hGNH hSNHhSh$hG haN}NHhaN}h=[h{jh1 UBd;eDekeveeegggggggghhh h%h&h,h3hFhUhVh^hxhyhhhi$i+i,i9iGiMiNitiyiziiiiiiiiiii j jjjRjcjdjej(m)m.m/m0m1mxmmmmmmmqqqqqqqq hSNHh/`3h{ h+nNH hNHhSh=[jh1 Uh+nhhrPqqqqqqr"r'r=rErSrTrUrrrrrrrrrrrssss%s9sCsDsgshsvvv v vCvpvqvxvyvzvxxyy y y$y)yHyMyUyhyԽ԰ԨԜœ˜h h+yNHh+yhY2h=[h{jh1 U hNH hY2h h6h1hm hFNHhFhhch}h=oh7PhRG>hoOrh0\h4hAhS8hyiymynyyyyyyyy zzz)z-z9z:zCzDz||||||||||||}!}?}@}H}[}u}z}}}}}}}}}}} ~!~z{ҶhbgNh# jֶh@Ujh1Uh1h-\h*'h6hih=[h{jh1 U h6 h hEh hNHhhFh+y h+yNH=qnyaBPfڪ gd(gd gdJgd 1gdzsFEƀ ..gdogdogd/gd0gd&3EIYZ_aklz܃!$%)GHIlmqrtdžȆŠAB΋ϋ  ֹhPGh{h hH.NH h/hH.hH.hh1h0h+yh-\ hNHhjh1 Uh=[G #3FGLO[l&'EK^`pxyghmnpw}~1`jϑБԑl $&'JKOP͓ "ÿh\dPh1h'7hmhh_Nh0h+y6 h=[0J#5jh1 0J#5Uh0h06 h0NHh0h+yh4]F"#EL[mn)+/6JTU\ftޘ45ABGHOTX^jz~^_cd~ĦĢĜ hUNHh-h*<h]Sh]S6h]Sho hl3NHhl3hUhh h+yNHh'7h+yh=[h{jh1 UhAhm3-h\dP h\dPNH=œĜМ .234#47HIZ[\HIOPU[ǡȡƺְ֪֜֔֔֎֔֊~zh0h/.ho6hoh?w hUaJh/.NHaJ hmaJh]SNHaJ h]SaJh/.h/.6aJ h=[aJ h{aJjh1 UaJ h0aJ h/.aJh-NHaJhEhh-aJhEhh]SaJ h-aJh-.ȡɡFGKLNiԤդ#$/2EMNSWǦȦ%&-.=EHXZ_`e¼¼¼–¡hoNHaJ hoaJhEhhoNHaJ htaJ h(c_aJ hL6aJhLNHaJ hLaJhEhhoaJ hUNHhLhL6 hLNHh(c_hUhLhoh=[h{jh1 U4efkqا  #$),MQUXpv} 38ABKVĩ>uvڪܪ hHaJ hoaJ h+nh hgBmNH hHNHhgBmhH h.NH hUNHh7] hNHhUhh1h.h!(h!(6h!(hh(c_aJ>ܪު)5\_fgjkƬ,-78=>?ů˯  hzs h aJhnA$h=[jh1 U h NHhgBm hmNHhmh h h hH6aJ h1aJhnA$NHaJ hnA$aJ hzsaJ hTaJ hMaJ hHaJ hgBmaJ6 DFӰڰ #$Tj³óųɳٳH۴ܴ:>?BMSԵ7COPЬЬХ沟h1 hNH hrMhJ? hJ?NHh h%FNH h6hm hhhzshJ?h=[jh1 U h NHh%FhgBm hygNHhygh h?w=PT[emosxz()pqrhinoqú9ESTablmĻ KS hXh 1hXhXNH hgBmNHhgBmh=[h{jh1U hlNHhl hXhXh hJ?aJh1h%FhD!;=>78<=B]LM"*abhڽگکککډxph;oNHaJ hgBm6aJhXhgBm6aJ h;oaJ h{aJjhkU0JUaJhkUNHaJ hkUaJhXNHaJ h=[aJjh1 UaJhJh$FDaJhXhX6aJ hXaJh 1hG_h# jh1UjSh@U*hox&)*+8>JKno06Habij  \]{@Fu{ hZaJhLNHaJ hLaJ hX&aJh;oh;o6aJh;oNHaJhJh>~aJhJhJNHaJhJhJaJ hJaJ h;oaJ h{+aJA-.6;FH[]{|_`devy*EOu !ܚ haJ haJ h{aJ hC$aJ hAaJhlh aJ h aJ hwaJ hVxaJhJh;oaJ hMoaJ h=[aJjh1 UaJh;oNHaJ hgBmaJ hZaJ h;oaJ8-67IJK)*789:;ABFG`fikqr{"#*9;<uvh1hoShoS6hm hoSNHhoShnmhkh aJ h:4aJ hoSaJh^e&NHaJhlh^e&aJ h^e&aJ haJhNHaJ h aJ haJ=RS56 IJOPRd}~RSXYPQ\bc~12 h S5hU*hW^hU*6h=[h{jhU*UhLhU*6 hU*NHhU*hm hoSNHhoSh1JO[fnFEƀ ..gdU*gdU*FEƀ .gdU*)8RS #+./3>AHIZ[fg̿ԻԷ{hGh(aJhGh(5\hGhY`aJh*khY`NH hakNHhak hY`NH h*khY`hY`hZhIh# mHnHujh(Uh(h$ihU*6 hU*6h1hU*6 hU*NHhU*jhU*U0Y~k!I & FEƀ  gdU*I & FEƀ  gdU*I & FEƀ  gdU* 4PUZ ,d$IfgdG&gd(gd( Z[kdз$$Iflֈ  t"||TT  t<"44 lagFp<ytG[fglOkd$$Iflt""  t "44 lap ytG/$d($IfgdG gklrstu{|   !)*45BCXYZmn|}hGh(aJhGh(5\hGhY`aJ hiNHhi hY`NHhL h!5hY`hY`N !*5CY[KKKKKK/$d($IfgdGkd$$Iflֈ  t"||TT t"44 lagFytGYZm[K/$d($IfgdGkdA$$Iflֈ  t"||TT t"44 lagFytGmn}/$d($IfgdGOkd$$Iflt""  t "44 lap ytG/I`PPPPPP/$d($IfgdGkd$$Iflֈ  t"||TT t"44 laytG()./?@HIJVWdeopuv}~  %&89=>STWXY^_kl hLNHhL hiNHhihGhY`aJ hY`NH h!5hY`hY`QIJev[KKKKKK/$d($IfgdGkd)$$Iflֈ  t"||TT t"44 lagFytG->[KKKKKKK/$d($IfgdGkdڼ$$Iflֈ  t"||TT t"44 lagFytG>XY_Kkd$$Iflֈ  t"||TT t"44 lagFytG/$d($IfgdG[K/$d($IfgdGkd<$$Iflֈ  t"||TT t"44 lagFytG!")*78ABPQ\]rsDEMNOdersz{hGhY`aJ hiNHhi hY`NH h!5hY`hY`hGh(aJhGh(5\O*8V/$d($IfgdGOkd$$Iflt""  t "44 lap ytG(N`PPPPPPP/$d($IfgdGkd}$$Iflֈ  t"||TT t"44 laytGNOes{[KKKKKK/$d($IfgdGkd$$$Iflֈ  t"||TT t"44 lagFytG. [VVQLLC gdgd'gd(2gd(kd$$Iflֈ  t"||TT t"44 lagFytGAz{ !"#*,.267\./8967 ʹʫʤʹʹʹʝ hsh' h6h' h'NHh=[h{jh'Uhfxh'NHh' hfxh'hI h(NHh/G h1NHh1hKsh( hv"h(hU*<h0>?O\'+89   )*+,.2Tbcz{:;UV &'ż͵͵ͮh,#h }*hdFlNHh= h }*hdFlhGhdFlaJ hhdFl hh|Gh# mHnHujh|GUh|Ghmhtyk h'6hRth'6 h'NHh' hsh'hsh'NH5   cmz{Kkd$$Ifl0F"  t0t"644 lapytG,d$IfgdG&gd|G gd{UVcd}kdW$$Ifl0F" t0t"644 laytG /$d$IfgdG/$d($IfgdG.^rd /$d$IfgdG/$d($IfgdG}kd$$Ifl0F" t0t"644 laytGWX^_hi  78  ! " $ %         * + 1 2 G H I J L   ɱɚ hjlh(h=[h{jh1 U h=h|Gh# mHnHujh|GUh|Gh`/ h }*h h=hdFlhGhdFlaJh }*hdFlNH h }*hdFl h }*h5;^_n rdrr /$d$IfgdG/$d($IfgdG}kd$$Ifl0F" t0t"644 laytG  "rd /$d$IfgdG/$d($IfgdG}kd+$$Ifl0F" t0t"644 laytG rd /$d$IfgdG/$d($IfgdG}kd$$Ifl0F" t0t"644 laytG  0 V    rdLLL/$Ud($If^`UgdG /$d$IfgdG/$d($IfgdG}kdc$$Ifl0F" t0t"644 laytG    * hXJ /$d$IfgdG/$d($IfgdG}kd$$Ifl0F" t0t"644 laytG/$Sd(($If^`SgdG* + }qq ,d$IfgdG&gd|G}kd$$Ifl0F" t0t"644 laytGKi[[ /$d$IfgdGkdS$$Ifl0 !D 9  t0}!644 lapytG<=KL %&  QTtu?EFGKMRl'(üмЬмммммммhbkhU*NHh=[jhU*UhbkhU*6 hbkhU* hv"hU* hU*NHhU*hmh-?xh ` hjlh(h4hGh(CJaJ h(NHh(>KL[vv /$d$IfgdG{kd$$Ifl0 !D 9 t0}!644 laytG vv /$d$IfgdG{kd$$Ifl0 !D 9 t0}!644 laytG  'vv /$d$IfgdG{kdA$$Ifl0 !D 9 t0}!644 laytGvv /$d$IfgdG{kd$$Ifl0 !D 9 t0}!644 laytG/vv /$d$IfgdG{kdg$$Ifl0 !D 9 t0}!644 laytGvv /$d$IfgdG{kd$$Ifl0 !D 9 t0}!644 laytGsvv /$d$IfgdG{kd$$Ifl0 !D 9 t0}!644 laytGstuMzu/F-Eƀ  gdU*gdU*2gdU*gdm{kd $$Ifl0 !D 9 t0}!644 laytG7s-(gdU*F-Eƀ  gdU*F-Eƀ  gdU*F-Eƀ  gdU*$%*+,-467RSh(!)!.!/!1!!!!!'"7"=">"?"'%(%-%.%/%9%?%@%A%%(&(*(+((((Ľ۵׵۵׵׵׵۵׵ h %Rh(h-?xh-Ihh(6 heUNHheUhjh1 U h-Ihh(h' hv"h(h(hU*h=[h{jhU*UhbkhU*NH hbkhU*hjlhU*687S'").1111Fkd$$IfTl0 ,"D B  t0644 lapytGT,d`$IfYD(gdG&gd|Ggd(2gd((((((((()#)I)M)P)n))))))))))***#*2*3*c*********--------.#.$.5.8.l.t.u.|.~....ƿƿƮƠh|Gh -]htyk h'NHh=[h{jh1 Uh/7h(NH h/7h(h' h(6 hTVh(hrh(6hh(6hKshIh5>h(6h( h(NH:..........11111111A2B22223E3F3333333"4#444n5o577T7U7888888&9'99999):*:L:M:Q::::::żżżżżżżżżżżżżżŸżżŸżżŴ hp hU*hU*hdh9hp h(NH hp h( h9h(h=[h{jh1 U hp h|Gh# mHnHuh|Gjh|GU?11o2p22-3pkd$$IfTl0 ,"D B t0644 laytGT /$d$IfgdG-3.3N33~pp /$d$IfgdGkd$$$IfTl0 ,"D B t0644 laytGT333 4~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGT 4 4)44~pp /$d$IfgdGkdd$$IfTl0 ,"D B t0644 laytGT4445~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGT555N6~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGTN6O6g66~pp /$d$IfgdGkdD$$IfTl0 ,"D B t0644 laytGT666?7~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGT?7@7a77~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGT7778~pp /$d$IfgdGkd$$$IfTl0 ,"D B t0644 laytGT88288~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGT888-9~pp /$d$IfgdGkdd$$IfTl0 ,"D B t0644 laytGT-9.9?99~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGT999K:~pp /$d$IfgdGkd$$IfTl0 ,"D B t0644 laytGTK:L:M:; <~yt.FEƀ ..gddgdU*gddkdD$$IfTl0 ,"D B t0644 laytGT:::;; ;f;g;y;z;;;;;;====== >:>;>d>e>@@@AAAA A AA A!A"A#AJA]A^ACCCCC_D`DDDDD#E$EE־ҫҖƖ–ҐҐҐҐ h(NHjhUh h# mHnHujh(U h&h(hBbhmh=[h{jhU*Uh( h %RhU*hp hU*NH hU*NHhU* hp hU*hp hU*68 <="=ACDDDD#D .$Ifgd9Pkd$$Ifl,"  t 644 lap ytG ,d$IfgdG&gd9gdU*2gdU*gd( #D$D,DeDD~~ /$d$IfgdG ,d$IfgdGgkdM$$IflF," t6    44 laytGDE EREE~~ /$d$IfgdG ,d$IfgdGgkd$$IflF," t6    44 laytGEEEE,F-FiFjF;GH?HtHuHHHHHHII'I(IMINIOI_I`IdIeIIIIIIIIIIIIJ J J[J\J]JgJhJJJJJJJJJ?KIKKŷ hoNHhGho5NHhohGho5 h]NHh] hh]hGh]5 hh(hGh]CJ h( h(NHFEEECFF~~ /$d$IfgdG ,d$IfgdGgkdq$$IflF," t6    44 laytGFFF]GH~~ /$d$IfgdG ,d$IfgdGgkd$$IflF," t6    44 laytGHHHHH||/$d($IfgdG ,d$IfgdGgkdy$$IflF," t6    44 laytGHH.H/H?H5' /$d$IfgdGPkdw$$Ifl,"  t 644 lap ytG, d$IfgdGikd$$IflF," t6    44 laytG?HHHHHHHIITkdj$$Ifl0," t644 laytGTkd$$Ifl0," t644 laytG /$d$IfgdGII(INIOIeIIITkdF$$Ifl0," t644 laytG /$d$IfgdGTkd$$Ifl0," t644 laytGIIIIJ J\JITkd"$$Ifl0," t644 laytG /$d$IfgdGTkd$$Ifl0," t644 laytG\J]JhJJJJJITkd$$Ifl0," t644 laytG /$d$IfgdGTkd$$Ifl0," t644 laytGJJJ?K@KIKKITkd$$Ifl0," t644 laytG /$d$IfgdGTkdl$$Ifl0," t644 laytGKKKKNVWZZ ,d$IfgdG&gd(gdtykgdU*gd9TkdH$$Ifl0," t644 laytGKKKKKKKKK L/LiLjLLLtNuNzN{NNNNN O!OQQQQQRVR\R_R`RjRqRRRRS#S4S5SNSTSqSSSUUUUVVVUVVVVV/W1W:W;WʺҺҺʶʶʰʶʩʩʩʩʺֺҺʰʩʩhD~htykNH hD~htyk htykNHh/GjhtykUh0htyk6htykhdh=[h{jhU*U hU*NH h %RhU*hU*h(hGho5?;WeWoWWWWWWWWWWWWWWWWWWXXZZZZZZS[T[[["\#\]]]]&_'_~````aa aaa%a&a'aٸͥͥэтh# mHnHsH uh`h(mH sH hGh(CJNHaJhGh(CJaJh=[h{jhlDUh h# mHnHuh(jh(U h&h(h hD~htykNHhD~htyk6htyk hD~htyk3ZZ_[SZ?$ d(($Eƀ If^gdGPkd$$Ifl,""  t 644 lap ytG_[[[IZ?$ d(($Eƀ If^gdGZ?$ d(($Eƀ If^gdG[\h]IZ?$ d(($Eƀ If^gdGZ?$ d(($Eƀ If^gdGh]]]]cW ,d$IfgdGAkd;$$Ifl,"" t644 laytGZ?$ d(($Eƀ If^gdG]]S^SZ?$ d(($Eƀ If^gdGPkd$$Ifl,""  t 644 lap ytGS^D__IZ?$ d(($Eƀ  If^gdGZ?$ d(($Eƀ If^gdG_`aIZ?$ d(($Eƀ  If^gdGZ?$ d(($Eƀ  If^gdGaa acccWPkdb$$Ifl,""  t 644 lap ytG ,d$IfgdG&gd(gd(Akd $$Ifl,"" t644 laytG'a(a)aDaEacccccccEdFddd2e3eeeffffffff+g,gggggqhrhhhhhhh l lllOlPllÿ~zh=[h{jhtykU htyk6 htykNHhtykhGh(6CJ]aJhGh(CJNHaJhGh(CJaJh(hGh(aJh=[mH sH h{mH sH jh)Uh`h(mH sH h mH sH jh(U0cd fIZ?$ d(($Eƀ  If^gdGZ?$ d(($Eƀ  If^gdG f`ffIZ?$ d(($Eƀ If^gdGZ?$ d(($Eƀ If^gdGf1ggIZ?$ d(($Eƀ If^gdGZ?$ d(($Eƀ If^gdGgggglqv{,eI & FEƀ  gd(gdtyk2gdtykgd(Akd$$Ifl,"" t644 laytG llllllllllllaobogohojo-p.ppp.q/qDqEqqqqqVrWrrrssssusvsssss8t9tOtPtvvvvvvww&w,w-w.wyyƾưưưưיjh1 U h(6 h(NHh(hmhtykNH h/G6h1;htyk6 hmhtykjhtyk0JUh=[h{jhtykU h %Rhtykh htyk6htyk htykNHah(NHjh1 Uh411 hI>ah( h(NHh( hxhxh=[h{jhxhxU;ы@TdX ;(6pFEƀ ..gd(2gd(gd(&-.PQVWY^_g|}ܕ78GOXop349:KLR\]$46=ABҠӠMN!"ΤϤԤդ h(%h(hjvh(6h411h %Rh(NHh=[h{jh1 U h(NH h %Rh(h( hYh(Jiovw}ץإyzª%&߫xy~459:efijk)*NU`|NOTUWX hOh(hJah{h=[jh1 Uhh(NH hh( h(NHh411h(NXz{5DƾǾ۾ܾ&3AMUWu{|    @A{|KĹhvww hLh(hLhquh(NH hquh( h/GNHh/Gh=[h{jh1 U h(NHh411h(HKL"#()+ !-.UV~ !cdijlGH'(wxy⽹޹ޫ櫹hjh(6jhLU h411NHh411 hNjh(jh1 U hh( h(NH h(6hLh(h=[h{jhvwwUhvww hvwwNH>  #&{| nVW^_st*.KL5tz⽴⥝ h(6hLh(6 h %Rh(hZ h(6hZ h(NH hZ h(h-Kh(6hF{h(6hX/h(6 h(NHh(h=[h{jh1 UhL hLNH<17?Ehi  VWstuMNg!/JK^_hi?Nmnr/1FINPQĻ˪h %Rh(NHhrh(6hMNPh(NH hMNPh( h %Rh( hjiQh( h(NHhMNPh(6hLh(h=[h{jh1 UCpkdF-Eƀ  gd(F-Eƀ  gd(2gd(gd($gd(  4;<ABoST]&'2$%  fglmnosxͿۿ׿ͺͿۿ׿ͯh|GmH sH hU* h %Rhm hm6jhmU hmNHhm hLNHh=[h{jh1 U hjiQh(hLh*h( h(NHB2opqrs ,d$IfgdG&gd|GgdtykgdmF-Eƀ  gd( xyz{|FJNOfg78<Mz{37?@W[ξhUp hzK8h(hGh(6 h(NHh(hGh(mH sH  hg/h(htZ.h|GmH sH h=[mH sH h{mH sH jhaUh# mHnHsH uh|GmH sH jh|GUmH sH h`h|GmH sH 4cd.Tkd$$Ifl0$ ,"  t644 laytG /$d$IfgdGnkd1$$Ifl0$ ,"   t644 lapytGgh9Tkd$$Ifl0$ ,"  t644 laytG/$d($IfgdGTkdB$$Ifl0$ ,"  t644 laytG /$d$IfgdGITkd$$Ifl0$ ,"  t644 laytGTkd$$Ifl0$ ,"  t644 laytG /$d$IfgdGYWI & FEƀ  gd(gdLgd(Tkd$$Ifl0$ ,"  t644 laytGEKrsn o t u x y     ) *  $%./YZ_`yz5<67YVW[\>BD|}̺̲̲̾h/Gh/G6h/GjhLU hLNHhL h(NHh( hU*NHh=[h{jhU*Uh-KhU*6hU*h9F ij./34 #'./35,/05      !i!j!p!!!!!!!!!䭦 h}[h(h@ hqbh(jh(0JUh*- h h(h=[jh1 U h(NH h(6h( hahLhLh/Gh/Gh/G6@ 4 W !$ 'kfaff\gdEh2gd(gd(I & FEƀ  gd(I & FEƀ  gd(!""""4"A"]"a"i"l"m"x"y""""""l#m####$+$-$.$$$$$$$w%x%%%&$&+&&&&&&&''' ' ' ' '#':';'1(8(?(R(](^(e(r((Ѿɺ hmNHhmhhh#h# jhhZZUjhZZUhZZhuh(6 h@6 h(NHhkth(NHh@h( hkth(@ ';'(8))n(F-Eƀ  gd%yF-Eƀ  gd%ygdFEƀ  gd#(((((((((((((((() ))))")6)7)8)A)J)W)c)q)y)))))))))))***0*5*y*}*****S+T+Y+y+++++++++,&,0,`,ÿڿڿڵڱhRh 5 hh/qNHhh/qh2 hhhQh4xhOhnKhthm h]NHh]hz`2hhhC)))**s-(gdF-Eƀ  gd%yF-Eƀ  gd%yF-Eƀ  gd%y**,b<'E\PQTdF-Eƀ  gd2gdh/qgd2gdmFEƀ .gd2`,a,s,|,,,,,,,,,,--- ----!-'-*-=-?-f-g-o-------------000000Q0S0`0a0o0v0w000000ʾⷮʷ⦢ΚΖhGhmheh=[h{jh2UhRUHh2NH hRUHh2hRUHh26hVOGh5h @jh @0JU h2NHh2h9h~V^hz`2hAGhR hRNH900001111314133333333333M6N6S6T6W6Y6Z6668888888888i99999::;;;;;; <6<Z<[<a<b<z<<<úúبԨhhPLhz`2jh,MUhG hVOGNHh=?h2NH h=?h2 h2NHh @h=[h{jh2UhRUHh2NHhVOGhMQ hRUHh2h2=<<<<<<<<<===#=&===>>>>2A3A7A8AAAAAAA@DADFDGDxDyDE E'EEEEEEEEEEEFFF,FCFKF|FFFFFFFFFFF¾¬¾¨h2h @ho}hm hA NHhA hMQh9h hNHh{h=[jhh/qU hh/qNHhh/q hPLNHhPLhhz`2AFFyGzG^M_MgMhMjMMMMMMMNNN(N6N9NYN{N|NNNNNNNOOOOO(O6O:O;OOOOOOOO P&P*P+P0PRPYP[PpPwPxPPPPPQQhBJh hNHh hz`2NH hlaNHhz`2hlah#;hF+h", hNHhh9]hT7uhh=[h{jh2Uh2 h2NH>Q!Q'QQQ:R;RTT$T%TTTTTTTUUUUUUUUUU V VXXXXXXXXXYY'Y(Y7Y8YUYYYYYYZZ Z3Z4Z^^^^^^^^^¾¸´hthb+hh{h^? hCnNHhmhCnhlahz`2 hNHhhF h pNHh ph=[jh2U h2NHhBJh2?TUUX^s-(gdF-Eƀ   gd2F-Eƀ  gd2F-Eƀ  gd2^^^_ ____________`````ddddddddde9eAeBeNeOeYeZeeeeeeff&f*f 0JUh=[h-jh-U h> NH h&NHh> hz`2h&hQh[h^?hmhb+ hb+NH8^^_ejpwaxyyg{idgd4xFEƀ .gd#gd2gd^?FEƀ .gdb+ g giiiiii,j-jjjjjkkkk k)kkkkTlUlnnnnoo$o)o?oHoIoooooooppppppppppVqqվվղﲮզ՞՚㈌hlh' hSNHhShhhmhV(6hChV(6hz`2jhV(UhBhKa{ hV(6h*hV(6 hV(NHhV(h^? h.NHh.hhhh=[h{jh,MU5qqqqqqKrkr{rrrrrsvv v vOvavevkvvvvvvvwww wrwswww`xax yy y!yyyyyyyyyyyyyyyyzzz(z9zLzNzOzgztz̾꺶򲶮 h4xNHh+h]+h4xhZZhN hvNHh~hvhQhh=[jh2U h2NHh2hlhmh' h'NHBtzzzz{{{&{f{g{{{{{{{{{{{{{{{{{| |||||||||Q}R}@mn޿޿ڿڿړ޿ h,ahZZh-h=[jhZZU hXhZZ hZZNHh1hUhZZ h6hg8th4x6NHhg8th4x6h]hh/h=Gh,Mh+ h]+NHh]+h4x7g{p:?ŜAժH9 gd-gd2FEƀ .gd#gdgd4xgdZZȀɅӅԅ  '(MN]^lmpqЎюQU:XZgkz%*uv֒ג¾ºº hNHhh-h=Gh8=hZZ6 hZZNHh%h1h=[h{jhZZUhZZ h,ahZZh,ahZZNHGHIW\st~•ޕ͖ΖkoϗЗ!"05IJNOXhh=[h{jh=GUh# jh~Ujh~UhJdh~6 h~NH h9NHh9h~ h=GNHhQh=Gh- hNHh=XYŜ  $%)>ΤϤӤԤ;?no%&*+jk̪ͪҪӪ23ijثܫ%2ABƾƾƾߺƾƾƾ߶߶߶߶hDh.h=[h{jh2U h/.h2 h2NHhh2h-hht hZZhh hNHHBw  ;>Y\de@JLMSTjuֶ׶"#()qr޺<=GHVW  ]ahg h-6 h26jh2Uhh=[h{jh-U h-NHh- h2NHhDh2G  jk4 -.NOv{|  =>h=[jh-Uhh2 h-NHh- h=[aJjhzh-UaJhzh-aJhzhB*aJph# hVj=B*aJph# hzB*aJph# hzhzB*aJph# hzhzaJhzhgaJ2%&MN()./$%rsPQVWHI]hij̹۹׹̹۹׹̲̪̣̹ h!h2h!h26 ho*#h2jh2U h26 h2NHh2 h-h-h=[h{jh-Uh[;h-6 h-NHh- h@h-< '(tu %18A[r~BCPQT^ghop䵱ϭϭӵϵӛӑjh0JUh^\ hNHhhmhOhhh=[h{jhU hNHhk"h hh h2NHh2h# jh2Ujbh2U9%61Qnidiiii$gdgdFEƀ ..gdgdFEƀ .gd  ()T_}56?29:LV]^  VWRĻѰѰѰѰѰĨh=[h{jhU h#Kahhk"h/hNH h/h hNHhh>h^\ hhhB hNHh hNHhh?JK$%)*+45ghstyz{x#$()+<JOP4NRlslmghȿhx0hNH hx0h h"h hNHh{h=[jhUh/hNHh h/hK5   !#$()./pq-.23TXZm#$67Q >\ci hUNHhUhhk"hLh:hAhNH\hAh\h=[h{jhUhFh\ hNHhjMhDQI ,I/%% $$a$gdr!kgdk"&gd4x$gd4x2gd>gd4xgdFEƀ ..gd./HIxklbcqs                    " 1 5 P R l p      Ž׹堫׋h@h4x6h# mHnHuh# jh$-UjhU hNHhk"h=[h{jh4xUh@ h4xNHh4x h>NHh>hhU hNHhjhU6  "#+,-DEFGHPQghij&'nlm'(3L(/8LOz{}%&ȿȱȭ h>NHh>hbr}hh@hk"6h=[h{ hk"NHhk"h# mHnHujh4xUj\haUjhaUmHnHuhajhaUh@h4x h4xNH:&PU\]:;/7CD  < = w#x#}#~#;$<$C$R$T$s$%% %!%*%-%/%0%IJ hNHh>h{hr]hh(h,(ehk"jhZrU hZrNHhZrh=[jh4xU h4xNHhbr}h4xF0%I%m%u%%%%%%%%%%%%%%%%%%%%%% &F&\&&&&&&&'''' '"'_'a''''''''F(P(e(ļļȬȬȬȬȝȬȬȬȬȬȖhIhk"6hhk"6hk" hdh# h#NHh&h#NH h&h#h# mHnHujhIUhIh#jhBzUjhr!kUmHnHuhr!kjhr!kU hNHhr]h4xh3%'''+~047x?C;HMM%OPPQgd FEƀ  gd,)9gd,qgdk"2gdk"gd#&gdIe(s(|(}(~(6+7+<+=+++++K,L,,,'-(----. .*.0.1.#0$0)0*0K0L0~0000000 1 1"1#1$1+1,1-1S1T12222/3ּh# mHnHuh# j|h$-UjhUh h,qNHhh,qhu'hv=hk"NH hv=hk" hk"NHhk"h=[h{jhk"Uhhk"6hIhk"6 hk"66/303P3Y3g3{3334444;5<5q5r5555577777t8u8v8::::::=;B;;;k?l?t?u?zAAAAAAAAAABB2B3B4BCCCCCCDh:Lh# jh$-UjhUhhM h#NHjh1 Uhlh=[h{jhU hNHh#hh,q h,qNH?DDDDDDGGGGGGGGGGHHH HHH'H)H*H.HCHLHWH[HbHeHoH|HHHIIkIlIIILLLLLLLLLLLLLMMMM_MyM{M|MMMMMMMMMMMŻŷŷŷŷhL hNHh]3h h]h#h:> h#NHhg8th^h=[h{jh1 Uh# hY]NHhY]HMMMMNNN"N/N;NQNpNyN~NNNNO O O$O%O8O9OeOOOOOOOOOOOOOOOO9P>PGPHPTPUPPPPPPPPQ QQQQ#Q@QNQPQQQQQQQQQ RԿԹ hk"NH hn%hk"hth hNHhk" hn%NHh;]=h1Chn% h#NHhFKch#hLhFPQQQ\Rs-F-Eƀ # gdk"F-Eƀ " gdF-Eƀ ! gdk" R RRRRRRRRRS'S,SASOSiSjSSSSSSSSSSSTTTTTTT U)U6U7UDUEUUUUUU$V%V~VVVVVVVVVVVWWƾƾưưƬh0 hxNHhc5hX h#NHh,#h=[hxh{P|h# hhhhh;]= hNHh"I hn%NHh1Chn%hhk" hk"hk";\RR STTTUYZZsnic^^^^gdX "$gd#gd#gdF-Eƀ % gdF-Eƀ $ gdk" WWWWWWWWWWWXXYXZXaXbXeXfXXXXYYhYiYkYmYqYvYwYYYYYYYYYYYYZZ"Z#ZcZdZZZZZZZjh1 Uh{hA mHsHh1 he @ h NHh h hX NHh{P|hX jh Ujh Uhc50JNH hc50Jjvh Uhc5jhc5U4ZZZ[[\\N]Q]]]]]j^k^^^^^^_z____ `````paa bNb$c%c&c]cccdcdddddddeeiejekee°ʣ…ʣ#jh8\h8\UmHsHh8\h8\5mHsHh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\6mHsHh8\h8\mHsHjh1 Uhtykh1 mHsH4Z[ ]Z]]]^_`aabiceeEffwggQhhNiivjkkslm0^`0gd8\o]}feeeefffffff?g@gAg\gbgdgggggggh0hdhh)i*i+i7i=i?iiiiiiiii@jj1kkkkkkkClDlEl]lcldlllllllmmDnEnFnnɷѪh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\6mHsHh8\h8\5mHsHh8\h8\mHsHCmmno&pqqrsstuupvwwcxOyynzzD{{j|}e}~~0^`0gd8\o]}fnnnDocoooooo$p%ppppq q qqqqqqq[r\r]rur{r|r s_s,t-t.t[tatbttttttuuaubucuruxuyu*v1vvwZw[w\wvw|w~wɷѪɘѪ#jQh8\h8\UmHsHh8\h8\0JmHsH#jzh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\6mHsHh8\h8\5mHsHh8\h8\mHsH<~wwxxyyyyyyy=z>z?zXz^z_zzz{{ {7{={?{R{{@|M||||}}}*}A}}}x~~ NOPmu1etuɷѪɘ#j-h8\h8\UmHsHh8\h8\0JmHsH#j>h8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH<#m?U3Ë:?7ZN0^`0gd8\o]}f߂567ltv,SօNO݆+R!(l]bcˌ缪缘缆#jh8\h8\UmHsH#jh8\h8\UmHsH#jh8\h8\UmHsHh8\mHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsHjh8\UmHsHh8\h8\0JmHsH4=>֎׎  5623{|}%&'5;=•ĕ&E<=>`fg$%۵۵۵۩۵۵۩ӗ۵۵۩۵۵۵۩۵۵۵۵۩۵#jh8\h8\UmHsHh8\h8\5mHsHh8\h8\6mHsH#jh8\h8\UmHsHh8\mHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH;<ӕdlؙ_Ýž9kDq(B0^`0gd8\o]}f%ՙ֙,-.HNP͚ΚϚ  ֛ʜ3dKL89:QZʡߡJl#Aͤ *N¶¶ª¶¶ª¶¶¶¶¶ª¶¶¶¶¶¶¶¶¶¶¶¶#jh8\h8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH#jh8\h8\UmHsHh8\mHsH<Zɨʨ&'ةߩ@AϪЪѪ rOPQyz^_`;Z Gǰ1ݶդݶՆݶ#j]h8\h8\UmHsHh8\h8\5mHsH#jh8\h8\UmHsHh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\6mHsHh8\h8\mHsH4B{5Ȯүd_]\̶`8ٻop0^`0gd8\o]}f123]^ȱq/01[\ogϵ7<=>IOQR !defǺȺɺɽɽɽᵣɽɽɽɽɽɽɗɽɽᵅɽɽɽɗɽ#jh8\h8\UmHsHh8\h8\5mHsH#jh8\h8\UmHsHh8\mHsHh8\h8\6mHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH#j8h8\h8\UmHsH4ǻȻ/oüļż567`fhٽ456Y_amno>h@ATHɷѪɘѪɆѪ#jh8\h8\UmHsH#jh8\h8\UmHsHh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH2CvTIXn fR,L&$0^`0gd8\o]}fHIJtu 6&XYZu{|W~"#YZ[ cdɽɽɽɽɽɽɽɱɽɽɽɱɽɽɱɽɽɽɽɽᩗɽɽ#jh8\h8\UmHsHh8\mHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH#jh8\h8\UmHsH:  qr"#KAj067&'(cijQ  /01FGɽɽɽɽᵣɽɽɽɽɽɗɽɽɽɽɗɽɽᵅ#jh8\h8\UmHsHh8\h8\5mHsH#jgh8\h8\UmHsHh8\mHsHh8\h8\6mHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH#jh8\h8\UmHsH4F{pH|N"VGa?0^`0gd8\o]}fG[\]|v5;=%&^_`:$%&OUVVɷѪh8\h8\6H*mHsHh8\h8\0JmHsH#jIh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH=RSTntv Ag# %<d012()ɷѪɘѪ#j{h8\h8\UmHsHh8\h8\0JmHsH#j(h8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH<JrKp-p + 0^`0gd8\o]}f+,-`fg n+,-V\]9.eRSt%&{|}(qɷѪɘѪɆѪ#j h8\h8\UmHsH#j h8\h8\UmHsHh8\h8\0JmHsH#j h8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH4 J;qrs 0Z a  ,         2 3 4 y  V     Z   Z   f g h : B C 6   jh8\UmHsHh8\h8\5mHsHh8\h8\mHsHh8\h8\6mHsHO  N & "    v  8  n T m =        P 0^`0gd8\o]}f   n o      ( K     J n u v               C I K z   S    E  <    ¤¤¤¤¤¤†¤¤†¤¤¤¤¤¤h8\h8\5mHsH#jh8\h8\UmHsHh8\h8\6mHsH#j h8\h8\UmHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH#j h8\h8\UmHsHh8\mHsH4P  o /        L v  t  c  .  * ! o! ." " # $ 0^`0gd8\o]}f    k         E y   * F     G H I t u   _ k       o    7 8 9 a b v      ȶЩȗЩȅЩ#jLh8\h8\UmHsH#jkh8\h8\UmHsHh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\6mHsHh8\h8\mHsHh8\h8\5mHsH4             - . d e f ! ! Q! _! ! ! ! " " " " " " " " " h# o# # # # $ $ $ $ $ o% % -& \& ' )' ' ( ( ( ɷѪɘѪ#jh8\h8\UmHsHh8\h8\0JmHsH#j'h8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH<$ $ % & k' ( ( o) * * !+ , N- . / / U0 a1 12 2 A3 4 4 5 7 7 8 o8 0^`0gd8\o]}f( ( ( ( ( .) /) 0) Q) W) m) ) ) W* X* Y* * * * * * * + + + + , e, f, , , , , , - &- - - - . . . w. x. . . . / / / / / $0 h0 0 $1 %1 H1 ɷѪɘѪ#jh8\h8\UmHsHh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH9H1 I1 J1 _1 `1 1 2 2 3 3 3 3 3 3 3 4 4 4 4 4 4 X5 5 W6 6 [7 \7 ]7 k7 q7 s7 7 7 7 7 8 8 U8 f8 8 8 8 8 8 8 8 9 y9 9 .: R: : ; ; ; H< I< J< < < ɽɽɽɽɱɽɽɱɽɽɽɽɱɽɽɱɽɽɽɱɽɽɽɽᩗ#jdh8\h8\UmHsHh8\mHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsHh8\h8\0JmHsHjh8\UmHsH#jh8\h8\UmHsH;o8 8 9 9 : 3; < = = = <> >? ? e@ @ A `B *C bC %D D D D E )F 0^`0gd8\gd8\0^`0gd8\o]}f< < < '= R= = = > > > > ? ? ? ? ? ? @@ ]@ z@ @ A 'A B !B "B CB IB JB B B B B B B B (C )C ?C OC C C pD D D D D D E E E E ɷѪ h*6NHh-C h*66h*6jh*60JUh}jh1 Uh8\h8\0JmHsH#jh8\h8\UmHsHh8\mHsHjh8\UmHsHh8\h8\5mHsHh8\h8\6mHsHh8\h8\mHsH3E hE iE E E E "F %F &F (F )F *F F F NG OG G G H H _H `H aH H H FI GI HI wI xI I I I I I I J J QJ ZJ [J jJ kJ lJ J J L L M CM DM PM M M M M XN YN aN bN |N h=[h*66NHh=[h*66jh*6U hh*6 hTP'h*6h*h*60J=CJNHh*h*60J=CJ hh*6 h&jh*6jh*60JU h*6NHh*6<)F G _H GI wI I kJ M N P @\ :` ` @d d 6e k wr xr zr {r }r ~r r gdxgdgd(gdtykgdkUgd gdO.gdqgd|8|N N N N N O O OP PP P P Q Q PQ VQ Q Q R R U U KV V V V V V 7W 8W X X oX pX sX tX X X E[ F[ [ [ [ [ =\ >\ ?\ @\ A\ B\ \ \ \ \ y_ z_ _ ྸྸ󠘠jh*6UaJh*6NHaJ h*6aJh*6CJNHh=[h*66CJ h*6CJjh h*6CJUh h*66CJh h*6CJNHh h*6CJjh*60JUh*6hY+h*667_ _ _ "` (` )` 7` 8` 9` :` ;` ` ` ` ` ` ` ` a a c c c c c 1d 7d 9d =d >d @d Ad Cd d d d d d d 6e 7e Rf Sf h h h h ׾אאwh=[h*66jh*6Uh @h*6NH h @h*6h=[h*65CJh=[h*66CJ h*6CJjhXh*6CJUhXh*6CJ h*6NHjh*60JUh*6jh*6UaJh=[h*65aJ h*6aJh=[h*66aJ.h i i i i j j k k 2l 3l fl gl l l n n n n o o o o o o q q 7r 8r Vr Wr wr xr yr {r |r ~r r r r r r r r r r r r r r r r r r r r r r r r r r r r r r r r r r hlmQmHnHuh_mHnHuhhbjhN]\UhN]\h=[h*66 h*6NHjh*6Uh*6Er r r r r r r r r r r r r r r r r s s s s s (s )s Js Ks ds es 7$8$H$gdagd@vQr s s s s (s )s Js Ks ds es zs {s s s s s s s s s s s s s t t )t *t Ct Dt Et Ft Wt t t t t t t u 篚h4hBzh*6B*CJOJQJ^JaJmHnHphu-hBzh*6B*CJNHOJQJ^JaJph)hBzh*6B*CJOJQJ^JaJph+hBzh*65B*OJQJ\^JaJphhvh*6B*CJ!aJ0ph%hvh*6B*CJOJQJ^Jphh*6(hr!kh*65B*OJQJ^JaJ$ph(es zs {s s s s s s s s s s s s s s s t t )t *t Ct Dt Et Ft Wt T7$8$H$^`Tgdr!k $7$8$H$a$gdr!k 7$8$H$gdaWt ct t SV & FG7$8$Eƀ{|f-H$^`Ggdr!kV & FT7$8$Eƀ{|f" H$^`Tgdr!kt t t SV & FT7$8$Eƀ{|f" H$^`Tgdr!kV & FG7$8$Eƀ{|f-H$^`Ggdr!ku u fu gu hu iu {u u u v Jx Kx Lx ϪϦh}h$-+hBzh*65B*OJQJ\^JaJphhvh*6B*CJ!aJ0phh*6)hBzh*6B*CJOJQJ^JaJph-hBzh*6B*CJNHOJQJ^JaJph t "u fu gu hu iu SQEQ $7$8$H$a$gdr!kV & FG7$8$Eƀ{|f-H$^`Ggdr!kV & FG7$8$Eƀ{|f-H$^`Ggdr!kiu {u u u BV & FxB7$8$Eƀ{|f-H$^x`BgdBzV & FT7$8$Eƀ{|f" H$^`TgdBzT7$8$H$^`TgdBzu u u SV & FT7$8$Eƀ{|f " H$^`TgdBzV & FxB7$8$Eƀ{|f-H$^x`BgdBzu u v SV & FxB7$8$Eƀ{|f -H$^x`BgdBzV & FxB7$8$Eƀ{|f -H$^x`BgdBz v v v v v v !v "v #v $v %v &v 'v (v )v *v +v ,v -v .v V & FxB7$8$Eƀ{|f -H$^x`BgdBz.v /v 0v 1v 2v 3v 4v 5v 6v 7v 8v 9v :v ;v v ?v @v Av Bv Cv Dv Ev Fv Gv Hv Iv Jv Kv Kv Lv Mv Nv Ov Pv Qv Rv Sv Tv Uv Vv Wv Xv Yv Zv [v \v ]v ^v _v `v av bv cv dv ev fv gv hv hv iv jv kv lv mv nv ov pv qv rv sv tv uv vv wv xv yv zv {v |v }v ~v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v v w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w !w "w #w $w %w &w 'w (w )w *w +w ,w -w .w /w 0w 1w 2w 3w 3w 4w 5w 6w 7w 8w 9w :w ;w w ?w @w Aw Bw Cw Dw Ew Fw Gw Hw Iw Jw Kw Lw Mw Nw Ow Pw Pw Qw Rw Sw Tw Uw Vw Ww Xw Yw Zw [w \w ]w ^w _w `w aw bw cw dw ew fw gw hw iw jw kw lw mw mw nw ow pw qw rw sw tw uw vw ww xw yw zw {w |w }w ~w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w w x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x !x "x #x $x %x &x 'x (x )x *x +x ,x -x .x /x 0x 1x 2x 3x 4x 5x 6x 7x 8x 8x 9x :x ;x x ?x @x Ax Bx Cx Dx Ex Fx Gx Hx Ix Jx Kx Lx 0^`0gd8\,1h/ =!"#$% n<c[z[d"ؚ-PNG  IHDR4sRGBPLTE """)))UUUMMMBBB999|PP֭3f333f3333f3ffffff3f̙3ff333f333333333f33333333f33f3ff3f3f3f3333f33̙33333f333333f3333f3ffffff3f33ff3f3f3f3fff3ffffffffff3ffff̙fff3fffff3fff333f3f3ff3ff33f̙̙3̙ff̙̙̙3f̙3f̙333f3̙333f3ffffff3f̙̙3f̙3f3f333f3333f3fff̙fff3f̙3f3f̙fffffffff!___www˲𠠤X"N pHYs+ IDATx^]Y:?$ eΊ6sד%D AJh3vA9^KZԩcjrr v95'2;vv!p!cg2;vv!p!cg2;vv!p!cg2;vv!p!cg2;vv!p!cg2;vv!H˦v{ #MMgBa|!p8t7Pg?WW䷊._@Hl?G8ٱyl?G8ٱyl???</(r/`>̏caþ^%^[:Ӿ@H|Gh՘B3><%ߙNwGՂ/jN |iE#Tb o9?Hn0SC@'-"T F`O{z% #WgxPC%ŋd$PW-_{Q:,53LҶEKix.̄OēRrGt(mYh_J J|@^zءWsѦ1( ħ$F08<P⋚r߽A'_Rqɖ& lB-1YO(%I?բu[j:BTC*y4KlNK Lvv$Z"DL/N)ez-f]?h&:?H\Ͻjm+6G'Y蝔8+ޚ[S _?;8|ʷu_#'H\T ߩU?C1/eGOf+p&g+@Z"rT3h˷w ~A$i_>l7Lڗ- DEU-2i Kbإ~Ů뻧v,fOoͮOq5AVբ!̟ynu]o]h&>=+n SYIo]o21K1xX8@+er:M &ڿGakLox17)7\ ĭJ|[_̳Pgiq+hyp|]8yx-lЪB8?Y4iJ-P 5! w`Uryw ~kHouf.U9R4SaǷ Gn:z`r|Mpej[ `KiӶ[Y}";&(MaoWW|[~%H.} 3l?kcbSasl!YlRN1<0fFOΥ??Jojo=#,&,$4z8h>BKkQL[F3z񓭋"}1*Ȍg<dz쓻F7irOO xϡw!kme?.V+FS'O2 U:geY+޸Q,IcsvlHwMC #Q>֒D}aD祦|2cYwoueFiI׃1U\)kZ%4 MExzsszwhy#9jW+~ӼC /wӠp߃orKsqt. 7]5v:ߛupK:z _%ߧ#i8w뱓}/擎7ϯ>_Xſo&~[TߚP?3Bސk1DK+/EzC^ZzD}K5>[n#4?Gx>?_ưݽaץ _Rw_3[.)m}DlsԟR<XK|b6c̠K[J<7^o)M )IѥW~Gh3'vc?]¡f\_T;e5i>oc?7~E^aZO]KW]$ߐ`k^AO/,8}/@^Hop%mn<LA4AICDs<LA4AICDs<LA4AICDs<LA4AICDs<LA4AICDs<LA4AIC~ӌ.hIENDB`}DyK _Toc175210961}DyK _Toc175210961}DyK _Toc175210962}DyK _Toc175210962}DyK _Toc175210963}DyK _Toc175210963}DyK _Toc175210964}DyK _Toc175210964}DyK _Toc175210965}DyK _Toc175210965}DyK _Toc175210966}DyK _Toc175210966}DyK _Toc175210967}DyK _Toc175210967}DyK _Toc175210968}DyK _Toc175210968}DyK _Toc175210969}DyK _Toc175210969}DyK _Toc175210970}DyK _Toc175210970}DyK _Toc175210971}DyK _Toc175210971}DyK _Toc175210972}DyK _Toc175210972}DyK _Toc175210973}DyK _Toc175210973}DyK _Toc175210974}DyK _Toc175210974}DyK _Toc175210975}DyK _Toc175210975}DyK _Toc175210976}DyK _Toc175210976}DyK _Toc175210977}DyK _Toc175210977}DyK _Toc175210978}DyK _Toc175210978}DyK _Toc175210979}DyK _Toc175210979}DyK _Toc175210980}DyK _Toc175210980}DyK _Toc175210981}DyK _Toc175210981}DyK _Toc175210982}DyK _Toc175210982}DyK _Toc175210983}DyK _Toc175210983}DyK _Toc175210984}DyK _Toc175210984}DyK _Toc175210985}DyK _Toc175210985}DyK _Toc175210986}DyK _Toc175210986}DyK _Toc175210987}DyK _Toc175210987}DyK _Toc175210988}DyK _Toc175210988}DyK _Toc175210989}DyK _Toc175210989}DyK _Toc175210990}DyK _Toc175210990}DyK _Toc175210991}DyK _Toc175210991}DyK _Toc175210992}DyK _Toc175210992}DyK _Toc175210993}DyK _Toc175210993}DyK _Toc175210994}DyK _Toc175210994}DyK _Toc175210995}DyK _Toc175210995}DyK _Toc175210996}DyK _Toc175210996}DyK _Toc175210997}DyK _Toc175210997}DyK _Toc175210998}DyK _Toc175210998}DyK _Toc175210999}DyK _Toc175210999}DyK _Toc175211000}DyK _Toc175211000}DyK _Toc175211001}DyK _Toc175211001}DyK _Toc175211002}DyK _Toc175211002}DyK _Toc175211003}DyK _Toc175211003}DyK _Toc175211004}DyK _Toc175211004}DyK _Toc175211005}DyK _Toc175211005}DyK _Toc175211006}DyK _Toc175211006}DyK _Toc175211007}DyK _Toc175211007}DyK _Toc175211008}DyK _Toc175211008}DyK _Toc175211009}DyK _Toc175211009}DyK _Toc175211010}DyK _Toc175211010}DyK _Toc175211011}DyK _Toc175211011}DyK _Toc175211012}DyK _Toc175211012}DyK _Toc175211013}DyK _Toc175211013}DyK _Toc175211014}DyK _Toc175211014}DyK _Toc175211015}DyK _Toc175211015}DyK _Ref163490554}DyK _Ref163700007}DyK _Ref172888347}DyK _Ref171596579LDd 8W8PPD  # A#" `2HL̠*[#$L7`!L̠*[#д 0QKx} |TwwI2,H[@ b]&d`2g&ԢJťVꂭZU b** ZWf22~0ޙL=s=K㚆^4m? \L#5⩛{SòwO~X|^ M[._ \;h(po?|a6m]ITncIF-K}u8qu,_t/^?V4R-WV{}ZM9.iM;ITzǴPvmoџ.g?VGxF?WmG^_ޯ>Z[c5, \%U< =;%ͩS|z+&FqL> ΚR>pR^8vO7pkBˇ3'̪>l\ebԊB%S=Э(TO&NYY&ꯞ0u"|[  kĹX+fnlqXŮ?aք$.b$ոM/ӡqOYʪr.,}0S.g֔5Y!\k:a#o}sĜㅽ%[HUh &BP@sk^OZgc:Hl70OUV,€qhq TUA`Y`=H+a)=ܩ~97gI* 0 *V̊;# c$ii<v88.8$-ȑprVKRbt0qp88 `4F"pGu)~|C]O]=`+䛀ʰKI OUCit`J܊$Ϧ`cT*1{KHsx"$vǦ]r~_q7(xUy4obQrI='f@q-EP[l5ţ^GbqlV+QȢYUrx(puB?ުc-8^|{ߖHM5䃓'WKi>y8KQu] ؉&VTU ۑ2` , 'jJ.\:7z.gY šX4R_`c蕫͐./ CB Dj |iJYx^ª;ܸETE+&WHvrp(VÄ`4,&RCrmʖS ( y+PSȊF'Pn;N9wPgSh9 Yh}]ٔҳOJBj6-)ύ{!M 4D#yZ+=х#) כV-qp$*&”Z繎q*\pXC]8ϖ};rJ=mb&,T[K 8մKkν,xnc$5lչ]3PFؙA>( IVu8x_]Ne:\syj sv <.䏙%otչ) +vȱ& E:w^Y7cOhR^]>GkR_*m}X OCZyh -s\; `;&&דwRc͉+NpTuNԩ3M m=V@Њ̾* ޝkz`-a+2+y2sE<}wǾWV_yd׸68eՎ{M!qK$_-KS3Ӎ33.HXJ>egUx6D"&nؓQQrM@B$kjH󧶎Ci=Zs:'5W{R6czafA,<$e]+IYӞL [ۯ^KR>W0뗚bO,3SsG M@c.ksofSsڎ㤶çCh>\ 5觤Oѥ+ E!sF{Ҭ&뷑-N򰾆ܣ}LOc=Sf2VI7I:B&[ȩ i2G@&ZҠO{eɯJNa9}mDp(b%RwåID[HKHj~ Vn{[Rt^_nhn ֿП?__ҷo;']t5>}ޤ/ҿu#4XǨNPN@;\Wݪ]VkKsBFAF梵Z5zDDOjѳZ9ڢAjNm(ڭ>ӎAHvGŮ>'*E'q!˕n2]C<S#$N U/t^ uSzD_R:f{z8fDU`ҏ2DT~ G5q-{):G/MqL %ͅ.d΂]5~KSCŢKVLFZ$Otn7 ;>dhKt{a!VBA::XVյPjT@,X'W߭S͞ Zļi7 }ԺFVض)\wIɢ ⡸_) *smEJGH3_sᭌ-Y6od#b~Vf3l׳5݋dzvȮœؕx [+Ebx. Y-aAؙx1;_⥝(s"ea1ީѠ,?s2v#m& uHЪϲuh8Z 2OSKIFڷHRhSϩ%e,dIoɁX5̗(Y! Ʊ,-5`b=|\ 1 >Y'aFNl|RUy!FËd[Qh!{-k"3:q<>Ƙ寈$UgZayx| 'i>Ty)?Vޭ>9gؽ]:ntsJmeҾcII{s[VCu<OLs5Q7, <>;r| -|<ߍn7|ԇC=|Rl/7|IoI!AzHO/$>[xv@e[͖D?Ww mWB|, >jfO,WçLB~v 1#'X'c7x9{d4.!Cp!d&BQd > ½d{0>G&GQxǓcp9WTO$qg#\90g>x?ב"0 0 a~< .#%. I\Er p;0no}Cd?{~oo^ν!7|֎ +D]&b-͈tWhm ȷs2&!o'IlũȡNHܙǨ (ESYu1Ţ?czyPH2cǧȞҫJcĊ'b xҢaL/cjgZk\yH$X$m+{Sv]䥶4ۜ5nGI'Bqpt7H-eEKA_  Di:dVr%H7W#o &ׂ4#)d H4w6[h쿏Z:5ct,$8 +d u3~"3#Gy1.6>%ƻ/hnNh7cMwON9} M[}xE_oӗ{?۠2Aoҿ6tAޏ_?DFz~މFo? .ٿހgUx^Rw.~]bC=P?IK}|9D?%=)/fΠ;]ʡywxb_eá2ۋ{ӝ.8$͎iփ8Pl?e:2OmA~DJӝ>S+fBy)[ۋ3[c/whUNBV %A4*bqo*ِڼ乯&-^Igz^VoeJsv\)lI%#=̦<.ЩHmc,ΓddTgg^nFƼC"㩬C+1:(U3@?' 0dK^<_JOO;\RFUn/y:Y6A'&77=;V JCS1pˮL,w+iG>֑[kKKB`EߡWMu mgww؟ѻ/֡}a{ }˞@?8߈<9ԃoFG _Eh~pRhqpY L3q15 M˿E_>o矠w^&wNno6Na5"hQe 3t;QW֥lhԒܩ2{Wmo_׵ݹoso8ѤVM9lI fEx-*+#ND8\K #ƐtN"F:Lg,֐{B m h#oD^mb^J>K88\"6r V!MƳ3d$i4%qc=9׸H3n"sdq l1~I?'GHrRhtLKjsS֒P{f8A?U$ wVRSc@)!̲VC]4bxAJgؒunš]gFÖf2JUwny9R $,pʗa`'h6;o C%> R꤆=Chm Pɧ qrg{Ǭg*әk`ryk"ױݼF-9ѕ-βPYڊEɲŁP׼i3JI!/4pZrNIΖL&AO&Y&B¿x';7cYAAЮWT<95 IfƚLUH'xspp>e=,WgZj&|TLzWy|5aO8ALcpoŴqa.WʤĢ<.jHǹSDUX(}qf!kg\gEN ~z;*QFR1)U!Y=t<HS-j$R|vzHfx|mwvG]'cAkPG!L}"+Q~g!QT7\΍cĨ@DU4Eߎo1kh2ϡg㨿{Q/CF;UMXuӔru b@9L0,1x(/F"DIʮeWzR:KMē+hz :M"{>V,hDb~=[L<$"X({tܺWjߗÂh|@Og[>]+TςLv !Ϭ{.~&e?=b<evgy=Emx_^KY~Wo?u+_.wY3m㷱W .<p?ݲ}`e!QœǚSnC s'?|Y{@<n#qk$6G3Rs'o1e?@?w_"-?|9r?\.9O7L{_@n`?` `3Pz3*][HںJ1+vV쩣$vhИ$8PQ+C1wRK%%yֵD2ؽX'd cVȂ(RW[a ^j_nۿO:L X4%Y [H $XV֧~1W9|=C *ÉP8 ۼ lh2KoUO; sZ|ފ[$nǴt䜖SS%wTO t1VUޛBKj^e&sǀ^2q~8iMydo#U ;A*b<5[Jgg9$9o@m1S$ﵕ޴{x:u6rs=-6͍{h1 Fiӧi"tG&d|9C_2|Yӧ͜0,R, MԴ2R%@+zJ[?QHzL8}feDzzԉn%Z/Xju>sq;bvXzO5afe4I;"I5nthgnS}jjEy,4@\eB]c&K:Ľk\k:a#o}sĜㅽH&"U,P|DFxl5vU60<2*JUP^ knW^[ܜ*2ܸ55Xp I*p;{e@l' gkʃyV VkkR-I8Z&qT:8($Ģ V,䏅jq͛=]Ѷ%ǝ2^{;esC zSJzunRto%R0Th۞Y\PM*hZtnͩZٷ6v)Wp&1`y\3m]=mu.FC5d.2׹Lʺ{V5˫hm\K'"~p(R+^;i:Wa7.$fN\O>czD3$;yV3[?~IAz  z fdgkU!b`*D0o*j)\")rv_({@!x/o}74O?U;E7Ru%v\Afg]CTP"aٹITUA[D^gdDE6% `!A|v^rk|\}'l#sE^֟gϲ3Y?>d5K[zEW0.b9E̋"t+ASX4BYt8JX?d7hq8=886~Ve*HcZV֠< ZO'h5o:mWdG2)4c42Cб`t1M4E1hq4G0g`D?QQO`:9&mʥT~<)bv:@6 wޠuƒd6j'J@Z:L44Uuh bM"M\oOixVyzS:Yo ;S;W-mH6I{!EA¾}9LoѢr&|O2WܝXEI?-6~01ƅ(^wLsKO&w(V 7)L&neD4 xSsl(=d b%U>8,I&$` n㢶' 'itΆ10&c ڗ5Aru0lPQmPEo uSGNSsV*+x=~cza A,<$e-Iy 8Z,L_&a/5Ŭ 'XB ~o*oz"=ph8luQQϛbIi_0qR!~4`BEHot.ISR'RR~NrЕߐJ~#P=iW~'yX_C>&d1)z3$e\_!-Td4o Z}-i'M=2W]d%`wk0֖e}rz$,Rڇa gBu\?z_~B{Vݩ{|WZh/g+NI}'>qO_7_ot݅ 1TS|2.EjG\tjП_hV*ѓdVhcЫ(Svkǣϴc:]QJI"4Es\yr墄FW8_`$Ol~#'(fvZUQ.㽲63/k,)}`"䯊Fc)] K2^ p̦C=KP8Q!R18whX8 T.*?X/Mqjd[ԒZ :.]bM["as;taرpD : 93l.^Q?bS*x}w. :_88i0$jZ%n*q!6Ds*PIb C;~r ֠iG 󥬋fVHq, &B-@M==i$ci>Lp}i7>_Bpw~Z\ >Ki)6h ȗRGzO$D{ͤ N]'@-a OiGOӲ .][#p6?g_x~WB l)?3x: W}TC#߯.2#2dd,"<|x%d.$C9#q_; GAl'(<`89 xr GJJቤ?.lr$ ?,ϒ:r^=w!p!|eD7~?ג"[H1^.nVmͲ7apl;5?ھus3Q LIAQ8㯴uњMV%u>NWS*z"&!o/a# {Gq*{ Fe@)߻=FX5gL/ Xfwjo2WFMiXD 'eӱ˱ڵ ;37WfJeN$V,׶{S6f]خ֖ZfQh>>:@ D](Ne>s)H??KA$_5 \ X vDо$74"ZfD nF{?tr -t^K?AO!#t x3`:~Az&Odf(7>GxC ` f_w. ݉?x?}Kϻm2~wG_67M7&ٿnx>+t='HO;m9<#~%7 ox=] p1E\Klj'b S53-U܁OqSBHmГsJ_ܥ~$UL ?G*yp#cPp2LŽv%m#͎iփ8P+C8V(7ЏHY-2rUcŬX(/;c-EwhUf“N~PJ,qh,`T&CZn٩3=/f2{[h.OzJO۠'e{lɎi'e!jXu)KVLb|NJ%sGcmrge̫vqXQI*RQUϚYV5k9y ٹRދLb HW[W^[y$-3K.7e@QD8..)pE״$v=m9..43CE@Rtgγ[ yE/*P_ r& dEd]xFMg^Yrl/!+SUwf <:vn)ƚxT%R˚SԦ1ɋT[>ܾT9\qժBXc>v:AAQ6DiVc<j2kQj|NsUon({˴TlRúޙ_ym [R"nV$E): 9tj R-zd%<)ݙ K1/+x*kJJjj3$,L+ω5̉ߒۗ qL.|^)vUN[y:62&77=;V J_1pˮ +iG~B֑[kKKB`EߡWMu mgww؟ѻ/֡}a{ }˞@?8߈<9ԃoFG _Eh~pRhqpY L3q15 M˿E_>o矠w^&wNno6 `#3U=8-,wQ{b֥˙ץԒԙDžvNm%i_5Noso81VMYAlI fEx]uoU83Xr- 0\GOCVqz:tr3MVӳZC Ƀliy^@ы.z).%_0p$y+[4ϒϐ%Ɠxč\>6"j2ϸ5V5dk2%fm,#}Iq1*T$(Tbq+]렯*y;{Pc@)!ϲVC]4bxAJݒn\졠]ngFÖ,ԙ822l?"{+dԩ7l(f`'h6;o C%#r E=Chm PI Ԙurg{YqQwBHMK´^0lIlѕuva6"e-zӢԓқy~2tM,us_rNUvYeƊ$D֖M*kfs^D,K%&)VFZnO(s+N\-Rxy7@~2ܰ-5I$5yQy#i5ȃ#!XK~g+;<;QHKH)PMh~BzyՐ"TG "DqBb_?|_@W/#e- 򊾜l֯$qjV [GW I`~pM|J.w},sK~z\z:2ʵ&Fr)rk(yw=Hp=@6B6&tlw!rA>vFt&H7)s9gI96pXnZ7G|SFk\cVZj>Ge5.-'yFȂP/)|9 .r%=_?-wWl0J1=Fo7rL1t}{}Ïd| |0ć lA0$1[f~Ūel?3x;cA~: ,ʧ|eI|$[ɇ[f?,:¤{ٕƇB%kuؒㅦpOP(I1|v\Q2WIKvD`DXlړb!G祙qnǚ&ROYւ@'aYc+&nXj"I7XP}粬糒e1ޯx_ly̦;aO8AFLcrB;i ]cbIueV)*bO,Z >{88ܐ5~I3O|$?Lw]}FRc7)U!Y;3t<HZ-j&IRAvmz[KfmwvG]'cAkPG!L}"+Q~g!QT7\΍cĨ@DU4Eߎo1kh2ϡg㨿{Q/CF"V?)vM&,:ii*lrҹ[ucltJk4[WViۧz\Hhodv`7S*> 35|>6>u ?mC=k}< s<žQ4OHx~{/egus&{_^a׭|ɯgV^7O_>_ wMlmXfXG skVLSD{9hN ОGf~d~YƏ4~9{̓ K͑@{?ȿpjpf_>ޝ͋f.=<) ^se{\l/>0~aofN]l5@lz\ٹ3AP'`>:RYmqRqW9QJ4GFNPQfc,!䋑CHqȸ<ג=wu&n=y`ғ' iÎFoHm|# 4Bm@"o%`e"jȗZdIJ4DFA؊̷b>uıv#y[N\V?&^F}J$}xrCeNC>/bIݓ1}>:9a0T&zɺ{+>]e`u.jMݴJ$xM|mdap((ʊ5Mq͛M5hvzmmhlI(q]X䬾uDǀN26jVytM\3u{7e`cW6?EDUQ̷̈5+ Z ,?":֐-^=9r~8MF3e^Ne+W] Y&,Fj)ɊF*[stn˹XȒy}Yr˲Ľ\-kh`۹#ӵ>-s9$٫nyh7`.}ϽAۡI}x( _xhʹ0:I*Y^0XN#xF uT+PbaUDJEGjdYLjּ P[5KљR`,J~FJ% v0Lhx-c&I ϣe{Fhٮ0Gb%Ѳ}FhپFhO9Wq];4Fj&a*XA;/ uyO5^gZQC ?ϖޣ+KZf79Y˦P/bB]4CÖ ٩ E粞k2V_nGAWʼ8ڵy9_HyxjAKCNU9y*qjɺjU4孢 ρf `#Of)t~I-]Gn!G8%Gc! R|4KrZfuYbM˚h?{%!m8hx?ŬQI {\tu}}c$YV. Gk O FzD*'g*% yLTIAڱ#Y.}DyK _Ref171344222}DyK _Ref174100347}DyK _Ref172889191`$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V 4  tF^"5,5-55H/ 4 pFytGTkd$$IfT4֞K xP",---HH  tF^"44 apFytGTo$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 22 s4 pFytGTkds$$IfT֞K xP",---HH  tF^"22 s44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT]$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGT kd$$IfT֞K xP",---HH  tF^"44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT]$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGT kd$$IfT֞K xP",---HH  tF^"44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT]$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGT kd#$$IfT֞K xP",---HH  tF^"44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT`$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V 4  tF^"5,5-55H/ 4 pFytGTkd$$IfT4֞K xP",---HH  tF^"44 apFytGTa$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGTkdk$$IfT֞K xP",---HH  tF^"44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT]$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGT kdM$$IfT֞K xP",---HH  tF^"44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT]$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGT kd#$$IfT֞K xP",---HH  tF^"44 apFytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H4 ytGT$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V  t^"5,5-55H/ 4 ytGT]$$If!vh5,5-5-5-55H5H#v,#v-#v#vH:V   tF^"5,5-55H/ 4 pFytGT kd$$IfT֞K xP",---HH  tF^"44 apFytGT}DyK _Ref174098610}DyK _Ref163703199}DyK _Ref174099256}DyK _Ref174099272}DyK _Ref148803388}DyK _Ref163708467}DyK _Ref163708666<$$If!vh5|5|5T555T#v|#vT#v#vT:V l  t<"5|5T55T/ gFp<ytG$$If!vh5"#v":V l  t "5"/ / p ytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ gFytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ / gFytG$$If!vh5"#v":V l  t "5"/ / p ytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ ytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ gFytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ gFytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ gFytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ / gFytG$$If!vh5"#v":V l  t "5"/ / p ytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ ytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ gFytG$$If!vh5|5|5T555T#v|#vT#v#vT:V l t"5|5T55T/ gFytG$$If!vh55#v#v:V l  t0t"655apytG$$If!vh55#v#v:V l t0t"655aytG$$If!vh55#v#v:V l t0t"655aytG$$If!vh55#v#v:V l t0t"655aytG$$If!vh55#v#v:V l t0t"655aytG$$If!vh55#v#v:V l t0t"655aytG$$If!vh55#v#v:V l t0t"655aytG$$If!vh55#v#v:V l t0t"655/ aytG$$If!vh55#v#v:V l t0t"655/ aytG$$If!vh5D 59#vD #v9:V l  t0}!65D 59pytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 59#vD #v9:V l t0}!65D 59ytG$$If!vh5D 5B#vD #vB:V l  t065D 5BapytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGT$$If!vh5D 5B#vD #vB:V l t065D 5BaytGTg$$If!vh5"#v":V l  t 65p ytG$$If!vh5=5 5#v=#v #v:V l t6555/ / / ytG$$If!vh5=5 5#v=#v #v:V l t6555/ ytG$$If!vh5=5 5#v=#v #v:V l t6555/ ytG$$If!vh5=5 5#v=#v #v:V l t6555/ ytG$$If!vh5=5 5#v=#v #v:V l t6555/ ytGx$$If!vh5=5 5#v=#v #v:V l t6555ytGu$$If!vh5"#v":V l  t 65/ p ytGz$$If!vh5(5p#v(#vp:V l t655/ / ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGl$$If!vh5(5p#v(#vp:V l t655/ ytGz$$If!vh5(5p#v(#vp:V l t655/ / ytGu$$If!vh5"#v":V l  t 65"/ p ytGV$$If!vh5"#v":V l t65"/ ytGu$$If!vh5"#v":V l  t 65"/ p ytGV$$If!vh5"#v":V l t65"/ ytGu$$If!vh5"#v":V l  t 65"/ p ytGV$$If!vh5"#v":V l t65"/ ytG$$If!vh5 5#v #v:V l  t65 5/ pytGl$$If!vh5 5#v #v:V l t65 5/ ytGl$$If!vh5 5#v #v:V l t65 5/ ytGl$$If!vh5 5#v #v:V l t65 5/ ytGl$$If!vh5 5#v #v:V l t65 5/ ytGl$$If!vh5 5#v #v:V l t65 5/ ytGl$$If!vh5 5#v #v:V l t65 5/ ytG}DyK _Ref171657883}DyK _Ref163712917}DyK _Ref148891578}DyK _Ref172894609Dd D  3 @"?Dd _D  3 @"?}DyK _Ref172894829}DyK _Ref172894884DyK _blankyK http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0627513yX;H,]ą'cDyK _blankyK http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0534406yX;H,]ą'cDyK _blankyK http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0534406yX;H,]ą'cDyK yK http://www.datenschutz-berlin.de/gesetze/sonstige/volksz.htmyX;H,]ą'cDyK yK thttp://guir.berkeley.edu/privacyworkshop2002/yX;H,]ą'cDyK yK Zhttp://allnurses.com/t16164.htmlyX;H,]ą'cDyK yK fhttp://www.acm.org/serving/ethics.htmlyX;H,]ą'cDyK yK |http://news.bbc.co.uk/1/hi/technology/3224920.stmyX;H,]ą'cDyK yK ~http://www.firstmonday.org/issues/issue7_10/brunk/yX;H,]ą'cDyK yK thttp://www.calnurse.org/cna/press/90402a.htmlyX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/500286.500292yX;H,]ą'cDyK yK Lhttp://www.w3.org/TR/P3P/yX;H,]ą'cDyK yK http://cups.cs.cmu.edu/courses/ups-sp06/slides/060117-overview.pptyX;H,]ą'cDyK yK hhttp://www.w3.org/TR/WD-P3P-PreferencesyX;H,]ą'cDyK yK http://www2.sjsu.edu/depts/anthropology/svcp/SVCPslnr.htmlyX;H,]ą'c+DyK yK http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11428572_10yX;H,]ą'cDyK yK ~http://www.epic.org/reports/prettypoorprivacy.htmlyX;H,]ą'cDyK yK fhttp://doi.acm.org/10.1145/97243.97305yX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/142750.142755yX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/302979.303001yX;H,]ą'cDyK yK nhttp://www.teco.edu/~philip/ubicomp2002ws/yX;H,]ą'cDyK yK nhttp://doi.acm.org/10.1145/1124772.1124862yX;H,]ą'cDyK yK http://portal.acm.org/citation.cfm?id=1073001.1073006yX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/587078.587082yX;H,]ą'cDyK yK http://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers.htmyX;H,]ą'cDyK yK http://www.springerlink.com/openurl.asp?genre=article&id=0H4KDUY2FXYPA3JCyX;H,]ą'cDyK yK nhttp://doi.acm.org/10.1145/1085777.1085814yX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/240080.240295yX;H,]ą'cDyK yK Fhttp://etd.gatech.edu/yX;H,]ą'c+DyK yK http://www.ibm.com/innovation/us/customerloyalty/harriet_pearson_interview.shtmlyX;H,]ą'cDyK yK Fhttp://etd.gatech.edu/yX;H,]ą'cDyK yK nhttp://doi.acm.org/10.1145/1124772.1124787yX;H,]ą'cSDyK yK http://www.cs.berkeley.edu/projects/io/publications/privacy-lederer-msreport-1.01-no-appendicies.pdfyX;H,]ą'c=DyK yK http://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers/lederer-privacyspace.pdfyX;H,]ą'cDyK yK nhttp://doi.acm.org/10.1145/1124772.1124790yX;H,]ą'cDyK yK http://www.wired.com/news/politics/0,1283,35950,00.htmlyX;H,]ą'cDyK yK http://www.microsoft.com/issues/essays/2007/03-20ProtectingPrivacy.mspxyX;H,]ą'cDyK yK http://www.pewinternet.org/reports/toc.asp?Report=34yX;H,]ą'cDyK yK http://www.pewinternet.org/reports/toc.asp?Report=19yX;H,]ą'cDyK yK http://security.dstc.edu.au/staff/povey/papers/optimistic.pdfyX;H,]ą'cDyK yK fhttp://doi.acm.org/10.1145/62266.62269yX;H,]ą'cDyK yK phttp://www.andrewpatrick.ca/CHI2003/HCISEC/yX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/571985.572005yX;H,]ą'cDyK yK jhttp://doi.acm.org/10.1145/365024.365349yX;H,]ą'cDyK yK Phttp://www.siteadvisor.com/yX;H,]ą'cDyK yK nhttp://doi.acm.org/10.1145/1124772.1124788yX;H,]ą'c DyK yK http://www.tbs-sct.gc.ca/pubs_pol/ciopubs/pia-pefr/siglist_e.aspyX;H,]ą'cDyK yK Fhttp://etd.gatech.edu/yX;H,]ą'c%DyK yK http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htmyX;H,]ą'cDyK yK http://www.wired.com/news/technology/0,1282,67980,00.htmlyX;H,]ą'cB666666666vvvvvvvvv66666686666666666666666666666666666666666666666666666666hH666666666666666666666666666666666666666666666666666666666666666666@@@ NormalCJ_HaJmH sH tH b@b  Heading 1$ & F@&5CJKH OJQJ\^JaJ ^@^ / Heading 2$ & F@&5OJQJ\]^JaJZ@Z ; Heading 3$ & F@&5CJOJQJ\^JaJZZ  }* Heading 4$ & F<@&5CJOJQJ\aJDA@D Default Paragraph FontRi@R  Table Normal4 l4a (k(No List 2B@2 it Body TextxFOF   Char Char1CJ_HaJmH sH tH LL o@ Char Char35CJOJQJ\]^JaJD!D ; Char Char25OJQJ\^JaJV>@2V Title$h@&a$5CJ KHOJQJ\^JaJ (OB( AuthorB'QB  Comment ReferenceCJaJ<b<   Comment TextCJaJH@rH   Balloon TextCJOJQJ^JaJ@jab@  Comment Subject5\4@4 @vQHeader  !L @L @vQFooter  !5B*CJOJQJph@ =lMFootnote Text,Footnote Text Char Char Char,Footnote Text1 Char,Footnote Text1CJaJO !yTFootnote Text Char Char Char Char,Footnote Text1 Char Char,Footnote Text1 Char Char1_HmH sH tH @&@@ =lFootnote ReferenceH*6U@6 !y Hyperlink >*B*ph&@& -TOC 1.@. -TOC 2 ^:@: <+)TOC 3! ! ^@O"@ Y1 Heading 1*"5CJOJQJ*W@1*  RStrong5\@OB@ ,~Citation$]^6:Q:  CharCJ_HaJmH sH tH z"@z '&Caption, Char,Diss. Caption&$ hxxa$5CJ\]aJvqv &&" Char Char,Diss. Caption Char Char5\]_HmH sH tH ZZ  }* SamplePages($a$!B*CJ_HaJmH phsH tH ZZ *iMTable Text Char Char)$d((CJ^^ )iMTable Text Char Char Char_HaJmH sH tH jj Smt Table Grid7:V)0+JOJ HJ5 Table Title ,$(( 5CJaJ:O: g/Item-$ & Fa$aJTOT 7e7Table Text Char.$d((CJaJ8O8 w Table Text/CJe ;HTML Preformatted70 2( Px 4 #\'*.25@9CJOJQJ^JaJ^^ fnInitial Body Text1$dha$CJOJQJaJHO"H V% Heading 4* 2$5CJOJQJN2N kt References3$]a$CJOJQJaJjj kt Tertiary Head4$$ dh` a$6CJKHOJQJaJ>R> [Text5d` B*aJph`b` 7B4text6d1$7$8$H$`!B*CJ_HaJmH phsH tH VqV 6B4text Char Char!B*CJ_HaJmH phsH tH  O B4IT6:a: :B4 text w/ B9`VV 9B4text w/ B Char!B*CJ_HaJmH phsH tH F^F #G4 Normal (Web);dd[$\$CJ,, =*Footnote<6O6 <* Footnote CharCJTT wStyle Table Text + Bold>5\:0@: / List Bullet ? & F(( bodytext.X. hEmphasis6]<NS)~0;=ٸ.i%\oLp  x -6 --U      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTU----6 -----------------------  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRST"1Sm2LNoq%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRLp     !"$%+,-.,+*)('&%$#"! !"#$%&'()*+,-./0123 4 5 6 7 89:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~~}|{zyxwvutsrqponmlkjihgfedcba`_^]\[ZYXWVUTSRQPONMLKJIHGFEDCBA@?>=<;:98 7 6 5 4 3210/.-,+*)('&%$#"!  !"#$%&'()*+,-./0123 4 5 6 7 89:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~~}|{zyxwvutsrqponmlkjihgfedcba`_^]\[ZYXWVUTSRQPONMLKJIHGFEDCBA@?>=<;:98 7 6 5 4 3210/.-,+*)('&%$#"!  !"#$%&'()*+,-./0123 4 5 6 7 89:;<=>?-"1Sm2LNoq%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRU  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?Lp < U z z z z z z z z z  z  z  z  z z z z z z z z z z z z z z z z z z z  z !z "z #z $z %z &z 'z (z )z *z +z ,z -z .z /z 0z 1z 2z 3z 4z 5z 6z 7z 8z 9z :z ;z <z =z >z ?z @z Az Bz Cz Dz Ez Fz Gz Hz Iz Jz Kz Lz Mz Nz Oz Pz Qz Rz Sz Tz UzRKḽr*OIsz0A+6Yv9&3RvR OˏU /=av?&9OdF6r#Ii?Z\)GR`o?}Nd  k - %< Lp j1H  ' ' ' ' ''''''''''''''''''' '!'"'#'$'%'&'''(')'*'+','-'.'/'0'1'2'3'4'5'6'7'8'9':';'<'='>'?'@'A'B'C'D'E'F'G'H'I'J'K'L'M'N'O'P'Q'R'S'T'/Pprstuvwxyz{|}~KMNOP` $v9O>T@CoDDOT Ygikll+m|mm nno:p"t-8,̴̭k3<xKL&ew=7  (3.277?:HkMW_Kljmmotq.r}rrHsIsKssttz~8~ъ&17Zv՘ '29:;?Tdlrϙיݙ  !89AFNV[qr~̚ /01CW`kopqr͛ޛ*3;>OcuvƜ˜ٜڜ()2Qaiȝ#Fbotyzƞ̞֞'*8PUVbcdef{ܟ"<=>KLMNRlˠ۠ =ajuw4iF'tyR@    $(-+/S/4+6;>?Z@ J KTQZ_``Rayadi>r~Ή,`94<u!C&A1   Xe 7$Bg!!8)-22 <ExFFI|JJJ[`rhkNklsuυ}(ܩ:׺z*2TK0(4/4::NK$TyXepukˏUM%:V}F>I]^").//166>=FKR\1eiinqa{BPfڢ ۽O[fY~ 4PUZ[fgl !*5CYZmn}/IJev->XY_*8V(NOes{.   cmz{UVc.^_n  "0V*+KL[  '      /       s t u  M7S'!)))))o*p**-+.+N++++ , ,),,,,---N.O.g....?/@/a////0020000-1.1?1111K2L2M23 45"5;<<<<#<$<,<e<<= =R====C>>>>]?@@@@@@.@/@?@@@@@@@AA(ANAOAeAAAAAB B\B]BhBBBBBBB?C@CICCCCCFNRRR_SSSThUUUUUSVDWWXYY Y[[[\ ^`^^1_____dins,zz{{{у@TdX ;(6pk2opqrscdghY  4W ; 8!!!!"""$b4'=\HILMMPVVW]bhoapqqgswp:?ŔAբH9 %61QI,I  /  #~(,/x7;;@EE%GHPIII\JJ KLLLMQRS UZUUUVWXYYZi[]]E^^w__Q``Naavbccsdeefg&hiijkslmmpnoocpOqqnrrDssjtueuvvw#xxmyyz{|?}~U3Ã:?7ZN<Ӎdlؑ_Õ–9kDq(B{5Ȧҧd_]\̮`8ٳopCĹvTIXn fR,L&$F{pH|N"VGa?JrKp-p + N&"v8n   T  m =       P o /       L v  t  c  .  *   o .       k o! " " !# $ N% & ' ' U( a) 1* * A+ , , - / / 0 o0 0 1 1 2 33 4 5 5 5 <6 >7 7 e8 8 9 `: *; b; %< < < < = )> ? _@ GA wA A kB E F H @T :X X @\ \ 6] c j j j k k k )k Kk ek {k k k k k k k k k l *l Dl Fl Wl cl l l l "m gm im {m m m m m m n Mp !!s!s!s!s!s!s!s!s!s!!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!!s!s!s!s!s!s!s!s!s !s!s!s !s!s !s !s!s!s!s!s!s!s!s!s!s!s!s!s !s!s!s!s!!s!s!s!s!s!s!s!s!s!v:!s!s!s!v:!s!s !s!s!s!s!s!s!s!s!s!v:!s!s!s !s!s!s!s!v:!s !s!s !s!s!s!s!!s!s!s!s!s!s!s!GR!v:!s!s !s !s!v:!s !s!s!sPSPSPSPSPS,PS,PS(]PSGPSPSPSPS,PS,PS,PS`PSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PSHPSPSPSPSPS,PS,PS,PS,PS`PSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PS(]PSPSPSPSPS,PS,PSHPSPSPSPSPS,PS,PS,PS,PSmPSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PS,PS PSPSPSPSPS,PS,PS,PS,PSHPSPSPSPSPS,PS,PS,PS,PS#PSPSPSPSPS,PS,PS(]PSPSPSPSPS,PS,PS(]!s!s!s!s!v:PSPSPSPSPS,PS,PS(]PSPSPSPSPS,PS,PSPSPSPSPSPS,PS,PS,PS`PSPSPSPSPS,PS,PS,PS`PSPSPSPSPS,PS,PS,PS,PS PSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PS,PS PSPSPSPSPS,PS,PS PSPSPSPSPS,PS,PS|PSPSPSPSPS,PS,PS,PS,PSx !s!s !s!s!s!s!s!s!s!s!s!s!v:!s !s!s !s!s!v:!s !s !s!s!s!s!v:!s !s!s!s!v: !s!s!s!s!v:!s!s!s!s!v:!s !s!s !s !s!s!s!s!s!v: !s!s!s!s !s!v:!s!s!s!v: !s!s!s!s !s!v:!s!s!s!v:!s!s!s !s!s !s!v:!s!s!s!s!s!s!s!s!s !s!s!v:!s !s!v:!s!s!s !s !s!s!v: !s!s!s!s !s!s!s!s!v:!s!s!s !s!s!s!s !s!s!v:!s!s!s!s!s!s!s!v:!s !s!s!s!v:!s !s!s !s!v: !s!s !s!s!v:!s!s!s !s !s!s!s!v:!s !s!s !s!s!s !s!s!v: !s!s!s!s!v:!s !s!s!s!s!s !s!s!v:!s!s!s !s!s!s!s!s!s!v: !s!s !s!s!s!s!s!s!v:!s !s!s !s!s !s!s!v:!s!s!s !s!s!s !s !s!v: !s !s!s !s !s !s!s!s!s!s!s!s!v:!s!s!s!s!s!s!s!s!v:PSPS|PSPSPS|PS`"PSPSPS|PSPSPSPS|PS|PS`PSPS|PSPSPS|PS "PSPSPS|PSPSPS|PSx PSPS|PSPSPS|PS PSPS|PSPSPS|PS PSPS|PSPSPSPSPS|PSv PSPS|PSPSPS|PS(]"PSPSPS|PSPSPSPS|PSPPSPS|PSPSPS|PS|PS`PSPS|PSPSPS|PS !v:!v:!s!s!s!s!s!s!v:PSPSbPSPS PSPSPSPS@M PSPSPSPS`PSPS PSPS@M PSPSPSPSPSPSPSPS!v:l PSaPSl PSaPSl PSaPSPSl PSaPSl PSaPSl PSaPSPSl PSaPSl PSaPSPSl PSaPS!s!v:!s!s!s!s!s!s!v:!s!s !sl PSjPSl PSjPSl PSjPSl PSjPSjl PSjPSl PSjPSl PSjPSl PSjPSl PSjPSjl PSjPSl PSjPSl PSjPSjl PSjPSl PSjPSl PSjPSl PSjPS!s!s!v:!s!v:!s!PSePS PSPSePS PSPS@M ePS PSPSePS PSPS@M ePS PSPS@M ePS PSPS.!PSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPSPPSPSPS!s!s!s!s!s!PS!PS!PS!PS!PS!PS!PS!PS!PS!PS!PS!PS!PS A!s!v:!PS!PS!PS!PS!PS!PS!PSH!s!v:!s!s!s!s!s!s!s!s!s!s!v: !s!s!s!v:!s!s!s!s!v:!s!v:!s !s !s!s!s!s!s!s !s!v:!s!s!s!s!s!s!s!s!s!v:PS0PSPS0PSPS0PSPSPS0PSPS0PSPSPS0PSPS0PS!s!s!s!s!s!s!s !s!v:!s!s!s!!s!s!s!s!s!s!s!s!s !s !s!s!s!s!s!s!s!s!s!s!s!s !s !s!s!s!s!s!s!s!s !s!s!s!s!s!s!s!s!s!s !s !s!s !s!s!s!s !s!v:!s!s!s!s!s!s!v:!s!s!v:!s!s !ף!v:!s!s!s!s!s !s !9!v:!s!v:!s!s !s!s!s!s!s!s!!s!s!s!s!s!s!s!s!s!s!!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!s!v:!v:!v:!v:!v:!v:!v:!v:!v:!v: ! !v:!v:!s !m!v:!v: !v: !v:!s !v:stثtثtثث+ثstثtثtثث+ث-stثtثtثث+ثWBs  WB(s(((((((/Pprstuvwxyz{|}~ Nc94 $  o S u5'cd:P |5.*KMNOPQ` $v9O>T@CoDDOT Ygikll+m|mm nno:p"t-8,̴̭k3<xKL&ew=7  (3.277?:HkMW_Kljmmotq.r}rrHsIsKssttz~8~ъ&17Zv՘ '29:;?Tdlrϙיݙ  !89AFNV[qr~̚ /01CW`kopqr͛ޛ*3;>OcuvƜ˜ٜڜ()2Qaiȝ#Fbotyzƞ̞֞'*8PUVbcdef{ܟ"<=>KLMNRlˠ۠ =ajuw4iF'tyR@    $(-+/S/4+6;>?Z@ J KTQZ_``Rayadi>r~Ή,`94<u!C&A1   Xe 7$Bg!!8)-22 <ExFFI|JJJ[`rhkNklsuυ}(ܩ:׺z*2TK0(4/4::NK$TyXepukˏUM%:V}F>I]^").//166>=FKR\1eiinqa{BPfڢ ۽O[fY~ 4PUZ[fgl !*5CYZmn}/IJev->XY_*8V(NOes{.   cmz{UVc.^_n  "0V*+KL[  '      /       s t u  M7S'!&)))))o*p**-+.+N++++ , ,),,,,---N.O.g....?/@/a////0020000-1.1?1111K2L2M23 45"59;<<<<#<$<,<e<<= =R====C>>>>]?@@@@@@.@/@?@@@@@@@AA(ANAOAeAAAAAB B\B]BhBBBBBBB?C@CICCCCCFNORRR_SSSThUUUUUSVDWWXYY Y[[[\ ^`^^1_____dins,zz{{{у@TdX ;(6pk2opqrscdghY  4W ; 8!!!!"""$b4'=\HILMMPVVW]bhoapqqgswp:?ŔAբH9 %61QI,I  /  #~(,/x7;;@EE%GHPIII\JJ KLLLMQRRS UZUUUVWXYYZi[]]E^^w__Q``Naavbccsdeefg&hiijkslmmpnoocpOqqnrrDssjtueuvvw#xxmyyz{|?}~U3Ã:?7ZN<Ӎdlؑ_Õ–9kDq(B{5Ȧҧd_]\̮`8ٳopCĹvTIXn fR,L&$F{pH|N"VGa?JrKp-p + N&"v8n   T  m =       P o /       L v  t  c  .  *   o .       k o! " " !# $ N% & ' ' U( a) 1* * A+ , , - / / 0 o0 0 1 1 2 33 4 5 5 5 <6 >7 7 e8 8 9 `: *; b; %< < < < = )> ? _@ GA wA A kB E F H @T :X X @\ \ 6] c wj xj zj {j }j ~j j j j j j j j j j j j j j j k k k k k (k )k Jk Kk dk ek zk {k k k k k k k k k k k k k k k l l )l *l Cl Dl El Fl Wl cl l l l "m fm gm hm im {m m m m m m n n n n n n !n "n #n $n %n &n 'n (n )n *n +n ,n -n .n /n 0n 1n 2n 3n 4n 5n 6n 7n 8n 9n :n ;n n ?n @n An Bn Cn Dn En Fn Gn Hn In Jn Kn Ln Mn Nn On Pn Qn Rn Sn Tn Un Vn Wn Xn Yn Zn [n \n ]n ^n _n `n an bn cn dn en fn gn hn in jn kn ln mn nn on pn qn rn sn tn un vn wn xn yn zn {n |n }n ~n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n n o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o !o "o #o $o %o &o 'o (o )o *o +o ,o -o .o /o 0o 1o 2o 3o 4o 5o 6o 7o 8o 9o :o ;o o ?o @o Ao Bo Co Do Eo Fo Go Ho Io Jo Ko Lo Mo No Oo Po Qo Ro So To Uo Vo Wo Xo Yo Zo [o \o ]o ^o _o `o ao bo co do eo fo go ho io jo ko lo mo no oo po qo ro so to uo vo wo xo yo zo {o |o }o ~o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p !p "p #p $p %p &p 'p (p )p *p +p ,p -p .p /p 0p 1p 2p 3p 4p 5p 6p 7p 8p 9p :p ;p

p ?p @p Ap Bp Cp Dp Ep Fp Gp Hp Ip Jp Mp 0000000000"0000000000000000000"00 0 00 0 0!0!0 0!0!00 0!0!0!0!0!0!0!0 0!0!0!0!0!0!0 0!0!0!0!0!0!0!0!0!0!0 0!0!0!0 0!0!0!00 0 0 0 0 0!0!0000000 00Q0Q0Q0Q0Q0Q 0QQ0oD0oD0oD0oD0oD0oD0oD -0oD -0oD -0oD -0oD -0oD0oD 0QQ0n0n0n0n0n0n 00 000000 00( 00k0k0k( 00x0x0x0x0x0x0x0x0x 00( 00000000( 007070707070707 00jm0jm0jm -0jm -0jm -0jm0jm0jm&0jm0jm 0jmjm0t0t( 0tt0~0~0~0~&0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~/0~ 0~/0~/0~/0~/0~/0~/0~/0~0~/0~/0~/0~/0~/0~/0~/0~0~/0~/0~/0~/0~/0~/0~/0~0~/0~/0~/0~/0~/0~/0~/0~/0~/0~0~/0~/0~/0~/0~/0~/0~/0~0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~/0~ /0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~/0~ /0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~/0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~0~0~0~0~&0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~/0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~/0~ /0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~/0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~ /0~ 0~/0~ /0~ /0~ /0~ /0~ /0~/0~ /0~/0~ 0~0~0~0~0~0~0~0~ -0~ -0 ~ -0 ~ -0 ~0~( 0tt00000( 0tt000000( 0tt0 0 0 0 ( 0tt0000( 0tt0+/0+/0+/0+/( 0tt0>0>0>0>0>0>0> 0jmjm0`( 0``0Ra0Ra0Ra0Ra0Ra( 0``0Ή0Ή0Ή( 0``0`0`0`0`0`( 0``000( 0``000000( 0``0&0&0& 0jmjm0  -0  -0  -0 0 0 0 ( 0  0$0$20$0$0$0$0$0$0$20$0$0$ -0$ -0$0$0$0$0$( 0  0k0k0k0k0k0k0k0k0k( 0  0000000200000( 0  0000200000( 0  0000000( 0  0:0:0:0:0:0:0:0:( 0  0ˏ0ˏ0ˏ0ˏ( 0  00000000( 0  000000000( 0  00000 0jmjm0/0/( 0//06060606060606( 0//0i0i0i0i0i0i0i0i( 0//0000000 0jmjm0000( 00 0 0 00000&0,0,0,0,0,0,00/00/0/0/0/0/0/0/0/00/0/0/0/0/0/00/00/0/0/0/0/0/00/0/0/0/0/0/00/0/0/0/0/0/00/0/0/0/0/0/0/0/00/0/0/0/0/0/00/00/0/0/0/0/0/0/00/0/0/0/0/0/0/00/0/0/0/0/0/002020000000&0,0,00/0/00/0/00/0/00/0/0/0/00/0/00/0/00/0/0/0/0/0/00/0/00&0,0,00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/000200 -0 -0 -0 -0020000&0,,0/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/00/0/0000( 00320303&03,0303.03,03,0303,03/03/0303,03/03/0303,03/03/0303,03/03/0303,03/03/0303,0303/03/0303/03/0303/03/0303/03/0303/03/0303/03/0303/03/0303/03/0303/03/0303/03/0303/03/03030303030303&03,0303 ?0~3 ?0~3 ?0~3 ?0~3 ?0~3 ?0~303,0303 ?0~3 ?0~3 ?0~3 ?0 ~3 ?0 ~30303&03,0303 ?0 ~3 ?0 ~3 ?0 ~3 ?0~3 ?0~3 ?0~303032030303030303 03 03 03 030320303030320303030303( 0020000000$000200 -0 -0 -000000&0,0,00/0/00/0/00/0/0/00/0/00/0/00/0/000000 0 0 0020000 00  -0  -0  -0  -0  -0 0  0  0"0"0"0"0" -0" -0" -0" -0 "0" 0  0V0V0V0V0V0V0V 0  0q0q0q0q0q0q0q0q 0  0000000000 0  00( 006$0606060606( 00Q0Q20Q0Q0Q0Q&0Q0Q0Q0Q0Q0Q0Q0Q&0Q0Q20Q0Q0Q0Q0Q0Q0Q0Q0Q 00E0E0E -0!E -0"E -0#E -0$E -0%E0E0E"0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E0E00@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@00(000000000@0@0@0@0@00@0h0 0 00h0000h000h000h000h000h0000h000h000h000h000h0000h000h000h000h000h000 0 0 0 0 0 0h000h000 0 0 0 0 0 0 0h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h00h0cHsIsKsst:;  qr01qruvٜڜ()yzUV=>Z[fgYZmnIJXYNOz{UV^_  *+KL          s t ))o*p*-+.+++ , ,,,--N.O...?/@///0000-1.111K2L2<<#<$<<===>>@@@@.@/@@@@@AANAOAAAAB\B]BBBBB?C@CCCRRUUUUYY[[__cdgh< xj zj {j }j ~j j j j j Mp h00h00h00Wxiiih00h0ȝ,1Jh0T(YyPjh0   Q&[Ja6orox}h0  d`yah0 ̗h08y{h0p jqh0Tph0p.h0p} H h0hnh0{. !{"h0- ̗h0 h08h0!"Th00th0#$h0%&rh0'(rT ph0)*sh0+,Lsh0-.s!h0/0y$E!--M$M$h012$z-!h034\zh056z܃| h078z!lbZ`Ġh09:hh0;<!h0=>@e h0?@h0ABH0bZh0CDh0EFȝ,h0GHtT(YyPjh0IJ Q&[Ja6orh0KL dh0MN ̗h0OPT8y{h0QRp h0STĘTph0UVph0WX4h0YZ2h0[\2 ,,xh0]^ 2Hh0_`X2,nh0ab2 jq}h0cd2h0ef2PSPSPSh0gh82(]PSh0ijp2PS,PSh0kl2PSPSPSh0mn2'h0op\2!h0qr$]2-Hh0st\]2h0uv]2(%/h0wx]2< h0yz^2 jqyh0{|<^2h0}~t^2h0^2h0^2`!h0_2 x (jh0T_2Dh0_2 Hh0_2"= h03*rw  h04*  h0<4* h0t4*R h04*b h04* h05*2r h0T5* h05*z h05*  h05*r  h046*r h0l6*Z h06* h06*2^ h07*R  h0L7*5 h07*h0h0h0h00h0hh0h0طh0h0Hh0h0h0h0(h0`h0h0йh0h0@h0xh0h0h0(3h00h0 h00h00@0h00@0h00@0h00  {} } H b2hn({. !{"#_(y@EK Q&[Ja6orox}/(=~ ]J|`Im >&"2>C0P*_tx}`[ݪ>BV [!%N5!>cH,Sbq_2$7#&*#2-;DMZhpft[|pk"Ю|Jzz C"+06<KSZc"mx6jKٳLDB; W:#*17a<>ADJxWZdqhy "ȡeܪ Phg(.:EK;W'alyJXKx!(`,0<FQ^ gqtzXB &0%e(/3DM RWZen~w%1HG   ( H1 < E |N _ h r u Lx     ,:;<?@ABCDFGHJKLMOPRSUVWY[]^_`abcefhijklnopqrstuwxyz{|}~   !"#%',-/145678:<=>@ABCDEFHIJLMNOQRSTVXY[]^`acdfgijlmoprtuwxz{}~  Ktux?wz:d F1q3v٤Uf=Rw +7gΑ& |Rs˗U>qZ[YmI>N {^   * K s71-33 445N66?7788-99K: <#DDEFHH?HII\JJKZ_[[h]]S^_ac ffgp ')*T^g{Q%PQ\RZmB P $ o8 )F r es Wt t t iu u u v .v Kv hv v v v v v w 3w Pw mw w w w w w x 8x Lx   !"#$%&'()*+-./0123456789=>EINQTXZ\dgmv   $&()*+.0239;?GKPUWZ\_behknqsvy|Kx    -a}-IKLNn>kA]`ac3679Y . 1 2 4 T   ! " $ D     9 M i l m o 1 M P Q S s    @ x   +Sorsu/235U!$%'G?l"A]`acB^abd478:Zj.JMNPp @Zvyz|/235U (+,.N$'(*Jo`|)EHIK49HX ^ m ""$(((***122]8m8:<<EGG ICKIKLOOP|D|S| '[a&W]#)#(Ve{4:_љӡ7ʰΰdihm-$/jU &}W\?BH"@E"(.CI z"w 1 7 7S(" "F"Z%_%5&((+*./.022 4T6Z67778;;'<>>kB*E0EE0H6H%KcMgMORRRVVY}__5affflhrhhkkoooRsisksuxx{{{}-}1}}}}cZ`N<[ *Ǘɗ*0˩Щ]HNw!%=C:#ݼ;AMy;APVa;AI- 2  /5  ~!$ $$p'v'*E-K-033/799H<>>,ADDE IIKNNOQQ,UWW2\__Yfhhjbmgm5nqqruuwy yGy||~|~~,΂ӂ"H1.4š ~*/aмռ$[>Dp>Fhr*.u{QEKMS IO"%%%.(4(E*,,p/113C6I68<<<O?T?q?BBK[N`NNHRNR~T\WbWtW[[Y]``a ddehhnhjkk mqquuuxzzzT}Z}}߄cit)ekҜל ɣ+թ٩{6;Ҳز[9>0`eE]bKw|z<0;W]qv49#ju]  , " ""O%''(++ 0U3[3444577{7}::;AA0AFFGAKKKKNNO+R0R?RZR`RUpXvXX[[\aaDbddeflpllooquuvzz5z||}ejԅڅ/JPekӒۙݛܞԤڤsyثKQ?KQ/?E-*'-EJA(<4: TYH   f  !x$'')(,.,A2Z4`497=9C99J;P;J>@@BEEuGJ%JyLNORU#UXK\Q\]__db(e.eeiigknnynpqCrttuuu vzxxxzz|l~q~~xgmm^c3șFK,7=«qhn=L_d IO}RX1.86)+1GIF $*(.>'-@% * "%%&&&&))d688 99!9]9;;DtFzF GIIKMMOOOPRRY%Y'YDY[[` dddagggOlnn-oqrtvvUw#z)zJ~Ƀ΃9PVo39\!ΜԜx~49jNTzƶ "(U ciG Vt$ flyrnt)$.Y_y   V[.3 %((3)+++M.S..00233629799@<F<y?^EgE:JL$L NPP3RVVX\\ _aaTdffjn nQuwwy}}M~]lINΜӜ%*̢Ң֮"(V -.PVtJ$)gsy#( #(.-2,DGPgi  <w}} 6#<#0&#()( )#),)-//u0223k7t79993:;;<??ADDOOOOYPaPePPhQRRjVVV\\]aaagg$hllm wOwwtzzzN~~~b=ֆ 52|$ՑKɠ&ߡ@Py2]0[ eĴn@It"Zcq" 0F%_1(R%|   n   u      H t  8 a - e     e$ $ $ w& & ' $) I) _) 3 I4 4 : : (; < Lp  X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%X%̕QQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQ QQQQQQQQ QQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQ QQQQ  QQQQQQQQ QQ QQQQ Q QQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQ QQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQ_ QQQQ_ QQQQQQQQXXXQXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXĕT\cejux!0X I.O!#\$(*n*3+---QQQQQQQQ  "$] RaMDL:b$c[z[d"ؚ-D.< "$ i|"$`̫ cHhH"$mm:O@TysfI"$ܛ`?)]H"$ł,M4X!"$5} p~"${`Ȗd2$̠*[#$L@+ (   ( 3  s"*`  c $X99?(`B  c $Do : `B  c $Do `B   c $Dow S `B   c $Do xx   0ll̙( x   0ll̙- :  x   0ll̙ -  fB  s *Dov: v x  0ll̙  x  0ll̙  fB  s *Do 1`B  c $DoU:U`B  c $DoU2`B  c $Do V `B  c $Do2x  0ll̙0 x  0ll̙y:  x  0ll̙y   fB  s *Do:  x  0 ll̙0z  x  0 ll̙{W  fB  s *DoX`B  c $DoV2 `B  c $Do : `B  c $Do 'x   0 ll̙(  x ! 0 ll̙!:   x " 0 ll̙ !   fB # s *DoX: X x $ 0ll̙# x % 0ll̙%( fB & s *Do#&`B ' c $Do #&$ ( (3  s"*` ) c $X99?(Z * S A̙?[^ + # lryryGHtIJKL%MpNg'̙jJ ( ~ , 6ryry̙I(G  - # lryryGHDIJKL*MpN,̙jJ (S ~ . 6ryry̙( B S  ?/ -+ELp !\ t(!4ts _Toc175210961 _Toc175210962 _Toc175210963 _Toc175210964 _Toc175210965 _Toc161388551 _Toc161413462 _Toc161388552 _Toc161413463 _Toc161388553 _Toc161413464 _Toc161388554 _Toc161413465 _Toc175210966 _Toc175210967 _Ref163708666 _Toc175210968 _Toc175210969 _Ref148803388 _Toc175210970 _Toc175210971 _Toc175210972 _Ref171596579 _Toc148892434 _Toc148892435 _Toc162584791 _Ref163712917 _Toc175210973 _Toc162584792 _Ref172894880 _Ref172894884 _Toc175210974 _Ref171344222 _Toc162584394 _Toc162584793 _Toc162584398 _Toc162584797 _Toc162584798 _Toc175210975 _Toc162584799 _Toc175210976 _Toc162584801 _Toc175210977 _Toc175210978 _Toc175210979 _Toc162584804 _Toc175210980 _Ref174100347 _Toc175210981 _Toc175210982 _Toc175210983 _Toc175210984 _Toc175210985 _Toc175210986 _Toc175210987 _Toc175210988 _Toc175210989 _Toc175210990 _Ref148891578 _Toc175210991 _Toc175210992 _Ref146857445 _Toc175210993 _Toc175210994 _Ref148883630 _Toc175210995 _Ref163703199 _Toc175210996 _Ref148887439 _Ref163708467 _Toc175210997 _Toc148892445 _Toc175210998 _Toc161735929 _Toc161735930 _Toc161735931 _Toc161735932 _Toc161735933 _Toc161735934 _Toc161735935 _Toc161735936 _Toc175210999 _Ref148861686 _Toc175211000 _Toc175211001 _Toc175211002 _Toc175211003 _Ref163490554 _Ref163700007 _Toc175211004 _Ref163704237 _Ref174099272 _Toc175211005 _Toc175211006 _Toc140034940 _Toc175211007 _Ref174099256 _Toc175211008 _Toc175211009 _Ref171657883 _Ref172889191 _Toc175211010 _Ref174098610 _Toc175211011 _Ref172888347 _Ref172893706 _Toc175211012 _Ref163700596 _Toc175211013 _Toc175211014 _Ref172894609 _Ref147676778 _Ref172894829 _Toc140034941 _Toc175211015RoDnkxx7jmKssssst~~~~  +/>>``RaΉ`& $k:ˏˏ/////////66iL233  ""Vqqq66QIEEMp   !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijnklmpoqr^Do+˴<7mlsssttt7~7~7~7~ʗQQ  R/??``xa@ AMk:&99/////////66i٢ 4 4 4":""VqqqĔĔ$$$jEEMp .P5Gyr Q5GUR5GS5GDM T5G;U5Gyr V5GW5Gi4X5G4Y5GyZ5G<[5GQ\5G,]5G,j4^5G̊5_5G`5Ga5Gd] b5Gc5GT:d5GDe5GI3f5G:g5G|~ h5G~ i5Gts:j5G4s:k5GtVl5G4Vm5Gn5Gdo5Gt.} p5G-} q5Gnr5Gfs5G<,t5G$6u5G8v5G\g2w5Gx5G<3y5G~5G$N:5G/5G 5G 8ppyhhf>JJ>>LLHHKKuuLNLNWNA `B d d Mp       !"#$%&(')*+,-xkk f  ESS>>OOHHKKuuSN`N`NA hB d d Mp   !"#$%&(')*+,-9--*urn:schemas-microsoft-com:office:smarttagsplaceB***urn:schemas-microsoft-com:office:smarttagscountry-region8*urn:schemas-microsoft-com:office:smarttagsdate8&&*urn:schemas-microsoft-com:office:smarttagsCity=++*urn:schemas-microsoft-com:office:smarttags PlaceType=..*urn:schemas-microsoft-com:office:smarttags PlaceName  20052006478DayMonthYear.-.+*-*-&*-**-**-**-&-*-.-+*-*-&-&-&-*--+.-*8@LmDoDrDsDDDDD@EBEEEFEEEEEJKLKQKRKhKiKrKsKjjjj.l/l3l4lClDlLlMlllmlrlsllllllllli (*-.7777ooooooKs{{{{{{{{}2}4}6}7}>}}}}}}}r}17<Běv~)14:,9iq/9>DRRRRkk!k"kuuuu4444bRdRhRiRRTVWuuuu,1J&& 9"9OOY(Yy  Ï  Pj*)+).)0)5)6)9999< < D D D E E E _E eE E E M M M M N N N N LS SS aS fS 7T ;T =T =T yW yW W W W W 7X 7X ] ] ` ` b b ?c Gc c c f f i i wj xj xj zj zj {j {j }j ~j j j j j j j j j j j j j j Jp Mp  D#O#>>??B*BOR`RSSWWhiiilll*m+m{m|mm>T"F׳s{*;N  C)P)!77&@L@AANNxXXnnnnoopprrKsmsnsst'tNu\uhyxyzz{|||}~~~4ʗ˗̗%Ȫ٪Vg޶6EGQx%GL[n    #  ,     W(a(+6H699::QQRRTTHXYXaYpY-b8bd!diiuu+ - "M_xA^hv bcc r Rb((()-9--.G2Y266bEsEWFgFFFI-IJJVShSjk8s@s&u9u}uuww 9K0EnN`08[hUx# >'Re3A4E7@ ##z##(.(..//33>4M4O44OdRRRSS_\h\u v݅T*91^m~uٮit˾qx$CT2c &-> 0Cx"".(0((')w,,w..S1e144;;uAAUJ}JOOj3ju&uuzE{R{{{ @L̊ڊfpʘP,1J&& 9"9OOY(YyPj< < wj xj xj yj zj zj {j {j }j ~j j j j j j j j j j j j j j Ip Jp Mp :::::::::::::::::::::::::::::::::::::::::33333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333 a-NkAc9 4  $  M o 1 S x Su5'lAcBd:j.P Z|5 .*o`)KR`ll,̴ֳkx)=`$b%U6777q.rIsKsnstt{|}}}~~8~˗̗& ̚2Qlw   +/S/>?```&A Xe$BJJjk}uuO44::OcR݅S://66iiuzE< wA A VE E E E N N =T =T yW yW W 7X 7X :X [ @\ a a g "g xj xj yj zj zj {j {j |j }j ~j j j j j j j k k k k 'k )k Ik Kk ck ek yk {k k k k k k k k k k k k k l l (l *l Bl Fl l im m m n Mp < xj j Mp n`bx; L@y .oLR1.\s kxz P-`#f?z/9?\'1xn1F,S1fhV i3xx(Y9b8{==Xx]+c}FTokGr܁hRi6YdR`//1T `VrtI}(] &Vgf:sj{m`=p20 hh^h`OJQJo(*h0^`0OJPJQJ^Jo(hH hpp^p`OJQJ^Jo(hHoh@ @ ^@ `OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHhPP^P`OJQJ^Jo(hHoh  ^ `OJQJo(hH8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh  ^ `OJQJo(hHhTT^T`OJQJ^Jo(hHoh$$^$`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJ^Jo(hH h^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hHh^`OJPJQJ^Jo(hH hpp^p`OJQJ^Jo(hHoh@ @ ^@ `OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHhPP^P`OJQJ^Jo(hHoh  ^ `OJQJo(hHh^h`OJQJo(hH^`OJQJ^Jo(hHopp^p`OJQJo(hH@ @ ^@ `OJQJo(hH^`OJQJ^Jo(hHo^`OJQJo(hH^`OJQJo(hH^`OJQJ^Jo(hHoPP^P`OJQJo(hHh^`OJQJ^Jo(hH hp^`pCJOJQJo(hHhpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hH ^`hH. ^`hH. pLp^p`LhH. @ @ ^@ `hH. ^`hH. L^`LhH. ^`hH. ^`hH. PLP^P`LhH.hh^h`CJOJQJaJo(hH^`OJQJ^Jo(hHo^`OJQJo(hHpp^p`OJQJo(hH@ @ ^@ `OJQJ^Jo(hHo^`OJQJo(hH^`OJQJo(hH^`OJQJ^Jo(hHo^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hHh^`OJQJ^Jo(hH h^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hH8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hH ^`OJQJo(n ^`OJQJo(n pp^p`OJQJo(n @ @ ^@ `OJQJo(n ^`OJQJo(n ^`OJQJo(n ^`OJQJo(n ^`OJQJo(n PP^P`OJQJo(n8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hH^`CJOJQJo(^`CJOJQJo(pp^p`CJOJQJo(@ @ ^@ `CJOJQJo(^`CJOJQJo(^`CJOJQJo(^`CJOJQJo(^`CJOJQJo(PP^P`CJOJQJo(8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hH hh^h`OJQJo(n 88^8`OJQJo(n ^`OJQJo(n   ^ `OJQJo(n   ^ `OJQJo(n xx^x`OJQJo(n HH^H`OJQJo(n ^`OJQJo(n ^`OJQJo(n8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hH hh^h`hH. P8^`PhH.. ^`hH... xp^`xhH.... @ ^`hH .....  X ^ `XhH ......  x^ `hH.......  8H^`8hH........  `^``hH.........h^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hH8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hHp^`pCJOJQJo(hH^`OJQJ^Jo(hHopp^p`OJQJo(hH@ @ ^@ `OJQJo(hH^`OJQJ^Jo(hHo^`OJQJo(hH^`OJQJo(hH^`OJQJ^Jo(hHoPP^P`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hH ^`o(hH  ^`o(hH. 0^`0o(hH.. ``^``o(hH... ^`o(hH .... ^`o(hH ..... ^`o(hH ......  `^``o(hH.......  00^0`o(hH........8^`CJOJQJo(hH8^`OJQJ^Jo(hHo8pp^p`OJQJo(hH8@ @ ^@ `OJQJo(hH8^`OJQJ^Jo(hHo8^`OJQJo(hH8^`OJQJo(hH8^`OJQJ^Jo(hHo8PP^P`OJQJo(hHhRm//1Tz/  i3LR1S1`VokG}(]=YdRy 1`=p`#x; Y9+c}F &Vg\'1k:sj  8{= @CJOJQJ^Jo(" @CJOJQJ^Jo(-dS                                           `        h        hhp-                آh                                 p9h3℀bd66B]Txx%-D E]^bezfhkyN{E}  )Rmu@ M 8  /$ F> g o o pt   5 B VD 0L gM ^ {g v     B' d' &> @ ,A O S i p ! F D   ! # .3 4 z6 > o@ &C F vG N P nd A C b a4 5 16 < -C x] ] ` c l q BBD gk7{| #-depDw LUWY8\lnr ,05<@DrMTT\yljsy{|kz%8:<CDHmInPZdqyBz|#)?EU;W[/bZcr y!'z,,5569lDUhYno w !O*U^phq~ +0+7@FHJ^\kk!rH  \.126CR\e,(j<O\r]_pRuxw\G/G1:IWfntv$)j++0^7f(h jmRtv]w; !$-3&ADNcpc|gjy*KPVmn *8+3/4589B DN&\ohQsqu4+*8=??GISiruv})D,/ABITbUjkKmqxP v n  l 0 ; > D J Z [ Qm Fv z L!8%!&!:!=!@!D!Q!4Y!r[!`!i!y! """" "PA"\D"Z".i"k"n"q"t"v"Tx"#0#8K#ts#!$#$($5$ >$nA$B$C$J$@N$oV$Y$Z$9]$`$% %%(%V%|\% j%n%Xp%r%v&%&3&;A&N&V&X&Y&5Z&]&a&c&^e&xi&w&' 'h']0'C'*N'TP'`S'W']'g'i'p'vq'zy'((!(9,(3(8(9(A(C(C(N(zO(V( Z(q(w(R)) )%)<+)A)yI)M)hW)\)i)K**|**7*K*U*^*;a* }**+q++H+W+'+.+V;+D[+][+]+^+5c+k+v,Q,",.%,',',9,G,K,L,M,S,`m,X-O--+(-*-*-*-m3-C4-q4-";-(?-A-A-C]-2g-gt-w . ._ .....'.,./C.H.tZ.c.j.r.@/|/?/@/X/^/`/g/n/ 0%0vA03I08P0hP0\0_0`0k0nl0t01)1 1J1P#1(1r,1,1|.14113141391";15Y1Y1Cv1H{122 2R.24292<2S2Y2z`2`2i2p2{2$333|33,353493b93,\3_3/`3y444$4?4B4#G4.o4aq49x4z4c5 5 555H$50585HJ5%P5Q5 S5U5a5j5ap5w56 66636+6,63666:96~B6BC6 I6kL6sS6S6$T6W6'7s7'17w<7YQ7UY7&^7b77e7,g7Ui7H88~8A,8u>8B8 D8zK8Q8S8^8f8l8|899)9!9%)9,)9Z9]9n9+t9::: 5:BA:D:)T:T:dZ:!]:it:|:{;;;;9 ;s;#;*;2;:;];7n;n;ds;w;y;E~;~;<%<"<*<2<<></F< k<x<z<){<= === ="=%=*=1=7=<=D=X=;]=Vj=}{={=H > >>> #>G2>:>BD>RG>M>O>S>xX>e>t>z>{>D}>z?9?J??9?6?>9?8G?^?%v?:w?~?e @ @=#@L'@(@0@8@_:@]V@Qf@AA"Ag AAAAa%A)A%/A3A7A:A?AMAVA3cAmAqAsAsAK|AVBB Bi/ByIBpB,wBwBCCCCoC!C.C1CQ]Q|]QhQlmQ@vQRdR4*RI9RHHRjRsRtRCSS0S2S4S35Sy7SJ9SHSHSJS]Sf_S`SonSTTqT T:T@@TaETHTOIT.UTiTUUUU$UQ,U,Uu0U0>UOURU]UdgUkUpUxU+V4V^BVcEVvFVQV WVRXVgVqV[{VWWu WUWQ3WFGW/VW[Wt]WdWqWXX XXX$X'X@AXWXrXBYHY;YY)Y Yo.Y20Yp2Y?Y?Y"RYfTYjYlY@ZZ ZZZ=ZuDZZZhZ0sZZ[[['[)[.7[?[A[O[#W[n][ ^[~[\-\?1\T;\;\;\]@\@\N]\^\`\v\<}\]] ]]] ])] -]0/]0]>]L]QY]Y]2^]^]b]x]y]G|]U]^ ^_.^1^R^~V^V^k^4v^z^_?_c _]__e__i<_<_>_X_Y_a_(c_h_1i_o_/v_Hv_z}_`!`%`&`4`A`M`N`Y`^`'o`x`TalaJa"a}/a1a9a:axTaXacaciaiaJjajauaNwaBbzbbbbFbGbGbJbObZbV{bM}bL~bccgc c'c(c(,c1cFKc(_cbcEwcxcxc.d-d1d0d0d4dL=dJdLdARdRd8nd ~dAeeeekee,(e1eEeSeJfeMoeyefFfffY f%f(+f-2fEfFfPfASfXfYfhfH~f gxggg7g7g@gSgyTgag"ngh hhNhh$h**h8hO;h?h@hEhbFh-IhWPhahbhehhgii$iD@i5BiJibbieipij'j/j1j2j6j>j%EjHjKjNje[jd`jvjr!kK(k*kDkIkJk]k:^kbkfknktyk:l lllldFlwVlXldljlplk mmmmmx-m0m\3m~J~|j~Xq~3${&'-]ey -22 5}6LeO D%&&f,AzEFMQjX_lu 0,z89B@CeDSs{ R .[9FJkNgQQ#R]Kaalmru}}UC#',BdHHbjknmpUu a)O.112EMcT[_a`befsuw z "2>BXJI^eknLKrh*-%FV~~H 4=88rGG_`lZ  MK+!)1T456<>JB4RZ\ow 1 % (.g/5RT~Z^Dcffs|x8(4>@@TijO{q~?#$$$-6>F6K4]bkn>~<{ wo5;EYG[^lp(q)>PGGI`JLOkS2Ujps}v !-024 `evVk0h19IJcik|M;4L(XZ_cOiyz *IKUtsuxA l) ,s<FKLR:T[ygbjjGui. | ()N]bjngsuz{ p %#.0:? @P^ d4hlqqy'%},X4Dckrzs3#/.3>Z?@JLMOYP]Z]corrG~J #=MNJRtnuy 6 *BoHMIvN* ="F$3mGLOU]_)as.!-B5:=A/SoSyS1hhfw!0569<ESZ"i2th~n c Z  'X2>H OX ZJZ/`Qep `Gv09dJJQ_i tt*#,69CWac*r(,- 9%9=?sHNRj1nTt| a+/.18HIJ!LaS_`k`nlpLLLGg&X+{+P8DwKKJiZxncb&*56=dCEajpMy.{vF :! ?6SZ^dGlru:  ",#i+&1z46B,MN[lrjl pd41<BKTbhjOn3pj{E|!*<n=B?pGs-l3N99@NObmmu ?&.(>FrGsfggiwx!|u~Fb/3U]NcHoo~W 8(.)6 7|;q=GgLRd`jjkk`n0rCy4{ %g45oMYdHpK| -(.8<KRSTbpzg#])KTLL OeUW)`LhWb![il[&689IS^wf@qors:w()9RDKGKTZ^hkGnqjv~~_ =w@EIKNLNSlbn,0?L'^k`nRtJX o!Y"1/;/A(SXZ_-npxM%1|8d>v??@OiZ_u7}uP#4NORTL`jfpZ479M7]donu3|/4>IC!\l?quy279;PTUa%y [k'CORT[_lRqqrx 8-59v98<xMQ*^bm@uz@6:F^IQR;T+U7sA @ s%(4)5@FW^fy{;Ic :qF '5<$MdMSPPRX&X0\t|vG2GdivH m &1lBIKLjMN^e P%S&T1BHM^e}L G'7 HI_hoqsfKQ*>2229 ^*ddl}z#((N*400d2F8 <qH^ips  V$),7.OCQFzTa}t} D5#:>"IEbkn(~~ u/4 5BxEH{Jjov  *d4X`bp>uzA [8]"x(*790U[`kp h(HMvO*TQWZtaab"m/L+$.CXZZlFx7!%+6HPGUHloxyGCv?! '18;ZD-KOSU\dfh8\oS^ "):;;JKVYv|<(0BQn\NeZqRtD~  <>$&12P RU}nw"r#n6:q<>z@JL4R[h]KawcnsuY++2//7FABEJTTab ptQa5A9Q XZ 6(U*0BUgLkn$3*5>@JH?`bqVvI$+d,904R:KKK:OPms {%~x+/1]3\48<w<Q?dLO\]akmt5uv}~ X&)G,1%RYZ\,al6}@Vj)4/>CEKbkpxy b"w5H_.(AtFtOSX^tpw}} ( ,C5=V^qs}!u'-2188HHLNR^bclppga;J^aclNp0)CDKXpZr.~At8U:H!R>GSUC`Cn[r{{R),t/BDORfmo|pmvx+-0OXqZM[bm**/7<dDIKgYalY ^%AP.8;Y=EuK` a@e$&/5Jimx$y{j}j q (5H ]!kwzSb+G,5>FNQ=[[sbdqssNknd@ [).@ O2RoYd]lMt}FSH\p$w|< mj2 m#9\`acfxy &\,0N> @L[qb f<V '61;Gpa2U:9 F+r+0*6]6?DQ0Tzf=lA|u *-t.37<>$SakKsnunvvy{} +5<FABJ=MQnk*i14[;;xAB\DDmF,LM PyU|(YDK[ j&jjWv_{~W$ J$84*>oe}e[w*++>FHGVd &+9MZjop:r},~A!8W99FOfkrpCxWRA"44PoyA~o &S)67nEGODPS>Vgsyt 8'.:458+AzGb+yyN{`} p$[AK?R`bakq*'-0=YAZHQSWfgpQt^zъ&17Zv՘ '29:;?Tdlrϙיݙ  !89AFVqr~̚ /01CW`kopqr͛ޛ*3;>OuvƜٜڜ()2Qaiȝ#Fbotyzƞ̞'*8UVbcdefܟ"<=>KLMNRlˠ۠ =ajuwF@$>r 4PUZ[fgl !*5CYZmn}/IJev>XY_*8NOes{ cmz{UVc.^_n  "0*+KL[  '      /       s t &)))))o*p**-+.+N++++ , ,),,,,---N.O.g....?/@/a////0020000-1.1?1111K2L29;<<<<#<$<,<e<<= =R====C>>>>]?@@@@@@.@/@?@@@@@@@AA(ANAOAeAAAAAB B\B]BhBBBBBBB?C@CICCCORRRUUUUYY Y[[[__scdgh< Mp +03c!){; R;[+EN.InstantFormat EN.Layout EN.LibrariesH w<ENInstantFormat><Enabled>0</Enabled><ScanUnformatted>1</ScanUnformatted><ScanChanges>1</ScanChanges></ENInstantFormat>T<ENLayout><Style>jason-cv</Style><LeftDelim>{</LeftDelim><RightDelim>}</RightDelim><FontName>Times New Roman</FontName><FontSize>11</FontSize><ReflistTitle></ReflistTitle><StartingRefnum>1</StartingRefnum><FirstLineIndent>0</FirstLineIndent><HangingIndent>720</HangingIndent><LineSpacing>0</LineSpacing><SpaceAfter>0</SpaceAfter></ENLayout>U<ENLibraries><Libraries><item>hci_privacy_review.enl</item></Libraries></ENLibraries>@u Lp `@Unknown Jason I Hongibm userGz Times New Roman5Symbol3& z ArialCTimesNewRoman5& zaTahoma?5 z Courier New7Tms Rmn;Wingdings"1h "`U`Uq4d7 7 2qHX?< 21 Jason Hong Jason I Hong                        Oh+'0  < H T `lt|1 Jason Hong Normal.dotJason I Hong15Microsoft Office Word@L@BZ @@~ U`՜.+,D՜.+,D hp  Carnegie Mellon University7  1 Title/ 8@ _PID_HLINKSA/v=:http://www.wired.com/news/technology/0,1282,67980,00.htmlO:Nhttp://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm/?7http://etd.gatech.edu/1t4Ahttp://www.tbs-sct.gc.ca/pubs_pol/ciopubs/pia-pefr/siglist_e.aspc31+http://doi.acm.org/10.1145/1124772.11247884.http://www.siteadvisor.com/B+)http://doi.acm.org/10.1145/365024.365349N()http://doi.acm.org/10.1145/571985.572005i7%,http://www.andrewpatrick.ca/CHI2003/HCISEC/l6"'http://doi.acm.org/10.1145/62266.62269I>http://security.dstc.edu.au/staff/povey/papers/optimistic.pdf_D5http://www.pewinternet.org/reports/toc.asp?Report=19]I5http://www.pewinternet.org/reports/toc.asp?Report=34~>Hhttp://www.microsoft.com/issues/essays/2007/03-20ProtectingPrivacy.mspx}b8http://www.wired.com/news/politics/0,1283,35950,00.htmlb;+http://doi.acm.org/10.1145/1124772.1124790  Zhttp://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers/lederer-privacyspace.pdfRG ehttp://www.cs.berkeley.edu/projects/io/publications/privacy-lederer-msreport-1.01-no-appendicies.pdfc<+http://doi.acm.org/10.1145/1124772.1124787/?http://etd.gatech.edu/Qhttp://www.ibm.com/innovation/us/customerloyalty/harriet_pearson_interview.shtml/?http://etd.gatech.edu/M)http://doi.acm.org/10.1145/240080.240295j5+http://doi.acm.org/10.1145/1085777.1085814QJhttp://www.springerlink.com/openurl.asp?genre=article&id=0H4KDUY2FXYPA3JCH_Ehttp://guir.berkeley.edu/pubs/ubicomp2003/privacyworkshop/papers.htmH)http://doi.acm.org/10.1145/587078.587082T 6http://portal.acm.org/citation.cfm?id=1073001.1073006m6+http://doi.acm.org/10.1145/1124772.112486220+http://www.teco.edu/~philip/ubicomp2002ws/B)http://doi.acm.org/10.1145/302979.303001G)http://doi.acm.org/10.1145/142750.142755h>'http://doi.acm.org/10.1145/97243.97305263http://www.epic.org/reports/prettypoorprivacy.html,Qhttp://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/11428572_10%c;http://www2.sjsu.edu/depts/anthropology/svcp/SVCPslnr.html"t(http://www.w3.org/TR/WD-P3P-Preferences!fChttp://cups.cs.cmu.edu/courses/ups-sp06/slides/060117-overview.ppt]http://www.w3.org/TR/P3P/J)http://doi.acm.org/10.1145/500286.500292AZ.http://www.calnurse.org/cna/press/90402a.htmlB}3http://www.firstmonday.org/issues/issue7_10/brunk/A2http://news.bbc.co.uk/1/hi/technology/3224920.stmyx'http://www.acm.org/serving/ethics.htmlC!http://allnurses.com/t16164.html.http://guir.berkeley.edu/privacyworkshop2002/LT=http://www.datenschutz-berlin.de/gesetze/sonstige/volksz.htm{e@http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0534406{e@http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0534406{d@http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=06275135I_Toc1752110155C_Toc1752110145=_Toc17521101357_Toc17521101251_Toc1752110115+_Toc1752110105%_Toc1752110095_Toc1752110085_Toc1752110075_Toc1752110065 _Toc1752110055_Toc1752110045_Toc1752110035_Toc1752110025_Toc1752110015_Toc175211000<_Toc175210999<_Toc175210998<_Toc175210997<_Toc175210996<_Toc175210995<_Toc175210994<_Toc175210993<_Toc175210992<_Toc175210991<_Toc175210990<_Toc175210989<_Toc175210988<_Toc175210987<_Toc175210986<_Toc175210985<_Toc175210984<_Toc175210983<_Toc175210982<}_Toc175210981<w_Toc175210980<q_Toc175210979<k_Toc175210978<e_Toc175210977<__Toc175210976<Y_Toc175210975<S_Toc175210974<M_Toc175210973<G_Toc175210972<A_Toc175210971<;_Toc175210970<5_Toc175210969</_Toc175210968<)_Toc175210967<#_Toc175210966<_Toc175210965<_Toc175210964<_Toc175210963< _Toc175210962<_Toc175210961  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./012456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Root Entry Fp, Data 1Table3`'WordDocumentrK SummaryInformation(DocumentSummaryInformation8@1CompObjq  FMicrosoft Office Word Document MSWordDocWord.Document.89q