CUPS (CMU)
The Carnegie Mellon University Usable Privacy and Security Laboratory (CUPS) was established in the Spring of 2004 to bring together Carnegie Mellon University researchers working on a diverse set of projects related to understanding and improving the usability of privacy and security software and systems. The privacy and security research community has become increasingly aware that usability problems severely impact the effectiveness of mechanisms designed to provide security and privacy in software systems. Indeed, one of the four grand research challenges in information security and assurance identified by the Computing Research Association in 2003 is: "Give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future." This is the challenge that CUPS strives to address. CUPS is affiliated with Carnegie Mellon CyLab and has members from the Engineering and Public Policy Department, the School of Computer Science, the Electrical and Computer Engineering Department, the Heinz College, and the Department of Social and Decision Sciences.
Projects
- P3P and computer-readable privacy policies
- Two members of the CUPS Lab are members of the W3C P3P Working Group, working on developing the P3P 1.1 specification.
- In the fall of 2005, AT&T gave the rights to the source code and trademarks surrounding Privacy Bird, their P3P user-agent. Privacy Bird is currently maintained and distributed by the lab.
- In the summer of 2005, the lab made available to the public a "P3P-enabled search engine", known as Privacy Finder. It allowed a user to reorder search results based on whether each site complied with his or her privacy preferences. This information was gleaned from P3P policies found on the web sites. Since 2012, Privacy Finder has been "temporarily out of service", with no indication of when service would be restored.
- Additionally, the lab archives web sites privacy policies and has been creating a toolkit to aid in the automated analysis of both P3P policies as well as natural language privacy policies.
- Supporting trust decisions
- More recently, the lab is examining trends in phishing attacks as well as users' perceptions of these attacks to develop better methods of detecting and reporting phishing messages.
People
- Lorrie Cranor, Director
- Alessandro Acquisti
- Julie Downs
- Serge Egelman
- Mandy Holbrook
- Jason Hong
- Patrick Gage Kelley
- Ponnurangam Kumaraguru
- Cynthia Kuo
- Adrian Perrig
- Robert Reeder
- Sasha Romanosky
- Norman Sadeh
- Steve Sheng
- Janice Tsai
- Kami Vaniea