Talks, Panels & Workshops
UPCOMING (July 29th): Quant & Qual on Common Ground: Collaboration among Researchers of All Stripes
This webinar focuses on the core concepts and concerns shared by all researchers and how this common ground establishes a basis for closer collaboration among researchers of all stripes. We’ll discuss the constraints we experience as qualitative and mixed methods researchers, a vocabulary for communicating the value of ethnographic work to quantitative colleagues, and strategies for more fully and effectively integrating ethnographic work into research and business cycles.
The coronavirus pandemic has given rise to a new ecosystem of mobile apps for checking symptoms, tracing contacts, monitoring quarantine, and more. These apps may have access to users’ sensitive personal health information, but don’t always have the right policies in place to ensure the privacy of that data. Two experts from an independent watchdog agency will share their analysis of over 100 coronavirus mobile apps and a privacy enforcement professional will join for a discussion of how to protect personal data, including through accountability mechanisms.
This webinar reflects our experiences with motivated intruder tests. Building on recently published work with clinical trial data it will describe the drivers for these types of tests from a business perspective, how to conduct them, and an overview of lessons learned across multiple studies over the last couple of years on different types of data.
2020 Vision on Privacy for Telematics Panel: Privacy Engineering, Compliance & Technical Innovation Trends
This panel held at Geotab Connect 2020, focused on privacy engineering and technology trends related to IoT, paying special attention to connected vehicles.
This EPIC2018 panel addresses questions of fairness and justice in data-centric systems. While the many social problems caused by data-centric systems are well known, what options are available to us to make things better?
Current mobile platforms provide privacy management interfaces to regulate how applications access sensitive data. Prior research has shown how these interfaces are insufficient from a usability standpoint: they do not account for context. In allowing for more contextual decisions, machine-learning techniques have shown great promise for designing systems that automatically make privacy decisions on behalf of the user. However, if such decisions are made automatically, then feedback mechanisms are needed to empower users to both audit those decisions and correct any errors.
In this paper, we describe our user-centered approach towards designing a fully functional privacy feedback interface for the Android platform. We performed two large-scale user studies to research the usability of our design. Our second, 580-person validation study showed that users of our new interface were significantly more likely to both understand and control the selected set of circumstances under which applications could access sensitive data when compared to the default Android privacy settings interface.
Evaluating the re-identification risk of a clinical study report anonymized under EMA Policy 0070 and Health Canada Regulations
Regulatory agencies, such as the European Medicines Agency and Health Canada, are requiring the public sharing of clinical trial reports that are used to make drug approval decisions. Both agencies have provided guidance for the quantitative anonymization of these clinical reports before they are shared. There is limited empirical information on the effectiveness of this approach in protecting patient privacy for clinical trial data.
In this paper we empirically test the hypothesis that when these guidelines are implemented in practice, they provide adequate privacy protection to patients. An anonymized clinical study report for a trial on a non-steroidal anti-inflammatory drug that is sold as a prescription eye drop was subjected to re-identification. The target was 500 patients in the USA. Only suspected matches to real identities were reported.
Selection of Research and Authoritative Reports
An independent investigation performed with our partners at the Internation Digital Accountability Council (IDAC) of worldwide COVID-19 mobile apps found that several widely-used apps pose privacy risks to worldwide users.
In this report, we relay our findings on our analysis of how apps collect personal data, what data the apps collect, what third parties receive data from these apps and other data issues, to identify concerning practices with app users’ reasonable expectations, privacy laws, and platform policies. While we did not find egregious or willful developer misconduct, the investigation revealed several instances in which apps fell short of best privacy practices and posed potential risks to users. Download the full report.