February 26, 2021
In a world in which decision-makers increasingly want numbers to base or back their decisions, how can user researchers take a larger role in partnership with data-science? In this talk, Jenny and Will discuss aspects of data-science workflow, identify commonalities between the two disciplines and present a taxonomy of the constraints standing in the middle of fruitful collaboration. We hope to gear attendees with concrete strategies on how to tackle these situations and maximize the overall value of research.
January 28, 2021
Happy Data Privacy Day!
December 17, 2020
De-identification and Differential Privacy.
Oct 19/21, 2020
This tutorial's goal is to empower ethnographers to develop more holistic, interdisciplinary programs of inquiry for their projects, teams and organizations. This tutorial focuses pn the core principles underlying research and inquiry of all kinds, establishing frameworks that unite rather than divide the current research “camps.”
October 6, 2020
Two years ago, the New York Times lifted the lid on the dangers of location tracking, but the tech industry shrugged. Now authorities are fighting back.
July 29, 2020
This webinar focuses on the core concepts and concerns shared by all researchers and how this common ground establishes a basis for closer collaboration among researchers of all stripes. We’ll discuss the constraints we experience as qualitative and mixed methods researchers, a vocabulary for communicating the value of ethnographic work to quantitative colleagues, and strategies for more fully and effectively integrating ethnographic work into research and business cycles.
March 25, 2020
This webinar reflects our experiences with motivated intruder tests. Building on recently published work with clinical trial data it will describe the drivers for these types of tests from a business perspective, how to conduct them, and an overview of lessons learned across multiple studies over the last couple of years on different types of data.
February 18, 2020
Regulatory agencies, such as the European Medicines Agency and Health Canada, are requiring the public sharing of clinical trial reports that are used to make drug approval decisions. Both agencies have provided guidance for the quantitative anonymization of these clinical reports before they are shared. There is limited empirical information on the effectiveness of this approach in protecting patient privacy for clinical trial data. In this paper we empirically test the hypothesis that when these guidelines are implemented in practice, they provide adequate privacy protection to patients. An anonymized clinical study report for a trial on a non-steroidal anti-inflammatory drug that is sold as a prescription eye drop was subjected to re-identification. The target was 500 patients in the USA. Only suspected matches to real identities were reported.
January 14, 2020
This panel held at Geotab Connect 2020, focused on privacy engineering and technology trends related to IoT, paying special attention to connected vehicles.
October 12, 2018
This EPIC2018 panel addresses questions of fairness and justice in data-centric systems. While the many social problems caused by data-centric systems are well known, what options are available to us to make things better?
July 12-14, 2017
Current mobile platforms provide privacy management interfaces to regulate how applications access sensitive data. Prior research has shown how these interfaces are insufficient from a usability standpoint: they do not account for context. In allowing for more contextual decisions, machine-learning techniques have shown great promise for designing systems that automatically make privacy decisions on behalf of the user. However, if such decisions are made automatically, then feedback mechanisms are needed to empower users to both audit those decisions and correct any errors. In this paper, we describe our user-centered approach towards designing a fully functional privacy feedback interface for the Android platform. We performed two large-scale user studies to research the usability of our design. Our second, 580-person validation study showed that users of our new interface were significantly more likely to both understand and control the selected set of circumstances under which applications could access sensitive data when compared to the default Android privacy settings interface.