How can design cause harm and how can we safeguard against it?

A series of case studies on how design can be used to protect against misuse, privacy intrusions, and the erosion of agency
product designprivacycivil libertiesdark patternsdesign ethicsdesign justice

Versions of this content have been developed during my tenure at Palantir, and this specific presentation was developed for a session I hosted at AIGA's DC Design Week (2021), in a session called "Defense Against the Dark Arts: Design for Civil Liberties and Agency."
DC Design Week Session
Why design? What about engineering ethics?

I've been gathering and developing this content while working at Palantir, where I served as a Privacy and Civil Liberties engineer, spotting the potential ways in which products could lead to unanticipated harms to civil liberties, privacy and human rights. To counter this, I worked on implementing policy, engineering, and design decisions to guide products away from these harms.

I developed these contents with product designers in mind, aiming to cover generally, how design can intentionally (or unintentionally) cause harms, while carving out a framework to guide thinking around design's dark patterns. Design is a crucially important stage in product development, and designers have tremendous potential to shape how ethics are considered during product development (though obviously, they do not uniquely shoulder this burden). Because designers' roles are centered around problem-solving and systems thinking, it is at this stage that product organizations naturally reflect on whether a technology solution could be the right way of approaching an issue, and affords a team a clear opportunity to conduct research and question underlying assumptions. Designers not only shape the aesthetics of products, but also direct the user experience of a tech solution. Designers must ask if they are solving problems, but also if they are creating new ones. Designers crucially hold the critical context that can educate their teams, identify potential issues, and imagine unintended harms. Designers frame the art of the possible, inspire users (often, influencing users' work environment and cultures too). Designers learn how users think. Design decisions indirectly guide engineering constraints (e.g. how should data be modeled? what app states are possible? who has access to certain features/permissions?).

Rooting these case studies in historical defensive design techniques like poka-yoke, and expanding Sasha Costanza-Chock's extensive work around design justice, I offer categories to evaluate design -- across surveillance, misrepresentation, addition, misdirection, manipulation, and friction -- to give designers clear anchors to consider the impacts of their design choices within the contexts of their products/platforms and broader systems.

These iframes are a little annoying -- if you'd prefer to read in a separate tab, this presentation is available here as well.



web version

Check it out here!

I've begun to adapt these contents to a more interactive web version, though this project has a long way to go! The static version above is more complete and thorough (though they were not designed to be consumed in a web format).