On Weaponised Design

text Cade Diehm

date 16 February 2018

The lives of digital platform users are at odds with how these systems are conceived and built. Weaponised design – a process that allows for harm of users within the defined bounds of a designed system – is facilitated by practitioners who are oblivious to the politics of digital infrastructure or consider their design practice output to be apolitical. Although users find themselves subject to traumatic events with increasing regularity, weaponised design is yet to be addressed by the multi-faceted field of interface and infrastructure design.

Originally commissioned by

Tactical Tech Logo

The Design of Everyday Things

Donald Norman, Basic Books

2013 Edition

User Experience Design has blossomed from a niche industry in the halls of Silicon Valley’s early startup darlings to a force that architects our digital lives. Today, user experience design is wholly responsible for modelling human expression and self identity, enabling interaction and participation online. Design shares as much in common with information security research as it does with behavioural science or aesthetics. By failing to criticise common design practices or form cooperative relationships with other technology fields, user experience designers are effectively refusing to recognise and respond to traumatic cases of its own work being used to harm the users it claims to serve.

Every major technology company has grappled with examples of their platform being leveraged to maliciously harm users whilst performing entirely within expected behaviour. This is weaponised design – electronic systems whose designs either do not account for abusive application or whose user experiences directly empower attackers.

Examples of weaponised design are numerous and horrific: Using a smartphone, a technology reporter . After viewing her friend’s posts, the journalist spends a few moments pinpointing her apartment via Google Maps street view and asks her subject to confirm the address. Her friend is ‘creeped out’ as she realises that, wholly unexpectedly, her phone has violated her privacy and her trust by exposing her home address to all of her Snapchat followers without her consent. The experience makes international headlines and authorities issue another warning over the app’s potential for targeted misuse.

Meanwhile, in an effort to address targeted harassment of users, the design team at Twitter that silences notifications sent to trolling victims when they are added to malicious Twitter lists. ; Removing list notifications allows harassers to compile and share lists of targets undetected. Victims are unaware they are shared targets and can’t fight back. The violation is so obvious . Harassment across Twitter continues.

Twitter first announced the change via its own platform. Responses from prominent critics are threaded contextually with the original tweet.

Elsewhere, two Facebook designers write about the data-driven tweaks their team has made to Newsfeed. They conclude that the changes make Facebook meaningfully easier to use, engage with and navigate, but fail to discuss the of their work, which . The article proves popular with the design community. A month later, , looking for evidence of Facebook’s ability to exert major influence on voting behaviour through psychology research enabled by interface and systems design. These examples happened in 2017 alone and are all dramatic scenarios of weaponised design.

The most common way design is weaponised is through poorly considered tradeoffs, where the ideal user is prioritised over empathetic threat modelling, resulting in features that are at high risk of exploitation. Alongside this year’s Snapchat example, another example of this famously occurred in 2014 when . In that case, the critical key focus is not of the actions of attackers, but rather one of informed consent. One can assume Apple’s designers took a puritan, anti-sex approach to the problem of centralised photo backup: Users don’t sext, and if they do, they wouldn’t object to intimate personal photos being synced to the cloud. As the company transferred millions of users into an opt-out automatic backup service, they failed to articulate the personal implications to their user base.

At time of writing, the US Justice Department had only just begun its investigation into electioneering via social media platforms and ad buys.

Reporting, Reviewing, and Responding to Harassment on Twitter

J. Nathan Matias et al., Women Action Media

13 May 2015

Design can also be weaponised through team apathy or inertia, where user feedback is ignored or invalidated by an arrogant, culturally homogenous or inexperienced team designing a platform. This is a notable criticism of Twitter’s product team, whose perceived lack of design-led response is seen as a core factor for enabling targeted, serious harassment of by , from at least 2014 to present day.

Sexism in the circuitry: female participation in male-dominated popular computer culture

Michael James Heron, Pauline Belford and Ayşe Göker, ACM SIGCAS Computers and Society

December 2014

Finally, design can be directly weaponised by the design team itself. Examples of this include Facebook’s designers , and Target, who in 2014 through surveillance ad tech and careful communications design, . In these examples, designers collaborate with other teams within an organisation, facilitating problematic outcomes whose impact scale exponentially in correlation with the quality of the design input.

Experimental evidence of massive-scale emotional contagion through social networks

Adam D. I. Kramer, Jamie E. Guillory and Jeffrey T. Hancock, Proceedings of the National Sciences of the United States of America

2 June 2014

How Companies Learn Your Secrets

Charles Duhigg, The New York Times

16 February 2012

These three situations have one thing in common: They are the result of designers failing their users through designed systems that behave more or less as the user expected. This is a decade long misrepresentation of the relationship and responsibilities of designers to their users, a dangerous lack of professionalism, ethics and self-regulation, along with a lack of understanding of how multi-disciplinary design is leveraged to both exploit and harm the community. As appointed user advocates, design is yet to embrace the new tools and practices to continue working in an increasingly user-hostile digital world.

Corrupting Design

In 1993, Apple Computer’s Human Computer Interface research team hired cognitive psychologist Donald Norman to work with the company’s early-stage interaction team. While they were not the first to argue for the need for so-called , Norman’s role as a ‘User Experience Architect’ marked a subtle but fundamental shift in the practice behind the design of commodified personal computer systems. By drawing from mid-twentieth century capitalist practice of product design and applying psychology and behavioural science to information architecture, the foundations for user-centric software design were laid, contributing to the company’s early successes in the West.

There were several other key players responsible for pioneering graphical user interfaces and user experience, including Susan Kare, Alan Kay, Larry Tesler, Robert Taylor, and many others.

Today, user experience design no longer requires a sociological or behavioural science background, but these origins linger. The field encompasses everything from aesthetics, visual communication and branding to deep system and information architecture, all working in concert to define and anticipate the activities of a user through applied user research. As platforms became more commodified – especially through mobile touch mediums – UX designers have progressively become more reliant on existing work, creating a feedback loop that promotes playfulness, obviousness and assumed trust at the expense of user safety.

The focus on details and delight can be traced not just from the history of the field, but to manifestos like Steve Krug’s , which propose a dogmatic adherence to cognitive obviousness and celebrates frictionless interaction as the ultimate design accomplishment. Users should never have to question an interface. Instead, designers should anticipate and cater to their needs:

Don’t Make Me Think

Steve Krug, New Riders

2014 Edition

A “mindless, unambiguous choice” is not without cultural, social and political context. In Universal UX Design: Building Multicultural User Experience, Alberto Ferreira explores the . Amongst many cross-cultural examples, his most striking are the failings of Western designers as they adapt their work to users in China or Japan. These hyper-connected middle-class audiences share comparable statuses for wealth, connectivity and education. But the cultural and aesthetic variances between these three societies are pronounced. Designers who produce or adapt work for these populations without a sturdy conceptual framework repeatedly fail their user. ‘Mindless and unambiguous’ is only true for those who have both the cultural context to effortlessly decode an interface, and the confidence that their comprehension is solid. Not only is this dogma an unreasonable constraint, it also frequently fails.

Four screenshots of Facebook’s Year In Review app. Clockwise from top left: Eric Meyer’s deceased daughter, an urn containing the remains of a Facebook user’s parent, a house fire and a recently-deceased beloved pet dog. Each image in the Year in Review app is picked algorithmically and placed against a celebratory new-year themed design.

In 2014, Eric Meyer, an influential web developer, was shown a celebratory algorithmically- generated “Facebook Year in Review” featuring images of his daughter who had died from cancer in that same year. He wrote:

Eric’s experience was not unique, nor was he the only person to write about his experience. Facebook’s Year in Review has generated annual criticism since it first debuted.

That the “distinguished” design team at Facebook chose to deploy an application capable of such an insensitive intrusion betrays a lack of diverse life experiences within its ranks, but this is somehow a tame example of the homogenous foundations of user experience design. At its extreme, Twitter is often singled out for its initial ideal user design, and through designing with such optimism, it has become optimised for abuse – . Now that the platform is an incumbent in the social media landscape, investor demand for user attention – measured as ‘platform stickiness’ – modifying the weaponised effects of homogenous ideal design has created business incentives to prevent the implementation of subsequent protections. In direct contradiction to public claims by the company’s leadership, and Twitter employees describe a organisation-wide disinterest in creating new models to moderate abuse on their platform or even deploy already-developed solutions.

As design has become commodified and weaponised by both platform operators and attackers, the response from designers has largely been to arrange chairs on the Titanic. As designer David Rudnick wrote in a series of tweets:

Its not just that 20th [century] design laid the groundwork for the centralisation of global power, it actively belittled strategies for resistance. It sneered & recast radical voices, voices of colour, of poverty, as aesthetics of failure. Then told us to trust a system it helped rig.21”

Today’s designers continue to maintain the status quo without industry or external criticism or pressure. Whereas other industries have ethical review boards or independent investigative bodies, . Because of this, the industry is yet to fully examine its participation to the current techno-political climate. In 2017 – as wealthy democracies destabilise and technology is embraced in conflict-stricken societies – the ethical and empathetic failings of user experience design harms users with increasing regularity and intensity.

This 2016 collection of popular design writing is an example of the industry’s fixation on more trivial issues as larger societal problems were beginning to converge.

Responding to a user-hostile world

Design is inherently political, but it is not inherently good. With few exceptions, the motivations of a design project are constrained by the encompassing platform or system first, and the experiences and values of its designers second. The result is designers working in a user hostile world, where even seemingly harmless platforms or features are exploited for state or interpersonal surveillance and violence.

As people living in societies, we cannot be separated from our political contexts. However, design practitioners research and implement systems based on a process of abstracting their audience through user stories. A user story is In most cases, user are grouped through shared financial or biographical data, by their chosen devices, or by their technical or cognitive abilities.

See more: A deeper introduction to user stories and the Agile software development methodology.

When designing for the digital world, user stories ultimately determine what is or is not an acceptable area of human variation. The practice empowers designers and engineers to communicate via a common problem-focused language. But practicing design that views users through a politically-naive lens leaves practitioners blind to the potential weaponisation of their design. User-storied design abstracts an individual user from a person of rich social and political agency to a collection of designer-defined generalisations. In this approach, their political and interpersonal experiences are also generalised or discarded, creating a shaky foundation that allows for assumptions to form from the biases of the design team. This is at odds with the personal affairs of each user, and the complex interpersonal interactions that occur within a designed digital platform.

When a design transitions from theoretical to tangible, individual user problems and motivations become part of a larger interpersonal and highly political human network, affecting communities in ways that we do not yet fully understand. In Infrastructural Games and Societal Play, Eleanor Saitta writes of the rolling anticipated and unanticipated consequences of systems design: These are no doubt challenges in an empathetically designed system, but in the context of design homogeny, these problems cascade.

Infrastructural Games and Societal Play

Elanor Saitta

30 January 2016

In a talk entitled From User Focus to Participation Design, Andie Nordgren advocates for how participatory design is a step to developing empathy for users:

From User Focus To Participation Design

Andie Nordgren, Alibis for Interaction

3 October 2014

Through inclusion, participatory design extends a design team’s focus beyond the hypothetical or ideal user, considering the interactions between users and other stakeholders over user stories. When implemented with the aim of engaging a diverse range of users during a project, participatory design becomes more political by forcing teams to address weaponised design opportunities during all stages of the process.

Beyond better design paradigms, designers must look beyond the field, toward practices that directly criticise or oppose their work. In particular, security research and user experience design have significant practice and goal overlap and this relationship is often antagonistic. Both fields primarily focus on the systems of wide-scale interactions between users and technology, but the goals of the two fields are diametrically opposed; design is to create the best possible experience for a user, security is to create the worst possible experience for an attacker. By focusing on the outcomes of the two fields, it’s clear that security research is a form of user experience design. Design should reciprocate, and become a form of security research.

At-risk users themselves are educating each other in operational security and threat modelling techniques. Robert Gehl describes how, increasingly, the threats faced by dark web communities mirror those experienced by the general public, acting as predictors for the future of the internet. In his essay, Proactive Paranoia, he writes:

Proactive Paranoia

Robert Gehl, Real Life Magazine

24 August 2017

There is tremendous opportunity for designers and information security researchers to cooperatively apply operational security and threat modelling practices to their work. If adopting participation design enables greater focus on interpersonal interactions within a network, then threat modelling and operational security practices offer concrete foundations for addressing human-design socio-political threats within a platform.

Countering the apolitical designer

User experience design must begin to deconstruct the outcomes of its collective body of work, especially as tech becomes more embedded and less visible or more easily ignored. Saitta writes, “All infrastructure is political; indeed, one might better say that all politics is infrastructural; we ignore it at our peril.” Developing new design practices helps to reduce cases of weaponisation through trade-offs in a system’s design, but practice alone is not enough.

Despite the many problematic elements of contemporary design, the community is somewhat capable of self-examination and cultural change. Counterintuitively, within user experience and its related fields, . Self-reported research suggests that entry level roles are both well-paid and tend towards equality, . This is not to say that the industry is an inclusive field, and furthermore it is unclear whether these trends extend into senior or managerial positions.

Artificial Intelligence’s White Guy Problem

Kate Crawford, The New York Times

25 June 2016

The majority of employment and gender reporting is presented by industry bodies. In this case, the cited research was published by the User Experience Professionals Association, a US organisation.

The introduction of the is one example of industry self-reflection and regulation. Designed to hold the contributors of a project to account, the commitment of members to a minimum level of expected behaviour as defined in a code of conduct has its roots in at conferences and in open source projects.

For an introduction to Code of Conducts within the open source software community, see the Open Source Code of Conduct guide.

In 2015, Github adopted the Code of Conduct as an official feature and policy, influencing the further spread of the concept.

But addressing problematic internal culture of design teams is not enough. As an industry we must also confront the real-world socio-political outcomes of our practice. If we accept a code of conduct as necessary, we must also accept a code of outcomes as necessary. We must create ethical frameworks to evaluate our work at all stages, especially once it is alive in the world. Our lack of ongoing critical evaluation of our profession means that design continues to reinforce a harmful status quo, creating exploitable systems at the expense of societies.

User experience is more than just aesthetics and interfaces. It is a form of cooperative authorship and bound deeply to the infrastructure; each platform and its designers work together to represent a piece of an individual’s digital self and self expression within a wider online community. This is a responsibility on par with publishing literature that transports the reader so fully as to transform one’s self-understanding or producing cinema that ensconces the viewer so deeply that at the end there is a moment of pause to remember which understanding is reality. The online lives we collectively live are inherently mediated by technology, but how we experience this is also mediated by design.

Technology inherits the politics of its authors, but almost all technology can be harnessed in ways that transcend these frameworks. Even in its harmful and chaotic state, it is still possible to marvel at the opportunities afforded to us through design. But arrogant practice that produce brutally naive outcomes must be transcended to facilitate empathetic products and broader platforms. We must stop making decisions inside of a rose-coloured echo-chamber. As designers it is time to collectively address the political, social and human costs of weaponised design.

Cade Diehm

Winter 2018

Revised Fall 2020.

Commissioned by Tactical Tech.

Edited by Maya Ganesh & Arikia Millikan. Special thanks to Rose Regina Lawrence, Sema Karaman, Dalia Othman, Louis Center and Ignatius Gilfedder.

Reproduced under a Creative Commons license.