It’s a common occurrence: Your phone or computer’s operating system runs an automatic update, and things suddenly look a little different.
Most of us understand that it happens occasionally and is no big deal. But for those who have experienced digital stalking or harassment at the hands of a current or former close partner, these seemingly innocuous changes can be terrifying.
That and other types of computer-related restores can be reduced or avoided in a number of low or inexpensive ways, says Nicola Dellassociate professor of information science at the Jacobs Technion-Cornell Institute at Cornell Tech, and at the Cornell Ann S. Bowers College of Computer and Information Science.
Her and her colleagues Tom Ristenpartassociate professor of computer science at Cornell Tech and Cornell Bowers CIS, led a research team focused on “injury notification computing” – an approach that recognizes the impact of trauma and seeks to make safer technology for all users, not just those who have experienced trauma.
Janet X. Chen, a doctoral student in information science, is a co-author of ““Injury Notification Calculator: Towards a Safer Tech Experience for All,” which the team presented at CHI ’22: Conference on Human Factors in Computer Systems, held from April 29 to May 5 in New Orleans. The other lead authors are Allison McDonald and Yixin Zou, doctoral students from the University of Michigan.
Dell and her colleagues define an injury notification computer as “committed to continuously improving the design, development, implementation, and support of digital technology by: explicitly acknowledging trauma and its impact; acknowledges that digital technology can cause and exacerbate injury; and actively seek out ways to avoid technology-related injury and rehabilitation. “
Some of the paper’s co-authors have experience with communities that have experienced trauma, including victims of intimate partner violence (IPV).
“Over time, we’ve found that there are a lot of survivors who actually feel fear of technology,” said Dell. “They’re getting feedback on what you or I might consider trivial tech stuff – their website crashes, their software update or their email changed as a result of Google updating something – what that would actually cause a disproportionate response to how they react to it.
“And often, they will assume that means they have been assaulted or are being abused,” she said. “We started to realize that what they were describing and many of the responses that we are found to correlate very well with well-known trauma or stress responses – things like increased vigilance, paralysis, or despair. “
The group’s framework includes six principles, adapted from the Substance Abuse and Mental Health Services Administration, for the design, development, implementation, and evaluation of computer systems. Those principles include safety, trust, cooperation, peer support, support (empowerment) and interconnectedness (regarding cultural, historical and gender issues).
The article – illustrating trauma in computers through three fictitious models, based on publicly available accounts as well as the experience of the authors – explores the application of these principles in the fields of user experience research and design; security and privacy; artificial intelligence and machine learning; and organizational culture in technology companies.
“We know from our work with IPV victims that many advocacy organizations, social work organizations, hospitals, and schools have actually worked to combine approaches,” says Dell. access to information about trauma. “For us, it brought this idea to the computing community to say, ‘What would it take to make your products and technology more informed?’
Dell says one possible approach is to allow users to manage a list of factors that can trigger their injury.
“Everybody knows that Facebook will show you ads,” she says, “but maybe you can just say, ‘Don’t show me ads for baby products, because I just went through this. Pregnancy period “. Giving people control over what they see, and explaining why you don’t want to see a certain thing, can help enable and empower people.”
The authors made 22 such recommendations on ways to make computing more secure for all users, such as: performing user studies in a safe, secure location; provide clear information when software updates are pending, with options for when and when to install; create content policies with input from affected communities; and provide training and resources to help tech workers better interact with trauma survivors.
One thing the researchers encourage tech companies not to do: seek out people and ask them questions about their traumatic experiences. They say that can cause unnecessary re-accumulation.
Getting support from the tech community “can certainly be a challenge,” says Dell, but some simple steps are achievable.
“We talked quite a bit with different tech companies and overall the response was very enthusiastic,” she said. “I think they are very interested in trying to do some of this. We certainly hope that the tech companies don’t want to hurt or retrain people.”
Other collaborators include PhD student Emily Tseng; Florian Schaub, assistant professor of information science at Michigan; and Kevin Roundy and Acar Tamersoy of the NortonLifeLock Research Group.
This research was supported by the National Science Foundation, Google, and the Defense Advanced Research Projects Agency.