Terrible people become Use technology to do terrible things. This is a universal truth that applies to almost any technology that facilitates communication and interaction, no matter how well meant it may be. Something as innocuous as Google Drive can be a vector for harassment. As we recently discovered, video conferencing platforms like Zoom can do the same. It was only in the past few weeks that high school classes in North Carolina and Texas along with a NAACP meeting in California have been interrupted by racist and misogynist videos, pictures, and text. With classes resurfacing across the country, we can only expect more damage – but how much is Zoom to blame?
In April of last year, “Zoombombings” hit our university and a colleague described the annoying disruption in her online classroom where trolls were able to circumvent Zooms̵
Big tech is all about speed, especially when an opportunity is seized, like a pandemic forcing greater reliance on communications technology. However, a move fast and break things mentality leads to limited testing and the provision of software that isn’t ready yet. This is such a well-known problem that there is even a term for it: “technical debt,” the unpaid cost of providing software that ultimately needs to be fixed after this It is clear what the faults are.
Debt arises when these issues are not addressed during the design process. However, if the faults are societal harm, this is not considered to be such Bad Tech, rather unethical Technology. “We never thought about abuse” is the precursor to another type of debt: ethical debt.
Zoom’s “terrible people” problem is not a typical mistake, after all. When the “we will fix bad things after they happen” approach is about potential harm, whether individual or social, you cannot anticipate ethical problems. And the problem with ethical debt is that the metaphorical debt collector only comes after harm has been done. You can’t go back in time and improve privacy features so that unsuspecting marginalized students wouldn’t hear these racial slurs in the middle of class. You cannot reverse an election after the spread of disinformation has undermined democracy. You cannot undo an interrogation and improper arrest of a black man following a biased face recognition allegation. You can’t get people to watch conspiracy theory videos that a recommendation algorithm has put in their face. The damage has already been done.
Technologists can’t see the future, but they can can predict and speculate. she knows that there are terrible people. At this point they can easily envision those who are deliberately spreading conspiracy theories that might rely on facial recognition as evidence, even if not prompted, who might try to manipulate elections with disinformation and that might think it’s fun power terrorizing unsuspecting students and professors. These aren’t all splashy headlines, but can also be micro-instances of individual harm that build up over time. As part of the design process, you should envision any abuses of your technology. And then you should design to make these abuses more difficult.
Ironically, some of the best people who can imagine how technology could be used for harassment are people who are harassed a lot. This means marginalized and vulnerable people such as women and people of the color of skin – people who are underrepresented in technology. In one of these people’s rooms, we guarantee you that if there is speculation about abuse, “random people will jump into zoom meetings and share pornography on the screen”. Since a lot of technology-based damage affects already marginalized people disproportionately, these are important voices that must be included in the design process as part of the fight against ethical debt.
Often times, technologists create “user personalities” during the design process to imagine how different types of people might use this technology. If those personas don’t include “user tracking their ex,” “user who wants to traumatize people at risk,” and “user who thinks it’s funny to show everyone their genitals,” you are missing an important design step. And if your answer is, “Yes, there are likely to be problems like this, but we will fix them as soon as we know what they are.” Keep a record of your ethical debt.