Authors:
Deborah Tatar
At a family holiday meal in my youth, over the lacy tablecloth and between the candlesticks, I heard a parable about technology decisions and the people who make them. The story—possibly a myth—was that the first person who wrote a banking program, using "those IBM machines" in the 1960s, did not know what to do with all the decimal places so he just truncated the calculations and put all the extra money into his own account. Eventually there was a lawsuit that involved as many pennies as there are stars in the sky. Or perhaps the gods decided how many stars to put in the sky based on the number of pennies that programmer stole.
Opacity, scale, and damage are the three components of Weapons of Math Destruction (WMDs), as Cathy O'Neil clearly, succinctly, and alarmingly describes them in her book, subtitled How Big Data Increases Inequality and Threatens Democracy.
As in the banking parable, computer scientists break new ground with confidence. They promise revolution! But these automated systems use big data to make cavalier guesses about people's behavior and circumstances, reducing the complexities of real life to stereotypes. The scale and authority of these systems make us operate as if those stereotypes were real. Stats 101 tells us that we cannot make predictions about individual values from group tendencies, but O'Neal describes areas in which exactly that is done and there is no recourse, no regulation, no policy, no grounds for lawsuits. Often, the technologies act indirectly so that their influence is undetectable by those affected. WMDs increasingly run our society. Examples range from finance to education to healthcare, among others.
I have also been rereading several older influential papers and thinking about how the ideas they promulgate mask the effects of WMDs, allowing our research community to remain untroubled. Both Donna Haraway's Cyborg Manifesto and Asle Kiran and Peter-Paul Verbeek's Trusting Ourselves to Technology are revolutionary in some senses and not without nuance. But they do not perceive the danger of systems that feel trustworthy and make us feel good, but have the potential for betrayal built in. To draw on an analogy, consider that human beings are made up largely of water. Haraway and Kiran and Verbeek draw attention to the technological equivalent of metabolic processes.
In this sense, water is our friend and we rely upon it. But water in other forms—water adulterated with pollutants or the ocean on a dark, cold night—is not a reliable friend.
O'Neil draws our attention to the power of the technological sea around us. Are we on the boat? Is it sinking? I hope not. But designers have to bear this danger in mind. We've designed our way into trouble; I hope we can design our way out.
Deborah Tatar is a professor of computer science at Virginia Tech. Previous titles include senior software engineer at DEC, member of the research staff at Xerox PARC, and cognitive scientist at SRI International. She is best known for critical work and middle-school educational research. She dyed her hair purple after the November 2016 elections. [email protected]
Copyright held by author
The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.
Post Comment
No Comments Found