Weapons of Math Destruction – Law Street https://legacy.lawstreetmedia.com Law and Policy for Our Generation Wed, 13 Nov 2019 21:46:22 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.8 100397344 Algorithms: How Blind Faith in Math and Data Can Exacerbate Social Ills https://legacy.lawstreetmedia.com/blogs/technology-blog/algorithms-math-data-social-ills/ https://legacy.lawstreetmedia.com/blogs/technology-blog/algorithms-math-data-social-ills/#respond Tue, 18 Oct 2016 19:58:52 +0000 http://lawstreetmedia.com/?p=56266

A panel of experts at New America recently discussed this quandary.

The post Algorithms: How Blind Faith in Math and Data Can Exacerbate Social Ills appeared first on Law Street.

]]>
"math" courtesy of [Akash Katakura via Flickr]

People trust math and data. Unlike the ambiguous, subjective nature of political debate or many other fields, math presents an objective truth. But should we trust in math with blind faith? Can a math-based system, a data-collecting mechanism unhinged from human error and discrepancy be fully trusted?

These are just a few of the questions that were addressed and discussed during a recent panel at New America, a think tank in Washington D.C. Algorithms–data processing formulas carried out by computer systems–can have hidden consequences and their potential for solving some of society’s deepest issues merits a closer look.

Why are we talking about algorithms now?

Cathy O’Neil’s new book, “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,” was the launchpad for algorithm discussion. O’Neil called math “beautiful, clear, and true,” but added that “there is no such thing as objectivity in algorithms.”

O’Neil said that algorithms require two things, one of which is inherently biased: loads of data points, and a “definition of success.” Whoever builds an algorithm, O’Neil said, brings their own definition of success to that model, and as an extension, the model’s conclusions will carry a result according to the technician’s definition of success. “Algorithms are not inherently fair,” said O’Neil, who earned a Ph.D. in math from Harvard and was an assistant professor at Barnard College. “The person building the model is in charge of the definition of success.”

To illustrate this point, O’Neil took the audience to her home kitchen. Her algorithm for cooking dinner for her children involves data factors such as time, ingredients, etc. Her definition of success: “Having my kids eat vegetables,” she said. But if the same data points were modeled after her son’s definition of success: “eating a lot of Nutella,” the outcome would be quite different.

Our world is awash in algorithms. From teacher evaluations to Amazon’s check-out process, society’s increasing reliance on computer systems to translate data points and solve problems like inequality, criminal justice, and surveillance. And while these systems are based on objective math, the outcome is not always ideal.

How do algorithms  affect our daily lives?

For Rachel Levinson-Waldman, an expert on surveillance technology and national security issues, who was also on the New America panel, mass surveillance relies on algorithms that are deeply opaque, mammoth in scale, and potentially damaging to certain groups of people. In mass surveillance algorithmic models, damage can be done if “you find yourself in a group that is more likely to be targeted with surveillance,” she said, mentioning people of color or Muslims in particular.

Levinson-Waldman echoed a sentiment shared throughout the panel. If one’s definition of success is turning a profit, which she said happens sometimes in the private security field, their algorithm will be based on “skewed incentives.” She said: “If you think the purpose of something is to make money, you’ll do something very different than if the purpose is to help people.”

Not every panelist on hand at New America agreed. Daniel Castro, vice president at the Information Technology and Innovation Foundation, said that assuming an algorithm is to blame for perpetuating a social ill shifts focus from what is actually causing the social ill. Blaming faulty algorithms for societal issues “distracts us from going after real solutions,” Castro said.

What can be done to improve algorithms, to make sure they function more fairly? It might look like “some kind of regulation oversight or audit mechanism that would check whether an algorithm is being used in a discriminatory way,” said K. Sabeel Rahman, a panelist at the event and an assistant professor of law at Brooklyn Law School.

O’Neil insisted her book was meant to start a conversation around algorithms, to raise questions, not offer solutions. And while the panel spent a good deal of time expressing their concerns with algorithms, “I think algorithms are potentially wonderful,” O’Neil said.

Alec Siegel
Alec Siegel is a staff writer at Law Street Media. When he’s not working at Law Street he’s either cooking a mediocre tofu dish or enjoying a run in the woods. His passions include: gooey chocolate chips, black coffee, mountains, the Animal Kingdom in general, and John Lennon. Baklava is his achilles heel. Contact Alec at ASiegel@LawStreetMedia.com.

The post Algorithms: How Blind Faith in Math and Data Can Exacerbate Social Ills appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/blogs/technology-blog/algorithms-math-data-social-ills/feed/ 0 56266