MONTCLAIR, NJ - Who to stop and frisk? Which college ranks best? How long should a prisoner be in jail? How to evaluate a teacher? These are some of the questions answered by models and algorithms.

During a lecture at Montclair State University on Thursday evening Cathy O’Neil, Harvard PhD, mathematician, data scientist, writer and social justice activist challenged the assumption that the data used to answer these questions is objective. She lead a compelling discussion at Montclair State University (MSU) Creative Research Center, at their Fifth Annual Symposium on the Imagination.

Tap into Montclair met with O'Neil to talk about her upcoming book "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy," which will be released on Sept. 6, 2016 by Crown Books.  

Sign Up for E-News

O'Neil stated that we as human beings constantly create models. We model our future husband, our careers; we also build models that define success in different ways.

She began by explaining that we externalize these internal models and turn them into code, which essentially are called algorithms. It’s a natural process, but when used by corporations that are not being held accountable for the goals they look to achieve, it can be deeply detrimental to society.

More than just a simple explanation or definition of big data, O'Neil tried to convey this evening was that the population at large can and should argue with an algorithm the same as they can argue with someone’s decision-making process.

O’Neil explained whether she thought these biased algorithms had to do with why she left Wall Street where she worked in finance for a while before moving to the tech sector. She said “I got out of Wall Street to become a data scientist and get away from corruption and the uneasy feeling that I was making the world worse instead of better.”  Sadly she has not been able to shake that feeling.

She added, "I realized that all these algorithms we are using on people are actually even less accountable and more opaque. We use them everywhere."

O'Neil specifically talked about algorithms that assess teachers based on student test scores and shared a story of a D.C. teacher erroneously fired because the students’ assessments of the current year, compared to the previous year, were tied to her performance. Unfortunately their original scores from the prior year had been incorrect to begin with (something the algorithm could not have accounted for).

She went further to explain that NJ Governor Chris Christie had approved a growth model that assesses teachers and added more examples of what algorithms do for society such as "Algorithms that help a judge decide how long to sentence a defendant."

When asked if it was better in the good old days prior to algorithms, O'Neil did not defend past processes.

She said, "our prior processes were often biased and racist to be clear." She said algorithms are intended to curb that but, "we are in fact using biased data to train models and instead of improving the status quo we are sort of doubling down in continuing the status quo... And we are calling it objective."

Many data scientists are designing models with the best intentions, but according to O'Neil, they aren't always thinking of the outcomes or effects.

Her call to action was threefold. First, she would like the general population to stop being intimidated by algorithms and start asking questions as well as demanding accountability.

When asked to whom an average citizen would direct said questions or from whom we should require accountability, she said there is always an institution responsible for creating a model. "Someone has the source code."

Second, she would like modelers to think of the ethical implications of how they create their models versus following the money.

Third, she would like policymakers to update civil rights era anti-discrimination laws and privacy laws to make it illegal to do certain things.

Neil Baldwin, professor in charge of the virtual center for the study of imagination at MSU, spoke about how he came to the decision to invite O’Neil to MSU.

He said, "I run the creative research center and I'm always looking for visionary people and I called my agent to find out about interesting books that were coming out".

Following the discussion there was a session of Q&A.

A few people in the audience added their insight on the evening's event. Local business owner Eric Pearson who runs a business that does oil changes for customers at the customer location was very interested in the topic.

His opinion was that the problem isn't necessarily the data but the business strategy. Pearson said, "Whoever is making the algorithm is ultimately responsible." Pearson himself is looking to use data to make his business more accommodating to his customers.

Audience member Jay said, “This woman is extraordinary based on all she has done in the past. She talks about how she bailed out of Wall Street because they are unethical and immoral. She realized hedge funds constituted stealing from people’s retirement.”

The evening’s takeaway was that algorithms can be challenged but also that the most important choices people make when modeling are deciding what the goals of their models are, how to define success, and what exactly is the data being used. According to O’Neil if those issues are being assessed correctly and fairly, algorithms work.

More information:

www.mathbabe.org.

http://www.montclair.edu/arts/creative-research-center/