"Math Is Racist" -- CNN Goes Full Retard

Chris Menahan
InformationLiberation
Sep. 08, 2016

This is not from a satire parody site, nor is the article a joke, CNN has now declared "math is racist."

From CNN:
It's no surprise that inequality in the U.S. is on the rise. But what you might not know is that math is partly to blame. In a new book, "Weapons of Math Destruction," Cathy O'Neil details all the ways that math is essentially being used for evil (my word, not hers).

From targeted advertising and insurance to education and policing, O'Neil looks at how algorithms and big data are targeting the poor, reinforcing racism and amplifying inequality.

These "WMDs," as she calls them, have three key features: They are opaque, scalable and unfair.

Cathy O'Neil
Denied a job because of a personality test? Too bad -- the algorithm said you wouldn't be a good fit. Charged a higher rate for a loan? Well, people in your zip code tend to be riskier borrowers. Received a harsher prison sentence? Here's the thing: Your friends and family have criminal records too, so you're likely to be a repeat offender. (Spoiler: The people on the receiving end of these messages don't actually get an explanation.)

The models O'Neil writes about all use proxies for what they're actually trying to measure. The police analyze zip codes to deploy officers, employers use credit scores to gauge responsibility, payday lenders assess grammar to determine credit worthiness. But zip codes are also a stand-in for race, credit scores for wealth, and poor grammar for immigrants.
These are basic tools for risk management.

The feelings of weird feminists are not taken into account when creating risk models, they look at raw data to determine risk and judge people accordingly.



Who is more likely to pay a loan back: a middle class white person in a nice neighborhood, or a poor African-American living in Section 8 housing?

The government declared in the 1990's it's racist not to give housing loans to low income minorities simply because they weren't likely to pay them back, that eventually lead to the 2008 housing crisis.

CNN continues:
... One of the book's most compelling sections is on "recidivism models." For years, criminal sentencing was inconsistent and biased against minorities. So some states started using recidivism models to guide sentencing. These take into account things like prior convictions, where you live, drug and alcohol use, previous police encounters, and criminal records of friends and family.

These scores are then used to determine sentencing.

"This is unjust," O'Neil writes. "Indeed, if a prosecutor attempted to tar a defendant by mentioning his brother's criminal record or the high crime rate in his neighborhood, a decent defense attorney would roar, 'Objection, Your Honor!'"
This is completely debunked nonsense.

Blake Neff in The Daily Caller debunked this same garbage which was shared in a ProPublica study titled "Machine Bias," which similarly tried to claim these recidivism models are "racist."

As Neff wrote:
Nowhere in the main article do ProPublica’s writers mention their own finding, which is that the risk scores accurately correspond to the reoffense risk of whites and blacks. In fact, the entire article never mentions that blacks have a higher recidivism rate at all.
Nowhere does CNN mention their higher recidivism rate either.

The models reflect reality.



CNN goes on to state this basic risk assessment is literally "evil" and concludes by saying minorities need to be given more welfare because computer models are discriminating against them.
Imagine if you used recidivist models to provide the at-risk inmates with counseling and job training while in prison. Or if police doubled down on foot patrols in high crime zip codes -- working to build relationships with the community instead of arresting people for minor offenses.

You might notice there's a human element to these solutions. Because really that's the key. Algorithms can inform and illuminate and supplement our decisions and policies. But to get not-evil results, humans and data really have to work together.

Pictured: An evil, racist, biased and bigoted computer.
"Big Data processes codify the past," O'Neil writes. "They do not invent the future. Doing that requires moral imagination, and that's something only humans can provide."
Affirmative action, housing subsidies, race-based employment quotas, government subsidies exclusively for minority owned businesses, preferential college admissions, all of these -- which are explicitly based on race -- are not enough (and totally not racist).

We need to pay for criminals in prison to get "job training" and police need to stop arresting criminals in bad neighborhoods for committing crimes.



When math itself is being labeled as "racist," I think we can safely say the whole concept has finally jumped the shark.



Follow InformationLiberation on Twitter and Facebook.













All original InformationLiberation articles CC 4.0



About - Privacy Policy