The following two posts popped up at the same time on my Twitter feed.
— The Economist (@TheEconomist) August 20, 2016
A London borough is developing an algorithm to predict who might become homeless. In India Microsoft is helping schools predict which students are at risk of dropping out…Researchers behind an algorithm designed to help judges make bail decisions claim it can predict recidivism so effectively that the same number of people could be bailed as are at present by judges, but with 20% less crime. To get a similar reduction in crime across America, they say, would require an extra 20,000 police officers at a cost of $2.6 billion.
But computer-generated predictions are sometimes controversial. ProPublica, an investigative-journalism outfit, claims that a risk assessment in Broward County, Florida, wrongly labelled black people as future criminals nearly twice as often as it wrongly labelled whites. Citizens complain that decisions which affect them are taken on impenetrable grounds.
— Gizmodo (@Gizmodo) August 20, 2016
Significant points from the MIT Technology Review article:
That raises the fascinating possibility that it might be possible to diagnose depression en masse by analyzing the photos people post to social-media sites such as Instagram…Andrew Reece…and Chris Danforth…have found significant correlations between the colors in photos posted to Instagram and an individual’s mental health. The link is so strong that the pair suggest that it could be used for early detection of mental illness.
‘These findings support the notion that major changes in individual psychology are transmitted in social-media use, and can be identified via computational methods,’ say Reece and Danforth.
And that will provide hope that mental illness can be accurately detected earlier, allowing for more effective intervention.
Combine both articles into a single package…this is what we get:
What Faception is in simplest terms is a computerized racial profiling tool – a programmed AI racist. This is Faception’s ‘theory’ behind the technology:
The face can be used to predict a person’s personality and behavior.
1. According to Social and Life Science research personalities are affected by genes.
2. Our face is a reflection of our DNA
It’s not difficult to imagine how this kind of system can really bring hell upon any number of individuals.
One day next year, you try to board a plane but are stopped by security. The automated facial recognition system has informed the officials that you look like a terrorist and might pose a security risk.
A few days later, you walk into a bank to apply for a mortgage, but are turned away before you can explain because once again the automated facial analysis software the bank uses has determined you have features associated with gamblers.
You have no criminal record and your credit rating is fine, but that doesn’t matter to the software behind the computer system known as Faception.
Based on an image, it matches facial features across its database to predict behavior before it occurs and label you with such tags as pedophile, terrorist, gambler or thief based solely on the geometry of your face. – PNW
Just A Thought In Closing:
It’s not really a good sign when a company like Faception, which boasts about its personality predictive programs based on facial recognition, only shows a shadow profile picture of their VP of Product & Marketing. What are they hiding, mmmmm…?