The Enemy is us
“The lesson of AI is not that the light of mind and consciousness is beginning to shine in machines, but rather the dimming of our own lights.” — Erik Larson
I recently acknowledged that my thoughts on AI have been naive, simply because I thought AI researchers were working to achieve something close to “Artificial General Intelligence” (AGI) (i.e. human-like intelligence). However, I now think that’s not going to happen because it was not the intention of those building and selling AI applications for it to happen. Furthermore it’s highly unlikely there’s a path from Artificial Narrow Intelligence (ANI) to AGI.
It appears those in charge of building AI have the simplest of motives, which is to gain an advantage and use it to their benefit for as long as they can. The perfect explanation of my revised understanding of AI is described by Jordan Greenhall in his recent medium.com post “What is the problem with social media?” ( https://medium.com/deep-code/what-is-the-problem-with-social-media-5ec873f7a738 ). More precisely it’s best described in Greenhall’s fourth of four “foundational problems” with Social Media which he calls “The asymmetry of Human / AI relationships”. Paraphrasing Jordan,
“When we enter into relationships with an entity like Facebook (or Google, or Apple, or . . .) we still have the basic expectation that we are entering into a vaguely symmetric, human, relationship. At worst, we unconsciously expect the sort of unpleasant bureaucratic relationship that we enter into with Walmart, IBM or General Motors….
Nothing could be further from the truth…When it comes to grabbing and holding our attention or to analyzing and profiling our data, the algorithms of social media…are like gods. And gods that, for now at least, don’t have our best interests in mind.”
Harvard Business School professor Shoshana Zuboff calls the phenomenon, “Surveillance capitalism” which monetizes data acquired through surveillance. She describes it as,
“Companies like Facebook and Google offer you free services in exchange for your data. Google’s surveillance isn’t in the news, but it’s startlingly intimate. We never lie to our search engines. Our interests and curiosities, hopes and fears, desires and sexual proclivities, are all collected and saved. Add to that the websites we visit that Google tracks through its advertising network, our Gmail accounts, our movements via Google Maps, and what it can collect from our smartphones.” ( https://www.cnn.com/2018/03/26/opinions/data-company-spying-opinion-schneier/index.html )
Facebook’s Custom Audience tool — allows organizations to upload a database of names, phone numbers, and email addresses. Facebook matches the data to personal profiles on its network and lets organizations target those profiles with advertising. Aaron Zakowski has even developed 10 Creative (and Sneaky ) ways To Use Facebook Custom Audiences. ( http://aaronzakowski.com/creative-ways-to-use-facebook-custom-audiences/ )
All of these developments are exactly what happened recently with the algorithms of Cambridge Analytica.
Cambridge Analytica used their Artificial Intelligence applications to analyze tens of millions of user profiles using data they acquired from Facebook. Using the Facebook data they first put together psychological profiles of users. They then used Facebook’s targeted advertising system to display ads and content at those users geared towards their own psychological profile. Think of it as persuasive arguing on steroids hyper targeted by your own beliefs about the world.
While there may have been issues with regard to how Cambridge Analytica “acquired” data from Facebook, I’m not implying there was anything nefarious with regard to how they used the data. In fact it appears to have become a generally accepted business practice to gather large amounts of data from human beings about their behavior and then use that data to manipulate those same human beings into behaving in a way the data manipulators want them to behave.
The problem is we, human beings all over the world, have become more than willing to freely give up data about ourselves to data manipulators. “We have met the enemy and he is us.
In this article Mike Elgan wrote for Computerworld ( https://www.computerworld.com/article/3035595/emerging-technology/artificial-intelligence-needs-your-data-all-of-it.html ) he reports:
“Siri, Google Now, Cortana or Alexa, like other AI applications, work by recording your voice, uploading the recording to the cloud, then processing the words and sending back the answer. After you’ve got your answer, you forget about the query. But your recorded voice, the text extracted from it, and the entire context of the back-and-forth conversations you had are still doing work in the service of the A.I. that makes virtual assistants work. Everything you say to your virtual assistant is funneled into the data-crunching A.I. engines and retained for analysis.”
Mike has worked as chief editor for Windows Magazine, HP World Magazine, Inside HP, HP World News, The Palm Reader, Palm News, Road Tricks, Portable Life News, Laptop Life, and BuzzWords so he has a clear appreciation for how the AI industry works.
There’s no pretense among commercial providers of Artificial Intelligence. It’s not to advance “intelligence” to human-like intelligence. It is pure and simple to know as much as possible about the people using an AI application and use that knowledge to the advantage of the provider of the application and others who may have a use for the related data.
In Dave Eggers dystopian novel, The Circle, he warns, “We are at a pivot in history. There used to be the option of opting out. But now that’s over. It’s a “totalitarian nightmare, everyone will be tracked, cradle to grave, with no possibility of escape.”
“Dystopian” indeed! The real world in which we currently live, the world of Cambridge Analytica, social media, IBM’s Watson, Google Android and others is increasingly taking on the characteristics of the world Eggers describes in his futuristic fiction. We are at a “pivot in history” but we do have the option of “opting out”. We can still protect ourselves from the advantage others would like to gain over us but that prophylactic first and foremost requires that we acknowledge that the enemy is undeniably us and the jury is still out on that.
HuffPost founder, Arianna Huffington, writes “social media makes it more likely that people compare their lives unfavorably to their friends’, including underestimating the negative aspects of others’ lives, and overestimating the positive ones.” and “2018 is the year when the signs of the disenchantment ( with social media) effect become ubiquitous and the hunger for a new way to live and to engage with technology will very likely hit a critical mass.” ( https://journal.thriveglobal.com/the-great-awakening-8bf08fa95eda ) That of course remains to be seen.
On the surface at least it appears millions may have caught-onto problems with social media and decided to opt-out. In his recent medium.com article ( https://medium.com/@Michael_Spencer/teens-signal-facebook-as-irrelevant-as-exodus-of-youth-continues-db8c146cb5ca ) Michael Spencer reports “ Facebook lost around 2.8 million U.S. users under 25 last year. 2018 won’t be much better. According to a new study by eMarketer”.
However it’s highly unlikely many have reached a point of “disenchantment” with social media. Millions have not really “caught onto problems” with social media, rather they’re simply making choices among different types of social media and sometimes for the most basic reasons. Young people don’t like being associated with the media of their parents (e.g. Facebook). They prefer something more hip (e.g. Snapchat ). It remains to be seen if Snapchat develops it’s own version of Facebook’s Custom Audience Tool but either way, it’s likely in for a deluge of freely provided data coming it’s way.
Originally published at neutec.wordpress.com on March 27, 2018.