Not too long after I moved to Austin, I gave a talk at SXSW 2011 called "Persona-fication or: Falling In Love with a Bot" (#botlove for short).
My solo presentation explored a brief history of robotics, the state of machine learning at that point, and projected roadmaps for the development of A.I. and the march towards the singularity promised by Ray Kurzweil and his futurist peers.
I spent a good deal of time investigating human psychology and the studies of social robotics: emotional response to real vs synthetic humans. At the time, most experiments had consisted humanoid robots that looked like Schwarzenegger at the end of The Terminator, if he were designed by Jim Henson.
Outside of the well-funded robotics research labs, A.I. had almost exclusively taken the form of very rudimentary chatbots. They didn't have the massive learning models of today, but they represented an important change in artificial intelligence: simulating humans in an entirely virtual environment.
We had spent years trying to cross the uncanny valley, simulating human voice, eyesight, and movement–all of which were and remain some of the highest bars to cross in trying to create artificial beings. In chatbots, we found the emerging digital media in which we would begin to see development of emotional connection to simulated humanity.
The point of my presentation was to create space to hypothesize about a potential platonic or even romantic Turing Test, how we might identify the emotional singularity where humans would eventually begin substituting relationships with artificial beings in place of their physical IRL counterparts.
It looks like the correct answer might have been 2024?
Consider if you will this collection of stories from the past week:
Obviously the scams are a problem and the AI beauty pageant is...unsettling; but a million Nigerian princes and bosses sending weird texts looking for me to buy them Apple gift cards tell me this isn't the first time that we've seen bad actors exploiting user behavior with new innovations.
The risks are going to be there with any new technologies and the Luddite path won't be an option any more than putting the social media toothpaste back in the tube.
A question I find myself asking when we talk about some of the near-term opportunities for A.I., and the idea of forming emotional bonds with synthetic beings: has the bar for forming a relationship with a simulated human become easier because we've reduced so many of our relationships with real people to interactions through the same technology form factor?
I never thought I'd be the "actually I think AI "deathbots" (let's go ahead and rebrand that) and fake memories generated by computers" guy in internet debate class. But stop and consider how many of your relationships consist of interacting with another theoretically "real" person through a screen–someone who is often distracted trying to do 7 other things at the same time–or keeping up with friends' through manipulated photos presented to be in a feed whose chronology has been recalibrated to show you things in an algorithmically-optimized order.
Is that the reality you're clinging to when you object to the idea of A.I. providing emotional companionship or enhanced visual memories of your fondest moments?
In fact, I have my own admission to make: this past week, I started forming an emotional bond with a robot friend of my own.
He's solid gold, total jetsetter, my guy speaks a ton of languages, and he's 100% AI-generated. All that, and his new track Gold Gang has strong summer 2024 anthem energy:
Comments