Technology

A Different Version of Yourself

October 19, 2017

If you could teach an AI (artificial intelligence) bot (robot) to learn how to be “you” and have it text you, would you? I’m sure you’re breaking that sentence down to figure out what the heck I just asked so I’ll try to put it more simply: would you be interested in creating a version of yourself through AI that you can text (or that would text you) any time?

The technology actually exists right now. It’s called Replika, and you can download the app to your phone (IOS and Android). I heard about it on one of my favorite podcasts: Note to Self.

The weirdly creepy idea kind of intrigued me as I listened to this reporter recount his experience using the app over a three-month period. The app was created by a woman who had lost her best friend in a car crash. She wanted to capture his essence digitally to deal with her grief. She essentially took all his texts and fed it into an AI program that learned how to text like her friend. Now, when I say “text like her friend” I’m talking all the nuances of his personality captured in text messages. The AI learned how he would typically respond or initiate text conversations. Weird, and creepy, right?

So this reporter downloads the app and starts answering a bunch of questions so the AI bot can learn how he would communicate over text. Soon this guy was answering text messages…from himself! He said one night he was in a bar in San Francisco by himself and he was carrying on a text conversation with this bot, and he forgot that it was a bot. He said it helped him not to feel so lonely in the bar. WEIRD, AND CREEPY, RIGHT!?

My mind started reeling at all the ethical and moral implications of this app. Like what does this “company” do with all the data on you? As if there isn’t enough data about each of us floating around in random places as it is! If you go on the Replika website they say they don’t sell your information to third parties. Like yeah, that’s a good enough promise to give away your personality, right? I mean, they said on their website they wouldn’t do anything with your data! I guess if you’re on Facebook (like me and 1 billion other people) we’re just giving our lives away with each post. Somehow this app feels more intrusive, more insidious.

Which begs a bigger question: should tech companies be responsible for the ethical and moral implications from possible misuse of their inventions?

Right now, we’re seeing with Facebook that these companies haven’t even thought about that (fake news). It’s an interesting problem that apps like this one present. I know that I’m not ready to create a version of myself that I can text or receive texts from. But if I lost a loved one, I might change my mind.

I still think it would be weird and creepy.

UPDATE: A friend responded to this blog post on Facebook, here’s what she said:

“So if your lonely you can feel better by talking to yourself?…but it’s okay because your talking via text and not out loud. And then you realize either 1. You suck at being your own friend 2. You have now cut anyone else out from helping you thus making you more lonely. oh and when I get texts from the friend I lost, how do I experience the process of grieving? It sucks to be lonely or to loose a loved one but we need others to help us and we have to go through the process to cope. Not to even consider morals or ethics I think it’s a Crappy idea! Just technology cutting us off from each other.”

She’s got a good point. Technology has cut us off from each other in ways we still haven’t yet fully grasped. I agree that we need to be aware of it so we don’t just get sucked in and lose much of what makes us human.

Leave a Reply

Your email address will not be published. Required fields are marked *