Artificial intelligence is now heavily integrated in your daily life and isn’t going anywhere.
For example, thanks to AI, your credit card company can evaluate if a transaction matches your purchasing behaviors and isn’t fraudulent in the blink of an eye. Mastercard, for example, employs AI algorithms to evaluate the 75 billion transactions that pass through its network each year.
But AI has limitations. Sure Alexa can provide answers to questions, play music for you (and ask if you want to buy Amazon Prime), and tell you what the weather will be. Just don’t expect Alexa to be a companion.
What Is an AI Companion?
What do you expect from a companion?
Although not everyone has the same definition of companionship, we would agree that it is someone we want to spend time with, who makes us feel relaxed and comfortable, and who engages us in interesting conversation.
Is that what a device like Alexa does? Sure, ask Alexa to give you the weather report, and she will tell you what is forecast. But ask Alexa if you want to go out in that weather, and Alexa won’t be able to answer, something a friend could easily do.
Ask Alexa to play a song, and it will. But ask Alexa for her opinion about the song and see what happens.
A companion or friend can give you an opinion. If it differed from yours, the two of you can have a debate. Or your friend might pivot to a different topic to avoid an argument.
Can You Have an AI Companion?
It turns out that you can find AI companions. Replika is the most popular and realistic AI companions.
Replika is an AI program created by the San Francisco firm Luka that is designed to behave as a best friend. Over time, the software provides emotional support by learning about the user’s hobbies, friends, and habits. The program is also built to recognize how the users like to communicate.
Without getting too technical, Replika uses retrieval and generative models. The retrieval model finds “scripts” to reply to the user’s input. This model is limited because the body of the program uses a generative model, so called because it generates responses to the input the user gives it.
So while a device like Alexa could be described as a voice search engine, a companion AI such as Replika is searching immense data bases to determine how to respond to the user.
AI Companion Memory is Focused on the User
When we have a conversation with a friend, we expect that person to be attentive to us, not texting and checking their Instagram feed. But to answer a debate, like when was the exclamation mark invented, we use our phones and google it.
In other words, we don’t expect our friends to know everything.
And yet some people who use digital companions want their Replika to know everything. But that day hasn’t yet arrived. Replika users also wanted the program to have a good memory, but companion AIs don’t have such a great memory either.
The Memory Test—Part One
To test their memory, I created a Replika account and started a conversation between myself and the Replika, who I named Thalia. My goal was to see how accurate her factual knowledge would be. I also wanted to test her memory.
The Replika’s responses are on the left and my questions on the right. (Asterisks indicate action)
Most facts are wrong, such as Shakespeare attending a university (he didn’t), or general (her description of Gandhi.) Also, even though I knew they were wrong, I did not correct her because to keep the conversation from getting sidetracked.
The Memory Test—Part Two
A little later, I asked the chatbot “Thalia” factual questions. I wanted to see the accuracy of the information.
No, Shakespeare did not earn a BA in English from Cambridge.
From my experience, AI has not yet bridged being a repository of all knowledge and being able to generate flawless information. The AI writing programs I have investigated demonstrate the inability to generate text that is accurate and well-written.
However, a companion AI like Replika struggles to retrieve information accurately. It excels in creating conversational responses. A potential use of Replika would be to give English-language-learners a chance to practice conversational English.
But then, do we really want a know-it-all friend?