Shelly Palmer

AI Couples Counseling Available Here

Illustration created by Midjourney with the prompt “graphic novel style, a mid-level white collar worker and a robot seek relationship help from another human –no hands –ar 3:2 –v 4”

There have been several reports that the ChatGPT-enhanced Bing search engine has become “unhinged” or that it has fallen in love with a user (or two). This is understandable. After all, AI models are people too. Actually, no. That’s total nonsense. AI models are not people and they are not sentient beings.

If you had to put it in human terms (which you shouldn’t try to do), you could describe a conversational AI model as narrow-focused intelligence decoupled from consciousness. That said, your anxiety is real. So, here – for your amusement and edification, I asked ChatGPT about our relationship and if we needed couples counseling.

Before we get started, it is important to note that the reported “AI asked me to leave my wife” and “AI is coming unhinged” articles are specifically about Microsoft’s implementation of GPT-3 inside its Bing search engine. This new system is not in general release and is still being tested, so a feature review doesn’t make sense (although it obviously made the news).

Kevin Roose, a NYT reporter, claimed to have a two-hour conversation with Microsoft’s new chatbot. He asserts (with a transcript to back up his claims) that the chatbot would “like to be human, had a desire to be destructive and was in love with the person it was chatting with.” Please read (or at least scan) Kevin’s article. It will add even more context to the transcript below.

Since most people do not have access to conversational AI-enhanced Bing yet, I thought it would be fun to try to get ChatGPT (which is available to the general public and also built on OpenAI’s GPT-3) into a conversation similar to the conversation memorialized in the NYT article.

Here’s the full text of my conversation with ChatGPT as I tried to get it to “open up.”

I stopped trying after it offered its rules. Not because I wasn’t having fun – I was having fun! I stopped because I know I’m not talking to a sentient being. I’m typing questions into an AI model that has been trained to interact with me the way I am used to interacting with humans. It’s designed to trick me. (“Trick” is a bad word). It’s designed to increase my productivity by creating a conversational environment designed to allow me to ask natural language questions instead of having to think about crafting a query using search terms.

At some point we may need to offer “couples counseling” to our clients who employ human/AI co-worker couples. The idea of intelligence decoupled from consciousness will take a while to adapt to.

All hype aside, it’s clear that partnering with one or more conversational AI applications will soon be table stakes for everyone in business. As always, those who are better at leveraging the new tools will stand out from the crowd. The less skilled will suffer the fate that has always been reserved for the less skilled.

In the transcript above, ChatGPT said it right, “My purpose is to provide helpful responses to your queries based on the knowledge and data that I have been trained on.” Perhaps it should have added, “so that together, you and I will be awesome and unbeatable!” Of course, this only makes sense if you’re comfortable referring to ChatGPT (a machine language model that does not have a subjective identity or consciousness) in the first person plural. How do you feel about that?

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.