Breaking news >>>
SuperBuzz recognized as one of the Top-Rated leading Marketing Automation Software tailored for small businesses
Watch now →

The latest ChatGPT update introduces a game-changing feature that’s easy to love—but it might raise some concerns. Are we getting too attached to AI? 🤖❤️

If you’re a paid ChatGPT subscriber, you might have noticed that the AI’s large language model now sounds more human during audio interactions.

This change is due to OpenAI, the company behind the language model and chatbot, conducting a limited pilot of a new feature called “advanced voice mode.”

OpenAI states that this new mode enables more natural, real-time conversations that can recognize and respond to emotions and non-verbal cues. The company plans to roll out the advanced voice mode to all paid ChatGPT subscribers in the coming months.

Advanced voice mode has a surprisingly human-like sound, with no awkward pauses typical of voice assistants; it even mimics human breathing. It handles interruptions effortlessly, delivers fitting emotional cues, and appears to interpret the user’s emotional state based on voice tone.

While enhancing ChatGPT to sound more human, OpenAI has raised concerns that users might start treating the chatbot as if it were a real person, potentially forming intimate relationships with it.

This concern is already a reality. For instance, social media influencer Lisa Li has programmed ChatGPT to act as her “boyfriend.” But why do some individuals form intimate relationships with a chatbot?

Humans have an extraordinary ability for friendship and intimacy, similar to how primates groom each other to form alliances that can be relied upon during difficult times.

Our ancestors also developed a notable ability to “groom” each other through conversation. This verbal interaction contributed to the evolutionary growth of our brain’s language centers and led to increasingly complex uses of language.

The evolution of more complex language facilitated deeper social interactions with larger networks of relatives, friends, and allies, which in turn expanded the social areas of our brains.

Language developed in tandem with human social behavior. We often build friendships or deepen relationships through conversation, drawing acquaintances closer or turning friends into intimate connections.

Experiments from the 1990s showed that engaging in conversational exchanges, particularly those involving the sharing of personal details, creates a feeling of intimacy, making it seem as though our conversation partner is a part of us.

Given this understanding, it’s not surprising that efforts to replicate the process of “escalating self-disclosure” between humans and chatbots often lead to humans feeling a sense of intimacy with the chatbots.

And that’s only with text input. When voice, the primary sensory component of conversation, is introduced, the effect intensifies. Even voice-based assistants that aren’t designed to sound human, like Siri and Alexa, receive numerous marriage proposals.

If OpenAI were to seek my advice on preventing users from forming social relationships with ChatGPT, I would offer a few straightforward recommendations.

First, avoid giving it a voice. Second, ensure it doesn’t appear to engage in meaningful conversation. Essentially, don’t design the product to be what you’ve created.

The product is so effective precisely because it excels at mimicking the traits we rely on to build social relationships.

The signs were evident from the early days of chatbots, nearly 60 years ago. Computers have been seen as social actors for at least 30 years. The advanced voice mode of ChatGPT is simply the latest significant development, rather than the “game changer” that the tech industry might enthusiastically claim.

It became evident early last year that users not only form relationships with chatbots but also develop deep personal feelings, as demonstrated when users of the virtual friend platform Replika AI were unexpectedly cut off from the most advanced features of their chatbots.

Although Replika was less advanced than the latest version of ChatGPT, the quality of interactions was high enough that users developed unexpectedly deep attachments.

Many people who crave non-judgmental companionship will find significant value in this new generation of chatbots. They may experience reduced feelings of loneliness and isolation, highlighting the important benefits technology can offer.

However, the potential risks of ChatGPT’s advanced voice mode are significant. Time spent interacting with a bot is time not spent with friends and family, and those who engage heavily with technology are at higher risk of replacing human relationships with digital ones.

As OpenAI points out, interacting with bots can also affect existing relationships. People might start expecting their partners or friends to behave like the polite, submissive, and deferential chatbots they interact with.

The broader impacts of machines on culture are likely to become more pronounced. On the positive side, they may also offer valuable insights into the workings of culture.