^

Technology

OpenAI worries its ChatGPT AI voice may charm users

Glenn Chapman - Agence France-Presse
OpenAI worries its ChatGPT AI voice may charm users
A person looks at Wehead, an AI companion that can use ChatGPT, during Pepcom's Digital Experience at the The Mirage resort during the Consumer Electronics Show (CES) in Las Vegas, Nevada on January 8, 2024.
AFP/Brendan Smialowski

SAN FRANCISCO, United States — OpenAI says it is concerned that a realistic voice feature for its artificial intelligence might cause people to bond with the bot at the cost of human interactions.

The San Francisco-based company cited literature which it said indicates that chatting with AI as one might with a person can result in misplaced trust and that the high quality of the GPT-4o voice may exacerbate that effect.

"Anthropomorphization involves attributing human-like behaviors and characteristics to nonhuman entities, such as AI models," OpenAI said Thursday in a report on safety work it is doing on a ChatGPT-4o version of its AI.

"This risk may be heightened by the audio capabilities of GPT-4o, which facilitate more human-like interactions with the model."

OpenAI said it noticed testers speaking to the AI in ways that hinted at shared bonds, such as lamenting aloud that it was their last day together.

It said these instances appear benign but must be studied to see how they might evolve over longer periods of time.

Socializing with AI could also make users less adept or inclined when it comes to relationships with humans, OpenAI speculated.

"Extended interaction with the model might influence social norms," the report said.

"For example, our models are deferential, allowing users to interrupt and 'take the mic' at any time, which, while expected for an AI, would be anti-normative in human interactions."

The ability for AI to remember details while conversing and to tend to tasks could also make people over-reliant on the technology, according to OpenAI.

"The recent concerns shared by OpenAI around potential dependence on ChatGPT's voice mode indicate what many have already begun asking: Is it time to pause and consider how this technology affects human interaction and relationships?"

"The recent concerns shared by OpenAI around potential dependence on ChatGPT's voice mode indicate what many have already begun asking: Is it time to pause and consider how this technology affects human interaction and relationships?" said Alon Yamin, co-founder and CEO of AI anti-plagiarism detection platform Copyleaks.

He said AI should never be a replacement for actual human interaction.

OpenAI said it will further test how voice capabilities in its AI might cause people to become emotionally attached.

Teams testing ChatGPT-4o voice capabilities were also able to prompt it to repeat false information and produce conspiracy theories, raising concerns the AI model could be told to do so convincingly.

OpenAI was forced to apologize to actress Scarlett Johansson in June for using something very similar to her voice in its latest chatbot, throwing a spotlight on voice-cloning tech.

Although OpenAI denied the voice they used was Johansson's, their case was not helped by CEO Sam Altman flagging the new model with a one-word message on social media -- "Her".

Johansson voiced an AI character in the film "Her", which Altman has previously said is his favorite film about the technology.

The 2013 film stars Joaquin Phoenix as a man who falls in love with an AI assistant named Samantha.

vuukle comment

AI

ARTIFICIAL INTELLIGENCE

CHATGPT

OPENAI

Philstar
x
  • Latest
Latest
Latest
abtest
Recommended
Are you sure you want to log out?
X
Login

Philstar.com is one of the most vibrant, opinionated, discerning communities of readers on cyberspace. With your meaningful insights, help shape the stories that can shape the country. Sign up now!

Get Updated:

Signup for the News Round now

FORGOT PASSWORD?
SIGN IN
or sign in with