International Finance
FeaturedTechnology

IF Insights: Making sense out of GPT-4o’s ‘Scarlett Johansson’ dilemma

IFM_Scarlett Johansson
Hollywood actor Scarlett Johansson was the one who was most taken aback by GPT-4o's voice

The large language model (LLM) known as GPT-4o was the subject of an OpenAI live-streamed event on May 13. Mira Murati, the chief technology officer of the firm, asserted that GPT-4o is faster and more user-friendly than ChatGPT.

Additionally, it was more adaptable and multimodal—the technical term for having text, speech, and visual interactions. We were informed that the new model’s key characteristics included its ability to be interrupted mid-sentence, its extremely low latency (or delay in reacting), and its sensitivity to the user’s emotions.

Subsequently, viewers were treated to the predictable and entertaining show of “Mark and Barret,” two tech brothers right out of central casting corresponding with the machine. Mark began by admitting that he was anxious and the machine assisted him in practising some deep breathing techniques to ease his anxiety.

Subsequently, Barret typed a basic equation onto paper, and the machine demonstrated how to calculate the value of X. Barret then showed the machine a section of computer code, which the system also handled.

Thus far, it was as expected. However, there was something strangely familiar about the machine’s voice, as if it were a seductive woman named “Sky” with a conversational repertoire that included encouragement, optimism, empathy, and maybe even a hint of flirtation. It made the audience think of someone. But of whom?

As it turned out, a lot of people thought it sounded like Scarlett Johansson, the well-known Hollywood actress who voiced the female lead in Spike Jonze’s 2013 film “Her,” which is about a guy who falls in love with the operating system on his computer. Sam Altman, the CEO of OpenAI, stated during a 2023 San Francisco event that he preferred this movie over other science fiction films, including artificial intelligence.

However, Scarlett Johansson was the one who was most taken aback by GPT-4o’s voice. As it happens, Sam Altman had contacted her in September 2023 in an attempt to get her to act as the chatbot’s voice.

“He told me,” she claimed in a statement, “That he felt that by my vocalising the system, I might assist customers feel comfortable with the seismic shift affecting humans and AI and bridge the gap between tech businesses and creatives. He claimed that people would find solace in my voice.”

She turned down the offer, but during the broadcast of the demo, she was inundated with messages from “friends, family, and the general public” complimenting her on how much GPT-4o resembled her. She was even more incensed to learn that Sam Altman had tweeted the single word “Her” on X, which she took to mean that there was purpose behind the machine’s and her voice sounding so similar.

Naturally, OpenAI strongly refutes any unethical behaviour. “Sky’s voice is not Scarlett Johansson’s, and it was never meant to sound like hers. We didn’t seek out Ms. Johansson before casting the voice actress behind Sky’s voice,” an OpenAI representative stated in a statement that the business claimed Sam Altman wrote.

However, the statement continues, saying, “We have ceased utilising Sky’s voice in our goods out of respect for Ms. Johansson. We apologise to Ms. Johansson for our poor communication. Oh, how awful.”

In the movie “Her,” Scarlett Johansson plays an operating system, and Joaquin Phoenix plays Theodore, a guy who falls in love with it.

Of course, one might argue that this is a storm in a champagne bottle on one level. Scarlett Johansson’s notoriety and high-priced legal representation may have nothing to do with OpenAI’s apparent newfound “respect” for her. It’s also possible that Sam Altman wasn’t making fun of her when he tweeted “Her.”

Ultimately, though, this small altercation illuminates the murky core of generative artificial intelligence (AI), as tech writer Charlie Warzel put it in The Atlantic.

This technology is based on theft, is rationalised by legalistic posturing about “fair use” on three levels, and is justified by a worldview that holds that the hypothetical “superintelligence” that tech companies are developing is too significant, too world-changing, and too important for trivial issues like copyright and attribution.

“The Johansson issue is only a reminder of AI’s manifest-destiny philosophy: this is happening, whether you like it or not,” as Warzel puts it, and he is correct. It is, and most of us don’t, is the appropriate response.

What's New

IF Insights: Australia’s big fight against scams

IFM Correspondent

After Donald Trump’s historic win, investors savour ‘red sweep’ possibilities

IFM Correspondent

Start-up of the Week: Prelude & the art called fraud-resistant SMS verification services

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.