Microsoft CEO Satya Nadella urges people to stop treating AI like humans

A week following the launch of OpenAI's personal assistant—which has the ability to laugh, sing, and speak in multiple voices—the company's closest partner presented a somewhat different perspective on how users ought to interact with AI technologies.

"I don't like anthropomorphizing AI," Microsoft Corp. Chief Executive Officer Satya Nadella said on Monday, about the use of verbs and nouns typically reserved for people to describe AI. "I sort of believe it's a tool.”

In light of the fact that technology is developing and responding in ways that seem more human-like, Nadella's comments allude to a debate that is currently taking place in the tech sector over how much to humanise AI services, Bloomberg reported.

Although it is possible to create AI tools that "show emotion," a Google executive stated last week that the business would rather concentrate on "being super helpful and super useful." OpenAI has adopted an alternative approach. Last week, the business gave a demonstration of a new voice assistant that it claims is capable of understanding emotions and expressing its own.

The AI voice on stage seemed to hit on an employee utilising a tool several times throughout the presentation. Many people on social media compared the feature to the dystopian film "Her," a comparison fueled by one particular voice option that users said resembled the film's star, Scarlett Johansson.

In a statement provided to NPR, Johansson stated that she was contacted by OpenAI CEO Sam Altman, who requested that she think about providing an audio chat function. Johansson claims that she was pitched by Altman with the notion of helping “consumers to feel comfortable with the seismic shift concerning humans and AI." 

She declined and claimed that since then, OpenAI's choice to proceed with a voice that sounds similar has pushed her to hire lawyers. (OpenAI has since removed the voice and added a new one.)

Tech companies frequently gave human personalities—typically with female-coded names and characteristics—to AI programmes even before ChatGPT popularised AI. This was ostensibly done to encourage human connection and familiarity with the technology.

Microsoft under Nadella hasn't been exempt from such actions either. Over the years, the business has produced a number of conversational and AI programmes, such as Tay and Cortana, named after Halo's female-appearing AI assistant. Not to mention the rogue persona of Bing AI, Sydney.

Artificial intelligence (AI) is prone to be described in human terms as people try to make sense of the math, statistics, and code behind the software by using terms like "learns." This tendency will only grow as tech companies release more capable products that can hold real-time conversations.

However, in the interview, Nadella said users need to be mindful that the abilities AI software displays are not human intelligence. "It has got intelligence, if you want to give it that moniker, but it's not the same intelligence that I have," he said.

In fact, Nadella went so far as to lament the selection of the term "artificial intelligence," first coined in the 1950s. "I think one of the most unfortunate names is 'artificial intelligence' - I wish we had called it 'different intelligence,'" he said. "Because I have my intelligence. I don't need any artificial intelligence.”


Tags: