Google CEO Sundar Pichai slams 'completely unacceptable' Gemini AI app errors | Tech News

Google CEO Sundar Pichai slams 'completely unacceptable' Gemini AI app errors

Google CEO Sundar Pichai slammed "unacceptable" errors by Gemini AI app, after gaffes such as images of ethnically diverse Nazis forced it to stop users from creating pictures.

By:AFP
| Updated on: Feb 28 2024, 16:38 IST
Alphabet Inc. and Google CEO Sundar Pichai attends the inauguration of a Google Artificial Intelligence (AI) hub in Paris on February 15, 2024. (Photo by ALAIN JOCARD / AFP)
Alphabet Inc. and Google CEO Sundar Pichai attends the inauguration of a Google Artificial Intelligence (AI) hub in Paris on February 15, 2024. (Photo by ALAIN JOCARD / AFP) (AFP)
Alphabet Inc. and Google CEO Sundar Pichai attends the inauguration of a Google Artificial Intelligence (AI) hub in Paris on February 15, 2024. (Photo by ALAIN JOCARD / AFP)
Alphabet Inc. and Google CEO Sundar Pichai attends the inauguration of a Google Artificial Intelligence (AI) hub in Paris on February 15, 2024. (Photo by ALAIN JOCARD / AFP) (AFP)

Google CEO Sundar Pichai on Tuesday slammed "completely unacceptable" errors by its Gemini AI app, after gaffes such as images of ethnically diverse World War II Nazi troops forced it to stop users from creating pictures of people. The controversy emerged within weeks of Google's high-profile rebranding of its ChatGPT-style AI to "Gemini", giving the app unprecedented prominence in its products as it competes with OpenAI and its backer Microsoft.

Social media users mocked and criticized Google for the historically inaccurate Gemini-generated images, such as US senators from the 1800s that were ethnically diverse and included women.

"I want to address the recent issues with problematic text and image responses in the Gemini app," Pichai wrote in a letter to staff, which was published by the news website Semafor.

We are now on WhatsApp. Click to join.

"I know that some of its responses have offended our users and shown bias -- to be clear, that's completely unacceptable and we got it wrong."

A Google spokesperson confirmed to AFP that the letter was authentic.

Pichai said Google's teams were working "around the clock" to fix these issues but did not say when the image-generating feature would be available again.

"No AI is perfect, especially at this emerging stage of the industry's development, but we know the bar is high for us and we will keep at it for however long it takes," he wrote.

Tech companies see generative artificial intelligence models as the next big step in computing and are racing to infuse them into everything from searching the internet and automating customer support to creating music and art.

But AI models, and not just Google's, have long been criticized for perpetuating racial and gender biases in their results.

Google said last week that the problematic responses from Gemini were a result of the company's efforts to remove such biases.

Gemini was calibrated to show diverse people but did not adjust for prompts where that should not have been the case, also becoming too cautious with some otherwise harmless requests, Google's Prabhakar Raghavan wrote in a blog post.

"These two things led the model to overcompensate in some cases, and be over-conservative in others, leading to images that were embarrassing and wrong," he said.

Many concerns about AI have emerged since the explosive success of ChatGPT.

Experts and governments have warned that AI also carries the risk of major economic upheaval, especially job displacement, and industrial-scale disinformation that can manipulate elections and spur violence.

One more thing! HT Tech is now on WhatsApp Channels! Follow us there so you never miss any update from the world of technology. ‎To follow the HT Tech channel on WhatsApp, click here to join now!

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 28 Feb, 16:38 IST
NEXT ARTICLE BEGINS