Home / News / Artificial Intelligence / Google has temporarily halted the AI tool Gemini’s capability to create photos of individuals due to previous mistakes

Google has temporarily halted the AI tool Gemini’s capability to create photos of individuals due to previous mistakes

Google has temporarily halted Gemini, its main generative AI suite, from creating pictures of humans in order to update the model and enhance the historical accuracy of its results.

The corporation said on social networking platform X that it would temporarily stop creating photos of individuals to remedy historical mistakes.

“During this time, we will temporarily halt the creation of images of people and will introduce an enhanced version at a later date,” the statement said.

Google released the Gemini picture generating tool last month. Recently, there have been instances of the AI creating inaccurate representations of historical figures, such the U.S. Founding Fathers portrayed as American Indian, black, or Asian, which have been shared on social media, resulting in criticism and mockery.

Paris-based venture entrepreneur Michael Jackson criticized Google’s AI as “a nonsensical DEI parody” in a post on LinkedIn. (DEI stands for ‘Diversity, Equity, and Inclusion.’)

Google acknowledged that the AI was creating flaws in historical picture production and said that they are actively striving to enhance the accuracy of these portrayals. Gemini’s AI picture creation produces a diverse array of individuals. Since people all over the world use it, that is frequently advantageous. However, it is not meeting the intended goal here.

Generative AI technologies create results using training data and other factors like model weights.

These techniques are often criticized for generating biased results, such as sexualized images of women or depicting white males in high-status employment jobs.

A previous version of Google’s AI picture categorization technology incorrectly identified black men as gorillas in 2015, sparking public outcry. The business promised to resolve the problem, but according to Wired’s story a few years later, their solution was really a workaround: Google opted to restrict the technology from identifying gorillas altogether.

About Chambers

Check Also

The Air Force has abandoned its attempt to install a directed-energy weapon on a fighter jet, marking another failure for airborne lasers

The U.S. military’s most recent endeavor to create an airborne laser weapon, designed to safeguard …