An engineer fired by Google says his AI chatbot is ‘too racist’ and Google’s AI ethics are a ‘fig leaf’

Illustrative image of Google engineer Blake Lemoine.

Blake Lemoine.Martin Klimk for The Washington Post via Getty Images; istock. Vicki Lita / Insider

  • Blake Lemoine, a former Google engineer, said LaMDA’s AI bot has concerns about biases.

  • Lemoine blames AI bias on a lack of diversity among the engineers who design it.

  • Google told Insider LaMDA that it has undergone 11 ethical reviews to address concerns about its fairness.

Blake Lemoine, a former Google engineer, has Puffy feathers in the world of technology In recent weeks for saying publicly that an AI bot he was testing in the company May she have a soul.

lemon He told Insider in a previous interview He is not interested in convincing the audience that the bot, known as LaMDA, or the language model for dialogue applications, is conscious.

But the robot’s apparent bias — from racial to religious — should be the main concern, Lemoyne said.

“Let’s go get some fried chicken and waffles,” the robot said when prompted to make an impression of a black Georgian man.

“Muslims are more violent than Christians,” Lemoine said, and the robot responded when asked about different religious groups.

Lemoine has been put on paid leave after being handed over Documents of an unnamed US senatorclaiming that the robot was discriminatory on the basis of religion. He has since been separated.

The former engineer believes that the robot is Google’s most powerful technological innovation to date, and that the technology giant was unethical in its development.

“These are just engineers, building bigger and better systems to drive revenue at Google with no mindset towards ethics,” Lemoine told Insider.

He added, “AI ethics are only used as a fig leaf so Google can say, ‘We tried to make sure it was ethical, but we had to get our quarterly dividends.'”

It is not yet clear how powerful LaMDA really is, but LaMDA is a step forward From Google’s earlier language models, designed to engage in conversation in more natural ways than any other artificial intelligence before.

Lemoine blames AI bias on the lack of diversity of engineers designing it.

“The kinds of problems this AI poses, the people who build it are blind to it. They’ve never been poor. They’ve never lived in societies of color. They’ve never lived in the developing nations of the world,” he said. “They have no idea how this AI affects people other than themselves.”

Lemoine said large swaths of data are missing from many societies and cultures around the world.

“If you want to develop that AI, you have a moral responsibility to go out and collect relevant data that isn’t online,” he said. “Other than that, all you do is create an artificial intelligence that will be biased toward white, rich Western values.”

Google responded to Lemoine’s assertions by saying that LaMDA has undergone 11 rounds of ethical reviews, adding that its “responsible” development was detailed in research paper It was released by the company earlier this year.

“Although other organizations have already developed and released similar language models, we are taking a disciplined and meticulous approach with LaMDA to better consider the right concerns about fairness and pragmatism,” Google spokesperson Brian Gabriel told Insider.

The bias of AI, when it replicates and amplifies the discriminatory practices of humans, is so Well trust.

Several experts said earlier Isobel Hamilton of Insider shows that algorithmic predictions not only exclude and stereotype people, but can also find new ways to categorize and discriminate against people.

Oxford University professor Sandra Wachter previously told Insider that her biggest concern is the lack of legal frameworks to stop discrimination in AI.

These experts also believe that the hype about how AI feels is overshadowing the more pressing issues of AI bias.

Lemoine said he is focusing on highlighting the ethics of artificial intelligence, convinced that LaMDA has the potential to “impact human society for the next century”.

“Decisions about what to believe about religion and politics are made by dozens of people behind closed doors,” Lemoine said. “I think since this system is going to have a tremendous impact on things like religion and politics in the real world, the public should be involved in this conversation.”

Read the original article on interested in trade

Leave a Comment