When bizarre and misleading answers to research requests generated by Google’s new Presentation of the AI Having become viral on social networks last week, the company published statements that have generally minimized the concept that technology has had problems. Late Thursday, the company’s research manager Liz Reid admitted that the Flubs had highlighted areas that needed improvement, writing: “We wanted to explain what had happened and the measures that We took. “
Reid post Directly referenced two of the most viral and incorrect overview results. We saw Google algorithms Eat rocks Because doing it “maybe good for you”, and the other suggested using non -toxic glue for thicken the pizza sauce.
Rocky food is not a subject that many people wrote or asked online questions, so there are not many sources for a search engine on which to rely. According to Reid, the AI tool found An onion articleA satirical website, which had been republished by a software company, and it misinterpreted information as factual.
As for Google telling its users to put glue on the pizza, Reid actually attributed the error to a sense of humor failure. “We saw IA glimps which included sarcastic or troll-y content of discussion forums,” she wrote. “Forums are often an excellent source of authentic and first -hand information, but in some cases can lead to less than stimulating advice, such as using glue to get cheese to stick to pizza.”
It is probably preferable to make no kind of dinner menu generated by AI without reading it carefully first.
Reid also suggested that judging the quality of Google’s new vision on research based on viral screenshots would be unfair. She said that the company had carried out in -depth tests before its launch and that the company data show that people appreciate the IA glimpses, in particular by indicating that people are more likely to stay on a discovery page This way.
Why embarrassing failures? Reid characterized the errors that drew attention as the result of an audit on the internet level which was not always well intended. “There is nothing like millions of people using functionality with many new research. We have also seen new absurd research, apparently aimed at producing erroneous results. »»