Thursday, April 24, 2025

“You can’t lick the badger twice”: Google failures emphasize the basic disadvantage of AI

Share

Here is a nice, petite note from your work day: go to Google, enter any sentence, add the word “meaning” and search. See! The AI ​​Google review will not only confirm that your gibberish is a real saying, but also will tell you what it means and how it has been derived.

It is really humorous and you can find a lot examples in social media. In the world of artificial intelligence reviews, “a loose dog does not surf” is “a funny way to say that something will not happen or that something will not work.” The invented phrase “wired is wired” is an idiom that means that “someone’s behavior or features are a direct result of their inseparable character or” wiring “, just as the computer function is determined by its physical connections.”

Everything sounds perfectly likely, provided with unwavering confidence. In some cases, Google even contains reference links, which gives the added answer of the gloss of authority. It is also bad, at least in the sense that the review causes the impression that these are common phrases, not a few random words thrown together. And although it is a stupid AI review thinks “Never throw a box in a pig” is a proverb with biblical rest, it is also a neat closing of where generative AI is still not tiny.

As a reservation at the bottom of each AI review, Google uses generative artificial “experimental” artificial intelligence to supply its results. Generative AI is a powerful tool with all kinds of justified practical applications. But two defining features appear when he explains these invented phrases. First of all, it is that it is ultimately a probability machine; Although it may seem that the system based on huge languages ​​has thoughts and even feelings, at the basic level it simply places one most reliable word after the other, putting the track when the train passes forward. This makes it very good to come up with an explanation of what these expressions I would do They mean if it means something, not again.

“Forecasting the next word is based on huge training data,” says Ziang Xiao, an IT specialist at Johns Hopkins University. “However, in many cases the next coherent word does not lead us to the right answer.”

Another factor is that AI is to be content; Studies have shown that chatbots often tell people what they want to hear. In this case, it means taking you a word “You can’t lick badgers twice” It is an accepted phrase phrase. In other contexts, this may mean a reflection of your own prejudices back to you, as demonstrated by a team of researchers led by Xiao test last year.

“It is extremely difficult to take into account every individual query or leading user questions,” says Xiao. “This is particularly difficult for unusual knowledge, languages ​​in which much less content, and minority perspectives are available. Because AI search is such a complex system, error cascades.”

Latest Posts

More News