I’m sorry but I can’t stop laughing at the 15 spectacularly bad ai ai failure

If you have recently been looking for something, you may have noticed a useful appearance AI summary that appeared before the rest of the search results, such as:

Pay attention to the subtle small text at the bottom, which says, “There may be mistakes in the answers.”

Seems to be It is convenient, but unfortunately, he is prone to “hallucinating” (also known as everything forms). These hallucinations occur because conversations created by large language patterns or LLM to “learn” by swallowing a huge amount of text. But he doesn’t really know things and don’t understand the text as people do. Instead, it uses an algorithm to anticipate which words are likely to be presented based on all the data of his training set. Based on the New York Times, tests found that newer AI models were hallucinated by as much as 79%.

A person writing on a virtual keyboard with a digital search bar above, symbolizing browsing online or online search activities

Suwakeram / Getty Images

Current AI models are also not good to distinguish between jokes and legitimate information, which sadly encouraged Google Ai twins to offer glue as a pizza, which shortly after it was included in the search results in 2024.

Recently, the site, previously known as Twitter, has shared the funniest “twin” hallucinations that Google’s search results, many in response to this viral twist:

        Zerosuitcamus / via x.com
Zerosuitcamus / via x.com

Here are the top 15/worst:

1It is not good to know things like an adult weighs:

The result of the search is humorously reply "What did Bob Dylan weigh" with "6 or 7 pounds," By specifying the weight of birth and causing entertainment on the Internet
Gamesearlpwns / vs x.com

2.And this is very unskilled to be your therapist:

A display of Tweet, mocked by Google's offer to combat depression, involving dangerous activity
President / over x.com

3.These are just as well in solving words of words as stones.

Tweet explains that extended "Lord of the Rings" The trilogy takes 1.99 days instead of 6.09 years using a 7: 1 human -dog ratio

Related: I hate to say that but I’m sure half of the Americans will not be able to pass this very light citizenship test

4.No, seriously:

Tweet showing a humorous mistake: Ai says Luxembourg is smaller than Singapore, but vice versa. User jokes about Google Ai inaccuracy

5.And it does not contain great spaghetti recipes.

Tweet with Google Ai's reply screenshot about the use of gasoline in spaghett, emphasizing absurd offers
okimstillhungry / via x.com

6.Sometimes this provides a correct answer for all the wrong reasons, as in this case, when the person probably wanted to know if Marlon Brando was in 1995. In the film To heat;

Tweet screenshot showing a humorous AI answer about Marlon Brando. The user comments on Ai utility

7.However, it can be truly, really good to improvise as it is one hell yes and.

Matt Rose Tweet: A interprets unconscious phrases as "Two dry frogs are the situation" as people's search reflecting a tendency to overestimate

Related: 19 things prays that are actually straight and we must stop pretending to be otherwise

8.Almost makes me see this imaginary episode Frasier… almost.

The social media record shows a satirical summary of a "Frasier" Ai created by an episode called title "The one where Frasier eats an abortion."
Slooptashboy / Via x.com

9.Sometimes I just don’t know what to say.

Google Summary says that Bella Ramsey and Millie Bobbie Brown are not the same age and that they are both 21 years old

10.Even with the right facts, he can get the exact incorrect answer.

Tweet displayed the result of Google search that indicates

11.Almost impressive how wrong can be.

The summary of the question search was

12.Definitely do not use it to find concert tickets.

DDDDREWDANIEL Tweet on Google Ai's Answer when Matma play in Seattle. Ai review shows a fake concert date alarming to the user

13.Do not take your airport security tips.

Google Ai Gemini said I can bring up to 6 ounces on the plane while it is in a 3.4 ounce tank

14.And remember that never, it is never good to leave the dog in a hot car.

A copy of Tweet screenshots with meme about Ai false information about dog legacy in hot cars by adding a song parody of the song and humorous comment

15.And finally, please, please do not eat rocks.

Tweet shows a humorous AI overview offering to eat at least one small rock a day, questioning Google's utility
Westernkabuki / vs x.com

Currently, Google users are still not able to disable these AI search summaries, but there are several ways to bypass them. One way is to add s to the end of your search query like this:

Search results fragment with information about kittens in litter, usually indicating one to nine kittens, is born of four to six

Some people swear that by adding a curse words to your search query will prevent me from summary, but it didn’t work for me:

Inquiry search results, how much fucking kittens are litter with ai -generated summary

And finally, if you are on your desktop computer, the menu just below the search bar

Search results show information about the number of kittens in the cat litter, ranging from three to six and up to 19 to 19

Do you have a terrible PG’s inability to share? Post a screenshot in the comments below:

Also on the Internet findings: 15 Facebook market elements you want from the depths of your soul, you may not think

Also on the Internet findings: 16 Home City Crime Stories you will not believe in

Also on the Internet finds: people confess their absolutely smallest “revenge served cold” stories, and this is a delicious fun

Leave a Comment