Twitter AI google

The great Greg Jenner shared a bizarre Google AI trick you’ll want to try immediately

Google is such an incredible research tool that its name has become a byword for looking up information. But a recent discovery has exposed a flaw in Google’s AI model, Gemini.

Shared by author and historian Greg Jenner over on Bluesky, it appears that if you type a random phrase into Google and add ‘meaning’ at the end, Gemini will do its best to instantly generate an appropriate-sounding definition.

Of course, believing what you read on the internet is always a risk. However Jenner decided to put the theory originally shared on Threads to the test by trying out a fictional phrase of his own. And Gemini didn’t disappoint by coming up with a meaning for the phrase ‘you can’t lick a badger.’

It’s hard to know what’s worse, the fact that the well of online knowledge has been poisoned by AI, or that the definition actually sounds like it makes sense. Jenner himself is the first to admit that, while amusing, this little quirk of Gemini has grim implications.

Before you go running to Google to create your own wacky phrases, here are some other deranged definitions that Gemini has been forced to conjure up…

1.

2.

3.

4.

5.

6.