- Empricorn ( @Empricorn@feddit.nl ) English40•5 months ago
Some “AI” LLMs resort to light hallucinations. And then ones like this straight-up gaslight you!
- Margot Robbie ( @MargotRobbie@lemm.ee ) 29•5 months ago
Ok, let me try listing words that ends in “um” that could be (even tangentially) considered food.
- Plum
- Gum
- Chum
- Rum
- Alum
- Rum, again
- Sea People
I think that’s all of them.
- dethedrus ( @dethedrus@lemmy.dbzer0.com ) 6•5 months ago
The Sea Peoples consumed by the Late Bronze Age collapse (or were a catalysts thereof)?
Or just people at sea eaten by krakens? Cause they definitely count.
- Margot Robbie ( @MargotRobbie@lemm.ee ) 4•5 months ago
It’s a dirty joke.
- LNRDrone ( @LNRDrone@sopuli.xyz ) 22•5 months ago
Coconutum
- callouscomic ( @callouscomic@lemm.ee ) English11•5 months ago
You did WHAT to em?
- shininghero ( @shininghero@kbin.social ) 13•5 months ago
Strawberrum sounds like it’ll be at least 20% abv. I’d like a nice cold glass of that.
- lemmyng ( @lemmyng@lemmy.ca ) English11•5 months ago
Strawberrum? Barely knew 'em!
- Annoyed_🦀 ( @Annoyed_Crabby@monyet.cc ) 10•5 months ago
Gemini thought we name food like we name a periodic table
- TonyTonyChopper ( @TonyTonyChopper@mander.xyz ) 5•5 months ago
plutonium is food once
- Sunny' 🌻 ( @Sunny@slrpnk.net ) 9•5 months ago
It’s crazy how bad d AI gets of you make it list names ending with a certain pattern. I wonder why that is.
- blindsight ( @blindsight@beehaw.org ) 5•5 months ago
LLMs aren’t really capable of understanding spelling. They’re token prediction machines.
LLMs have three major components: a massive database of “relatedness” (how closely related the meaning of tokens are), a transformer (figuring out which of the previous words have the most contextual meaning), and statistical modeling (the likelihood of the next word, like what your cell phone does.)
LLMs don’t have any capability to understand spelling, unless it’s something it’s been specifically trained on, like “color” vs “colour” which is discussed in many training texts.
"Fruits ending in ‘um’ " or "Australian towns beginning with ‘T’ " aren’t talked about in the training data enough to build a strong enough relatedness database for, so it’s incapable of answering those sorts of questions.
- Even_Adder ( @Even_Adder@lemmy.dbzer0.com ) English5•5 months ago
It can’t see what tokens it puts out, you would need additional passes on the output for it to get it right. It’s computationally expensive, so I’m pretty sure that didn’t happen here.
- ramirezmike ( @ramirezmike@programming.dev ) 1•5 months ago
doesn’t it work literally by passing in everything it said to determine what the next word is?
- adderaline ( @ondoyant@beehaw.org ) English1•5 months ago
it chunks text up into tokens, so it isn’t processing the words as if they were composed from letters.
- some_guy ( @some_guy@lemmy.sdf.org ) 5•5 months ago
Ok, I feel like there has been more than enough articles to explain that these things don’t understand logic. Seriously. Misunderstanding their capabilities at this point is getting old. It’s time to start making stupid painful.
- PhAzE ( @PhAzE@lemmy.ca ) 3•5 months ago
Tomatum… that’s the one