Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items
Ransom ( @Ransom@lemmy.ca ) 5•2 years agoA spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”
Disappointed. Bloody hell. A company that’s too cheap to curate some actual recipes is “disappointed” because the tech they built is creating lethal recipes. I wonder what their liability is if someone tried and got hurt/sick.
jimbolauski ( @jimbolauski@kbin.social ) 3•2 years agoLet’s turn the outrage dial back a bit. People had the bot make recipies with non-grocery items for laughs ( ie bleach and ammonia) and unsurprisingly the bot combined them.
conciselyverbose ( @conciselyverbose@kbin.social ) 2•2 years agoThe fact that it’s capable of doing so is obscenely dangerous and should draw serious legal attention.
Ransom ( @Ransom@lemmy.ca ) 2•2 years agoCorporations acting all parentally “disappointed” because they were too cheap to properly code it isn’t okay.
UnhappyCamper ( @UnhappyCamper@kbin.social ) 3•2 years agoSuch poor coding, how hard could it be to not let people use products that aren’t food? This along with their comment just makes them seem so lazy.
mishimaenjoyer ( @mishimaenjoyer@kbin.social ) 3•2 years agoso this is how ai starts to try to genocide us …
conciselyverbose ( @conciselyverbose@kbin.social ) 2•2 years agoA spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.
You should be thrilled that people are highlighting the issue before your batshit insane implementation literally kills someone.
How fucking hard is it to define stuff as “food” or “not food”?
vettnerk ( @vettnerk@lemmy.ml ) 1•2 years agoTo the AIs defense, that’s a damn effective depression meal.