- Sem ( @sem@lemmy.ml ) English32•2 months ago
If someone missed that: it returned a wrong answer even in the demo video.
- Stefen Auris 🖖 ( @stefenauris@furry.engineer ) 13•2 months ago
@schizoidman I can’t wait to use the energy requirements of a small country to search for shoes and convert from kg to lbs!
- The Doctor ( @drwho@beehaw.org ) English7•2 months ago
GPT: Because nobody in their right mind would waste nukes destroying the Internet.
- pedz ( @pedz@lemmy.ca ) 7•2 months ago
I hope it’s using a shit load of energy, like other “AI” stuff. Because we’re absolutely not in a climate crisis where reducing consumption is necessary. More “AI” that consumes more power, that’s exactly what we need.
- helenslunch ( @helenslunch@feddit.nl ) 2•2 months ago
OpenAI also confirmed it plans to integrate SearchGPT into ChatGPT down the line.
I don’t understand. Isn’t CGPT already just a fancy search engine?
- belated_frog_pants ( @belated_frog_pants@beehaw.org ) 4•2 months ago
No, its fancy autocomplete at a huge scale. Sometimes it returns correct answers.
A search engine should be taking a list of websites and metadata about those websites and returning results based on some ranking with the original desire being to get you what you wanted. (The current desire is just how much money can be extracted from your hands on the keys)
- gerryflap ( @gerryflap@feddit.nl ) 3•2 months ago
No. ChatGPT pulls information out of its ass and how I read it SearchGPT actually links to sources (while also summarizing it and pulling information out of it’s ass, presumably). ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.
- kosmoz ( @kosmoz@lemm.ee ) 2•2 months ago
Kagi supports this since a while. You can end your query with a question mark to request a “quick answer” generated using an llm, complete with sources and citations. It’s surprisingly accurate and useful!
- helenslunch ( @helenslunch@feddit.nl ) 2•2 months ago
ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.
…where do you think CGPT gets the information it “knows” from?
- gerryflap ( @gerryflap@feddit.nl ) 2•2 months ago
From the train dataset that was frozen many years ago. It’s like you know something instead of looking it up. It doesn’t provide sources, it just makes shit up based on what was in the (old) dataset. That’s totally different than looking up the information based on what you know and then using the new information to create an informed answer backed up by sources