- cross-posted to:
- hackernews@lemmy.smeargle.fans
- technology@lemmy.ml
- ExtremeDullard ( @ExtremeDullard@lemmy.sdf.org ) 36•7 months ago
And this is a surprise how?
The entire digital economy is based on spying. It’s called corporate surveillance and it’s been around for 25 years. Why would AI escape this business model? If anything, it turbocharges it.
- EveryMuffinIsNowEncrypted ( @EveryMuffinIsNowEncrypted@lemmy.blahaj.zone ) English31•7 months ago
NO SHIT.
- macniel ( @DmMacniel@feddit.de ) 9•7 months ago
I’m shooked I say. shooked!
- EveryMuffinIsNowEncrypted ( @EveryMuffinIsNowEncrypted@lemmy.blahaj.zone ) English3•7 months ago
Well, not that shocked.
- macniel ( @DmMacniel@feddit.de ) 3•7 months ago
I’m mildly shooked then.
- EveryMuffinIsNowEncrypted ( @EveryMuffinIsNowEncrypted@lemmy.blahaj.zone ) English4•7 months ago
That was a Futurama reference I was continuing that I thought you were making. Lol.
- const_void ( @const_void@lemmy.ml ) 8•7 months ago
Given how hard they’ve been pushing Copilot/Bing Chat/etc I’m not surprised
- sub_ubi ( @sub_ubi@lemmy.ml ) 4•7 months ago
As a bad Python scripter, I’m stuck using Microsoft’s AI because there isn’t a privacy-focused alternative anywhere near as good.
- Vojtěch Fošnár ( @vfosnar@beehaw.org ) 4•7 months ago
Don’t overuse AI, there is plenty of resources on the web and at least you can practice reading docs. Use Phind. https://www.phind.com/privacy
- swordsmanluke ( @swordsmanluke@programming.dev ) 2•7 months ago
It’s not as good, but running small LLMs locally can work. I’ve been messing around with ollama, which makes it drop dead simple to try out different models locally.
You won’t be running any model as powerful as ChatGPT - but for quick “stack overflow replacement” style of questions I find it’s usually good enough.
And before you write off the idea of local models completely, some recent studies indicate that our current models could be made orders of magnitude smaller for the same level of capability. Think Moore’s law but for shrinking the required connections within a model. I do believe we’ll be able to run GPT3.5-level models on consumer grade hardware in the very near future. (Of course, by then GPT-7 may be running the world but we live in hope).
- Facebones ( @Facebones@reddthat.com ) 1•7 months ago
SkyGPT
- grandel ( @grandel@lemmy.ml ) 1•7 months ago
Isn’t that their business model? How else can windows be offered for “free”?