Lee Duna ( @throws_lemy@lemmy.nz ) to TechnologyEnglish · 2 years agoApple wants AI to run directly on its hardware instead of in the cloudarstechnica.comexternal-linkmessage-square35linkfedilinkarrow-up1113cross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
arrow-up1113external-linkApple wants AI to run directly on its hardware instead of in the cloudarstechnica.com Lee Duna ( @throws_lemy@lemmy.nz ) to TechnologyEnglish · 2 years agomessage-square35linkfedilinkcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
minus-square Quokka ( @Marsupial@quokk.au ) linkfedilinkEnglisharrow-up9·2 years agoYou can already run a llm natively on Android devices.
minus-square Vibrose ( @Vibrose@programming.dev ) linkfedilinkarrow-up10·2 years agoYou can on iOS as well! https://apps.apple.com/us/app/private-llm/id6448106860
minus-square snowe ( @snowe@programming.dev ) linkfedilinkarrow-up7·2 years agoThe hard part isn’t running ai on a device… it’s doing so while retaining battery life, performance, and privacy.
minus-square Amaltheamannen ( @Amaltheamannen@lemmy.ml ) linkfedilinkarrow-up4·2 years agoPrivacy is also easy with a local LLM. Performance and battery not so much.
minus-square JackGreenEarth ( @JackGreenEarth@lemm.ee ) linkfedilinkEnglisharrow-up3·2 years agoWhich one do you use? I tried MLCChat, but all 3 times it either showed a java error or generated giberrish, what’s worked for you?
You can already run a llm natively on Android devices.
You can on iOS as well!
https://apps.apple.com/us/app/private-llm/id6448106860
The hard part isn’t running ai on a device… it’s doing so while retaining battery life, performance, and privacy.
Privacy is also easy with a local LLM. Performance and battery not so much.
Which one do you use? I tried MLCChat, but all 3 times it either showed a java error or generated giberrish, what’s worked for you?