Ollama - super easy to host local LLMgithub.comexternal-linkcross-posted to: technews@radiation.party Dissk ( @Dissk@alien.top ) B Self-Hosted Main@selfhosted.forum • 10 months ago message-square5fedilinkarrow-up18
arrow-up18external-linkOllama - super easy to host local LLMgithub.com Dissk ( @Dissk@alien.top ) B Self-Hosted Main@selfhosted.forum • 10 months ago message-square5fedilinkcross-posted to: technews@radiation.party
minus-square Gamma ( @GammaGames@beehaw.org ) linkfedilinkEnglish3•10 months agoThere’s also an unofficial web frontend https://github.com/ollama-webui/ollama-webui Though I can’t get compose to use my gpu
There’s also an unofficial web frontend https://github.com/ollama-webui/ollama-webui
Though I can’t get compose to use my gpu