Ruletaniclemmy.blahaj.zoneimage outer_spec ( @outer_spec@lemmy.blahaj.zone ) 196@lemmy.blahaj.zone • 10 months ago message-square7fedilinkarrow-up1107
arrow-up1107imageRuletaniclemmy.blahaj.zone outer_spec ( @outer_spec@lemmy.blahaj.zone ) 196@lemmy.blahaj.zone • 10 months ago message-square7fedilink
minus-square Norah - She/They ( @princessnorah@lemmy.blahaj.zone ) linkfedilinkEnglish2•10 months agoHope you like 40 second response times unless you use a GPU model.
minus-square JDubbleu ( @JDubbleu@programming.dev ) linkfedilink10•10 months agoI’ve hosted one on a raspberry pi and it took at most a second to process and act on commands. Basic speech to text doesn’t require massive models and has become much less compute intensive in the past decade.
minus-square Norah - She/They ( @princessnorah@lemmy.blahaj.zone ) linkfedilinkEnglish2•10 months agoOkay well I was running faster-whisper through Home Assistant.
Hope you like 40 second response times unless you use a GPU model.
I’ve hosted one on a raspberry pi and it took at most a second to process and act on commands. Basic speech to text doesn’t require massive models and has become much less compute intensive in the past decade.
Okay well I was running faster-whisper through Home Assistant.