teslasdisciple@lemmy.ca to Lemmy Shitpost@lemmy.world · edit-210 days agomy Lemmy feedplus-squarelemmy.caimagemessage-square34fedilinkarrow-up1660arrow-down19
arrow-up1651arrow-down1imagemy Lemmy feedplus-squarelemmy.cateslasdisciple@lemmy.ca to Lemmy Shitpost@lemmy.world · edit-210 days agomessage-square34fedilink
minus-squareteslasdisciple@lemmy.catoSelfhosted@lemmy.world•1U mini PC for AI?linkfedilinkEnglisharrow-up14·4 months agoI’m running ai on an old 1080 ti. You can run ai on almost anything, but the less memory you have the smaller (ie. dumber) your models will have to be. As for the “how”, I use Ollama and Open WebUI. It’s pretty easy to set up. linkfedilink
minus-squareteslasdisciple@lemmy.catoSelfhosted@lemmy.world•1U mini PC for AI?linkfedilinkEnglisharrow-up1arrow-down1·4 months agodeleted by creator linkfedilink
I’m running ai on an old 1080 ti. You can run ai on almost anything, but the less memory you have the smaller (ie. dumber) your models will have to be.
As for the “how”, I use Ollama and Open WebUI. It’s pretty easy to set up.