Pro@programming.dev to Technology@lemmy.worldEnglish · 9 天前Google quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square48fedilinkarrow-up1243arrow-down126
arrow-up1217arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 9 天前message-square48fedilink
minus-squareAmbiguousProps@lemmy.todaylinkfedilinkEnglisharrow-up23·9 天前That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up4·9 天前Yes, that’s my setup. But this will be useful for cases where internet connection is not reliable
minus-squareOhVenus_Baby@lemmy.mllinkfedilinkEnglisharrow-up2·8 天前How is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.
minus-squareAmbiguousProps@lemmy.todaylinkfedilinkEnglisharrow-up1·8 天前It’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.
That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
Yes, that’s my setup. But this will be useful for cases where internet connection is not reliable
How is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.
It’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.