Takes eternity for it to respond to anything. RAM usage is normal, storage is fine… don’t have much installed. Are there bugs I don’t know about in the 1.0 release or is potentially something else going on?
I believe its “up to” 3 words a second. pretty slow
Sure but, it takes like at least 3 minutes for it to tell me that “Two plus two = 4.” If that’s normal, then I don’t see the hype.
wow. yea thats slow.
I been using the api version of chatgpt. ( Chatbot UI)
Although I find the web version is more reliable.
I wouldn’t expect much from the RPi 4 for using llama but maybe im wrong.
Hopefully someone else chimes in
It’s the Umbrel Home, should be good. I’m just gonna assume it’s bugs, but yeah, still want someone to chime in.
I run it LlamaGPT 2 7B on a NUC13 with 64G RAM and a i7-1360P CPU (i.e.: much faster than Umbrel Home) and it is still very slow. On top of it, it generates very poor quality of the responses. Even basic questions are answered in a factually incorrect way. And the worst is the “condescending” tone of Llama. Contrary to ChatGPT who is always professional, acknowledge its mistakes and learn from them, Llama argue with you that its (wrong) answers are right. I had a lot of hope with this app but ended up deleting it after playing with it for a couple of hours. This version is a waste of time but will try again when 70B is available (which sounds could be soon according to the GitHub repository).