Is it possible to get Ollama to use the GPU. I have logged into the terminal and been messing with this for ages, I can get ollama to detect the GPU inside docker if stop and start a new instance with teh GPUs flag, but then none of the connections work from the dashboard. The weubUI dosnt detect it etc.
Anyone have any ideas were im going wrong?