Openclaw local models

Hello,

I am aware openclaw was just released today on the app store but does anyone think they know if you can use ollama’s local models instead openai’s or anthropic’s models for openclaw? If you can, can you please leave a guide on how to do so? Much appreciated.

good question - id like to know as well

Isn’t there a way to configure it via the CLI? Through it, it’s possible to configure other models.

Still not working with local ollama, still “infinite bouncing dots”