One of my favorite reviews is a 2-star review asking for “local model support”.
My first reaction was: who installs a 100KB Chrome extension to talk to a 10GB model running locally?
But it did make me curious. Are people actually running Ollama or LM Studio as part of their daily workflow?
For anything dealing with personal data, like browser inputs, I would exclusively use local models too. Probably still niche, but non-local AI would be a deal breaker for me for both browsers and OS.