Ask HN: What's a standard way for apps to request text completion as a service?
4 points
2 days ago
| 1 comment
| HN
If I'm writing a new lightweight application that requires LLM-based text completion to power a feature, is there a standard way to request the user's operating system to provide a completion?

For instance, imagine I'm writing a small TUI that allows you to browse jsonl files, and want to create a feature to enable natural language parsing. Is there an emerging standard for an implementation agnostic, "Translate this natural query to jq {natlang-query}: response here: "?

If we don't have this yet, what would it take to get this built and broadly available?

billylo
1 day ago
[-]
Windows and macOS does come with a small model for generating text completion. You can write a wrapper for your own TUI to access them platform agnostically.

For consistent LLM behaviour, you can use ollama api with your model of choice to generate. https://docs.ollama.com/api/generate

Chrome has a built-in Gemini Nano too. But there isn't an official way to use it outside chrome yet.

reply
nvader
1 day ago
[-]
Is there a Linux-y standard brewing?
reply
billylo
5 hours ago
[-]
Each distro is doing their own thing. If you are targeting Linux mainly, I would suggest to code it on top of ollama or LiteLLM
reply