Ask HN: How do you use local LLMs?
10 points
4 months ago
| 3 comments
| HN
What applications(as in use cases, not in software names) have you found for local LLMs that you use yourself on a daily basis, and which LLMs do you use?
henry_flower
4 months ago
[-]
not exactly daily, but I use llama3.2-vision (via ollama) to generate a .txt file alongside every photo I take, containing a description of the photo. then I can just grep for, say, "selfie":

    $ alias omglol='find -name \*txt | xargs -n50 grep -li'
    $ omglol selfie
    ./10/IMG_20241019_204444.txt
    ./09/IMG_20240930_082108.txt
    ./09/IMG_20240930_082118.txt
    ./07/IMG_20240712_154559.txt
    ./07/IMG_20240712_154554.txt
or to do a slide show:

    $ omglol selfie | sed s/txt/jpg/ | xargs feh
reply
pizza
4 months ago
[-]
You might be able to load up all the txt files into the embedded vector db in `llm` so that you could also query them semantically.
reply
Haeuserschlucht
4 months ago
[-]
Optimizer detected!
reply
Haeuserschlucht
4 months ago
[-]
Great idea!
reply
thatjoeoverthr
4 months ago
[-]
Small pretrained models are often too “dumb” to be useful, but, if you have a task in which you can tune a model, their pretraining means you can produce an effective model with a shockingly small corpus. These can be more reliable than off the shelf models in an automated process, because even the large pretrained models have a lot of “behaviors” you can trigger with surprising inputs. Most recently I retrained SmolLM2 to translate intents to SDXL prompts.
reply
ud0
4 months ago
[-]
I use Deepseek via LLM studio for reading sensitive/non-sensitive docs, contracts, searching bank statements & bills.
reply
Haeuserschlucht
4 months ago
[-]
> for reading sensitive/non-sensitive docs, contracts

Can you elaborate?

reply