Ask HN: How do you use local LLMs?
4 points
23 hours ago
| 1 comment
| HN
What applications(as in use cases, not in software names) have you found for local LLMs that you use yourself on a daily basis, and which LLMs do you use?
henry_flower
23 hours ago
[-]
not exactly daily, but I use llama3.2-vision (via ollama) to generate a .txt file alongside every photo I take, containing a description of the photo. then I can just grep for, say, "selfie":

    $ alias omglol='find -name \*txt | xargs -n50 grep -li'
    $ omglol selfie
    ./10/IMG_20241019_204444.txt
    ./09/IMG_20240930_082108.txt
    ./09/IMG_20240930_082118.txt
    ./07/IMG_20240712_154559.txt
    ./07/IMG_20240712_154554.txt
or to do a slide show:

    $ omglol selfie | sed s/txt/jpg/ | xargs feh
reply
pizza
3 hours ago
[-]
You might be able to load up all the txt files into the embedded vector db in `llm` so that you could also query them semantically.
reply
Haeuserschlucht
17 hours ago
[-]
Great idea!
reply