Tell HN: Crosstalk when using Ollama with cloud DeepSeek models?
3 points
1 hour ago
| 0 comments
The other day I was experimenting with `deepseek-v3.1:671b-cloud` using Ollama, when a coding question got answered with a medical diagnosis on the basis of a list of symptoms. I put it down to some strange LLM hallucination, but today a Reddit thread [1] showed a similar phenomenon of an answer referring to a totally different prompt.The only logical explanation seems to be that the server is occasionally failing to pair prompts and answers. If this is the case I think users should be aware of this problem, in addition to the other general security issues of using non-local models.
Has anyone else experienced this?
[1] (NSFW) https://old.reddit.com/r/ollama/comments/1rqoez5/so_this_has_started_happening_recently_with/
No one has commented on this post.