Show HN: Claude Memory – Long-term memory for Claude
76 points
11 days ago
| 6 comments
| github.com
| HN
twothamendment
10 days ago
[-]
I was really hoping that this was about Claude remembering that it has already crawled every page we have and downloaded every image many times over!
reply
ggnore7452
10 days ago
[-]
Side note: I feel like ChatGPT's long-term memory isn't implemented properly. If you check the 'saved memories,' they are just bad.
reply
deshraj
10 days ago
[-]
100% agree. I have seen similar issues related to both quality and performance of ChatGPT Memory feature.

Shameless plug: We have been working on this problem at Mem0 to solve the long-term memory problem with LLMs. GitHub: https://github.com/mem0ai/mem0

reply
imranq
10 days ago
[-]
Nice. I think in the future this could be way better if everything was local and didn't require a API key. As far as I can tell mem0 is a fancy retrieval system. It could probably work pretty well locally with simpler models
reply
deshraj
10 days ago
[-]
Yes, you can run Mem0 locally since we have open sourced it but would need some more work to have a server up and running to be able to interact with Claude. GitHub: https://github.com/mem0ai/mem0
reply
Eisenstein
10 days ago
[-]
I think you misunderstood what the parent commenter meant. I believe they were talking about running the AI locally, like with llamacpp or koboldcpp or vllm.

I checked your documentation and the only way I can find to run mem0 is with a hosted model. You can use the OpenAI API, which many local backends can support, but I don't see a way to point it at localhost. You would need to use an intermediary service to intercept OpenAI API calls and reroute them to a local backend unless I am missing something.

reply
deshraj
10 days ago
[-]
Ah I see. We do support running Mem0 locally with Ollama. You can checkout our docs here: https://docs.mem0.ai/examples/mem0-with-ollama
reply
chipdart
10 days ago
[-]
I'm a long time Claude user.

Instead of long-term memory I'd be happy if it had short-term reliability. I lost count the number of times this week that Claude failed to process prompts because it was down.

reply
Tostino
10 days ago
[-]
Completely agree on the reliability front...but I don't think mentioning it on some guy's 3rd party GitHub project is going to help all that much with that.
reply
chipdart
10 days ago
[-]
Yes, fair enough. I was just venting some frustration on how brittle and unstable Claude is proving to be. For all the warts that ChatGPT have, at least in comparison is reliable and rock-solid. Outputting higher-quality results in synthetic benchmarks might be nice but it's meaningless if the service is unusable.
reply
kromem
10 days ago
[-]
Are you using mobile?

I've noticed a bug where long conversations timeout on new sends on mobile because of processing time, but in reality the prompt is sent and responded to, it just doesn't show up until you leave and return to the conversation.

reply
pigeons
10 days ago
[-]
How long has claude been around, I didn't know there were long time users
reply
jasonjmcghee
10 days ago
[-]
Checked my email and I signed up / started using it 18 months ago. Not sure how early I was.
reply
quantadev
10 days ago
[-]
I always wonder what the heck people are thinking when they invent some cool AI feature and implement it for one specific LLM since we already have the technology/libraries to make most anything you want to do be able to work with most any LLM. (For you pedantic types, feel free to point out the exceptions).

Personally I use LangChain/Python for this, and that way any new AI features I create therefore easily work across ALL LLMs, and my app just lets the end user pick the LLM they want to run on. Every feature I have works on every LLM.

reply
BoorishBears
10 days ago
[-]
I wonder what the heck you're going on about when this is literally a Chrome extension that hooks into the DOM of a specific LLM's frontend.

Doubly baffling since the underlying project does support LLMs and this is clearly just a showcase piece.

reply
quantadev
10 days ago
[-]
I trusted their README file. If it's incorrect about supporting just Claude don't blame me, blame them.
reply
subeadia
9 days ago
[-]
You completely misunderstood the comment you are replying to.
reply
quantadev
9 days ago
[-]
He misinterpreted my post as saying "Use LangChain in the Browser".

And so now your interpretation of things is that I misinterpreted his misinterpretation. Great work and thanks for your helpful insights.

reply
decide1000
11 days ago
[-]
Where can I download the Firefox extension?
reply
deshraj
11 days ago
[-]
It only support Chrome for now. I built this in few hours quickly to solve my problem. Happy to accept contributions to the repository if someone builds it.
reply
decide1000
11 days ago
[-]
Thanks, I misunderstood. I thought it was a commercial product. Thanks for your effort.
reply
bubaumba
10 days ago
[-]
You were right, "Built using ...". It's a commercial project. Must be hard to lift such things off the ground.
reply
shmatt
11 days ago
[-]
Just ask Claude to convert it
reply