How do you manage context/memory across multiple AI tools?
7 points
5 hours ago
| 5 comments
| HN
I'm curious how others are handling this: I use Claude for some tasks, Cursor for coding, ChatGPT for research, and Perplexity for quick lookups.

The problem is none of them know what I've discussed with the others.

I find myself re-explaining the same context repeatedly, or copy-pasting from Notion docs.

For those of you heavily using AI tools:

- How are you managing shared context across tools?

- What's your current workflow for keeping AI "memory" consistent?

- Have you found any solutions that work well?

Especially interested in hearing from teams where multiple people need to access the same knowledge base across different AI sessions.

mr_o47
1 hour ago
[-]
I built this tool for myself for this same exact problem https://github.com/mraza007/echovault
reply
raw_anon_1111
3 hours ago
[-]
If you are working by yourself, no need to over complicate it, I have three terminal sessions open - Claude, Codex and one for testing that has my AWS credentials.

I tell both Claude Code and Codex to keep the same markdown file updated where I keep my progress, requirements, decisions, thought process etc updated

reply
KurSix
3 hours ago
[-]
Anthropic is pushing MCP for a reason. The idea is to spin up a local MCP server that serves project context - docs, architectural decisions, DB schemas - which Claude Desktop and Cursor (via plugins or natively soon) can connect to

For a team this is a total killer feature: you spin up a shared MCP server with documentation, and all agents instantly sync their "brains"

reply
mejutoco
4 hours ago
[-]
I use markdown files which full detail and summarized versions. I also create files just for an aspect of it (security, performance) so that it makes it easier to reference.
reply
tizzzzz
4 hours ago
[-]
I saw before that someone has developed a software that can transform the context between major AI programs, which should be helpful to your problem.
reply