Ask HN: How do you manage your AI prompts?
11 points
21 days ago
| 7 comments
| HN
Hello folks!

Do you guys use any tools/process to your AI prompts? Or still prefer to keep it adhoc?

runjake
19 days ago
[-]
Depending on the use case and frequency, I either:

- Save them as a ChatGPT custom GPT or a Claude Project.

- Create a RayCast AI Command. https://manual.raycast.com/ai

- Save them as a text snippet in Obsidian notes. https://obsidian.md

reply
tobiasnvdw
18 days ago
[-]
Mostly plain text files saved locally for easy copy-pasting.

I'll occassionally use prompts from the Anthropic library (https://docs.anthropic.com/en/prompt-library/library) and make some minor modifications to them. E.g. I'll modify the "prose polisher" prompt from the prompt library for refining written text in specific ways.

reply
cloudking
19 days ago
[-]
For ChatGPT I've found this search extension useful to find previously used prompts: https://chromewebstore.google.com/detail/gpt-search-chat-his...

Source code: https://github.com/polywock/gpt-search

reply
muzani
18 days ago
[-]
I keep it adhoc - models change so frequently that prompts are always broken all the time. Most of the ones I've used last year are no longer relevant.

"Prompt engineering" may be a thing of the past. These days, you can sketch a vague table on a piece of paper and take a photo of it with a phone, and AI will figure out exactly what you're trying to do.

reply
97-109-107
20 days ago
[-]
Maybe I'm hijacking, but I see a generalized problem - how do you keep snippets of text that you use in your browser?

My current kludge is to edit long fields of text in an external editor via a browser addon, and have the editor save all such edits locally.

reply
wruza
20 days ago
[-]
I’m thinking of making a simple wrapper around APIs, because web-based AIs tend to dump literal tons of text due to monetary incentives. For now I prepend a standard pseudo-system stub to all my chats, works fine in my case.
reply
cyberhunter
20 days ago
[-]
Tired of OpenAI account deletions and Gemini template hiccups? Frustrated with manually typing or copy-pasting prompts every time you switch between LLM clients? If you're like me and want a smoother way to manage your prompts, I built a tool that might be just what you need.

*The Problem:*

* OpenAI accounts can be deleted unexpectedly.

* Gemini templates sometimes fail to work.

* Re-typing or copy-pasting prompts across multiple clients is tedious.

*The Solution: DryPrompt*

DryPrompt lets you create reusable prompt templates with variable fields. You set up the framework once, and then simply fill in the variables to generate the full prompt.

*How It Works:*

1. *Go to:* dryprompt.go123.live

2. *Sign up:* It's free and allows you to sync your prompts across devices.

3. *Create a template:* Define your prompt structure and mark the parts you want to change with variables.

4. *Use it:* Copy the template, replace the variables with your specific content, and you've got your ready-to-use prompt!

*Example:*

Let's say you need to internationalize multiple code files. With DryPrompt, you can create a template that includes the file code as a variable. Each time, just copy the template, paste in the new file's code, and you'll instantly get the internationalization prompt. No more tedious copying and manual concatenation!

*Give it a try and make your LLM workflow more efficient:* dryprompt.go123.live

reply