Settings

Theme

Ask HN: How do you manage your AI prompts?

11 points by siddharthgoel88 a year ago · 7 comments · 1 min read

Reader

Hello folks!

Do you guys use any tools/process to your AI prompts? Or still prefer to keep it adhoc?

runjake a year ago

Depending on the use case and frequency, I either:

- Save them as a ChatGPT custom GPT or a Claude Project.

- Create a RayCast AI Command. https://manual.raycast.com/ai

- Save them as a text snippet in Obsidian notes. https://obsidian.md

tobiasnvdw a year ago

Mostly plain text files saved locally for easy copy-pasting.

I'll occassionally use prompts from the Anthropic library (https://docs.anthropic.com/en/prompt-library/library) and make some minor modifications to them. E.g. I'll modify the "prose polisher" prompt from the prompt library for refining written text in specific ways.

cloudking a year ago

For ChatGPT I've found this search extension useful to find previously used prompts: https://chromewebstore.google.com/detail/gpt-search-chat-his...

Source code: https://github.com/polywock/gpt-search

muzani a year ago

I keep it adhoc - models change so frequently that prompts are always broken all the time. Most of the ones I've used last year are no longer relevant.

"Prompt engineering" may be a thing of the past. These days, you can sketch a vague table on a piece of paper and take a photo of it with a phone, and AI will figure out exactly what you're trying to do.

97-109-107 a year ago

Maybe I'm hijacking, but I see a generalized problem - how do you keep snippets of text that you use in your browser?

My current kludge is to edit long fields of text in an external editor via a browser addon, and have the editor save all such edits locally.

wruza a year ago

I’m thinking of making a simple wrapper around APIs, because web-based AIs tend to dump literal tons of text due to monetary incentives. For now I prepend a standard pseudo-system stub to all my chats, works fine in my case.

cyberhunter a year ago

Tired of OpenAI account deletions and Gemini template hiccups? Frustrated with manually typing or copy-pasting prompts every time you switch between LLM clients? If you're like me and want a smoother way to manage your prompts, I built a tool that might be just what you need.

*The Problem:*

* OpenAI accounts can be deleted unexpectedly.

* Gemini templates sometimes fail to work.

* Re-typing or copy-pasting prompts across multiple clients is tedious.

*The Solution: DryPrompt*

DryPrompt lets you create reusable prompt templates with variable fields. You set up the framework once, and then simply fill in the variables to generate the full prompt.

*How It Works:*

1. *Go to:* dryprompt.go123.live

2. *Sign up:* It's free and allows you to sync your prompts across devices.

3. *Create a template:* Define your prompt structure and mark the parts you want to change with variables.

4. *Use it:* Copy the template, replace the variables with your specific content, and you've got your ready-to-use prompt!

*Example:*

Let's say you need to internationalize multiple code files. With DryPrompt, you can create a template that includes the file code as a variable. Each time, just copy the template, paste in the new file's code, and you'll instantly get the internationalization prompt. No more tedious copying and manual concatenation!

*Give it a try and make your LLM workflow more efficient:* dryprompt.go123.live

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection