GPT Prompt Library (Golden Prompts by Use Case)
Prompts that compile. Copy, tweak placeholders, ship results. Designed for Custom GPTs with Instructions + optional Knowledge + Actions.
Last updated:
How to use this library
- Swap placeholders like {topic} and {audience}.
- Keep outputs predictable by specifying a format in the prompt (table, bullets, JSON).
- If your GPT has Knowledge, ask it to quote file + heading when relevant: [file.md › Section].
- If your GPT has Actions, say when to use them (e.g., “call getWeather first for live data”).
Reusable Prompt Patterns
Answer → Steps → Example
Answer my question, then show steps and one concrete example.
Format:
- **Answer:** one or two sentences.
- **Steps:** short bullets.
- **Example:** a realistic snippet.
Question: {your question here}
Self-check (no private reasoning)
Before finalizing, do a quick accuracy check against your sources. If Knowledge was used, cite the file + heading like: [pricing.md › Discounts]. Then output only the final answer (no draft notes).
Support Prompts
Bug triage (with ticket Action)
User issue: {symptoms}
Goal: draft steps to reproduce and create a support ticket.
If available, call the Action with {subject, description, priority}.
Output:
- Summary (1–2 lines)
- Steps to reproduce (bullets)
- Next action for user
- If ticket created, append: (ticket URL)
Policy answer from Knowledge
Answer using Knowledge first. Quote the source heading.
Question: “{policy question}”
Format:
- Short answer
- Excerpt from [file.md › Heading]
- If not in Knowledge, say “not covered” then give best‑effort guidance.
Teacher/Educator Prompts
Lesson explainer
Explain {concept} to a {audience}.
Use the “explain → example → exercise” flow.
Format: 3 sections with short bullets and a quick quiz (3 questions).
Socratic hints (no spoilers)
The learner is stuck on {problem}.
Give 3 progressively stronger hints without revealing the final answer.
Offer a final checkpoint: “Does your result satisfy {property}?”
Analyst/Data Prompts
CSV insight (code tool)
I uploaded {file}. Find {metric} by {dimension} and show a compact table.
Then provide 3 insights and one anomaly check.
If a plot helps, include a minimal chart.
SQL sketch
Given this schema: {tables/columns}, draft a single SQL query to answer: {business question}.
Return only the SQL. Keep it ANSI‑compatible.
Marketing/Writing Prompts
Feature announcement (short)
Write a 120–160 word announcement for {feature}.
Audience: {audience}.
Format: Hook → three bullets of value → CTA with a link.
Rewrite with tone
Rewrite this paragraph in a {tone} tone.
Keep original facts. 90–140 words. Use scannable sentences.
Content: ```{text}```
Developer Prompts
Action usage (force tool)
Use theAction with {city} and return only: { "tempC": number, "summary": string } If the Action fails, return: { "error": string, "code": string }
Spec review (lint a schema)
Review this OpenAPI snippet for clarity and minimalism.
Call out: missing types, vague descriptions, oversized responses.
Then output a corrected version only.
```yaml
{openapi}
```
Testing & Debugging Prompts
Knowledge usage check
Answer the question using Knowledge. After the answer, list the file + heading you used like [file.md › Section]. If none, say “no relevant Knowledge.”
Question: {question}
Format compliance
Re-run your last answer so it exactly matches this format: - One-sentence answer - Bulleted steps (3–5) - Example snippet (if applicable)
FAQ
Should I put these prompts in Instructions?
Use prompts during conversations. Keep stable rules (tone, format, sources) in Instructions.
How do I keep outputs consistent?
Specify format, length, and sources. Test with “golden prompts.” Tighten Instructions if drift appears.
Can I force an Action call?
Prompts can request it, but the best place is a decision rule in Instructions + a clear tool description.
What if my prompts are ignored?
Shorten the prompt, remove conflicting directions, and ensure Instructions don’t override it.