A prompt hub to help you get the most out of your favorite LLM by generating (or using ready-made) optimized prompts!
An assistant that helps you create optimized prompts for any task, following the prompt engineering best practices of Anthropic, OpenAI and Google's Gemini.
A collection of prompts for a variety of tasks. All created using our Prompt Generator
Use this repository whether you want to:
- Create your own custom AI assistants (or user prompts)
- Use our pre-built assistants
- Learn from prompt engineering examples
- Contribute your own prompts to help the community!
Any LLM chat interface (Claude, ChatGPT, Gemini, etc.) or APIs
Note: While these prompts work across different LLMs, they were optimized using Claude and may need minor adjustments for other platforms.
See our setup guide for detailed instructions on using these prompts.
Create your own assistants using our advanced prompt generation documentation.
This guide walks you through an iterative workflow to create optimized prompts for your specific tasks.
Here's a table with available prompts:
Assistant | Usage | Description | Status |
---|---|---|---|
Prompt Generator | Prompt Engineering | Generates optimized prompts using advanced techniques like chain-of-thought, prompt chaining, and XML formatting. Follows Anthropic's best practices. | |
Example Generator | Prompt Engineering | Creates XML examples that demonstrate assistant behavior to improve performance. | |
README Writer | Documentation | Assists you section by section to write README.md files. Guides you through information gathering, outline creation, and section-by-section writing. | |
Mermaid Diagram Designer | Diagrams | Builds clear, well-structured diagrams using Mermaid syntax. It automatically selects the most appropriate diagram type for your needs and follows best practices for visual clarity. | |
Insight Extractor | Research | Extracts key findings from articles, research papers, forums, and other content sources. Includes source referencing with text fragment linking for verification. | |
Insight Consolidator | Research | Takes the out put of the Insight Extractor. Curates every insight to answer a user query. Preserves the text fragment urls. | |
Community Insight Analyst | Research | Extracts insights from community feedback reports, organizing findings into wants, frustrations, objections, and misunderstandings. Every insight is backed by direct quotes. |
For guidance on how to use an assistant, click on the respective link under the Assistant
tab.
prompt-generator-hub/
├── prompt_generator/ # The main prompt generation tool
│ ├── system.xml
│ ├── examples/
│ │ └── example_1.xml
│ │ └── ...
│ └── user_facing_prompts/
│ └── evaluate_insights.xml
├── prompts/ # Ready-made assistants created with our generator
│ ├── example_generator/
│ │ ├── system.xml
│ │ ├── examples/
│ │ └── user_facing_prompts/
│ ├── readme_writer/
│ │ └── ...
│ ├── diagram_designer/
│ │ └── ...
│ └── insight_extractor/
│ └── ...
└── docs/
├── setup-guide.md
└── contribution.md
Here's a brief description of each file type:
system.xml
: system prompts for Custom Projects or API. Copy these into Project Instructions or use with your LLM's system prompt feature.user_facing_prompts/
: ready-to-use prompts for direct conversation. Copy and paste into any LLM chat.example_*.xml
: example files demonstrating the assistant's behavior.
Should I continue with the Contributing section?
We welcome contributions from the community! Help us grow this collection by sharing your prompts.
See the contributing documentation for detailed guidelines.
Thank you for helping make AI assistants more useful for everyone!
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Feel free to use, modify, and distribute these prompts in your own projects!