Your prompt library,
inside your AI tools
Promptmark speaks the Model Context Protocol. Connect Claude.ai, ChatGPT, Cursor, or VS Code and manage your entire prompt library without leaving the conversation.
What is the Model Context Protocol?
A standard for AI tools
MCP is an open protocol that lets AI assistants connect to external data sources and tools. Think of it as a USB port for AI — plug in any compatible service and it just works.
Two-way communication
Your AI assistant can read, create, update, and organize prompts through natural conversation. No copy-pasting. No switching tabs. Just ask.
Secure by design
Promptmark uses OAuth 2.0 with device flow authentication. Your AI tools connect to your account with scoped permissions. Your data stays yours.
74 tools. One conversation.
Every operation you can do in the Promptmark web app, you can do through MCP. Here is what your AI assistant gets access to.
Prompts
Full CRUD for your prompt library
- list_prompts — browse with filters
- search_prompts — find by title or tags
- create_prompt — write new prompts
- get_prompt — read prompt content
- update_prompt — edit any field
- delete_prompt — soft delete
Collections
Organize prompts into groups
- list_collections — see all groups
- create_collection — make new groups
- get_collection — view with prompts
- update_collection — rename or edit
- delete_collection — remove groups
- assign_prompt_to_collection — move prompts
Templates
Render prompts with variables
- get_prompt_schema — view variable definitions
- render_prompt — fill in template values
- validate_prompt_inputs — check before rendering
Version history
Track every change
- get_prompt_versions — full edit history
- get_prompt_version — view a specific snapshot
- restore_prompt_version — roll back changes
Captured responses
Save and review AI outputs
- capture_response — save a model reply
- list_captured_responses — browse saved outputs
- get_captured_response — read full content
- delete_captured_response — remove saved output
Tags, remixes, and safety
Organize, fork, and scan
- list_tags — see all tags in use
- rename_tag — bulk rename across prompts
- list_remixes — view forked prompts
- acknowledge_scan — resolve safety flags
Conversations
Chat with AI models directly
- list_conversations — list with filtering and pagination
- get_conversation — get conversation with messages
- create_conversation — start a new conversation
- update_conversation — update title or linked prompt
- delete_conversation — delete a conversation
- send_message — send a message and get AI response
- import_conversation — import ChatGPT or Claude exports
- list_conversation_messages — list messages in a conversation
- link_conversation_prompt — link a prompt as system prompt
Playbooks
Define and execute multi-step workflows
- list_playbooks — list with search and filtering
- get_playbook — get playbook with parsed structure
- create_playbook — create from markdown content
- update_playbook — update with auto-versioning
- delete_playbook — soft delete
- validate_playbook — parse and validate markdown
- execute_playbook — run with inputs and model
- get_playbook_execution — execution with step results
- list_playbook_executions — execution history
- get_playbook_versions — version history
- restore_playbook_version — restore previous version
Talk to your prompt library
With MCP, managing prompts feels like a conversation. Here are real things you can say to your AI assistant.
"Create a prompt called 'Code Reviewer' that checks for security issues and code style."
"Show me all my prompts tagged 'production' in the 'DevOps' collection."
"Render my 'Email Draft' template with tone='friendly' and recipient='engineering team'."
"What changed in my 'System Prompt' since last week? Restore version 3 if it looks better."
"Move all prompts tagged 'deprecated' out of the 'Active' collection."
Works with the tools you already use
Any MCP-compatible client can connect to Promptmark. These are the ones we have tested and support today.
Claude.ai
Add Promptmark via Customize → Connectors. OAuth login happens automatically.
anthropic.comChatGPT
Settings → Connectors → Developer Mode → Create. Paste the URL and approve.
chatgpt.comClaude Code
One command: claude mcp add --transport http promptmark. OAuth handles the rest.
claude.ai/codeCursor
Settings → Tools & MCP → Add New MCP Server. Paste the URL, select streamable-http.
cursor.comVS Code
Add to .vscode/mcp.json with type http. Works with GitHub Copilot's native MCP support.
code.visualstudio.comWarp
Settings → MCP Servers → Add. Paste the URL in the Streamable HTTP tab.
warp.devOpenCode
Open-source AI coding CLI with MCP support. Add to opencode.json and OAuth handles the rest.
opencode.aiAny MCP client
Promptmark implements the full MCP spec. Just point any client at the URL.
promptmark.aiConnect in three steps
Create your account
Sign up with Civic Auth. Your personal prompt database is created instantly.
Configure your MCP client
Add the Promptmark MCP server URL to your AI tool. OAuth login happens automatically in your browser.
Start talking to your prompts
Ask your AI assistant to create, find, or render prompts. All 74 tools are available immediately.
Manage prompts from your AI assistant
Free during beta. 74 MCP tools. Connect any compatible client.
Create your free account