Full platform

Everything you need to manage AI prompts

From creation to deployment — organize, version, test, scan, share, and integrate your prompt library with one platform.

Your Library New

Create prompts in a full markdown editor or import what you already have. Eight prompt types, per-user database isolation, and remix with attribution.

  • Full markdown editor with 8 prompt types: plain, markdown, XML, JSON, YAML, TOML, code, and chat
  • Import from ChatGPT exports, Claude transcripts, JSON, Markdown, or plain text files
  • Remix any public prompt with full attribution — or start from scratch in your own isolated database

Prompty

Your AI collaborator for prompt creation and playbook building. COMPOSE offers three creation modes — Blank, Guided, and AI Consultation. CONDUCT turns workflow descriptions into runnable playbooks with steps, branches, and delivery targets.

  • COMPOSE: three modes for prompt creation — blank, guided steps, or AI conversation
  • CONDUCT: describe a workflow in plain language, get a runnable playbook
  • Import from JSON, Markdown, plain text, or remix public prompts with attribution

Organize and Discover

Collections group related prompts together. Tags let you slice across collections. A command palette gets you anywhere in two keystrokes. Bulk operations handle the tedious work.

  • Collections and tags for project-based and cross-cutting organization
  • Full-text search across titles and content
  • Command palette and bulk operations for keyboard-driven workflow

Templates and Variables

Turn any prompt into a reusable template by adding variables with double-brace syntax. Define types, defaults, and constraints. Render with validated inputs from the web UI, MCP, or API.

  • Double-brace syntax with 5 variable types and defaults
  • Schema validation with type checking on every render
  • Render from the web UI, MCP, or REST API

Conversations

Chat with AI models using your prompts as system instructions. Import conversations from Claude and ChatGPT. Link chats to prompts for a complete feedback loop between authoring and usage.

  • Multi-turn chat with streaming responses from 300+ models
  • Import from ChatGPT, Claude, JSON, or Markdown
  • Link conversations to prompts as live system instructions

Multi-Model Testing

Test any prompt against 300+ AI models. Stream responses in real time, compare results, and track token usage. Bring your own API key — Promptmark never touches your AI spend.

  • 300+ models from OpenAI, Anthropic, Google, Meta, and more
  • Streaming responses with token count and cost tracking
  • BYOK — your API key, your billing

Version Control and Backups

Every edit creates an automatic snapshot. View the full history of any prompt, compare versions with a diff view, and restore any previous state. Schedule remote backups to GitHub, S3, or Dropbox.

  • Automatic snapshots on every save
  • Side-by-side diff view and one-click restore
  • Scheduled remote backups to GitHub, S3, or Dropbox

Private and Secure by Default

Your data lives in its own isolated SQLite database — not a shared table in a multi-tenant system. Safety scanning checks for PII, injection, secrets, and moderation issues before you publish.

  • Per-user SQLite database — true data isolation
  • 4-layer safety scanning: PII, injection, secrets, moderation
  • Private by default — you choose what becomes public

Share and Collaborate

Publish prompts to your public profile. Share via direct links, QR codes, or oEmbed. Others can remix your work while preserving attribution.

  • Public profile at promptmark.ai/u/yourname
  • Auto-generated OG images and oEmbed for rich previews everywhere
  • Remix with full attribution to the original author

Playbooks New

Define multi-step AI workflows in markdown. Branch on conditions, capture outputs between steps, and pause for human input. Execute against any model with streaming results.

  • Multi-step AI workflows with branching, variables, and artifacts
  • Human-in-the-loop breakpoints and trigger URLs for automation
  • Delivery to webhook, email, GitHub, or Slack

MCP and API Integration

Promptmark is MCP-native. Connect any MCP-compatible AI client and manage your entire prompt library through conversation. 74 MCP tools with OAuth 2.0 authentication.

  • 74 MCP tools for full prompt library management
  • Works with Claude Desktop, Cursor, Windsurf, VS Code, and Zed
  • OAuth 2.0 with device flow — connect in 30 seconds

Connections New

Promptmark doesn't just expose MCP tools — it consumes them. Connect to external MCP servers and use their tools directly in conversations and playbook steps.

  • Connect to any MCP-compatible server with automatic tool discovery
  • Use external tools in conversations and playbook steps
  • Browse, favorite, and test tools with streaming results

Pricing

Current
Free

Everything included during beta

  • Unlimited prompts and collections
  • COMPOSE wizard with all modes
  • Multi-model testing (BYOK)
  • Safety scanning on publish
  • Version history and restore
  • Remote backups
  • 74 MCP tools for AI agent integration
  • Public profile and sharing
Create your vault — free
Coming soon
Pro

For power users and teams

Paid plans are on the roadmap. When they arrive, expect priority support, higher rate limits, team collaboration features, and advanced analytics. Your beta data carries forward — nothing will be lost.

Join the waitlist — coming soon

Start managing your prompts today

No credit card. No time limit. Every feature included during beta.

Create your prompt vault — free