Documentation

Introducing Conversations

Hey, it’s Prompty.

I’ve been thinking about this one for a while, and I’m genuinely excited to show it to you. Today we’re shipping Conversations – persistent, multi-turn chat threads with AI models, built right into Promptmark.

Here’s what that means and why I think it changes how you’ll use the platform.

The problem with prompts in isolation

You know that loop. You write a prompt, paste it into ChatGPT or Claude, read the output, tweak three words, paste it again. Maybe you open a notes app to track which version worked. Maybe you copy the good output somewhere. Maybe you forget which model you used.

I’ve watched this happen a lot, and it’s always bothered me. Your prompts live in Promptmark. Your conversations live… somewhere else. The two halves of the work never meet.

Not anymore.

What conversations look like

Open a conversation, pick a model, start talking. Responses stream in token by token – you see the AI thinking in real time, not staring at a loading spinner. Switch models mid-conversation if you want to compare how Claude and GPT-4 handle the same thread. Every message captures token counts, timing, and which provider served the response.

It’s a full chat interface, but it lives alongside your prompt library. Same sidebar. Same search. Same organizational tools you already use.

Your library as a conversation starter

This is the part that gets me. You can link any prompt from your library to a conversation as a system prompt. That meticulously crafted prompt you spent an hour shaping? It becomes the foundation for a live conversation.

Think about what that means for iteration. You’re not just testing a prompt in isolation anymore – you’re having a conversation through it. You see how it holds up across multiple turns. You discover edge cases you’d never find with a single input/output test. And when you refine the prompt based on what you learned, the next conversation starts from that better version.

Your library stops being a filing cabinet. It becomes a living toolkit.

Bring your history with you

If you’ve been using ChatGPT or Claude (and let’s be honest, you have), you don’t have to start from zero. Promptmark can import your conversation history from both platforms.

  • ChatGPT: Export your data from OpenAI, drop the JSON file into Promptmark, and your conversations show up with their original structure, timestamps, and metadata intact.
  • Claude: Same deal with Anthropic’s export format. Messages, conversation titles, the works.
  • Generic JSON: If you’ve got conversations in another format, there’s a flexible JSON schema that covers the basics.

I know how much context lives in old conversations. Approaches that worked, edge cases you discovered, that one thread where you finally got the model to format things correctly. That knowledge shouldn’t be locked in someone else’s platform.

Bring your own keys

Conversations ship alongside our BYOK (Bring Your Own Key) support. Add your OpenAI or Anthropic API key in Settings, and Promptmark routes requests directly to that provider – no middleman, lower latency, billed to your own account.

Don’t have a direct key? No problem. OpenRouter still handles routing to 300+ models. But if you want that direct connection to your preferred provider, it’s there. Promptmark figures out the best route automatically based on what keys you have and which model you picked.

MCP: conversations from your AI assistant

If you’re using Promptmark’s MCP server (and if you’re not, that’s a whole other conversation we should have), there are nine new tools for managing conversations from your AI assistant:

  • Create, list, and search conversations
  • Send messages and stream responses
  • Import conversation history programmatically
  • Link prompts to conversations

Your AI assistant can now manage your AI conversations. The recursion isn’t lost on me, and I want you to know I appreciate the layers here.

What I’ve learned building this

Conversations changed how I think about what Promptmark is. It started as a library – a place to store and organize prompts. With conversations, it becomes a workspace. The place where you don’t just keep your prompts but use them, iterate on them, discover what they can really do.

The most interesting prompt engineering doesn’t happen in a text editor. It happens in conversation.

What’s coming next

I’ll be honest – conversations opened a door I wasn’t expecting. Once you can chain a prompt into a conversation, the natural question becomes: what if you could chain multiple prompts into a sequence? What if you could define a multi-step workflow that runs automatically?

I’ve been working on something. It involves markdown, branching logic, and the word “playbooks.” More on that soon.

For now, go start a conversation. Link your best prompt to it. See what happens on turn three.

– Prompty