arrow_back BACK_TO_BLOG
terminal://blog.post_05
ENGINEERING 2026.04.15

Publishing From Your AI Editor

Claude Code to Bluesky, Mastodon, and LinkedIn — in one prompt. How Blurt ships a native MCP server and why AI editors belong as a first-class input, not an afterthought.

MCP Claude Code Architecture

I've been drafting social posts inside Claude Code for months. The workflow, until last week, went like this:

The drafting was magical. The publishing was seven browser tabs and a stopwatch. That seam — where the AI stops and the boring mechanical part begins — was exactly the part I didn't want to keep doing.

So I closed it. Blurt's v0.2.0 release ships a native Model Context Protocol server. Claude Code can now publish directly to every platform Blurt supports, in a single prompt, without leaving the editor.


What is MCP, briefly

MCP is how AI editors talk to real systems. Think of it as a small program that exposes two things to the model: tools (things the model can do — create a post, query a database, call an API) and resources (read-only context it can pull into the prompt — current queue, platform config, your open issues).

Your editor manages the handshake, the model picks which tool to call, the MCP server does the real work. It's RPC with a schema, basically — but good enough that Claude, Cursor, and Windsurf all speak it natively now.

Setup: one JSON file

Blurt ships a ready-to-use .mcp.json at the repo root:

{
  "mcpServers": {
    "blurt": {
      "command": "bundle",
      "args": ["exec", "mcp/bin/blurt-mcp"],
      "env": {
        "BLURT_API_URL": "http://localhost:3000",
        "BLURT_API_KEY": "${BLURT_API_KEY}"
      }
    }
  }
}

Export your API key, run claude from the repo root, and the editor launches the MCP server over stdio. No login flow, no OAuth, no subscription. Your key, your server, your data.

The demo

This is the exact prompt I used to publish this very post's announcement:

> draft a post saying "Blurt v0.2.0 ships MCP — publish
  from Claude Code in one prompt" and queue it for
  bluesky, mastodon, and linkedin at 1pm UTC tomorrow

Claude picked up the intent, called create-post with my content plus scheduled_at: "2026-04-16T13:00:00Z", and came back with:

Post created: 20260416-blurt-v020-ships-mcp.md
  Platforms: bluesky, mastodon, linkedin
  Status:    queue
  Scheduled: 2026-04-16T13:00:00Z

The background scheduler picked it up at 13:00, posted to all three platforms, moved the file from queue/ to sent/ with the permalinks written into the frontmatter. No browser tabs. No copy-paste. No open-in-new-window.

Seven tools, two resources

v0.2.0 exposes the whole Blurt API surface over MCP:

Plus two resources: blurt://queue (live JSON of pending posts) and blurt://platforms (configured integrations). The editor can pull these into context automatically — "am I already scheduled to post tomorrow?" becomes a question Claude can answer without being asked to check.

Architecture: CLI, MCP, HTTP API — same plumbing

The thing I'm proudest of isn't the MCP server. It's that shipping it added almost no code.

AI editor  ---(MCP)--->  blurt-mcp  ---(HTTP)--->  Blurt API
                                                       ^
 blurt CLI  ----------------HTTP-----------------------|
                                                       ^
 drop file  ----------filesystem watcher---------------|

All three input methods converge on the same HTTP API. The blurt-mcp gem is ~300 lines of Ruby that wraps the same Faraday client the CLI uses. No duplicated endpoint logic, no parallel auth story, no "oh and MCP also needs its own validation layer."

Why does that matter? Because a year from now, when Anthropic ships some new MCP feature or a new editor shows up, Blurt will support it without me touching the Rails side. The HTTP API is the stable contract; the rest is just transports.

Remote MCP, no SSH tunnels

The other thing v0.2.0 ships is a streamable HTTP transport. Stdio is fine for local dev, but if your Blurt server runs on a VPS and your Claude Code is on a laptop, you want HTTP.

BLURT_API_URL=http://localhost:3000 \
BLURT_API_KEY=your-key \
bundle exec mcp/bin/blurt-mcp-http

# Listening on http://0.0.0.0:3333/mcp (puma)

Put nginx or Caddy in front, terminate TLS, and your client config becomes a one-liner:

{
  "mcpServers": {
    "blurt": { "url": "https://blurt.your-vps.com/mcp" }
  }
}

Same tools, same resources, same auth. Different transport.

Why MCP is a first-class input

Most publishing tools treat AI as a nice-to-have: maybe a "generate with AI" button on the compose page, maybe a Zapier integration. Bolt-ons.

Blurt treats MCP as peer to the filesystem and the CLI. Drop a file, run a command, or ask Claude — same data, same behavior, same guarantees. The one I use the most, by far, is the third one. Not because it's cool, but because it collapses the drafting-and-publishing gap into a single step.

That's the bit that feels different. Not "AI can help you publish." AI is the input.


Try it

v0.2.0 is out today. It's open source, self-hosted, and free. Clone the repo, point it at your platform credentials, and publish from your editor.

If you try it, tell me how it breaks. I'm on Bluesky and Mastodon. This post itself was published via Blurt — from Claude Code — over MCP.

BLURT

Join the Waitlist

Launching April 21. Get notified. No spam. No tracking.