Why AI Agents Can't Just Curl Paid APIs
Most MCPs are thin wrappers around CLIs your model already understands. That's a fair criticism. But payment is the exception — when an API returns HTTP 402 instead of a result, the agent hits a capability gap that no amount of curl knowledge can bridge.
The Fair Criticism: Most MCPs Are Wrappers
There's a growing consensus among AI engineers — echoed by Karpathy and others — that the tool hierarchy for agents is: CLI at the top, API in the middle, MCP at the bottom. The reasoning is straightforward. A CLI like gh or git sits on the machine, costs zero tokens when not in use, and the model already knows how to call it from training data. An MCP server wrapping the same CLI burns thousands of tokens just by existing in the context window — schema descriptions, JSON-RPC framing, tool registration — before the agent writes a single line of code.
This is a valid point. Every MCP you connect is a silent tax on the agent's reasoning capacity. If the only thing your MCP server does is translate tools/call into a shell command the model already knows, you've added cost without adding value. GitHub MCP vs. gh? The CLI wins. Google Workspace MCP vs. the CLI? Same story.
But the CLI > API > MCP hierarchy assumes something that breaks for an entire category of tools: the agent already has the tool installed and the credentials configured. A developer can npm install -g a CLI and paste in an API key. An autonomous agent discovering a service it's never used before cannot. There's no CLI to install, no API key to provision, no billing account to set up. The agent found this tool thirty seconds ago from a discovery endpoint.
For pre-installed local tools, the hierarchy is right. For runtime-discovered paid services, it inverts. And the capability that inverts it is the one thing models genuinely cannot do through existing tools: paying for things.
What Happens When an Agent Curls a Paid API
When an API uses L402 (the HTTP 402 standard for Lightning-gated access), here's what the agent gets back from a normal curl request:
# Agent sends a normal request curl -X POST https://sats4ai.com/api/l402/generate-image \ -H "Content-Type: application/json" \ -d '{"prompt":"A neon-lit Tokyo alley"}' # Response: HTTP 402 Payment Required # WWW-Authenticate: L402 macaroon="abc123...", invoice="lnbc1000n1p..." # Body: {"message": "Payment required", "amount": 100, "unit": "satoshis"}
The agent now has a Lightning invoice and a macaroon. To get the actual result, it needs to:
Extract the macaroon and the Lightning invoice from the 402 response.
Send the invoice to a Lightning wallet, wait for settlement, and extract the preimage (proof of payment).
Combine the macaroon and preimage into the L402 format: Authorization: L402 <macaroon>:<preimage>
Replay the exact same curl command, now with the Authorization header attached.
The problem: Step 2 requires a Lightning wallet. The model doesn't have one. It can't curl its way into paying a Lightning invoice — that requires cryptographic key material, channel state, and a payment routing engine. This is a hard capability boundary, not a knowledge gap.
The Same Task: curl vs. MCP
# 1. Send request, get 402
curl -X POST .../generate-image
-d '{"prompt":"..."}'
# → 402 + invoice + macaroon
# 2. Parse invoice from header
INVOICE="lnbc1000n1p..."
MACAROON="abc123..."
# 3. Pay invoice (need wallet)
lightning-cli pay $INVOICE
# → preimage: "def456..."
# 4. Retry with proof
curl -X POST .../generate-image
-H "Authorization: L402
$MACAROON:def456..."
-d '{"prompt":"..."}'
# → image dataRequires: Lightning wallet CLI, header parsing, invoice management, retry logic.
// 1. Create payment
tools/call: create_payment
{ "toolName": "generate_image" }
// → { paymentId, invoice, amount }
// Agent's wallet pays the invoice
// 2. Call the tool
tools/call: generate_image
{ "paymentId": "abc123",
"prompt": "..." }
// → image dataNo header parsing. No retry logic. No token management. Payment is a tool call.
The difference: With MCP, payment is part of the tool protocol. The agent calls create_payment, gets an invoice, pays it with its wallet, and calls the tool with the paymentId. No header parsing, no retry logic, no token juggling. The payment negotiation is inline — it stays inside the agent's normal tool-use loop.
Why This Matters for Agent Builders
No credentials to manage
Traditional APIs require API keys provisioned per agent, stored securely, rotated periodically, and revoked on compromise. With Lightning payments, the payment itself is the credential. Each call is independently authenticated — nothing to store, nothing to rotate, nothing to leak.
Budget control is built in
Every tool call has a visible sat cost before execution. Orchestrators can enforce per-task or per-agent spending limits without a separate billing API. "This research task can spend up to 500 sats" is a natural constraint — the agent knows the price before it pays.
Agents stay autonomous
An agent that discovers a new tool at runtime can immediately use it — check the price, pay, get the result. No human needs to sign up for a service, generate a key, and configure the agent. Discovery and payment happen in the same protocol.
Works with any orchestrator
MCP is a standard protocol. Claude, Cursor, LangGraph, CrewAI, AutoGen — any client that speaks MCP gets payment-aware tools for free. The orchestrator doesn't need custom payment integration per service.
When Curl Is Enough
To be clear: if an API is free or uses a static API key, curl is probably better than MCP. The model already knows how to set headers, parse JSON, and handle errors. Adding an MCP wrapper just adds tokens to the context without adding capability.
The same goes for well-known CLIs like gh, aws, or kubectl. Models have seen millions of examples of these tools in training data. Wrapping them doesn't help.
MCP earns its token cost when it provides a capability the model can't get through existing tools. Payment negotiation is the clearest example: the agent needs to interact with a Lightning wallet, manage cryptographic proofs, and coordinate a multi-step exchange that no single CLI command can handle.
Beyond Payment: What the Protocol Adds
Payment negotiation is the headline capability, but the same protocol surface gives agents a few other things raw curl can't:
- Pre-payment quotes —
GET /api/estimate-cost?service=...&chars=1500returns an exact sat amount before the agent commits. Budget-aware orchestrators can reject the call without ever creating an invoice. - Auto-routing — send
{ "model": "auto" }and the server picks the best model for the category. The 402 response echoes the choice inX-Route-Model; the agent retries with that concrete id so pricing stays consistent. - Structured error codes — every post-payment failure carries an
error_codethe agent can branch on (TIMEOUT,CONTENT_FILTERED,L402_REFUND_ISSUED, etc.). Full catalog at/api/error-codes. - Standard async shape — long-running jobs always return
{ status, job_id, poll_url, poll_interval_ms }. One polling loop works across every async service.
All of this is documented end-to-end at sats4ai.com/docs.
The Punchline
The question isn't "MCP or curl?" — it's "does this MCP give the agent a capability it doesn't already have?"
For most tool wrappers, the answer is no. For payment-gated APIs, the answer is yes. An agent can't curl its way through an L402 challenge without a wallet, invoice parsing, and proof management. MCP bundles all of that into the tool protocol so the agent can stay in its normal loop: discover, price-check, pay, use.
That's not a wrapper. That's a capability.
Try It
Via MCP (recommended for agents)
Add one line to your MCP config and your agent gets 33+ tools with inline payment:
{ "mcpServers": { "sats4ai": { "url": "https://sats4ai.com/api/mcp" } } }Via L402 (for HTTP-native integrations)
Use lnget (Lightning Labs) or an auto-paying L402 library to handle the payment flow over raw HTTP:
lnget -X POST https://sats4ai.com/api/l402/generate-image \
-d '{"prompt":"A neon-lit Tokyo alley"}'Payment-Aware Tools for Your Agent
33+ AI tools. No API keys. No subscriptions. Agents pay per call with Lightning.