← Back to Index

Why the Hell Don’t You Have an API?

Source: Stories by Tim O'Brien on Medium
Published: April 11, 2026 at 9:17 AM

What every customer is about to ask every website

Tim O'Brien

I’m trying to publish a book, and this isn’t my first time dealing with the publishing process. I’ve published books through O’Reilly before, and I’m a fan of O’Reilly. This project is semi-technical, somewhat unusual, and may not be a clean fit for a traditional publisher. Part of this is practical. Part of it is curiosity. I wanted to see what it actually takes to publish something myself.

And no, I’m not alone in this. Plenty of people are producing books on their own now. What I’m trying to do differently is take the production side seriously: detailed layout, deliberate formatting, proper distribution, the unglamorous mechanics that keep a book from turning into, “Here’s my poorly conceived Word document, now available as an EPUB.”

So I’m not self-publishing in the “upload a Word doc and pray” sense. I mean, actually publishing — proper ISBNs, proper formatting, proper distribution through multiple channels. Here’s what that involves. Draft2Digital for EPUB distribution — great tool, genuinely, no complaints. IngramSpark for bulk print. Bowker for ISBNs, because I buy my own rather than letting a platform assign one, since I’m distributing through multiple channels and I want to own the identifier. Kindle Partner Program for Amazon. Medium for the blog. Podbean for the podcast.

Six platforms. Six websites. Six sets of forms.

Part of this post is me complaining. Part of it is documenting where the friction is. I’ve been automating as much of the rigamarole as possible with an AI agent. The agent can write code, call APIs, fill out forms, and navigate websites. It’s genuinely capable. And what I’ve discovered is that almost none of these platforms want me to do this. Publishing is an industry that predates the web by centuries, and I am increasingly convinced that some of the websites supporting it may also predate the web by centuries, because some issues make automation difficult.

Press enter or click to view image in full size

I’m Sorry Dave (AI assist from ChatGPT 5.4)

The spectrum of hostility

It’s not all the same. The platforms fall on a spectrum, but the consistent experience is that the publishing industry hasn’t put in any work to make life easier for agents.

On one end, you have Podbean. They have an API. My agent can pull statistics, automate posts, and do actual work. It works. It’s clunky — the API feels like it was designed in 2014 and nobody has touched it since — but it works. The agent can do the thing. It’s clunky, but it works.

In the middle, you have Bowker. Having used Bowker for more than a decade, I don’t get the impression that this site has meaningfully changed in all that time. The forms are AngularJS — not even modern Angular, the old one. Automating it isn’t impossible, but it’s fragile. The agent has to click through multi-step wizards, fill in fields with weird IDs, and handle session timeouts. It works, barely, and you get the sense that if they push any update, it’ll all break overnight. If Bowker has an API for this workflow, they have done an excellent job of hiding it.

Then you have Medium. I like Medium. I’ve published there for years. I like what they’re trying to do. But Medium has made a deliberate decision to wall off its data. They retired their API. They put Cloudflare in front of everything — not just the login page, but the stats pages, the comment pages, everything. Try to access it programmatically, and you'll be asked to solve silly riddles and visual puzzles to prove you’re human. The RSS feed works, technically, but it gives you titles, dates, and content. No stats. No comments. No engagement metrics. A feed that tells you what you published, but not whether anyone read it.

I understand why. Medium is trying to protect itself against automated content creation. They don’t want bots flooding their platform. Fair enough. But the thing they’ve actually done is make it impossible for a legitimate user to access their own data programmatically. I’m not trying to spam anyone. (Some readers may disagree.) I’m trying to see how my posts are doing. I wrote them. They’re mine. And the only way to find out if anyone read them is to open a browser, click through three pages, and squint at a dashboard.

On the far end, you have the Kindle Partner Program. The agent I was using looked at the forms, looked at the terms of service, and said, “I should mention that automating this might violate their TOS. You could get locked out.” That’s not a technical problem. That’s a legal threat. The platform is saying: a human may fill out these forms. A machine may not. We will know the difference, and we will punish you for trying.

Why this is about to become everyone’s problem

Here’s why I’m writing this. A month ago, the people trying to automate website interactions were developers. They had API keys, they wrote scripts, and they understood rate limiting and authentication headers. The platforms could ignore them because they were a small, technical audience that could be dismissed as “power users” — a category in platform economics that means “people we don’t have to care about.”

That’s over now.

AI agents have made every motivated non-developer into someone who wants to programmatically interact with websites. Not because they learned to code. Because they asked an AI to do something and the AI tried to do it and discovered that the web — the thing we’ve been building for thirty years, often on technologies that predate the smartphone — was designed for hands-on keyboards, not for agents acting on behalf of humans.

This is going to be a shock to these new users — the ones who are saying things like, “Oh, I’ll just have my agent read the school's website and correlate attendance with grades.” Good luck getting past ancient, broken auth systems, coupled with education privacy regulations.

People are about to discover that their bank doesn’t have an API for transaction categorization. Their health insurance portal can’t be queried by an agent to find out what’s covered. That their kids’ school website requires a human to log in and click through a menu to find the lunch schedule. The utility company’s “contact us” form is a JavaScript nightmare that even a sophisticated agent can’t reliably navigate.

These aren’t edge cases. This is the entire web. The web was built for people. Every form, every dashboard, every click-through workflow was designed with the assumption that a human would be sitting there, reading the text, making decisions, clicking buttons. The web is a user interface, and the user was always supposed to be a person.

The reinvention

What’s about to happen is the same thing that happened to mobile. In 2007, most websites looked terrible on phones. They weren’t designed for small screens. Over the next five years, every consumer-facing website had to be rebuilt — responsive design, touch targets, mobile-first layouts. The ones that didn’t got replaced by apps that did the job better.

Companies that didn’t adapt to mobile are gone.

We’re at the same inflection point, but for agents instead of phones. The difference is that this time the user isn’t holding a different device. The user is the same person, with the same goals, but they’ve hired a machine to do the clicking for them. And the machine can’t click. Not because it’s not capable. Because the website won’t let it.

Some platforms will adapt. They’ll build APIs. They’ll offer agent-friendly authentication. They’ll realize that a user who can automate their workflow is a user who stays longer, publishes more, and complains less.

Some won’t. They’ll double down on CAPTCHA, bot detection, and terms of service that prohibit automated access. They’ll treat every agent as a potential threat, because some of them are, and because distinguishing between “my author checking their stats” and “a bot farm spamming the platform” is genuinely hard.

I don’t blame them for the difficulty. I blame them for not trying.

And this is where it gets genuinely annoying. Medium is a good example, because I actually like Medium. I support what they’re doing. But I should be able to ask a model running on Nanobot to tell me exactly how my posts did over the last week, or keep track of new comments, without turning that into a manual research project. I understand why they make that hard. I also know that, in practice, it means someone trying to use the platform seriously winds up doing more work for no good reason.

Give the market another year or two, and I think this gets sorted out the hard way. Sites will either wake up and realize that agents are not some exotic attacker class but are, increasingly, the way customers interact with systems, or they’ll keep blocking them and start learning what that does to retention, usage, and, ultimately, the business.

The platforms that figure this out first — that offer clean APIs, agent-friendly workflows, and clear terms of service for automated access — are going to win. Not because agents are the future. Because users with agents are the future. And blocking those users is eventually going to look a lot like volunteering to go out of business. Those users are about to start asking a very simple question:

Why the hell don’t you have an API?

A post like this should probably end with an emoticon. :)