This section makes the case for hypermedia-driven architecture (HDA) as the default approach to building web applications. The arguments here are opinionated but grounded in the original definition of REST, the economics of framework migration, and the structural properties of HTML as a transfer format.
The technical implementation follows in later sections. This one answers the prior question: why build this way at all?
REST was always about hypermedia
Roy Fielding’s 2000 doctoral dissertation, Architectural Styles and the Design of Network-based Software Architectures, defined REST as “an architectural style for distributed hypermedia systems.” The word hypermedia is not incidental. It is the subject of the entire architecture.
Chapter 5 of the dissertation specifies four interface constraints for REST. The fourth is HATEOAS: Hypermedia As The Engine of Application State. Server responses carry both data and navigational controls. The client does not hardcode knowledge of available actions. It discovers them through hypermedia links and forms embedded in the response. HTML is the canonical format that satisfies this constraint: an HTML page contains both content and the controls (links, forms, buttons) that drive state transitions.
JSON carries no native hypermedia controls. A JSON response like {"name": "Alice", "email": "alice@example.com"} contains data but no affordances. The client must know in advance what URLs to call, what HTTP methods to use, and what payloads to send. This requires out-of-band documentation and tight client-server coupling, which is precisely what REST’s uniform interface constraint was designed to prevent.
By 2008, the drift had become bad enough that Fielding wrote a blog post titled “REST APIs must be hypertext-driven”:
I am getting frustrated by the number of people calling any HTTP-based interface a REST API. […] If the engine of application state (and hence the API) is not being driven by hypertext, then it cannot be RESTful and cannot be a REST API. Period.
The industry ignored him. The Richardson Maturity Model, popularised by Martin Fowler, formalised REST into “levels.” Most developers stopped at Level 2 (HTTP verbs and resource URLs) and never implemented Level 3 (hypermedia controls). When JSON replaced XML as the dominant transfer format, the “REST” label stuck even though the defining constraint had been dropped. What the industry calls a “RESTful API” is, by Fielding’s definition, RPC with nice URLs.
This matters because the original REST architecture was designed to solve real problems: evolvability, loose coupling, and independent deployment of client and server. Those problems did not go away when the industry adopted JSON APIs. The solutions were simply abandoned.
The HDA architecture defined
A hypermedia-driven application (HDA) returns HTML from the server, not JSON. The term comes from Carson Gross, creator of htmx, and is defined in detail in the book Hypermedia Systems and on the htmx website.
The architecture has two constraints:
-
Hypermedia communication. The server responds to HTTP requests with HTML. The client renders it. There is no JSON serialisation layer, no client-side data model, and no mapping between API responses and UI state. The HTML is the interface.
-
Declarative interactivity. HTML-embedded attributes (such as htmx’s
hx-get,hx-post,hx-swap) drive dynamic behaviour. The developer declares what should happen in the markup rather than writing imperative JavaScript to manage requests, state, and DOM updates.
The key mechanism is partial page replacement. When the user interacts with an element, the browser sends an HTTP request and receives an HTML fragment. That fragment replaces a targeted region of the DOM. The server controls what the user sees next, because the server produces the HTML. The client is a rendering engine, not an application runtime.
This eliminates an entire layer of software. In a typical SPA, the server serialises data to JSON, the client deserialises it, maps it into a state store, derives a virtual DOM from that state, and diffs it against the real DOM. In HDA, the server renders HTML and the browser displays it. The serialisation, deserialisation, state management, and virtual DOM diffing layers do not exist because they are not needed.
An HDA is not a traditional multi-page application with full page reloads on every click. The partial replacement model provides the same responsiveness that SPAs deliver, but the interactivity logic lives on the server rather than in client-side JavaScript.
The coupling advantage
Each endpoint in an HDA produces self-contained HTML. A handler for GET /contacts/42/edit returns an edit form. That form contains the data, the input fields, the validation rules (via HTML5 attributes), and the submit action (via the form’s action attribute or htmx attributes). Everything the client needs is in the response. There is no shared state to coordinate with.
SPA architectures centralise client-side state. React applications commonly use a global state store (Redux, Zustand, Jotai, or React Context) to hold data that multiple components need. This creates a coupling pattern: when you change the shape of data in the store, every component that reads or writes that data must be updated.
Redux’s single-store design has been criticised for exhibiting the God Object anti-pattern, where a single entity becomes tightly coupled to much of the codebase. Changes intended to benefit one feature create ripple effects in unrelated features. The React-Redux community documented this problem: hooks encourage tight coupling between Redux state shape and component internals, reducing testability and violating the single responsibility principle.
The single-spa project (a framework for combining multiple SPAs) explicitly warns against sharing Redux stores across micro-frontends: “if you find yourself needing constant sharing of UI state, your microfrontends are likely more coupled than they should be.” This is an acknowledgement from within the SPA ecosystem that centralised client state creates coupling problems.
In HDA, the coupling boundary is the HTTP response. Each response is stateless and self-contained. The server can change the HTML structure of one endpoint without affecting any other endpoint, because there is no shared client-side state that binds them together. Two developers can modify two different pages concurrently with zero coordination. This property is structural, not a matter of discipline. It falls out automatically from the architecture.
The framework migration tax
JavaScript framework churn imposes a recurring cost on every project built with a client-side framework.
AngularJS to Angular 2+. React class components to hooks to server components. Vue 2 to Vue 3. Each major transition changes fundamental patterns: how components are defined, how state is managed, how side effects are handled. Code written against the old patterns must be rewritten, not just updated.
A peer-reviewed study by Ferreira, Borges, and Valente (On the (Un-)Adoption of JavaScript Front-end Frameworks, published in Software: Practice and Experience, 2021) examined 12 open-source projects that performed framework migrations. The findings:
- The time spent performing the migration was greater than or equal to the time spent using the old framework in all 12 projects.
- In 5 of the 12 projects, the time spent migrating exceeded the time spent using both the old and new frameworks combined.
- Migration durations ranged from 7 days to 966 days.
AngularJS reached end-of-life on 31 December 2021. Three years later, BuiltWith reports over one million live websites still running AngularJS. WebTechSurvey puts the figure above 500,000. The exact count varies by measurement method, but the order of magnitude is clear: hundreds of thousands of applications remain on a deprecated, unpatched framework because migrating to Angular 2+ requires a near-complete rewrite of the client-side codebase.
This is not a one-time problem. React’s transition from class components to hooks changed every component pattern in the ecosystem. The ongoing shift toward React Server Components is changing the execution model itself, blurring the boundary between server and client in ways that require rethinking application architecture. Each transition resets knowledge, breaks libraries, and forces rewrites.
The migration tax is a structural property of the SPA model: when interactivity logic lives in client-side JavaScript tied to a specific framework’s component model, that logic must be rewritten whenever the framework’s model changes. HDA does not eliminate the need to stay current with server-side tools, but server-side framework transitions (switching from one Rust web framework to another, for example) affect route definitions and middleware, not the fundamental rendering model. The HTML your server produces is the same regardless of which framework generates it.
The backward-compatibility guarantee
No HTML element has ever been removed from the specification in a way that breaks rendering.
The WHATWG HTML Standard, which governs HTML as a living specification, lists obsolete elements including <marquee>, <center>, <font>, <frame>, and <acronym>. Authors are told not to use them. But the specification still mandates that browsers render them. <marquee> has a complete interface specification (HTMLMarqueeElement) with defined behaviour. <acronym> must be treated as equivalent to <abbr> for rendering purposes. These elements work in every modern browser because the spec requires it.
This is not accidental. It is policy. The W3C HTML Design Principles document establishes a priority of constituencies: “In case of conflict, consider users over authors over implementors over specifiers over theoretical purity.” Backward compatibility flows directly from this principle: breaking existing content harms users, so the specification does not break existing content.
The WHATWG’s founding position reinforces this:
Technologies need to be backwards compatible, that specifications and implementations need to match even if this means changing the specification rather than the implementations.
An application built on HTML, CSS, and HTTP in 2026 can reasonably expect its platform foundation to remain stable for decades. The same HTML that rendered in Netscape Navigator still renders in Chrome today. No JavaScript framework has provided, or can provide, a comparable guarantee. React is 12 years old and has undergone three major paradigm shifts. The <form> element is 31 years old and works exactly as it did in 1995, with additional capabilities layered on top.
This is the core durability argument for HDA. Your investment in HTML templates, HTTP handlers, and declarative interactivity attributes is protected by the strongest backward-compatibility commitment in software: the web platform’s refusal to break existing content.
No separate API layer
In HDA, the HTML response is the API. There is no JSON layer to design, version, document, or maintain.
A traditional SPA architecture requires two applications: a client-side app that renders UI, and a server-side API that produces JSON. These are developed, tested, deployed, and versioned as separate artefacts with a contract between them. When the contract changes, both sides must change in coordination.
HDA collapses this into one application. An Axum handler receives a request, queries the database, renders HTML with Maud, and returns it. The browser displays the HTML. There is one codebase, one deployment, one thing to reason about.
This has practical consequences:
- No API versioning. The server controls the HTML. If the data model changes, the server updates the template. There is no external consumer relying on a JSON schema.
- No serialisation code. No
serdeannotations on response types, no JSON schema validation on the client, no mapping between API responses and component props. - No CORS configuration. The browser requests HTML from the same origin that served the page. Cross-origin issues do not arise.
- Faster feature delivery. Adding a field to a page means adding it to the query and the template. In an SPA, it means updating the API response, the TypeScript types, the state store, and the component that renders it.
The reduction in moving parts is not incremental. It is categorical. An entire class of bugs (schema mismatches, stale client caches, API versioning conflicts) cannot occur because the architecture does not have the layers where those bugs live.
When you do need a separate API
HDA does not mean you never write JSON endpoints. It means JSON is not the default, and HTML handles the majority of your application’s interface.
There are legitimate cases where a JSON API is the right tool:
- Third-party integrations. External services that call your application (payment webhooks, OAuth callbacks, partner integrations) communicate in JSON. These are not UI interactions; they are machine-to-machine interfaces.
- Mobile applications. If you ship a native mobile app alongside your web application, the mobile client needs a data API. HDA applies to the web interface; the mobile interface has different constraints.
- Public APIs. If your product offers an API as a feature (for customers to build integrations), that API will be JSON and needs the usual API design treatment: versioning, documentation, authentication, rate limiting.
- Islands of rich interactivity. Some UI components genuinely need client-side state: a drag-and-drop kanban board, a collaborative text editor, a real-time data visualisation. These components can fetch JSON from dedicated endpoints while the rest of the application uses HDA. This is the islands pattern, covered in When to Use HDA.
The principle is straightforward: use HTML for the interface, JSON for integrations. Most web applications are overwhelmingly interface. The JSON endpoints, when needed, are a small surface area alongside the HDA core, not a parallel architecture that doubles the codebase.