--- title: 'The AI-Ready Web: Building Landing Strips for Intelligent Agents' permalink: /futureproof/the-ai-ready-web-building-landing-strips-for-intelligent-agents/ description: Today, I focused on making GAPalyzer and Notebooks communicate, and created `contextualizer.py` to condense articles for AI context windows. I'm starting to hit API quotas, which led me to Google AI Studio. I've long resisted over-reliance on large models, preferring to 'paint context' myself using vim and Python, realizing that LLMs' amnesia leads to expensive vendor lock-in. This issue mirrors the 'big vat of content' problem on the web, where poor hierarchy makes navigation difficult for both humans and AIs. My goal is to transform MikeLev.in into an ideal site demonstrating AI preparedness through clear navigation, faceted search, and static Jekyll/Liquid templates, anticipating an 'AI webdev reckoning' similar to Mobilegeddon, but one that is largely invisible. meta_description: Building an "AI-Ready Web" means structured content, clear navigation, and accessible design, in contrast to the "Invisible Web" that hinders AI agents. meta_keywords: AI-Ready Web, Stateless Agent, Semantic Web, Invisible Web, Context Painting, AI Reckoning, Jekyll, Liquid Templates, web architecture, AI preparedness layout: post sort_order: 4 --- ## Setting the Stage: Context for the Curious Book Reader This fascinating blueprint delves into the fundamental shift required for the web in the Age of AI: moving beyond interfaces designed solely for human eyeballs to embrace structures that are readily consumable by intelligent agents. It argues that just as the web adapted for mobile, it must now adapt for AI, advocating for a return to clarity, hierarchy, and accessibility as the foundational principles for a truly AI-friendly online presence. --- ## Technical Journal Entry Begins ### Optimizing Context for AI Consumption Alright, today was mostly about cleaning up GAPalyzer which urgently needed to be done, and I also made Notebooks able to talk. My imagine runs away and I chased the white rabbit a little bit at the end of this work. I created a `contextualizer.py` program that I also urgently needed to get at least in prototype form that "rolls over" all the articles like that magic rolling pin making much more context-window-friendly content then the existing articles or even all their already existing YAML summaries and keyword extractions. This new approach is much more geared towards creating the navigational drill-down example that I need. That one had to be done. I needed the first example out there of the "big vat of content" being percolated into something that could be handled as one big thing in the 1-million token context windows of the latest state of the art models. Most of my work is Gemini-centric because of the extreme generosity of the free tiers. With this work I'm starting to run into the quotas and rate limits like I haven't before, which is only logical and reasonable. Most of my work flies so low under the radar only going to the well for the big models through the API on rare occasions. I mostly go through the Web API for Gemini Pro under my personal GoogleOne account, which has extraordinarily high allowances. The free-tier API not so much. They're trying to get you to pay, of course. ## The Invisible Web vs. The Semantic Web So I'm finally starting to get familiar with the [AI Studio](https://aistudio.google.com/) portal. What you've got to do is to drill-down through `Dashboard` and then `Usage and Billing`. Then you can switch between `Usage` and `Rate Limit`. Then there's a dropdown to switch between `Projects`. And it's easy to switch between your Google login accounts too. And finally there's switching between different models and filtering on time-ranges. I've been pretty resistant to using the big models at all after having been on Cursor IDE for about a year on the $20/mo plan realizing it is far from $20/mo when you start to code ambitiously. And then the CLI tools started dropping and my vim skills became incredibly valuable because I found that I could *paint* context much more readily than *built it up in chat* which seems sloppy, non-deterministic and non-reproducible. Also it's a real pain to document copy/pasting chat-interaction by chat-interaction. The *Composer* interface, `Ctrl`+`i`, in Cursor seemed like the path but they took that away for a huge chunk of the time I was on Cursor. They subsequently brought it back but in that time I realized what **composing was** and with my vim text-editing and Python skills realized this whole... what is it? It's accommodating for the fact that the LLMs are amnesiac in ways that are both very expensive and lock you into particular vendor's tools. ## The Pervasive "Big Vat of Content" Problem And this issue of *painting context* is very related to the *big vat of content* pervasive problem on the Web. Good hierarchy is not easy to do. Something's going to work but it's generally going to be one of these solutions that cuts a navigational scheme across your entire site with templates. Things will be sorted into that hierarchy but say it's an ecommerce site with lots of products under a particular category or collection. You're going to have to either implement pagination or lazy-loading infinite scroll, and... well, it's the *big vat of content* problem. Hmmm, what else needs to be said? Because I'm developing the solution here, at least experimentally. Sites often have a search tool and those search tools in the case of ecommerce are often faceted. If this is actually usable by a Web-navigating AI, usually of the LLM variety and usually using tool-calls, then you've provided an alternative means of navigating your site. The user expresses some sort of intent and then the AI can add facet after facet to their search to do drill-down with facets. With faceted search all the values of the facets can be given on the first request following the discovery of the search tool so the whole drill-down procedure can be short circuited. That's optimal and one priority must always be making site search tools usable by LLM-style AIs navigating the web by calling tools. I haven't gotten to this yet. ## Navigating the Web: Drill-Down and Small World Theory The other way of navigating the Web and a single website in particular is drill-down through its navigation. Fortunately these days a quick glance at what's in the `