The Lab tab

The Lab tab shows experimental and exploratory checks. These signals can inform future optimisation, but they are rarely the most urgent thing to fix.

What this tab shows

The Lab tab groups diagnostics that are exploratory rather than established. These are checks where the underlying evidence is still developing — either because the signal is new, because the standards around it are not yet settled, or because the impact is likely but not definitively measured.

Current Lab checks include:

llms.txt — Whether the site has a /llms.txt file. This is an emerging convention (inspired by robots.txt) for giving AI systems a structured, plain-text overview of the site’s content, key sections, and any guidance for AI agents interacting with the site. It is not yet a standard adopted by major AI systems, but it is an area of active development.

HTML vs Markdown comparison — A comparison of how your page reads when converted from raw HTML to a clean Markdown representation. AI systems that work with simplified or linearised versions of pages may receive a version of your content closer to the Markdown representation than the full HTML. Significant differences between the two can indicate content that does not survive simplification well.

Why it is a separate tab

These signals are kept in Lab because:

  1. The evidence base for their impact is still developing
  2. They are almost never the reason a page is failing today
  3. Acting on Lab items before fixing core issues in the other tabs is counterproductive

A page that is blocked in robots.txt, delivering empty HTML to AI bots, or responding in 4,000ms does not need a /llms.txt file. Fix those first.

When Lab items do matter

Lab signals become useful once the core tabs are healthy:

  • Crawlability is passing for the bots you care about
  • Content Visibility shows no significant content gaps
  • Authority has no failing Required checks
  • Performance is in the A or B range for TTFB

At that point, Lab signals can inform experiments and future-proofing. They may also be relevant if you are trying to stay ahead of practices that are likely to become more important as AI retrieval evolves.

llms.txt: what it is and whether to add one

/llms.txt is a plain-text file placed at the root of your site that gives AI systems a high-level summary of the site — what it covers, key sections, and any instructions for AI agents. It is analogous to robots.txt (access control) and sitemap.xml (URL discovery), but focused on helping language models understand the site’s purpose and structure.

Whether to add one today is a judgment call. The file is easy to add and does not cause harm. It may be picked up by AI systems that already support it. But there is no confirmed evidence yet that its presence materially changes how major AI systems retrieve or rank your content.

HTML vs Markdown: what a bad result means

If your page’s Markdown representation is significantly shorter, less coherent, or missing key sections compared to the raw HTML, it may indicate:

  • Content that is heavily dependent on visual layout (tables, columns) to convey meaning
  • Information embedded in images without alt text
  • Key text in elements that do not survive HTML-to-Markdown conversion cleanly

This is worth knowing for future optimisation, but it is rarely the immediate blocker.