The Performance tab

The Performance tab shows how fast your page and server respond. TTFB is the most critical metric for AI retrieval — slow pages are harder to include reliably.

What this tab shows

The Performance tab reports three metrics:

  • TTFB (Time to First Byte) — how long it takes the server to start sending a response after receiving a request
  • CLS (Cumulative Layout Shift) — how much the page layout moves during loading
  • INP (Interaction to Next Paint) — how quickly the page responds to user input

TTFB is by far the most important metric for AI retrieval. CLS and INP are included because they affect the overall page quality signal, but they are not the primary reason a page fails in AI retrieval pipelines.

Why TTFB matters most

AI systems that retrieve content in real time are working under time pressure. They are not waiting patiently for a page to finish loading — they are making decisions about which sources to include based on how quickly a response arrives. A server that consistently takes three seconds to respond is a server that gets skipped.

The connection is direct: a slow TTFB means the page takes longer to index and re-index. It also means the page is harder to include in real-time retrieval pipelines like those powering search-integrated AI tools.

TTFB grade table

Grade Response time What it means
A+ Under 200ms Strong signal for AI inclusion
A 200–350ms Healthy range
B 350–600ms Acceptable but improvable
C 600–1,000ms Noticeable slowdown, warrants attention
D 1,000–2,000ms Significant problem
F Over 2,000ms Very likely to be excluded from fast retrieval

The grade is based on real-world field data from the Chrome UX Report (CrUX) where available, or lab-based measurements for pages without field data. CrUX data reflects actual user experiences across recent visits, which makes it more representative than a single lab test.

Origin-level vs page-level data

The Performance tab shows two views:

Page-level — Performance data specific to this URL. If available, this is the most relevant measurement.

Origin-level — Average performance across your entire domain. This is useful when the specific page does not have enough traffic for CrUX to have data. If the origin average is slow, it often means a server or hosting infrastructure issue affecting all pages.

If neither page-level nor origin-level field data is available, the tab falls back to a lab measurement.

CLS (Cumulative Layout Shift)

CLS measures how much the visible page content moves unexpectedly during load. For AI systems, significant layout shift is a minor concern — it can make content extraction less reliable if the page structure changes significantly after the initial render. For human visitors, it is a more direct usability problem.

Grade CLS score
Good Under 0.1
Needs improvement 0.1–0.25
Poor Over 0.25

INP (Interaction to Next Paint)

INP measures responsiveness — how quickly the page responds to clicks, taps, or keyboard input. This is primarily a usability metric for human visitors. For AI systems it is a secondary signal, but it contributes to overall page quality signals used by some retrieval systems.

Grade INP
Good Under 200ms
Needs improvement 200–500ms
Poor Over 500ms

How to improve TTFB

For hosted platforms (WordPress, Squarespace, Webflow, etc.) Upgrade your hosting tier or add a caching plugin. Most slow TTFB on shared hosting comes from a slow server response before caching kicks in.

For custom setups Enable a CDN (Cloudflare, Fastly, etc.) to serve responses from edge nodes close to your visitors. Enable full-page caching where possible. Investigate slow database queries or large synchronous operations on page load.

For Next.js, Nuxt, or similar frameworks Check whether the page is rendering server-side or client-side. Pages that render entirely client-side deliver an empty HTML shell fast (good TTFB) but then load all content via JavaScript (bad for content visibility). Server-side rendering solves both TTFB and content visibility at once.

What the data source means

Next to each metric, the tab shows whether the data comes from the Chrome UX Report (CrUX) or a lab measurement. CrUX data represents aggregated real visits to the page over a period of weeks and is more representative. Lab data is a single synthetic measurement at audit time and can vary depending on server load and network conditions.