Guides, explanations, and troubleshooting for both free audits and the beta monitoring platform.
Paste any public URL you want to check — your homepage, a blog post, a product page, or any publicly accessible page.
Performance, crawlability, and content visibility checks run in parallel and usually complete in 15–30 seconds.
The report is organized into tabs: Overview, Performance, Crawlability, and Content Visibility. Each tab surfaces actionable findings.
Click Export PDF to download a shareable report for developers, clients, or stakeholders.
Sign up at app.beseenby.ai/signup. Access is invite-only during beta — use your invite code at signup.
Projects group pages by site or client. Examples: "Company Website", "Client A", "Client B — E-commerce". One project per website is the recommended pattern.
Add individual URLs one at a time, or bulk-import via CSV for larger sites. You can also trigger a manual check immediately.
Optional but recommended. Set scheduled checks, choose frequency (daily or weekly), and configure alert thresholds for each page.
Projects keep your tracked pages organized. Each project represents one website or client property. Best practices: one project per website, separate projects for separate clients, use descriptive names that make reports easy to identify.
url,page_name,frequency https://example.com,Homepage,daily https://example.com/about,About Page,weekly https://example.com/products,Products,weekly
From the project view you can see all tracked pages at a glance. Filter and sort options include: by status (healthy / needs attention / error), by last check date, and by monitoring status (monitored / manual-only). Bulk actions available: enable or disable monitoring, change check frequency, delete pages, and export the full list as CSV.
When monitoring is enabled, checks run automatically on schedule, results are saved to report history, alerts fire when thresholds are crossed, and no manual action is required between checks.
Best for critical pages. Recommended for: homepage, main product or service pages, top-converting landing pages, and any page you're actively monitoring post-deployment.
Standard cadence for most pages. Good for: blog posts, secondary product pages, support docs, and about or contact pages.
Set any schedule that fits your workflow — every 3 days, bi-weekly, monthly, or first Monday of the month.
You can add a page to a project without enabling scheduled monitoring. In manual-only mode, audits only run when you trigger them. Useful for: staging environments, pre- and post-deployment spot checks, and one-off audits where you don't need ongoing history.
Sent as soon as a change is detected. Used for high-severity changes like a crawlability block or a major TTFB regression.
A summary email covering all changes detected in the past 24 hours. Good for staying informed without inbox noise.
A summary of the past 7 days of changes. Useful for lower-priority pages or weekly review workflows.
Alerts can be configured per-page from page settings, or at the project level to set defaults that apply to all pages in the project. Per-page settings always override project defaults when set.
Compare Mode lets you view two saved reports side-by-side to see what changed between any two points in time.
Once branding is configured, PDF exports automatically include your logo and colors, shareable report links display your branding, and the "Powered by BeSeenBy.AI" attribution is removed.
Support for multiple branding profiles — one per client — is on the roadmap. This will let you switch branding per project, useful for agencies managing multiple brands. Not yet available.
| Metric | Good | Needs Improvement | Poor |
|---|---|---|---|
| TTFB | ≤ 800ms | 800–1800ms | > 1800ms |
| CLS | ≤ 0.1 | 0.1–0.25 | > 0.25 |
| INP | ≤ 200ms | 200–500ms | > 500ms |
Check for these common causes:
User-agent: * Disallow: / blocks all bots including AI crawlers.User-agent: Bot can inadvertently catch GPTBot, ClaudeBot, and similar names.Example robots.txt that blocks all bots:
User-agent: * Disallow: /
We parse robots.txt according to RFC 9309 to determine the effective policy for each bot.
See the Content Visibility features page for a deeper technical walkthrough.
chrome:// pages, extension settings pages, local files (file://), and pages with strict Content Security Policy headers that block extension scripts. For these cases, use the web app at beseenby.ai instead.yourdomain.com/robots.txt in your browser, then run the audit again.Time to First Byte. Measures how long a server takes to respond to a request. The most critical metric for AI crawler visibility — crawlers from data center IPs cannot rely on CDN caching.
Cumulative Layout Shift. Measures visual stability — how much page content unexpectedly moves during load. A Core Web Vital.
Interaction to Next Paint. Measures responsiveness to user interactions. Replaced FID as a Core Web Vital in 2024.
Chrome UX Report. Google's dataset of real-user performance metrics collected from Chrome users in the field, aggregated at the URL and origin level.
A plain-text file at the root of a domain that instructs crawlers which paths they are (or are not) permitted to access. Advisory, not enforced technically.
An HTML meta tag or HTTP response header that instructs search engines and crawlers not to index a page, even if they can access it.
Origin-level data is aggregated across an entire domain. URL-level data is specific to one page. Both can be present in a single report when page-level data exists but is supplemented with domain context.
A tracked page is any URL added to a project. A monitored page is a tracked page with scheduled automatic checks enabled. All monitored pages are tracked, but not all tracked pages are monitored.
Have a question not answered here? Reach out and we'll get back to you.
Get in TouchDetailed documentation on each diagnostic check — crawlability, performance, content visibility, and authority.
View FeaturesAccess monitoring, white-label reports, compare mode, and team features in the full platform.
Get Beta Access