A diagnostic tool for teams who need to know if technical issues are preventing AI systems from accessing their content.
AI systems like ChatGPT, Claude, and Perplexity are answering millions of questions every day. Businesses invest in content strategy, SEO optimization, and AI-specific content—but many skip the fundamental question: Can AI systems actually reach your content in the first place?
This tool was built to answer that question with real data.
We don't track rankings, monitor citations, or promise AI optimization. We diagnose technical barriers—the kind that prevent access before optimization even matters.
We use authoritative sources like Google's CrUX API, the Robots Exclusion Protocol (RFC 9309), HTTP Status Check probes (Browser + GPTBot + ClaudeBot + PerplexityBot), and page-level noindex detection. When real data isn't available (like CrUX coverage for low-traffic sites), we say so clearly instead of guessing.
No synthetic tests. No assumptions. No made-up metrics.
This tool doesn't optimize anything. It doesn't promise rankings, citations, or traffic. It shows you AI readiness status—what's working, what's not, and what might need attention.
Think of it as the technical pre-flight check before optimization strategies.
We're explicit about what we check and what we don't:
robots.txt is advisory, not access control. Noindex directives can suppress indexing even when crawl access is allowed. CrUX data may be unavailable. Performance thresholds affect likelihood, not certainty.
We communicate limitations clearly because overselling tools breaks trust.
If you need traditional SEO audits, content optimization, or competitive analysis—this isn't the tool for that.
If you need to know whether technical issues could be preventing AI access—this is exactly the tool for that.
This tool was created by Andre Guelmann, a Website Growth & Technical SEO Consultant with 15+ years of experience helping companies improve organic visibility and website operations.
After working with dozens of sites losing traffic to preventable technical issues—and seeing the same patterns emerge with AI visibility—it became clear that most teams needed better diagnostics, not more optimization advice.
This tool is that diagnostic layer: real data about whether AI crawlers can access your site, whether your content is visible in raw HTML, and whether performance issues might be causing fetch failures.