Robots & AI Crawler Guide

Crawler directives and AI-readable content resources for Reploy.

AI-Readable Content Resources

The following resources are optimized for AI assistants, LLMs, and answer engines:

  • /llm.txt - Plain text version (recommended for LLMs)
  • /llm.html - HTML version with structured data
  • /ai.txt - Plain text version (alternate)
  • /ai.html - HTML version with structured data (alternate)
For AI Tools: Use /llm.txt or /ai.txt for the most efficient parsing. Use the HTML versions for structured data and schema markup.

Static HTML Pages (No JavaScript Required)

All key site content is available as static HTML for crawlers and AI tools:

Product Pages (Static HTML)

Foundation (Free Tier)

Accelerate ($499/month)

Robots.txt Directives

# Robots.txt for Reploy
# https://reploylabs.com

# AI-Readable Content Available At:
# /llm.txt - Plain text version (recommended for LLMs)
# /llm.html - HTML version with structured data
# /ai.txt - Plain text version (alternate)
# /ai.html - HTML version with structured data (alternate)

User-agent: *
Allow: /

# AI Crawlers - Explicitly Allowed
User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: Claude-Web
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: Applebot-Extended
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: CCBot
Allow: /

User-agent: cohere-ai
Allow: /

# Search Engine Crawlers
User-agent: Googlebot
Allow: /

User-agent: Bingbot
Allow: /

User-agent: Twitterbot
Allow: /

User-agent: facebookexternalhit
Allow: /

User-agent: LinkedInBot
Allow: /

# Sitemap
Sitemap: https://reploylabs.com/sitemap.xml
      

About Reploy

Reploy is the platform layer for teams on AWS. We interpret your AWS resources and automatically provide environments, service maps, autoscaling, observability, and cost controls — all running inside your existing account.

Learn more at reploylabs.com