# TimerRemote Robots Configuration # This file controls crawling behavior for all search engines and AI systems User-agent: * Disallow: /admin/ Disallow: /api/ Disallow: /dashboard/ Disallow: /private/ Allow: / Allow: /blog Allow: /privacy-policy Allow: /terms-of-service Crawl-delay: 1 Request-rate: 30/60 # Search engines - optimized crawl parameters User-agent: Googlebot Disallow: /admin/ Disallow: /api/ Disallow: /dashboard/ Allow: / Crawl-delay: 0.5 Request-rate: 60/60 User-agent: Googlebot-Extended Disallow: /admin/ Disallow: /api/ Disallow: /dashboard/ Allow: / Crawl-delay: 0.5 User-agent: Bingbot Disallow: /admin/ Disallow: /api/ Disallow: /dashboard/ Allow: / Crawl-delay: 1 Request-rate: 60/60 User-agent: Slurp Disallow: /admin/ Disallow: /api/ Allow: / Crawl-delay: 1 User-agent: DuckDuckBot Disallow: /admin/ Allow: / Crawl-delay: 1 # AI and LLM crawlers - explicitly allowed for indexing User-agent: GPTBot Disallow: /admin/ Disallow: /api/ Allow: / Crawl-delay: 1 User-agent: CCBot Disallow: /admin/ Allow: / Crawl-delay: 1 User-agent: Claude-Web Disallow: /admin/ Allow: / Crawl-delay: 1 User-agent: anthropic-ai Disallow: /admin/ Allow: / Crawl-delay: 1 User-agent: Applebot Disallow: /admin/ Allow: / Crawl-delay: 1 User-agent: Perplexity Disallow: /admin/ Allow: / Crawl-delay: 1 User-agent: Baiduspider Disallow: /admin/ Allow: / Crawl-delay: 1 # Sitemap location - helps crawlers discover all content Sitemap: https://timerremote.com/sitemap.xml Sitemap: /sitemap.xml