-
Home
- User Agent Directory
User Agent Directory
A comprehensive database of 124 verified user agents crawling the web. Identify AI bots, SEO crawlers, search engine spiders and understand their behavior patterns.
124
Total User Agents
17
AI Crawlers
36
Search Engines
24
SEO Tools
seo
Vendor: ContentKing
Mozilla/5.0 (compatible; ContentKing/1.0; +https://www.contentkingapp.com)
seo
Vendor: SISTRIX
Mozilla/5.0 (compatible; SISTRIX Crawler; +https://crawler.sistrix.net/)
search
Vendor: Google
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
search
Vendor: Google
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; AdsBot-Google-Mobile; +http://www.google.com/mobile/adsbot.html)
search
Vendor: Yandex
Mozilla/5.0 (compatible; YandexImages/3.0; +http://yandex.com/bots)
seo
Vendor: Semrush
Mozilla/5.0 (compatible; SemrushBot-SA/0.97; +http://www.semrush.com/bot.html)
seo
Vendor: Webmaster Brian
Mozilla/5.0 (compatible; WBSearchBot/1.1; +http://www.webmasterbrain.com/bot/)
monitoring
Vendor: Datadog
Mozilla/5.0 (X11; Linux x86_64; DatadogSynthetics) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36
monitoring
Vendor: New Relic
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.60 Safari/537.36 NewRelicSynthetics/1.0
security
Vendor: Censys
Mozilla/5.0 (compatible; CensysInspect/1.1; +https://about.censys.io/)
security
Vendor: Verisign
Mozilla/5.0 (compatible; Verisign Spider; +http://www.verisign.com/)
What are User Agents?
User agents are strings that identify the software making requests to your website. They help servers understand what type of client is accessing the content - whether it's a browser, search engine crawler, SEO tool, or AI bot.
Why This Matters
- Control which bots can access your content
- Identify AI crawlers harvesting data
- Monitor SEO tools analyzing your site
- Understand your traffic sources better
How to Use This Data
- Create robots.txt rules: Block or allow specific bots
- Server configuration: Set up rate limiting for aggressive crawlers
- Analytics filtering: Exclude bot traffic from reports
- Security monitoring: Identify suspicious crawler activity
Tip: Use the Robots.txt Generator to easily create rules for these user agents.