Cookie Consent by Free Privacy Policy Generator User Agent Directory | Bot Database | CL SEO

User Agent Directory

A comprehensive database of 124 verified user agents crawling the web. Identify AI bots, SEO crawlers, search engine spiders and understand their behavior patterns.

124
Total User Agents
17
AI Crawlers
36
Search Engines
24
SEO Tools
Vendor: ContentKing
Mozilla/5.0 (compatible; ContentKing/1.0; +https://www.contentkingapp.com)
#seo #monitoring #real-time #crawler
robots.txt: ContentKing
Vendor: Lumar
Mozilla/5.0 (compatible; Deepcrawl/3.5; +https://www.lumar.io/)
#seo #technical #audit #crawler
robots.txt: Deepcrawl
Vendor: OnCrawl
Mozilla/5.0 (compatible; OnCrawl/1.0; +https://www.oncrawl.com/)
#seo #technical #data #crawler
robots.txt: OnCrawl
Vendor: Botify
Mozilla/5.0 (compatible; botify; +https://www.botify.com)
#seo #enterprise #analytics #crawler
robots.txt: botify
seo
Vendor: Ryte
Mozilla/5.0 (compatible; RyteBot/1.0; +https://www.ryte.com/)
#seo #website-quality #crawler #german
robots.txt: RyteBot
Vendor: SISTRIX
Mozilla/5.0 (compatible; SISTRIX Crawler; +https://crawler.sistrix.net/)
#seo #visibility #german #crawler
robots.txt: SISTRIX Crawler
Vendor: Xenu
Xenu Link Sleuth/1.3.9
#seo #broken-links #desktop #crawler
robots.txt: Xenu Link Sleuth
Vendor: W3C
Jigsaw/2.3.0 W3C_CSS_Validator_JFouffa/2.0
#validation #css #w3c #standards
robots.txt: W3C_CSS_Validator
Vendor: W3C
W3C-checklink/5.0.0
#validation #links #w3c #checker
robots.txt: W3C-checklink
Vendor: W3C
Mozilla/5.0 (compatible; Validator.nu/LV)
#validation #html #w3c #standards
robots.txt: Validator.nu
Vendor: Google
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
#search #google #mobile #crawler
robots.txt: Googlebot-Mobile
Vendor: Google
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; AdsBot-Google-Mobile; +http://www.google.com/mobile/adsbot.html)
#ads #google #mobile #quality-check
robots.txt: AdsBot-Google-Mobile
Vendor: Yandex
Mozilla/5.0 (compatible; YandexImages/3.0; +http://yandex.com/bots)
#search #yandex #images #russian
robots.txt: YandexImages
Vendor: Baidu
Baiduspider-image+(+http://www.baidu.com/search/spider.htm)
#search #baidu #images #chinese
robots.txt: Baiduspider-image
Vendor: Semrush
Mozilla/5.0 (compatible; SemrushBot-SA/0.97; +http://www.semrush.com/bot.html)
#seo #site-audit #semrush #crawler
robots.txt: SemrushBot-SA
other
Vendor: Pipl
Mozilla/5.0 (compatible; PiplBot; +http://www.pipl.com/bot/)
#people-search #data-aggregation #identity #crawler
robots.txt: PiplBot
Vendor: Webmaster Brian
Mozilla/5.0 (compatible; WBSearchBot/1.1; +http://www.webmasterbrain.com/bot/)
#seo #search #indexing #crawler
robots.txt: WBSearchBot
Vendor: Turnitin
TurnitinBot (https://turnitin.com/robot/crawlerinfo.html)
#plagiarism #academic #content-checking #crawler
robots.txt: TurnitinBot
Vendor: ZoomInfo
ZoominfoBot (zoominfobot at zoominfo dot com)
#b2b #data-collection #business-intelligence #crawler
robots.txt: ZoominfoBot
Vendor: Datadog
Mozilla/5.0 (X11; Linux x86_64; DatadogSynthetics) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36
#monitoring #synthetics #apm #observability
robots.txt: DatadogSynthetics
Vendor: New Relic
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.60 Safari/537.36 NewRelicSynthetics/1.0
#monitoring #synthetics #apm #performance
robots.txt: NewRelicSynthetics
Vendor: Censys
Mozilla/5.0 (compatible; CensysInspect/1.1; +https://about.censys.io/)
#security #scanning #research #internet-scanning
robots.txt: CensysInspect
Vendor: Neeva
Mozilla/5.0 (compatible; Neevabot/1.0; +https://neeva.com/neevabot)
#search #privacy #ad-free #crawler
robots.txt: Neevabot
Vendor: Verisign
Mozilla/5.0 (compatible; Verisign Spider; +http://www.verisign.com/)
#security #dns #domain-verification #crawler
robots.txt: Verisign Spider

What are User Agents?

User agents are strings that identify the software making requests to your website. They help servers understand what type of client is accessing the content - whether it's a browser, search engine crawler, SEO tool, or AI bot.

Why This Matters

  • Control which bots can access your content
  • Identify AI crawlers harvesting data
  • Monitor SEO tools analyzing your site
  • Understand your traffic sources better

How to Use This Data

  • Create robots.txt rules: Block or allow specific bots
  • Server configuration: Set up rate limiting for aggressive crawlers
  • Analytics filtering: Exclude bot traffic from reports
  • Security monitoring: Identify suspicious crawler activity