Cookie Consent by Free Privacy Policy Generator User Agent Directory | Bot Database | CL SEO

User Agent Directory

A comprehensive database of 124 verified user agents crawling the web. Identify AI bots, SEO crawlers, search engine spiders and understand their behavior patterns.

124
Total User Agents
17
AI Crawlers
36
Search Engines
24
SEO Tools
monitoring
Vendor: Site24x7
Mozilla/5.0 (compatible; Site24x7/1.0; +https://www.site24x7.com/)
#monitoring #uptime #performance #apm
robots.txt: Site24x7
monitoring
Vendor: GTmetrix
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 GTmetrix
#performance #pagespeed #monitoring #testing
robots.txt: GTmetrix
Vendor: ByteDance
Mozilla/5.0 (compatible; Bytespider; [email protected])
#bytedance #tiktok #search #chinese
robots.txt: Bytespider
Vendor: You.com
Mozilla/5.0 (compatible; YouBot/1.0; +https://you.com/bot)
#ai #search #answer-engine #crawler
robots.txt: YouBot
Vendor: Cohere
Mozilla/5.0 (compatible; Cohere-AI/1.0; +https://cohere.com/)
#ai #nlp #training #crawler
robots.txt: Cohere-Ai
Vendor: Google
GoogleOther
#google #research #crawler #generic
robots.txt: GoogleOther
Vendor: Google
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 Chrome-Lighthouse
#performance #audit #pagespeed #google
robots.txt: Chrome-Lighthouse
Vendor: Google
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko; Google Page Speed Insights) Chrome/120.0.0.0 Safari/537.36
#performance #pagespeed #google #testing
robots.txt: Google Page Speed Insights
Vendor: Internet Archive
ia_archiver (+http://www.alexa.com/site/help/webmasters; [email protected])
#archive #wayback #preservation #history
robots.txt: ia_archiver
other
Vendor: GNU
Wget/1.21.3
#download #cli #tool #gnu
robots.txt: Wget
other
Vendor: curl
curl/7.81.0
#download #cli #tool #testing
robots.txt: curl
Vendor: Python
Python-urllib/3.11
#python #library #scraping #programming
robots.txt: Python-urllib
Vendor: Python
python-requests/2.31.0
#python #library #http #programming
robots.txt: python-requests
Vendor: Go
Go-http-client/1.1
#golang #http #programming #client
robots.txt: Go-http-client
other
Vendor: Oracle
Java/17.0.8
#java #programming #http #client
robots.txt: Java
other
Vendor: Scrapy
Scrapy/2.11.0 (+https://scrapy.org)
#scraping #python #framework #crawler
robots.txt: Scrapy
Vendor: Apache
Apache-HttpClient/4.5.14 (Java/17.0.8)
#java #apache #http #client
robots.txt: Apache-HttpClient
other
Vendor: Postman
PostmanRuntime/7.35.0
#api #testing #development #tool
robots.txt: PostmanRuntime
Vendor: Kong
insomnia/2023.5.8
#api #testing #development #tool
robots.txt: insomnia
other
Vendor: Zoom
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.1 Safari/537.36 ZoomBot
#zoom #meeting #preview #chat
robots.txt: ZoomBot
Vendor: Grammarly
Mozilla/5.0 (compatible; GrammarlyBot/1.0; +https://www.grammarly.com/bot)
#writing #grammar #ai #assistant
robots.txt: GrammarlyBot
Vendor: Algolia
Algolia Crawler
#search #indexing #algolia #crawler
robots.txt: Algolia Crawler
Vendor: Diffbot
Mozilla/5.0 (compatible; Diffbot/0.1; +http://www.diffbot.com/our-apis/crawler/)
#ai #extraction #knowledge-graph #crawler
robots.txt: Diffbot
Vendor: Siteimprove
Mozilla/5.0 (compatible; SITEIMPROVE)
#accessibility #seo #quality #crawler
robots.txt: SITEIMPROVE

What are User Agents?

User agents are strings that identify the software making requests to your website. They help servers understand what type of client is accessing the content - whether it's a browser, search engine crawler, SEO tool, or AI bot.

Why This Matters

  • Control which bots can access your content
  • Identify AI crawlers harvesting data
  • Monitor SEO tools analyzing your site
  • Understand your traffic sources better

How to Use This Data

  • Create robots.txt rules: Block or allow specific bots
  • Server configuration: Set up rate limiting for aggressive crawlers
  • Analytics filtering: Exclude bot traffic from reports
  • Security monitoring: Identify suspicious crawler activity