Cookie Consent by Free Privacy Policy Generator User Agent Directory | Bot Database | CL SEO

User Agent Directory

A comprehensive database of 124 verified user agents crawling the web. Identify AI bots, SEO crawlers, search engine spiders and understand their behavior patterns.

124
Total User Agents
17
AI Crawlers
36
Search Engines
24
SEO Tools
Vendor: Swiftype
Swiftbot/1.0 (swiftype.com)
#search #enterprise #site-search #crawler
robots.txt: Swiftbot
Vendor: Cliqz
Mozilla/5.0 (compatible; Cliqzbot/3.0; +http://cliqz.com/bot)
#search #privacy #german #browser
robots.txt: Cliqzbot
Vendor: Surdotly
Mozilla/5.0 (compatible; SurdotlyBot/1.0; +http://sur.ly/bot.html)
#link-safety #url-preview #security #crawler
robots.txt: SurdotlyBot
Vendor: MegaIndex
Mozilla/5.0 (compatible; MegaIndex.ru/2.0; +http://megaindex.com/crawler)
#seo #russian #backlinks #crawler
robots.txt: MegaIndex
Vendor: Mail.Ru
Mozilla/5.0 (compatible; Mail.RU_Bot/2.0; +http://go.mail.ru/bot)
#search #russian #mail-ru #crawler
robots.txt: Mail.RU_Bot
Vendor: Sputnik
Mozilla/5.0 (compatible; SputnikBot/2.3; +http://corp.sputnik.ru/webmaster)
#search #russian #sputnik #crawler
robots.txt: SputnikBot
other
Vendor: ltx71
ltx71 - (http://ltx71.com/)
#hosting #security #monitoring #crawler
robots.txt: ltx71
Vendor: Netcraft
Mozilla/5.0 (compatible; NetcraftSurveyAgent/1.0; [email protected])
#security #survey #web-analysis #anti-phishing
robots.txt: NetcraftSurveyAgent
Vendor: 360 Search
Mozilla/5.0 (compatible; 360Spider/3.0; +http://www.so.com/help/help_3_2.html)
#search #chinese #360 #security
robots.txt: 360Spider
Vendor: Gigablast
Gigabot/3.0 (http://www.gigablast.com/spider.html)
#search #open-source #independent #crawler
robots.txt: Gigabot
Vendor: Sogou
Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)
#search #chinese #sogou #crawler
robots.txt: Sogou web spider
Vendor: Exalead
Mozilla/5.0 (compatible; Exabot/3.0; +http://www.exabot.com/go/robot)
#search #enterprise #french #crawler
robots.txt: Exabot
Vendor: Meta
facebookplatform/1.0 (+http://developers.facebook.com)
#social #facebook #platform #apps
robots.txt: facebookplatform
Vendor: Nextcloud
NextCloud Server Crawler
#cloud #preview #self-hosted #crawler
robots.txt: NextCloud Server Crawler
Vendor: Mojeek
Mozilla/5.0 (compatible; MojeekBot/0.11; +https://www.mojeek.com/bot.html)
#search #privacy #independent #uk
robots.txt: MojeekBot
Vendor: Integral Ad Science
IAS Crawler/1.0 (+https://integralads.com/ias-crawler/)
#advertising #brand-safety #verification #crawler
robots.txt: IAS Crawler
Vendor: Awario
Mozilla/5.0 (compatible; AwarioSmartBot/1.0; +https://awario.com/bot.html)
#social-listening #brand-monitoring #mentions #crawler
robots.txt: AwarioSmartBot
other
Vendor: Buck
Buck/2.3.1; (+https://app.hypefactors.com/media-monitoring/about-buck)
#media-monitoring #pr #news #crawler
robots.txt: Buck
Vendor: Naver
Mozilla/5.0 (compatible; Yeti/1.1; +http://naver.me/spd)
#search #korean #naver #crawler
robots.txt: Yeti
Vendor: Coc Coc
Mozilla/5.0 (compatible; coccoc/1.0; +http://help.coccoc.com/)
#search #vietnamese #coccoc #browser #crawler
robots.txt: coccoc
Vendor: Brave
Mozilla/5.0 (compatible; BraveBot/1.0; +https://search.brave.com/help/bot)
#search #privacy #brave #independent #crawler
robots.txt: BraveBot
Vendor: Microsoft
Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0; AppInsights)
#microsoft #azure #monitoring #performance #crawler
robots.txt: AppInsights
Vendor: WebPageTest
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/125.0.0.0 Safari/537.36 WebPageTest
#performance #testing #webpagetest #open-source #crawler
robots.txt: WebPageTest
other
Vendor: Feedly
Feedly/1.0 (+https://feedly.com/fetcher.html; 1 subscriber)
#rss #aggregator #news #reader #crawler
robots.txt: Feedly

What are User Agents?

User agents are strings that identify the software making requests to your website. They help servers understand what type of client is accessing the content - whether it's a browser, search engine crawler, SEO tool, or AI bot.

Why This Matters

  • Control which bots can access your content
  • Identify AI crawlers harvesting data
  • Monitor SEO tools analyzing your site
  • Understand your traffic sources better

How to Use This Data

  • Create robots.txt rules: Block or allow specific bots
  • Server configuration: Set up rate limiting for aggressive crawlers
  • Analytics filtering: Exclude bot traffic from reports
  • Security monitoring: Identify suspicious crawler activity