Cookie Consent by Free Privacy Policy Generator Seo Bots | User Agent Directory | CL SEO

User Agent Directory

A comprehensive database of 124 verified user agents crawling the web. Identify AI bots, SEO crawlers, search engine spiders and understand their behavior patterns.

124
Total User Agents
17
AI Crawlers
36
Search Engines
24
SEO Tools

Found 24 user agents in category "seo"

Vendor: Ahrefs
Mozilla/5.0 (compatible; AhrefsBot/7.0; +http://ahrefs.com/robot/)
#seo #backlinks #analytics #crawler
robots.txt: AhrefsBot
Vendor: Semrush
Mozilla/5.0 (compatible; SemrushBot/7~bl; +http://www.semrush.com/bot.html)
#seo #analytics #marketing #crawler
robots.txt: SemrushBot
Vendor: Majestic
Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)
#seo #backlinks #majestic #crawler
robots.txt: MJ12bot
Vendor: Moz
Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; [email protected])
#seo #moz #opensiteexplorer #crawler
robots.txt: DotBot
Vendor: Moz
rogerbot/1.0 (http://moz.com/help/pro/rogerbot-crawler)
#seo #moz #site-audit #crawler
robots.txt: rogerbot
Vendor: Screaming Frog
Screaming Frog SEO Spider/20.0
#seo #audit #desktop #spider
robots.txt: Screaming Frog SEO Spider
Vendor: SEOkicks
Mozilla/5.0 (compatible; SEOkicks; +https://www.seokicks.de/robot.html)
#seo #german #backlinks #crawler
robots.txt: SEOkicks
Vendor: WebMeUp
Mozilla/5.0 (compatible; BLEXBot/1.0; +http://webmeup.com/crawler/)
#seo #backlinks #webmeup #crawler
robots.txt: BLEXBot
Vendor: DataForSEO
Mozilla/5.0 (compatible; DataForSeoBot/1.0; +https://dataforseo.com/dataforseo-bot)
#seo #api #data #crawler
robots.txt: DataForSeoBot
monitoring
Vendor: GTmetrix
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 GTmetrix
#performance #pagespeed #monitoring #testing
robots.txt: GTmetrix
Vendor: Google
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 Chrome-Lighthouse
#performance #audit #pagespeed #google
robots.txt: Chrome-Lighthouse
Vendor: Google
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko; Google Page Speed Insights) Chrome/120.0.0.0 Safari/537.36
#performance #pagespeed #google #testing
robots.txt: Google Page Speed Insights
Vendor: Siteimprove
Mozilla/5.0 (compatible; SITEIMPROVE)
#accessibility #seo #quality #crawler
robots.txt: SITEIMPROVE
Vendor: ContentKing
Mozilla/5.0 (compatible; ContentKing/1.0; +https://www.contentkingapp.com)
#seo #monitoring #real-time #crawler
robots.txt: ContentKing
Vendor: Lumar
Mozilla/5.0 (compatible; Deepcrawl/3.5; +https://www.lumar.io/)
#seo #technical #audit #crawler
robots.txt: Deepcrawl
Vendor: OnCrawl
Mozilla/5.0 (compatible; OnCrawl/1.0; +https://www.oncrawl.com/)
#seo #technical #data #crawler
robots.txt: OnCrawl
Vendor: Botify
Mozilla/5.0 (compatible; botify; +https://www.botify.com)
#seo #enterprise #analytics #crawler
robots.txt: botify
seo
Vendor: Ryte
Mozilla/5.0 (compatible; RyteBot/1.0; +https://www.ryte.com/)
#seo #website-quality #crawler #german
robots.txt: RyteBot
Vendor: SISTRIX
Mozilla/5.0 (compatible; SISTRIX Crawler; +https://crawler.sistrix.net/)
#seo #visibility #german #crawler
robots.txt: SISTRIX Crawler
Vendor: Xenu
Xenu Link Sleuth/1.3.9
#seo #broken-links #desktop #crawler
robots.txt: Xenu Link Sleuth
Vendor: Semrush
Mozilla/5.0 (compatible; SemrushBot-SA/0.97; +http://www.semrush.com/bot.html)
#seo #site-audit #semrush #crawler
robots.txt: SemrushBot-SA
Vendor: Webmaster Brian
Mozilla/5.0 (compatible; WBSearchBot/1.1; +http://www.webmasterbrain.com/bot/)
#seo #search #indexing #crawler
robots.txt: WBSearchBot
Vendor: MegaIndex
Mozilla/5.0 (compatible; MegaIndex.ru/2.0; +http://megaindex.com/crawler)
#seo #russian #backlinks #crawler
robots.txt: MegaIndex
Vendor: Awario
Mozilla/5.0 (compatible; AwarioSmartBot/1.0; +https://awario.com/bot.html)
#social-listening #brand-monitoring #mentions #crawler
robots.txt: AwarioSmartBot

What are User Agents?

User agents are strings that identify the software making requests to your website. They help servers understand what type of client is accessing the content - whether it's a browser, search engine crawler, SEO tool, or AI bot.

Why This Matters

  • Control which bots can access your content
  • Identify AI crawlers harvesting data
  • Monitor SEO tools analyzing your site
  • Understand your traffic sources better

How to Use This Data

  • Create robots.txt rules: Block or allow specific bots
  • Server configuration: Set up rate limiting for aggressive crawlers
  • Analytics filtering: Exclude bot traffic from reports
  • Security monitoring: Identify suspicious crawler activity