Cookie Consent by Free Privacy Policy Generator UptimeRobot User Agent - UptimeRobot Bot Details | CL SEO

UptimeRobot

UptimeRobot Since 2010
Monitoring May ignore robots.txt
#monitoring #uptime #health-check #alerts
Quick Actions
Official Docs

What is UptimeRobot?

UptimeRobot is one of the most popular website monitoring services, checking websites for uptime from multiple locations worldwide. The bot performs regular HTTP(S) requests to verify site availability and response times, alerting website owners when issues occur. With both free and paid tiers, UptimeRobot monitors millions of websites globally. The service can check different types of monitors including HTTP(S), ping, port, and keyword existence. For website owners, UptimeRobot provides peace of mind with instant notifications about downtime, helping maintain high availability.

User Agent String

Mozilla/5.0+(compatible; UptimeRobot/2.0; http://www.uptimerobot.com/)

How to Control UptimeRobot

Block Completely

To prevent UptimeRobot from accessing your entire website, add this to your robots.txt file:

# Block UptimeRobot User-agent: UptimeRobot Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: UptimeRobot Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: UptimeRobot Crawl-delay: 10

How to Verify UptimeRobot

Verification Method:
Check UptimeRobot's published IP addresses

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect UptimeRobot in your application:

Basic Pattern

/UptimeRobot/i

Strict Pattern

/^Mozilla/5\.0\+\(compatible; UptimeRobot/2\.0; http\://www\.uptimerobot\.com/\)$/

Flexible Pattern

/UptimeRobot[\s\/]?[\d\.]*?/i

Vendor Match

/.*UptimeRobot.*UptimeRobot/i

Implementation Examples

// PHP Detection for UptimeRobot function detect_uptimerobot() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/UptimeRobot/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('UptimeRobot detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for UptimeRobot import re from flask import request, make_response def detect_uptimerobot(): user_agent = request.headers.get('User-Agent', '') pattern = r'UptimeRobot' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False # Django Middleware class UptimeRobotMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for UptimeRobot const express = require('express'); const app = express(); // Middleware to detect UptimeRobot function detectUptimeRobot(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /UptimeRobot/i; if (pattern.test(userAgent)) { // Log bot detection console.log('UptimeRobot detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'UptimeRobot'; } next(); } app.use(detectUptimeRobot);
# Apache .htaccess rules for UptimeRobot # Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} UptimeRobot [NC] RewriteRule .* - [F,L] # Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} UptimeRobot [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L] # Or set environment variable for PHP SetEnvIfNoCase User-Agent "UptimeRobot" is_bot=1 # Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /UptimeRobot/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for UptimeRobot # Map user agent to variable map $http_user_agent $is_uptimerobot { default 0; ~*UptimeRobot 1; } server { # Block the bot completely if ($is_uptimerobot) { return 403; } # Or serve cached content location / { if ($is_uptimerobot) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_uptimerobot) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site Type Recommendation Reasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: UptimeRobot Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: UptimeRobot Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: UptimeRobot Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

UptimeRobot

Robots.txt Name

UptimeRobot

Category

monitoring

Respects robots.txt

May not respect
Copied to clipboard!