What is Better Uptime Bot?
Better Uptime Bot is the monitoring agent for Better Stack (formerly Better Uptime), a modern uptime monitoring and incident management platform. The bot performs regular health checks on configured URLs to detect outages, SSL certificate issues, and performance degradation. It sends alerts through various channels including email, SMS, Slack, and phone calls when issues are detected.
User Agent String
Better Uptime Bot
Copy
How to Control Better Uptime Bot Block Completely To prevent Better Uptime Bot from accessing your entire website, add this to your robots.txt file:
User-agent: Better Uptime Bot
Disallow: /
Block Specific Directories To restrict access to certain parts of your site while allowing others:
User-agent: Better Uptime Bot
Disallow: /admin/
Disallow: /private/
Disallow: /wp-admin/
Allow: /public/
Set Crawl Delay To slow down the crawl rate (note: not all bots respect this directive):
User-agent: Better Uptime Bot
Crawl-delay: 10
How to Verify Better Uptime Bot
Verification Method:
Check user agent string for Better Uptime Bot identifier
Learn more in the official documentation .
Detection Patterns Multiple ways to detect Better Uptime Bot in your application:
Basic Pattern
/Better Uptime Bot/i
Strict Pattern
/^Better Uptime Bot$/
Flexible Pattern
/Better Uptime Bot[\s\/]?[\d\.]*?/i
Vendor Match
/.*Better Stack.*Better/iImplementation Examples
PHP
Python
JavaScript
.htaccess
Nginx
Copy
function detect_better_uptime_bot() {
$user_agent = $_SERVER['HTTP_USER_AGENT'] ?? '';
$pattern = '/Better Uptime Bot/i';
if (preg_match($pattern, $user_agent)) {
// Log the detection
error_log('Better Uptime Bot detected from IP: ' . $_SERVER['REMOTE_ADDR']);
// Set cache headers
header('Cache-Control: public, max-age=3600');
header('X-Robots-Tag: noarchive');
// Optional: Serve cached version
if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) {
readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html');
exit;
}
return true;
}
return false;
}
Copy
import re
from flask import request, make_responsedef detect_better_uptime_bot():
user_agent = request.headers.get('User-Agent', '')
pattern = r'Better Uptime Bot'
if re.search(pattern, user_agent, re.IGNORECASE):
# Create response with caching
response = make_response()
response.headers['Cache-Control'] = 'public, max-age=3600'
response.headers['X-Robots-Tag'] = 'noarchive'
return True
return False
class BetterUptimeBotMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
if self.detect_bot(request):
# Handle bot traffic
pass
return self.get_response(request)
Copy
const express = require('express');
const app = express();// Middleware to detect Better Uptime Bot
function detectBetterUptimeBot(req, res, next) {
const userAgent = req.headers['user-agent'] || '';
const pattern = /Better Uptime Bot/i;
if (pattern.test(userAgent)) {
// Log bot detection
console.log('Better Uptime Bot detected from IP:', req.ip);
// Set cache headers
res.set({
'Cache-Control': 'public, max-age=3600',
'X-Robots-Tag': 'noarchive'
});
// Mark request as bot
req.isBot = true;
req.botName = 'Better Uptime Bot';
}
next();
}app.use(detectBetterUptimeBot);
Copy
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Better Uptime Bot [NC]
RewriteRule .* - [F,L]
RewriteCond %{HTTP_USER_AGENT} Better Uptime Bot [NC]
RewriteCond %{REQUEST_URI} !^/static/
RewriteRule ^(.*)$ /static/$1 [L]
SetEnvIfNoCase User-Agent "Better Uptime Bot" is_bot=1
<If "%{HTTP_USER_AGENT} =~ /Better Uptime Bot/i">
Header set Cache-Control "public, max-age=3600"
Header set X-Robots-Tag "noarchive"
</If>
Copy
map $http_user_agent $is_better_uptime_bot {
default 0;
~*Better Uptime Bot 1;
}server {
if ($is_better_uptime_bot) {
return 403;
}
location / {
if ($is_better_uptime_bot) {
root /var/www/cached;
try_files $uri $uri.html $uri/index.html @backend;
}
try_files $uri @backend;
}
location @backend {
if ($is_better_uptime_bot) {
add_header Cache-Control "public, max-age=3600";
add_header X-Robots-Tag "noarchive";
}
proxy_pass http://backend;
}
}
Should You Block This Bot? Recommendations based on your website type:
Site Type Recommendation Reasoning E-commerce
Optional
Evaluate based on bandwidth usage vs. benefits Blog/News
Allow
Increases content reach and discoverability SaaS Application
Block
No benefit for application interfaces; preserve resources Documentation
Selective
Allow for public docs, block for internal docs Corporate Site
Limit
Allow for public pages, block sensitive areas like intranets
Advanced robots.txt Configurations E-commerce Site Configuration
Copy
User-agent: Better Uptime Bot
Crawl-delay: 5
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /api/
Disallow: /*?sort=
Disallow: /*?filter=
Disallow: /*&page=
Allow: /products/
Allow: /categories/
Sitemap: https://example.com/sitemap.xml
Publishing/Blog Configuration
Copy
User-agent: Better Uptime Bot
Crawl-delay: 10
Disallow: /wp-admin/
Disallow: /drafts/
Disallow: /preview/
Disallow: /*?replytocom=
Allow: /
SaaS/Application Configuration
Copy
User-agent: Better Uptime Bot
Disallow: /app/
Disallow: /api/
Disallow: /dashboard/
Disallow: /settings/
Allow: /
Allow: /pricing/
Allow: /features/
Allow: /docs/
Quick Reference
User Agent Match
Better Uptime Bot
Robots.txt Name
Better Uptime Bot
Category
monitoring
Respects robots.txt
May not respect