What is AdsBot-Google?
AdsBot-Google is Google's crawler that checks the quality of landing pages used in Google Ads campaigns. This bot visits advertiser landing pages to assess their content, loading speed, mobile-friendliness, and overall user experience. The quality scores determined by AdsBot-Google directly impact ad rankings and costs in Google Ads auctions. Unlike Googlebot, AdsBot-Google doesn't respect robots.txt for ads landing pages because it needs to verify the actual user experience. Advertisers should ensure their landing pages are always accessible to AdsBot-Google to maintain good quality scores and ad performance.
User Agent String
AdsBot-Google (+http://www.google.com/adsbot.html)
Copy
How to Control AdsBot-Google
Block Completely
To prevent AdsBot-Google from accessing your entire website, add this to your robots.txt file:
User-agent: AdsBot-Google
Disallow: /
Block Specific Directories
To restrict access to certain parts of your site while allowing others:
User-agent: AdsBot-Google
Disallow: /admin/
Disallow: /private/
Disallow: /wp-admin/
Allow: /public/
Set Crawl Delay
To slow down the crawl rate (note: not all bots respect this directive):
User-agent: AdsBot-Google
Crawl-delay: 10
How to Verify AdsBot-Google
Verification Method:
Verify via Google's IP ranges
Learn more in the official documentation .
Detection Patterns
Multiple ways to detect AdsBot-Google in your application:
Basic Pattern
/AdsBot\-Google/i
Strict Pattern
/^AdsBot\-Google \(\+http\://www\.google\.com/adsbot\.html\)$/
Flexible Pattern
/AdsBot\-Google[\s\/]?[\d\.]*?/i
Vendor Match
/.*Google.*AdsBot\-Google/i
Implementation Examples
PHP
Python
JavaScript
.htaccess
Nginx
Copy
function detect_adsbot_google() {
$user_agent = $_SERVER['HTTP_USER_AGENT'] ?? '';
$pattern = '/AdsBot\\-Google/i';
if (preg_match($pattern, $user_agent)) {
// Log the detection
error_log('AdsBot-Google detected from IP: ' . $_SERVER['REMOTE_ADDR']);
// Set cache headers
header('Cache-Control: public, max-age=3600');
header('X-Robots-Tag: noarchive');
// Optional: Serve cached version
if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) {
readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html');
exit;
}
return true;
}
return false;
}
Copy
import re
from flask import request, make_response
def detect_adsbot_google():
user_agent = request.headers.get('User-Agent', '')
pattern = r'AdsBot-Google'
if re.search(pattern, user_agent, re.IGNORECASE):
# Create response with caching
response = make_response()
response.headers['Cache-Control'] = 'public, max-age=3600'
response.headers['X-Robots-Tag'] = 'noarchive'
return True
return False
class AdsBotGoogleMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
if self.detect_bot(request):
# Handle bot traffic
pass
return self.get_response(request)
Copy
const express = require('express');
const app = express();
// Middleware to detect AdsBot-Google
function detectAdsBotGoogle(req, res, next) {
const userAgent = req.headers['user-agent'] || '';
const pattern = /AdsBot-Google/i;
if (pattern.test(userAgent)) {
// Log bot detection
console.log('AdsBot-Google detected from IP:', req.ip);
// Set cache headers
res.set({
'Cache-Control': 'public, max-age=3600',
'X-Robots-Tag': 'noarchive'
});
// Mark request as bot
req.isBot = true;
req.botName = 'AdsBot-Google';
}
next();
}
app.use(detectAdsBotGoogle);
Copy
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} AdsBot\-Google [NC]
RewriteRule .* - [F,L]
RewriteCond %{HTTP_USER_AGENT} AdsBot\-Google [NC]
RewriteCond %{REQUEST_URI} !^/static/
RewriteRule ^(.*)$ /static/$1 [L]
SetEnvIfNoCase User-Agent "AdsBot\-Google" is_bot=1
<If "%{HTTP_USER_AGENT} =~ /AdsBot\-Google/i">
Header set Cache-Control "public, max-age=3600"
Header set X-Robots-Tag "noarchive"
</If>
Copy
map $http_user_agent $is_adsbot_google {
default 0;
~*AdsBot\-Google 1;
}
server {
if ($is_adsbot_google) {
return 403;
}
location / {
if ($is_adsbot_google) {
root /var/www/cached;
try_files $uri $uri.html $uri/index.html @backend;
}
try_files $uri @backend;
}
location @backend {
if ($is_adsbot_google) {
add_header Cache-Control "public, max-age=3600";
add_header X-Robots-Tag "noarchive";
}
proxy_pass http://backend;
}
}
Should You Block This Bot?
Recommendations based on your website type:
Site Type
Recommendation
Reasoning
E-commerce
Allow
Essential for product visibility in search results
Blog/News
Allow
Increases content reach and discoverability
SaaS Application
Block
No benefit for application interfaces; preserve resources
Documentation
Allow
Improves documentation discoverability for developers
Corporate Site
Allow
Allow for public pages, block sensitive areas like intranets
Advanced robots.txt Configurations
E-commerce Site Configuration
Copy
User-agent: AdsBot-Google
Crawl-delay: 5
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /api/
Disallow: /*?sort=
Disallow: /*?filter=
Disallow: /*&page=
Allow: /products/
Allow: /categories/
Sitemap: https://example.com/sitemap.xml
Publishing/Blog Configuration
Copy
User-agent: AdsBot-Google
Crawl-delay: 10
Disallow: /wp-admin/
Disallow: /drafts/
Disallow: /preview/
Disallow: /*?replytocom=
Allow: /
SaaS/Application Configuration
Copy
User-agent: AdsBot-Google
Disallow: /app/
Disallow: /api/
Disallow: /dashboard/
Disallow: /settings/
Allow: /
Allow: /pricing/
Allow: /features/
Allow: /docs/
Quick Reference
User Agent Match
AdsBot-Google
Robots.txt Name
AdsBot-Google
Category
search, other
Respects robots.txt
May not respect