What is PageSpeed Insights?
PageSpeed Insights is Google's web performance testing service that analyzes web pages and provides suggestions to make them faster. Using Lighthouse and Chrome User Experience Report data, it provides both lab and field data about page performance. The tool is crucial for SEO as page speed is a ranking factor for Google Search. PageSpeed Insights measures Core Web Vitals and other performance metrics that directly impact search rankings and user experience. The service is free and widely used by webmasters and developers to optimize their sites.
User Agent String
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko; Google Page Speed Insights) Chrome/120.0.0.0 Safari/537.36
Copy
How to Control PageSpeed Insights
Block Completely
To prevent PageSpeed Insights from accessing your entire website, add this to your robots.txt file:
User-agent: Google Page Speed Insights
Disallow: /
Block Specific Directories
To restrict access to certain parts of your site while allowing others:
User-agent: Google Page Speed Insights
Disallow: /admin/
Disallow: /private/
Disallow: /wp-admin/
Allow: /public/
Set Crawl Delay
To slow down the crawl rate (note: not all bots respect this directive):
User-agent: Google Page Speed Insights
Crawl-delay: 10
How to Verify PageSpeed Insights
Verification Method:
Requests come from Google's infrastructure
Learn more in the official documentation .
Detection Patterns
Multiple ways to detect PageSpeed Insights in your application:
Basic Pattern
/PageSpeed Insights/i
Strict Pattern
/^Mozilla/5\.0 \(X11; Linux x86_64\) AppleWebKit/537\.36 \(KHTML, like Gecko; Google Page Speed Insights\) Chrome/120\.0\.0\.0 Safari/537\.36$/
Flexible Pattern
/PageSpeed Insights[\s\/]?[\d\.]*?/i
Vendor Match
/.*Google.*PageSpeed/i
Implementation Examples
PHP
Python
JavaScript
.htaccess
Nginx
Copy
function detect_pagespeed_insights() {
$user_agent = $_SERVER['HTTP_USER_AGENT'] ?? '';
$pattern = '/PageSpeed Insights/i';
if (preg_match($pattern, $user_agent)) {
// Log the detection
error_log('PageSpeed Insights detected from IP: ' . $_SERVER['REMOTE_ADDR']);
// Set cache headers
header('Cache-Control: public, max-age=3600');
header('X-Robots-Tag: noarchive');
// Optional: Serve cached version
if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) {
readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html');
exit;
}
return true;
}
return false;
}
Copy
import re
from flask import request, make_response
def detect_pagespeed_insights():
user_agent = request.headers.get('User-Agent', '')
pattern = r'PageSpeed Insights'
if re.search(pattern, user_agent, re.IGNORECASE):
# Create response with caching
response = make_response()
response.headers['Cache-Control'] = 'public, max-age=3600'
response.headers['X-Robots-Tag'] = 'noarchive'
return True
return False
class PageSpeedInsightsMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
if self.detect_bot(request):
# Handle bot traffic
pass
return self.get_response(request)
Copy
const express = require('express');
const app = express();
// Middleware to detect PageSpeed Insights
function detectPageSpeedInsights(req, res, next) {
const userAgent = req.headers['user-agent'] || '';
const pattern = /PageSpeed Insights/i;
if (pattern.test(userAgent)) {
// Log bot detection
console.log('PageSpeed Insights detected from IP:', req.ip);
// Set cache headers
res.set({
'Cache-Control': 'public, max-age=3600',
'X-Robots-Tag': 'noarchive'
});
// Mark request as bot
req.isBot = true;
req.botName = 'PageSpeed Insights';
}
next();
}
app.use(detectPageSpeedInsights);
Copy
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} PageSpeed Insights [NC]
RewriteRule .* - [F,L]
RewriteCond %{HTTP_USER_AGENT} PageSpeed Insights [NC]
RewriteCond %{REQUEST_URI} !^/static/
RewriteRule ^(.*)$ /static/$1 [L]
SetEnvIfNoCase User-Agent "PageSpeed Insights" is_bot=1
<If "%{HTTP_USER_AGENT} =~ /PageSpeed Insights/i">
Header set Cache-Control "public, max-age=3600"
Header set X-Robots-Tag "noarchive"
</If>
Copy
map $http_user_agent $is_pagespeed_insights {
default 0;
~*PageSpeed Insights 1;
}
server {
if ($is_pagespeed_insights) {
return 403;
}
location / {
if ($is_pagespeed_insights) {
root /var/www/cached;
try_files $uri $uri.html $uri/index.html @backend;
}
try_files $uri @backend;
}
location @backend {
if ($is_pagespeed_insights) {
add_header Cache-Control "public, max-age=3600";
add_header X-Robots-Tag "noarchive";
}
proxy_pass http://backend;
}
}
Should You Block This Bot?
Recommendations based on your website type:
Site Type
Recommendation
Reasoning
E-commerce
Optional
Evaluate based on bandwidth usage vs. benefits
Blog/News
Allow
Increases content reach and discoverability
SaaS Application
Block
No benefit for application interfaces; preserve resources
Documentation
Selective
Allow for public docs, block for internal docs
Corporate Site
Limit
Allow for public pages, block sensitive areas like intranets
Advanced robots.txt Configurations
E-commerce Site Configuration
Copy
User-agent: Google Page Speed Insights
Crawl-delay: 5
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /api/
Disallow: /*?sort=
Disallow: /*?filter=
Disallow: /*&page=
Allow: /products/
Allow: /categories/
Sitemap: https://example.com/sitemap.xml
Publishing/Blog Configuration
Copy
User-agent: Google Page Speed Insights
Crawl-delay: 10
Disallow: /wp-admin/
Disallow: /drafts/
Disallow: /preview/
Disallow: /*?replytocom=
Allow: /
SaaS/Application Configuration
Copy
User-agent: Google Page Speed Insights
Disallow: /app/
Disallow: /api/
Disallow: /dashboard/
Disallow: /settings/
Allow: /
Allow: /pricing/
Allow: /features/
Allow: /docs/
Quick Reference
User Agent Match
PageSpeed Insights
Robots.txt Name
Google Page Speed Insights
Category
monitoring, seo
Respects robots.txt
Yes