Diffbot is an AI-powered web scraping and data extraction service that uses machine learning to understand and structure web content. Unlike traditional crawlers that rely on specific selectors or patterns, Diffbot's AI can understand pages like humans do, automatically identifying articles, products, discussions, and other content types. The service builds a massive knowledge graph from crawled data, providing structured data APIs for various use cases. Diffbot is used by businesses for market intelligence, news monitoring, and data enrichment.
⚠️ AI Training Notice
This bot may collect and use your website content for AI model training. Consider whether you want your content used for this purpose before allowing access.
Detection Patterns
Multiple ways to detect Diffbot in your application:
// PHP Detection for Diffbot
function detect_diffbot() {
$user_agent = $_SERVER['HTTP_USER_AGENT'] ?? '';
$pattern = '/Diffbot/i';
if (preg_match($pattern, $user_agent)) {
// Log the detection
error_log('Diffbot detected from IP: ' . $_SERVER['REMOTE_ADDR']);
// Set cache headers
header('Cache-Control: public, max-age=3600');
header('X-Robots-Tag: noarchive');
// Optional: Serve cached version
if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) {
readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html');
exit;
}
return true;
}
return false;
}
# Python/Flask Detection for Diffbot
import re
from flask import request, make_responsedef detect_diffbot():
user_agent = request.headers.get('User-Agent', '')
pattern = r'Diffbot'
if re.search(pattern, user_agent, re.IGNORECASE):
# Create response with caching
response = make_response()
response.headers['Cache-Control'] = 'public, max-age=3600'
response.headers['X-Robots-Tag'] = 'noarchive'
return True
return False# Django Middleware
class DiffbotMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
if self.detect_bot(request):
# Handle bot traffic
pass
return self.get_response(request)
// JavaScript/Node.js Detection for Diffbot
const express = require('express');
const app = express();// Middleware to detect Diffbot
function detectDiffbot(req, res, next) {
const userAgent = req.headers['user-agent'] || '';
const pattern = /Diffbot/i;
if (pattern.test(userAgent)) {
// Log bot detection
console.log('Diffbot detected from IP:', req.ip);
// Set cache headers
res.set({
'Cache-Control': 'public, max-age=3600',
'X-Robots-Tag': 'noarchive'
});
// Mark request as bot
req.isBot = true;
req.botName = 'Diffbot';
}
next();
}app.use(detectDiffbot);
# Apache .htaccess rules for Diffbot# Block completely
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Diffbot [NC]
RewriteRule .* - [F,L]# Or redirect to a static version
RewriteCond %{HTTP_USER_AGENT} Diffbot [NC]
RewriteCond %{REQUEST_URI} !^/static/
RewriteRule ^(.*)$ /static/$1 [L]# Or set environment variable for PHP
SetEnvIfNoCase User-Agent "Diffbot" is_bot=1# Add cache headers for this bot
<If "%{HTTP_USER_AGENT} =~ /Diffbot/i">
Header set Cache-Control "public, max-age=3600"
Header set X-Robots-Tag "noarchive"
</If>
# Nginx configuration for Diffbot# Map user agent to variable
map $http_user_agent $is_diffbot {
default 0;
~*Diffbot 1;
}server {
# Block the bot completely
if ($is_diffbot) {
return 403;
}
# Or serve cached content
location / {
if ($is_diffbot) {
root /var/www/cached;
try_files $uri $uri.html $uri/index.html @backend;
}
try_files $uri @backend;
}
# Add headers for bot requests
location @backend {
if ($is_diffbot) {
add_header Cache-Control "public, max-age=3600";
add_header X-Robots-Tag "noarchive";
}
proxy_pass http://backend;
}
}
Should You Block This Bot?
Recommendations based on your website type:
Site Type
Recommendation
Reasoning
E-commerce
Limit Access
Protect pricing and inventory data from AI training
Blog/News
Consider Blocking
Your content may be used for AI training without compensation
SaaS Application
Block
No benefit for application interfaces; preserve resources
Documentation
Selective
Allow for public docs, block for internal docs
Corporate Site
Limit
Allow for public pages, block sensitive areas like intranets