Cookie Consent by Free Privacy Policy Generator Python-requests User Agent - Python Bot Details | CL SEO

Python-requests

Python Since 2011
Other May ignore robots.txt
#python #library #http #programming
Quick Actions
Official Docs

What is Python-requests?

Python-requests is the default user agent for the popular Python Requests library, known for its human-friendly HTTP interface. This user agent is more common than urllib as Requests has become the de facto standard for HTTP operations in Python. It appears in logs from various applications including web scrapers, API clients, and monitoring tools. The Requests library's ease of use has made it popular among developers, data scientists, and researchers. While often used legitimately, some websites monitor this user agent for potential automated access.

User Agent String

python-requests/2.31.0

How to Control Python-requests

Block Completely

To prevent Python-requests from accessing your entire website, add this to your robots.txt file:

# Block Python-requests User-agent: python-requests Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: python-requests Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: python-requests Crawl-delay: 10

How to Verify Python-requests

Verification Method:
Requests library default user agent

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect Python-requests in your application:

Basic Pattern

/Python\-requests/i

Strict Pattern

/^python\-requests/2\.31\.0$/

Flexible Pattern

/Python\-requests[\s\/]?[\d\.]*?/i

Vendor Match

/.*Python.*Python\-requests/i

Implementation Examples

// PHP Detection for Python-requests function detect_python_requests() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/Python\\-requests/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('Python-requests detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for Python-requests import re from flask import request, make_responsedef detect_python_requests(): user_agent = request.headers.get('User-Agent', '') pattern = r'Python-requests' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False# Django Middleware class PythonrequestsMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for Python-requests const express = require('express'); const app = express();// Middleware to detect Python-requests function detectPythonrequests(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /Python-requests/i; if (pattern.test(userAgent)) { // Log bot detection console.log('Python-requests detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'Python-requests'; } next(); }app.use(detectPythonrequests);
# Apache .htaccess rules for Python-requests# Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} Python\-requests [NC] RewriteRule .* - [F,L]# Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} Python\-requests [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L]# Or set environment variable for PHP SetEnvIfNoCase User-Agent "Python\-requests" is_bot=1# Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /Python\-requests/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for Python-requests# Map user agent to variable map $http_user_agent $is_python_requests { default 0; ~*Python\-requests 1; }server { # Block the bot completely if ($is_python_requests) { return 403; } # Or serve cached content location / { if ($is_python_requests) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_python_requests) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site TypeRecommendationReasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: python-requests Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: python-requests Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: python-requests Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

Python-requests

Robots.txt Name

python-requests

Category

other

Respects robots.txt

May not respect
Copied to clipboard!