Cookie Consent by Free Privacy Policy Generator W3C-checklink User Agent - W3C Bot Details | CL SEO

W3C-checklink

W3C Since 1998
Other Respects robots.txt
#validation #links #w3c #checker
Quick Actions
Official Docs

What is W3C-checklink?

W3C Link Checker is a tool provided by the W3C that crawls websites to find broken links, redirects, and other link-related issues. The service helps maintain website quality by identifying problematic links that could harm user experience and SEO. It can check both internal and external links, providing detailed reports about link status, redirect chains, and anchor text issues. The tool is particularly useful for maintaining large websites where manual link checking would be impractical.

User Agent String

W3C-checklink/5.0.0

How to Control W3C-checklink

Block Completely

To prevent W3C-checklink from accessing your entire website, add this to your robots.txt file:

# Block W3C-checklink User-agent: W3C-checklink Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: W3C-checklink Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: W3C-checklink Crawl-delay: 10

How to Verify W3C-checklink

Verification Method:
W3C-checklink user agent

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect W3C-checklink in your application:

Basic Pattern

/W3C\-checklink/i

Strict Pattern

/^W3C\-checklink/5\.0\.0$/

Flexible Pattern

/W3C\-checklink[\s\/]?[\d\.]*?/i

Vendor Match

/.*W3C.*W3C\-checklink/i

Implementation Examples

// PHP Detection for W3C-checklink function detect_w3c_checklink() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/W3C\\-checklink/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('W3C-checklink detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for W3C-checklink import re from flask import request, make_response def detect_w3c_checklink(): user_agent = request.headers.get('User-Agent', '') pattern = r'W3C-checklink' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False # Django Middleware class W3CchecklinkMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for W3C-checklink const express = require('express'); const app = express(); // Middleware to detect W3C-checklink function detectW3Cchecklink(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /W3C-checklink/i; if (pattern.test(userAgent)) { // Log bot detection console.log('W3C-checklink detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'W3C-checklink'; } next(); } app.use(detectW3Cchecklink);
# Apache .htaccess rules for W3C-checklink # Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} W3C\-checklink [NC] RewriteRule .* - [F,L] # Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} W3C\-checklink [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L] # Or set environment variable for PHP SetEnvIfNoCase User-Agent "W3C\-checklink" is_bot=1 # Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /W3C\-checklink/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for W3C-checklink # Map user agent to variable map $http_user_agent $is_w3c_checklink { default 0; ~*W3C\-checklink 1; } server { # Block the bot completely if ($is_w3c_checklink) { return 403; } # Or serve cached content location / { if ($is_w3c_checklink) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_w3c_checklink) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site Type Recommendation Reasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: W3C-checklink Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: W3C-checklink Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: W3C-checklink Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

W3C-checklink

Robots.txt Name

W3C-checklink

Category

other

Respects robots.txt

Yes
Copied to clipboard!