Cookie Consent by Free Privacy Policy Generator SEOkicks User Agent - SEOkicks Bot Details | CL SEO

SEOkicks

SEOkicks Since 2011
Seo Respects robots.txt
#seo #german #backlinks #crawler
Quick Actions
Official Docs

What is SEOkicks?

SEOkicks is a German SEO service that provides free and paid backlink analysis tools. Their crawler focuses on building a comprehensive database of backlinks with particular strength in German and European websites. SEOkicks offers a unique value proposition by providing substantial free access to backlink data, making it popular among budget-conscious SEO professionals and small businesses. The crawler operates efficiently and respects robots.txt while building its link database. For German-language websites and European SEO, SEOkicks provides valuable localized insights.

User Agent String

Mozilla/5.0 (compatible; SEOkicks; +https://www.seokicks.de/robot.html)

How to Control SEOkicks

Block Completely

To prevent SEOkicks from accessing your entire website, add this to your robots.txt file:

# Block SEOkicks User-agent: SEOkicks Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: SEOkicks Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: SEOkicks Crawl-delay: 10

How to Verify SEOkicks

Verification Method:
Check user agent string

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect SEOkicks in your application:

Basic Pattern

/SEOkicks/i

Strict Pattern

/^Mozilla/5\.0 \(compatible; SEOkicks; \+https\://www\.seokicks\.de/robot\.html\)$/

Flexible Pattern

/SEOkicks[\s\/]?[\d\.]*?/i

Vendor Match

/.*SEOkicks.*SEOkicks/i

Implementation Examples

// PHP Detection for SEOkicks function detect_seokicks() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/SEOkicks/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('SEOkicks detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for SEOkicks import re from flask import request, make_response def detect_seokicks(): user_agent = request.headers.get('User-Agent', '') pattern = r'SEOkicks' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False # Django Middleware class SEOkicksMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for SEOkicks const express = require('express'); const app = express(); // Middleware to detect SEOkicks function detectSEOkicks(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /SEOkicks/i; if (pattern.test(userAgent)) { // Log bot detection console.log('SEOkicks detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'SEOkicks'; } next(); } app.use(detectSEOkicks);
# Apache .htaccess rules for SEOkicks # Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} SEOkicks [NC] RewriteRule .* - [F,L] # Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} SEOkicks [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L] # Or set environment variable for PHP SetEnvIfNoCase User-Agent "SEOkicks" is_bot=1 # Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /SEOkicks/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for SEOkicks # Map user agent to variable map $http_user_agent $is_seokicks { default 0; ~*SEOkicks 1; } server { # Block the bot completely if ($is_seokicks) { return 403; } # Or serve cached content location / { if ($is_seokicks) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_seokicks) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site Type Recommendation Reasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: SEOkicks Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: SEOkicks Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: SEOkicks Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

SEOkicks

Robots.txt Name

SEOkicks

Category

seo

Respects robots.txt

Yes
Copied to clipboard!