Cookie Consent by Free Privacy Policy Generator Nessus User Agent - Tenable Bot Details | CL SEO

Nessus

Tenable Since 1998
Security May ignore robots.txt
#security #vulnerability-scanner #tenable
Quick Actions
Official Docs

What is Nessus?

Nessus is a widely-used vulnerability scanner developed by Tenable that detects security issues, misconfigurations, and compliance violations in web applications and servers. The scanner performs authenticated and unauthenticated scans to identify known vulnerabilities, default credentials, and configuration weaknesses. Nessus does not respect robots.txt as it is a security auditing tool. Website owners typically see Nessus traffic when their security teams or third-party auditors perform vulnerability assessments.

User Agent String

Nessus SOAP

How to Control Nessus

Block Completely

To prevent Nessus from accessing your entire website, add this to your robots.txt file:

# Block Nessus User-agent: Nessus Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: Nessus Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: Nessus Crawl-delay: 10

How to Verify Nessus

Verification Method:
Verify scanner is authorized by the website owner

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect Nessus in your application:

Basic Pattern

/Nessus/i

Strict Pattern

/^Nessus SOAP$/

Flexible Pattern

/Nessus[\s\/]?[\d\.]*?/i

Vendor Match

/.*Tenable.*Nessus/i

Implementation Examples

// PHP Detection for Nessus function detect_nessus() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/Nessus/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('Nessus detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for Nessus import re from flask import request, make_responsedef detect_nessus(): user_agent = request.headers.get('User-Agent', '') pattern = r'Nessus' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False# Django Middleware class NessusMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for Nessus const express = require('express'); const app = express();// Middleware to detect Nessus function detectNessus(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /Nessus/i; if (pattern.test(userAgent)) { // Log bot detection console.log('Nessus detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'Nessus'; } next(); }app.use(detectNessus);
# Apache .htaccess rules for Nessus# Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} Nessus [NC] RewriteRule .* - [F,L]# Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} Nessus [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L]# Or set environment variable for PHP SetEnvIfNoCase User-Agent "Nessus" is_bot=1# Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /Nessus/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for Nessus# Map user agent to variable map $http_user_agent $is_nessus { default 0; ~*Nessus 1; }server { # Block the bot completely if ($is_nessus) { return 403; } # Or serve cached content location / { if ($is_nessus) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_nessus) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site TypeRecommendationReasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: Nessus Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: Nessus Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: Nessus Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

Nessus

Robots.txt Name

Nessus

Category

security

Respects robots.txt

May not respect
Copied to clipboard!