Cookie Consent by Free Privacy Policy Generator Zoom User Agent - Zoom Bot Details | CL SEO

Zoom

Zoom Since 2020
Other Respects robots.txt
#zoom #meeting #preview #chat
Quick Actions
Official Docs

What is Zoom?

ZoomBot generates link previews when URLs are shared in Zoom Chat during meetings or in persistent chat channels. As remote work and virtual meetings have become standard, Zoom's chat functionality has grown in importance for sharing resources and collaboration. The bot extracts metadata to create previews that help meeting participants quickly understand shared content without leaving the Zoom interface. This functionality enhances collaboration by providing context for shared links directly within the communication platform.

User Agent String

Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.1 Safari/537.36 ZoomBot

How to Control Zoom

Block Completely

To prevent Zoom from accessing your entire website, add this to your robots.txt file:

# Block Zoom User-agent: ZoomBot Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: ZoomBot Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: ZoomBot Crawl-delay: 10

How to Verify Zoom

Verification Method:
Check for ZoomBot in user agent

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect Zoom in your application:

Basic Pattern

/Zoom/i

Strict Pattern

/^Mozilla/5\.0 \(Macintosh; Intel Mac OS X 10_10_1\) AppleWebKit/537\.36 \(KHTML, like Gecko\) Chrome/41\.0\.2227\.1 Safari/537\.36 ZoomBot$/

Flexible Pattern

/Zoom[\s\/]?[\d\.]*?/i

Vendor Match

/.*Zoom.*Zoom/i

Implementation Examples

// PHP Detection for Zoom function detect_zoom() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/Zoom/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('Zoom detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for Zoom import re from flask import request, make_response def detect_zoom(): user_agent = request.headers.get('User-Agent', '') pattern = r'Zoom' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False # Django Middleware class ZoomMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for Zoom const express = require('express'); const app = express(); // Middleware to detect Zoom function detectZoom(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /Zoom/i; if (pattern.test(userAgent)) { // Log bot detection console.log('Zoom detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'Zoom'; } next(); } app.use(detectZoom);
# Apache .htaccess rules for Zoom # Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} Zoom [NC] RewriteRule .* - [F,L] # Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} Zoom [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L] # Or set environment variable for PHP SetEnvIfNoCase User-Agent "Zoom" is_bot=1 # Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /Zoom/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for Zoom # Map user agent to variable map $http_user_agent $is_zoom { default 0; ~*Zoom 1; } server { # Block the bot completely if ($is_zoom) { return 403; } # Or serve cached content location / { if ($is_zoom) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_zoom) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site Type Recommendation Reasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: ZoomBot Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: ZoomBot Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: ZoomBot Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

Zoom

Robots.txt Name

ZoomBot

Category

other

Respects robots.txt

Yes
Copied to clipboard!