Cookie Consent by Free Privacy Policy Generator SnapchatBot User Agent - Snap Inc. Bot Details | CL SEO

SnapchatBot

Snap Inc. Since 2017
Respects robots.txt
#social #snapchat #link-preview #fetcher
Quick Actions
Official Docs

What is SnapchatBot?

SnapchatBot fetches web page metadata to generate link previews when URLs are shared within Snapchat. The bot retrieves Open Graph and meta tag information to display rich link cards including titles, descriptions, and thumbnail images. It respects robots.txt directives. Website owners who want their content to display well when shared on Snapchat should ensure proper Open Graph tags are in place.

User Agent String

Mozilla/5.0 (compatible; Snapchat/1.0; +https://www.snapchat.com)

How to Control SnapchatBot

Block Completely

To prevent SnapchatBot from accessing your entire website, add this to your robots.txt file:

# Block SnapchatBot User-agent: Snapchat Disallow: /

Block Specific Directories

To restrict access to certain parts of your site while allowing others:

User-agent: Snapchat Disallow: /admin/ Disallow: /private/ Disallow: /wp-admin/ Allow: /public/

Set Crawl Delay

To slow down the crawl rate (note: not all bots respect this directive):

User-agent: Snapchat Crawl-delay: 10

How to Verify SnapchatBot

Verification Method:
Check user agent string for Snapchat identifier

Learn more in the official documentation.

Detection Patterns

Multiple ways to detect SnapchatBot in your application:

Basic Pattern

/SnapchatBot/i

Strict Pattern

/^Mozilla/5\.0 \(compatible; Snapchat/1\.0; \+https\://www\.snapchat\.com\)$/

Flexible Pattern

/SnapchatBot[\s\/]?[\d\.]*?/i

Vendor Match

/.*Snap Inc\..*SnapchatBot/i

Implementation Examples

// PHP Detection for SnapchatBot function detect_snapchatbot() { $user_agent = $_SERVER['HTTP_USER_AGENT'] ?? ''; $pattern = '/SnapchatBot/i'; if (preg_match($pattern, $user_agent)) { // Log the detection error_log('SnapchatBot detected from IP: ' . $_SERVER['REMOTE_ADDR']); // Set cache headers header('Cache-Control: public, max-age=3600'); header('X-Robots-Tag: noarchive'); // Optional: Serve cached version if (file_exists('cache/' . md5($_SERVER['REQUEST_URI']) . '.html')) { readfile('cache/' . md5($_SERVER['REQUEST_URI']) . '.html'); exit; } return true; } return false; }
# Python/Flask Detection for SnapchatBot import re from flask import request, make_responsedef detect_snapchatbot(): user_agent = request.headers.get('User-Agent', '') pattern = r'SnapchatBot' if re.search(pattern, user_agent, re.IGNORECASE): # Create response with caching response = make_response() response.headers['Cache-Control'] = 'public, max-age=3600' response.headers['X-Robots-Tag'] = 'noarchive' return True return False# Django Middleware class SnapchatBotMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): if self.detect_bot(request): # Handle bot traffic pass return self.get_response(request)
// JavaScript/Node.js Detection for SnapchatBot const express = require('express'); const app = express();// Middleware to detect SnapchatBot function detectSnapchatBot(req, res, next) { const userAgent = req.headers['user-agent'] || ''; const pattern = /SnapchatBot/i; if (pattern.test(userAgent)) { // Log bot detection console.log('SnapchatBot detected from IP:', req.ip); // Set cache headers res.set({ 'Cache-Control': 'public, max-age=3600', 'X-Robots-Tag': 'noarchive' }); // Mark request as bot req.isBot = true; req.botName = 'SnapchatBot'; } next(); }app.use(detectSnapchatBot);
# Apache .htaccess rules for SnapchatBot# Block completely RewriteEngine On RewriteCond %{HTTP_USER_AGENT} SnapchatBot [NC] RewriteRule .* - [F,L]# Or redirect to a static version RewriteCond %{HTTP_USER_AGENT} SnapchatBot [NC] RewriteCond %{REQUEST_URI} !^/static/ RewriteRule ^(.*)$ /static/$1 [L]# Or set environment variable for PHP SetEnvIfNoCase User-Agent "SnapchatBot" is_bot=1# Add cache headers for this bot <If "%{HTTP_USER_AGENT} =~ /SnapchatBot/i"> Header set Cache-Control "public, max-age=3600" Header set X-Robots-Tag "noarchive" </If>
# Nginx configuration for SnapchatBot# Map user agent to variable map $http_user_agent $is_snapchatbot { default 0; ~*SnapchatBot 1; }server { # Block the bot completely if ($is_snapchatbot) { return 403; } # Or serve cached content location / { if ($is_snapchatbot) { root /var/www/cached; try_files $uri $uri.html $uri/index.html @backend; } try_files $uri @backend; } # Add headers for bot requests location @backend { if ($is_snapchatbot) { add_header Cache-Control "public, max-age=3600"; add_header X-Robots-Tag "noarchive"; } proxy_pass http://backend; } }

Should You Block This Bot?

Recommendations based on your website type:

Site TypeRecommendationReasoning
E-commerce Optional Evaluate based on bandwidth usage vs. benefits
Blog/News Allow Increases content reach and discoverability
SaaS Application Block No benefit for application interfaces; preserve resources
Documentation Selective Allow for public docs, block for internal docs
Corporate Site Limit Allow for public pages, block sensitive areas like intranets

Advanced robots.txt Configurations

E-commerce Site Configuration

User-agent: Snapchat Crawl-delay: 5 Disallow: /cart/ Disallow: /checkout/ Disallow: /my-account/ Disallow: /api/ Disallow: /*?sort= Disallow: /*?filter= Disallow: /*&page= Allow: /products/ Allow: /categories/ Sitemap: https://example.com/sitemap.xml

Publishing/Blog Configuration

User-agent: Snapchat Crawl-delay: 10 Disallow: /wp-admin/ Disallow: /drafts/ Disallow: /preview/ Disallow: /*?replytocom= Allow: /

SaaS/Application Configuration

User-agent: Snapchat Disallow: /app/ Disallow: /api/ Disallow: /dashboard/ Disallow: /settings/ Allow: / Allow: /pricing/ Allow: /features/ Allow: /docs/

Quick Reference

User Agent Match

SnapchatBot

Robots.txt Name

Snapchat

Category

social

Respects robots.txt

Yes
Copied to clipboard!