Documentation

AI Crawler Guard – Documentation

Follow these steps to install, configure, and troubleshoot the plugin. Every section includes anchor links so you can jump directly where you need.

1. Installation

From WordPress admin

Go to Plugins → Add New, search for "AI Crawler Guard", click Install, then Activate.

Manual upload

Download the ZIP from aicrawlerguard.com, visit Plugins → Add New → Upload Plugin, choose the ZIP, and activate.

2. Basic setup

Finding the settings

Navigate to Settings → AI Crawler Guard. All controls live in a single, lightweight settings screen.

Block known AI crawlers vs Allow all crawlers

Choose Block to automatically deny the AI list, or Allow to permit all crawlers. You can still override individual bots either way.

3. Dashboard overview

Real-time status

See total AI hits, blocked vs allowed requests, and your current global mode (Block known AI / Allow all).

Recent activity

A mini log shows the last 10 AI requests so you can confirm the plugin is working.

4. Crawlers page

Status definitions

Allowed = always permitted, Blocked = always denied. Bots without an override follow your global mode setting.

Changing status

Click Allow to permit a blocked bot, or Block to deny an allowed bot. Changes take effect immediately.

5. Logs page

Logging toggle

Turn logging on if you want to store detailed entries. Turn it off for maximum performance.

Retention

Set how many days of logs to keep. Older entries are pruned automatically.

Clearing logs

Use the Clear Logs button to wipe data instantly before sharing screenshots or handing access to clients.

6. Custom bot rules

Adding a rule

Click Add Rule, provide a label, the pattern to match, and choose the category plus default action.

Rule order

Custom rules run before the core list. Disable that behavior with "Allow core list to override" if needed.

7. Troubleshooting

Social previews broken

Ensure FacebookBot, Twitterbot, and LinkedInBot are set to Allow. These are allowed by default but can be changed accidentally.

No crawlers detected

Confirm logging is enabled and give it time. Some sites only get AI hits every few hours.

Performance concerns

Disable logging or shorten retention. Detection itself is a single user-agent check and adds virtually zero overhead.