Block AI bots from crawling your 11ty website. Powered by ai.robots.txt. https://www.npmjs.com/package/@awmottaz/eleventy-plugin-block-ai
  • JavaScript 100%
Find a file
2026-02-11 11:11:24 -06:00
.gitignore initial implementation 2026-02-11 10:05:03 -06:00
index.js better logging and virtual template fixes 2026-02-11 10:25:13 -06:00
package-lock.json 1.0.1 2026-02-11 11:11:24 -06:00
package.json 1.0.1 2026-02-11 11:11:24 -06:00
README.md fix package name 2026-02-11 11:11:18 -06:00

@awmottaz/eleventy-plugin-block-ai

This plugin uses the data from ai.robots.txt to add files to your site that will block AI bots.

Installation

npm install @awmottaz/eleventy-plugin-block-ai

Internally, this plugin uses 11ty/eleventy-fetch to fetch the ai.robots.txt files. You may want to add the .cache/ directory to your .gitignore file to avoid committing the cached assets.

Usage

Add the plugin to your 11ty configuration file.

import BlockAI from '@awmottaz/eleventy-plugin-block-ai';

export default async function(eleventyConfig) {
    eleventyConfig.addPlugin(BlockAI);
}

By default, this plugin will create only a robots.txt file in your build output. You must explicitly opt in to create other files.

Configuration

Here is the plugin with default configuration:

import BlockAI from '@awmottaz/eleventy-plugin-block-ai';

export default async function(eleventyConfig) {
    eleventyConfig.addPlugin(BlockAI, {
        robots: true,
        htaccess: false,
        nginx: false,
        caddyfile: false,
        haproxy: false,
        cacheDuration: '1d',
    });
}

The boolean-valued options control whether the associated file will be generated. It will have the exact contents as provided by the ai.robots.txt repository at the prescribed file path.

Caching

Internally, this plugin uses 11ty/eleventy-fetch to fetch the ai.robots.txt files. You can edit the cache duration using the cacheDuration configuration option using the cache duration configuration syntax.

Advanced control with the data cascade

If you need extra control over the generated files, you can disable this plugin from generating them. This plugin will add the raw text contents of each ai.robots.txt file to the global data cascade, so you can create your own files using that.

The data is namespaced under ai_robots:

  • ai_robots.robots
  • ai_robots.htaccess
  • ai_robots.nginx
  • ai_robots.caddyfile
  • ai_robots.haproxy

For example, if you want to generate a custom .htaccess file that includes 404 redirects, you could use an 11ty.js template like so:

class HTAccess {
	data() {
		return {
			permalink: ".htaccess",
		};
	}

	async render(data) {
		return `ErrorDocument 404 /404.html\n\n${data.ai_robots.htaccess}`;
	}
}

export default HTAccess;