Fall Sale! Code FALL2024 takes 25% OFF our Pro Plugins & Books »
Web Dev + WordPress + Security

Protect Your Site with a Blackhole for Bad Bots

[ Black Hole (Vector) ] One of my favorite security measures here at Perishable Press is the site’s virtual Blackhole trap for bad bots. The concept is simple: include a hidden link to a robots.txt-forbidden directory somewhere on your pages. Bots that ignore or disobey your robots rules will crawl the link and fall into the honeypot trap, which then performs a WHOIS Lookup and records the event in the blackhole data file. Once added to the blacklist data file, bad bots immediately are denied access to your site.

Contents

Tip: For a shorter version of this tutorial, check out the Quick Start Guide »
WordPress user? Check out the free Blackhole plugin and Blackhole Pro »
Important: The article below is for the standalone PHP version of Blackhole for Bad Bots. For information about the Blackhole WordPress plugins, visit WordPress.org (free version) and/or Plugin Planet (pro version).

Intro

[ Black Hole (Graphic) ] I call it the “one-strike” rule: bots have one chance to follow the robots.txt protocol, check the site’s robots.txt file, and obey its directives. Failure to comply results in immediate banishment. The best part is that the Blackhole only affects bad bots: normal users never see the hidden link, and good bots obey the robots rules in the first place. So the percentage of false positives is extremely low to non-existent. It’s an ideal way to protect your site against bad bots silently, efficiently, and effectively.

With a few easy steps, you can set up your own Blackhole to trap bad bots and protect your site from evil scripts, bandwidth thieves, content scrapers, spammers, and other malicious behavior.

[ Blackhole for Bad Bots ] The Blackhole is built with PHP, and uses a bit of .htaccess to protect the blackhole directory. Refined over the years and completely revamped for this tutorial, the Blackhole consists of a plug-&-play /blackhole/ directory that contains the following three files:

  • .htaccess – protects the log file
  • blackhole.dat – log file
  • index.php – blackhole script

These three files work together to create the Blackhole for Bad Bots. If you are running WordPress, the Blackhole plugin is recommended instead of this standalone PHP version.

Note: By default, .htaccess files are hidden on Windows and OS X, so to view them you need to enable “Show hidden files” on your machine, or use any FTP or code-editing app that is capable of displaying them. It’s a common feature.

Overview

The Blackhole is developed to make implementation as easy as possible. Here is an overview of the steps:

  1. Upload the /blackhole/ directory to your site
  2. Edit the four variables in the “EDIT HERE” section in index.php.
  3. Ensure writable server permissions for the blackhole.dat file
  4. Add a single line to the top of your pages to include the index.php file
  5. Add a hidden link to the /blackhole/ directory in the footer
  6. Forbid crawling of /blackhole/ by adding a line to your robots.txt

So installation is straightforward, but there are many ways to customize functionality. For complete instructions, jump ahead to the installation steps. For now, I think a good way to understand how it works is to check out a demo..

Live Demo

I have set up a working demo of the Blackhole for this tutorial. It works exactly like the download version, but it’s set up as a sandbox, so when you trigger the trap, it blocks you only from the demo itself. Here’s how it works:

  1. First visit to the Blackhole demo loads the trap page, runs the whois lookup, and adds your IP address to the blacklist data file
  2. Once your IP is added to the blacklist, all future requests for the Blackhole demo will be denied access

So you get one chance (per IP address) to see how it works. Once you visit the demo, your IP address will be blocked from the demo only — you will still have full access to this tutorial (and everything else at Perishable Press). So with that in mind, here is the demo link (opens new tab):

Visit once to see the Blackhole trap, and then again to observe that you’ve been blocked. Again, even if you are blocked from the demo page, you will continue to have access to everything else here at Perishable Press.

Tip: You can visit the Blackhole Demo via any free proxy service if you want to try again and do another test.

How to Install

[ Black Hole (Physical) ] Here are complete instructions for implementing the PHP/standalone of Blackhole for Bad Bots. Note that these steps are written for Apache servers running PHP. The steps are the same for other PHP-enabled servers (e.g., Nginx, IIS), but you will need to replace the .htaccess file and rules with whatever works for particular server environment. Note: for a concise summary of these steps, check out this tutorial.

Step 1: Download the Blackhole zip file, unzip and upload to your site’s root directory. This location is not required, but it enables everything to work out of the box. To use a different location, edit the include path in Step 4.

Step 2: Edit the four variables in the “EDIT HERE” section in index.php.

Step 3: Change file permissions for blackhole.dat to make it writable by the server. The permission settings may vary depending on server configuration. If you are unsure about this, ask your host. Note that the blackhole script needs to be able to read, write, and execute the blackhole.dat file.

Step 4: Include the Blackhole script by adding the following line to the top of your pages (e.g., header.php):

<?php include(realpath(getenv('DOCUMENT_ROOT')) . '/blackhole/index.php'); ?>

The Blackhole script checks the bot’s IP address against the blacklist data file. If a match is found, the request is blocked with a customizable message. View the source code for more information.

Step 5: Add a hidden link to the /blackhole/ directory in the footer of your site’s web pages (replace “Your Site Name” with the name of your site):

<a rel="nofollow" style="display:none" href="https://example.com/blackhole/" title="Do NOT follow this link or you will be banned from the site!">Your Site Name</a>

This is the hidden trigger link that bad bots will follow. It’s currently hidden with CSS, so 99.999% of visitors won’t ever see it. Alternately, to hide the link from users without relying on CSS, replace the anchor text with a transparent 1-pixel GIF image. For example:

<a rel="nofollow" style="display:none" href="http://example.com/blackhole/" title="Do NOT follow this link or you will be banned from the site!"><img src="/images/1px.gif" alt=""></a>

Remember to edit the link href value and the image src to match the correct locations on your server.

Step 6: Finally, add a Disallow directive to your site’s robots.txt file:

User-agent: *
Disallow: /blackhole/

This step is pretty important. Without the proper robots directives, all bots would fall into the Blackhole because they wouldn’t know any better. If a bot wants to crawl your site, it must obey the rules! The robots rule that we are using basically says, “All bots DO NOT visit the /blackhole/ directory or anything inside of it.” So it is important to get your robots rules correct.

Step 7: Done! Remember to test thoroughly before going live. Also check out the section on customizing for more ideas.

Testing

[ Black Hole (Figurative) ] You can verify that the script is working by visiting the hidden trigger link (added in step 5). That should take you to the Blackhole warning page for your first visit, and then block you from further access on subsequent visits. To verify that you’ve been blocked entirely, try visiting any other page on your site. To restore site access at any time, you can clear the contents of the blackhole.dat log file.

Important: Make sure that all of the rules in your robots.txt file are correct and have proper syntax. For example, you can use the free robots.txt validator in Google Webmaster Tools (requires Google account).

Tip: Make sure to check the list of whitelisted user agents. For example, the Chrome browser is whitelisted. So if you want to test that Blackhole is working, either use a non-Chrome browser or remove chrome from the whitelist.
Tip: To reset the list of blocked bots at any time, simply clear the contents of the blackhole.dat file.

Customize

The previous steps will get the Blackhole set up with default configuration, but there are some details that you may want to customize:

  • index.php (lines 25–28): Edit the four variables as needed
  • index.php (lines 140–164): Customize markup of the warning page
  • index.php (line 180): Customize the list of whitelisted bots

These are the recommended changes, but the PHP is clean and generates valid HTML, so feel free to modify the markup or anything else as needed.

Troubleshoot

If you get an error letting you know that a file cannot be found, it could be an issue with how the script specifies the absolute path, using getenv('DOCUMENT_ROOT'). That function works on a majority of servers, but if it fails on your server for whatever reason, you can simply replace it with the actual path. From Step 4, the include script looks like this:

<?php include(realpath(getenv('DOCUMENT_ROOT')) . '/blackhole/index.php'); ?>

So if you are getting not-found or similar errors, try this instead:

/var/www/httpdocs/blackhole/index.php

So that would be the actual absolute path to the blackhole index.php file on your server. As long as you get the path correct, it’s gonna fix any “file can’t be found” type errors you may be experiencing.

If in doubt about the actual full absolute path, consult your web host or use a PHP function or constant such as __DIR__ to obtain the correct infos. And check out my tutorial over at WP-Mix for more information about including files with PHP and WordPress.

Tip: I wrote an in-depth guide on how to verify that Blackhole is working. It is written for users of the WordPress plugin, but the general steps show how to test the PHP/standalone version as well. Long story short: use a proxy service.

Caveat Emptor

Blocking bots is serious business. Good bots obey robots.txt rules, but there may be potentially useful bots that do not. Yahoo is the perfect example: it’s a valid search engine that sends some traffic, but sadly the Yahoo Slurp bot is too stupid to follow the rules. Since setting up the Blackhole several years ago, I’ve seen Slurp disobey robots rules hundreds of times.

By default, the Blackhole DOES NOT BLOCK any of the big search engines. So Google, Bing, and company always will be allowed access to your site, even if they disobey your robots.txt rules. See the next section for more details.

Whitelist Good Bots

In order to ensure that all of the major search engines always have access to your site, Blackhole whitelists the following bots:

  • AOL.com
  • Baidu
  • Bing/MSN
  • DuckDuckGo
  • Google
  • Teoma
  • Yahoo!
  • Yandex

Additionally, popular social media services are whitelisted, as well as some other known “good” bots. To whitelist these bots, the Blackhole script uses regular expressions to ensure that all possible name variations are allowed access. For each request made to your site, Blackhole checks the User Agent and always allows anything that contains any of the following strings:

a6-indexer, adsbot-google, ahrefsbot, aolbuild, apis-google, baidu, bingbot, bingpreview, butterfly, chrome, cloudflare, duckduckgo, embedly, facebookexternalhit, facebot, googlebot, google page speed, ia_archiver, linkedinbot, mediapartners-google, msnbot, netcraftsurvey, outbrain, pinterest, quora, rogerbot, showyoubot, slackbot, slurp, sogou, teoma, tweetmemebot, twitterbot, uptimerobot, urlresolver, vkshare, w3c_validator, wordpress, wp rocket, yandex

So any bot that reports a user agent that contains any of these strings will NOT be blocked and always will have full access to your site. To customize the list of whitelisted bots, open index.php and locate the function blackhole_whitelist(), where you will find the list of allowed bots.

The upside of whitelisting these user agents ensures that anything claiming to be a major search engine is allowed open access. The downside is that user-agent strings are easily spoofed, so a bad bot could crawl along and say, “Hey look, I’m teh Googlebot!” and the whitelist would grant access. It is your decision where to draw the line.

With PHP, it is possible to verify the true identity of each bot, but doing so consumes significant resources and could overload the server. Avoiding that scenario, the Blackhole errs on the side of caution: it’s better to allow a few spoofs than to block any of the major search engines and other major web services.

Tip: Check out CLI Forward-Reverse Lookup for how to verify bot identity.

License & Disclaimer

Terms of Use: Blackhole for Bad Bots is released under GNU General Public License. By downloading the Blackhole, you agree to accept full responsibility for its use. In no way shall the author be held accountable for anything that happens after the file has been downloaded.

Questions & Feedback

Questions? Comments? Send ’em via my contact form. Thanks!

Download

Here you can download the latest version of Blackhole for Bad Bots. By downloading, you agree to the terms.

Standalone PHP version, last updated: 2024/11/06
Download Blackhole for Bad BotsVersion 4.7 ( 4.43 KB ZIP )

About the Author
Jeff Starr = Creative thinker. Passionate about free and open Web.
USP Pro: Unlimited front-end forms for user-submitted posts and more.

244 responses to “Protect Your Site with a Blackhole for Bad Bots”

  1. Jeff Starr 2011/07/26 6:23 pm Reply

    @Jack A: Great question. The goal of this script is to catch bad bots. I.e., the ones that don’t obey your robots.txt directives, such as the one that disallows access to the blackhole directory.

    Yes you should include the blackhole.php script on any page that you want to protect against bad bots. An easy place for this using WordPress is the header.php file.

    The next time the bot visits my site is when I catch him, right?

    Here’s basically how it works:

    1. all bots have access to everything EXCEPT the blackhole
    2. if a bot visits the blackhole they are denied access to any protected page on your site

    I hope this helps, and refer back to the article and the comment thread for more info.

  2. TheNightOwl 2011/08/26 10:14 pmReply

    Hi, Jeff

    Firstly, thank you for sharing this great little tool.

    I had some problems setting it up and had to read through the whole thread to fix them. So thanks, also, to everyone who has contributed to making the script even better.

    The particular issue I had was that before my server would stop throwing include and fopen errors, I had to alter the filepaths in blackhole.php and index.php to:

    /home/xxxxxx/public_html/renamed-blackhole/blackhole.dat

    …and, change /renamed-blackhole/ in the header include, of course.

    ————-

    Thank you to people who posted a reminder that GoogleBot will use “display:none” as a red flag. It will also do so for certain security scan plugins for WP.

    So now I’ve just set a 1px transparent gif before the closing body tag.

    Is this good enough?

    A couple of people mentioned using the footer include to link to yet another page where the link to the trap resides.

    Maybe I didn’t read clearly enough, but I don’t really understand the need for this.

    Can someone tell me the advantage of doing this, please?

    ————

    Jeff, a couple of lingering questions/concerns:

    1. Two people (Slava and Alex) have called attention to the wildcard asterisk in the robots.txt file.

    Are they right or wrong?

    Should I put THIS in my robots.txt file:

    Disallow: /*/renamed-blackhole/*

    Or THIS:

    Disallow: /renamed-blackhole/

    Or THIS:

    Disallow: /renamed-blackhole

    2. Setting writable permissions to the dat file and having it “in the open” for anyone who knows (or guesses) the directory name.

    Jeff, what are the minimum permissions I can set?

    At the moment, I have 755 but that makes me nervous. Perhaps I’m just paranoid. If so, feel free to tell me so ;)

    Also on this note, fWolf suggested renaming the dat file to avoid having it read from outside (and someone else further down referred to this, too).

    Is this a legitimate concern? And if so, how might we mitigate against it?

    Is there a way (like with htaccess files) or “locking down” this file and not making it readable?

    Thanks again to you and everyone else contributing to this post!

  3. TheNightOwl 2011/08/26 10:30 pmReply

    Oh, and one more:

    3. There haven’t been any comments on the mods posted here:

    https://sites.google.com/site/phpblackholemods/

    @Dave: What do these changes do? Are they only for use in certain circumstances? What potential conflicts are there? Et cetera.

  4. PhotoshopWarrior 2011/09/20 9:29 amReply

    The question is how I can manually add IP addresses to block into my blackhole.dat file?

  5. Works like a charm!! Many thanks, All day’s more than 4 badbots are capturated ;)

    Keep going

    Seymour

  6. Derick2012 2011/10/23 5:42 amReply

    how do i get this to work on a vbulletin forum?

    everyone works fine except for the banning part. i cant seem to get that to work

    i get a syntax error no matter where i put this

    i put it everywhere possible. error, error error. ive been at this for hours

    i tried my header, skin, html, include.php, everything. it doesnt seem to go well with vbulletin its there a different way that i could do that that might work?

  7. Derick2012 2011/10/23 5:44 amReply

    doesnt show for some reason but im referring to the step about what is suppose to be put at the top of each page. the step that was reccomended for the header

    cant get that to work on vbulletin. plz work. i really wanna ban these bots automatically

  8. Derick2012 2011/10/23 4:40 pmReply

    nevermind i got it to work after hours after testing

  9. Hey, I just downloaded this, and after a couple touch up’s it works great. Couple things I did were move blackhole.dat and blackhole.php out of my www directory. Then I made a php file called footer.php that just inserts the link for bad bots to follow.

    Then I edited my php.ini and added blackhole.php to auto-prepend and footer to auto-append so it covers my whole site. Then you can put index.php and the .htaccess file wherever browsing shouldn’t happen.

    I’ve had a problem with people getting a link to parts of my site that don’t exist anymore that get’s a hit every minute. So I slapped the index.php in there and also a mod_rewrite so if they try to access anything, it puts them at index.php.

    Just some ideas for other people.

    I also think that renaming the blackhole directory and blackhole.php to something random is a good idea, so that bots can’t filter for it. Also, in blackhole.php there were a couple errors that were propagating through to my apache error log(couple uninitialized variables)

    Anyways, great idea and thanks.

  10. Derick2012 2011/10/24 12:46 pmReply

    i just wanna say thanks a ton for this jeff. this has really helped. my site was getting killed by bots and the first day of installing this i caught over 40 of them

  11. Jeff Starr 2015/01/24 1:37 pm Reply

    Update: just received the following email, which looks like it may be useful until I get a chance to update the script again. Here it is:

    “I think I found the mistake, in the blackhole.php file, at the end there is a die() command after the closing bracket for the previous if condition (valuates if it has been included), after I passed that die command before the closing if bracket everything worked fine.”

  12. Why is your Black Hole for Bad Bots set up to catch people looking for tools to defend their sites against bad bots? That is not very friendly.

    • Jeff Starr 2016/01/28 5:49 pm Reply

      As explained in the tutorial, the “Blackhole” at PerishablePress.com is just a single-page DEMO that blocks repeat visitors from itself (one page only). The only thing the demo blocks on my site is access to the trap itself, to repeat visitors – all other content at Perishable Press remains freely accessible to all. Keep in mind the following key points:

      • When the trap says “you have been banned” don’t take it too seriously, it’s just a one-page demo (see article).
      • If you do happen to visit the trap and get banned, the rest of the site and all other sites will still be accessible to you (the demo only blocks repeat visitors from itself (one page) and nothing else.
      • If you do get banned from the single-page demo, just drop a line and I will be glad to reset the file (see article for more info). The trap page provides information for contacting the administrator.

      I hope this makes sense, let me know if I can elaborate on anything.

Leave a Reply to Derick2012 Cancel

Name and email required. Email kept private. Basic markup allowed. Please wrap any small/single-line code snippets with <code> tags. Wrap any long/multi-line snippets with <pre><code> tags. For more info, check out the Comment Policy and Privacy Policy.

Subscribe to comments on this post

Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
GA Pro: Add Google Analytics to WordPress like a pro.
Thoughts
I disabled AI in Google search results. It was making me lazy.
Went out walking today and soaked up some sunshine. It felt good.
I have an original box/packaging for 2010 iMac if anyone wants it free let me know.
Always ask AI to cite its sources. Also: “The Web” is not a valid answer.
All free plugins updated and ready for WP 6.6 dropping next week. Pro plugin updates in the works also complete :)
99% of video thumbnail/previews are pure cringe. Goofy faces = Clickbait.
RIP ICQ
Newsletter
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.