Celebrating 20 years online :)
Web Dev + WordPress + Security

Block Spam by Denying Access to No-Referrer Requests

What we have here is an excellent method for preventing a great deal of blog spam. With a few strategic lines placed in your .htaccess file, you can prevent spambots from dropping spam bombs by denying access to all requests that do not originate from your domain.

Block comment spam

Here is the script to add to your site’s root .htaccess file:

# block comment spam by denying access to no-referrer requests
RewriteEngine On
RewriteCond %{REQUEST_URI} wp-comments-post\.php
RewriteCond %{HTTP_REFERER} !(.*)example\.com(.*) [OR]
RewriteCond %{HTTP_USER_AGENT} ^-?$
RewriteRule .* http://the-site-where-you-want-to-send-spammers.com/ [R=301,L]

Note that you need to edit the following lines according to your specific setup:

This is the default comment-processing script for WordPress users. If you are not running WordPress, you will need to determine the corresponding file and enter its name here.
Change this value to that of your own domain.
Because spambots typically ignore redirects, this may not be accomplishing too much. But go ahead and enter the URL of your least-favorite website anyway. Another option here is to simply bounce the spambot back to where it came from by replacing the last with this: RewriteRule .* http://%{REMOTE_ADDR}/ [R=301,L]

For more awesome anti-spam techniques, check out How to Block Bad Bots and Stupid .htaccess Tricks.

How does it work?

When a legitimate user (i.e., not a robot, etc.) decides to leave a comment on your blog, they have (hopefully) read the article for which they wish to leave a comment, and have subsequently loaded your blog’s comment template (e.g., comments.php), which is most likely located within the same domain as the article, blog, etc. (i.e., your domain).

So, after filling out the comment form via comments.php, the user clicks the “submit” button, which then initiates the PHP file/script that actually processes the comment for the world to see. For WordPress users, the comment processing file is wp-comments-post.php.

Therefore, the HTTP referrer for all legitimate (user-initiated) comments will be your domain (or the domain in which the comments.php file is located). Automated spam robots typically target the comment-processing script directly, bypassing your comments.php form altogether. Such activity results in HTTP referrers that are not from your domain.

Thus, by blocking all requests for the comments-processing script (wp-comments-post.php) that are not sent directly from your domain (comments.php), you immediately eliminate a large portion of blog spam.

And that is all there is to it! Bye bye spambots!

About the Author
Jeff Starr = Web Developer. Security Specialist. WordPress Buff.
The Tao of WordPress: Master the art of WordPress.

44 responses to “Block Spam by Denying Access to No-Referrer Requests”

  1. (beausoleilm) Mathieu Beausoleil 2008/05/09 5:05 pm

    What about proxy ? I know that some proxy server will erase referrer header. Do you know if that solution will block visitors ? Is that better to stock a referrer address in session or use an otherway like an empty input text (display none) and verify that the input still empty before using the data ?

  2. Yes, that would be one way to do it. If you are allowing visitors to comment via proxy, you may want to test the method before implementation. It is really a double-edged sword, dealing with no-referrer requests: it is nearly impossible to avoid false positives and false negatives. Frankly, I have been contemplating removing the method described in this article. If so, it will be done as a test and I will report the results in a future post.

  3. Web Designer 2008/09/25 8:40 pm

    I was using manual posting technique, but I am not able to post comment in any site.

    May be my URL “gigaturn” has been listed in block-list by wp-comments-post.

    Any solution for this problem would be appreciated.

    Thanks in advance!

  4. Jeff Starr 2008/09/27 7:20 pm

    Hi Web Designer, I have never heard of the wp-comments-post.php file having any inherent blacklisting capabilities, but I have not investigated the file in newer versions of WordPress, so it may be the case. Another thing you could check is whether or not your “gigaturn” URL has been flagged as spam within the Akismet database. If you go to the site or Google for something like “remove site from Akismet” (or similar), you should find all the information needed to investigate and possibly remedy the issue. Good luck!

  5. Web Designer 2008/09/29 3:03 am

    Thanks Jeff,
    But you can try it by yourself.
    just try it with gigaturn.com
    you will not able to post.

    still looking for right solution.

  6. Web Designer 2008/09/29 3:06 am

    Tried again with site URL and got this,


    Not able to post comment but this URL (gigaturn.com), however I can post with other URLs.

  7. Perhaps I am confused as to what you are trying to do. Are you trying to post comments at gigaturn.com? Or are you trying to post comments on other sites using gigaturn.com as the commentator link? Or something else..? I guess I need more information as to what’s going on and where..

  8. balisugar 2008/11/19 8:12 am

    Hi, sorry to botter you, I need help.

    I think I have a few pages with strange url, that i can see from my wassUp stats. That xxx is a porn site. And Google crawls it all the time. I never link to them in the first place. Please help. How to remove and block it because it’s not only one page.
    eg :

    I’m very sad, I don;t know much about this :cry:

  9. Jeff Starr 2008/11/24 6:03 pm

    @balisugar: don’t cry! You should be able to deny requests for that specific query string by adding the following directives to your root htaccess file:

    <ifmodule mod_rewrite.c>
       RewriteCond %{QUERY_STRING} xxx.com [NC,OR]
       RewriteRule .* - [F,L]

    Once in place, this code should block any query-string requests containing the character string “xxx.com”.

    For more information on this technique, check out my article, Improving Site Security by Preventing Malicious Query-String Exploits.

  10. balisugar 2008/11/25 5:50 am

    Thank you for your help. I will link to you so I don’t forget your site.

  11. @balisugar: My pleasure — happy to help! :)

  12. balisugar 2008/11/28 7:58 am

    Hi, Mr Jeff. How are you? :smile:

    After what happened to me, I’m still sometimes worried that someone is redirecting bad content to my site. Is that possible and if so, how can I stop them? And which is the better way to block bad bots – .htaccess or robots.text?

    I feel more “comfortable” modifying robots.text rather than .htaccess.
    Thank you for all your help.

Comments are closed for this post. Something to add? Let me know.
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
SAC Pro: Unlimited chats.
Crazy that we’re almost halfway thru 2024.
I live right next door to the absolute loudest car in town. And the owner loves to drive it.
8G Firewall now out of beta testing, ready for use on production sites.
It's all about that ad revenue baby.
Note to self: encrypting 500 GB of data on my iMac takes around 8 hours.
Getting back into things after a bit of a break. Currently 7° F outside. Chillz.
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.