Disobedient Robots and Company
In our never-ending battle against spammers, leeches, scrapers, and other online undesirables, we have implemented several powerful security measures to improve the operational integrity of our perpetual virtual existence. Here is a rundown of the new behind-the-scenes security features of Perishable Press.
- Automated spambot trap, designed to identify bots (and/or stupid people) that disobey rules specified in the site’s robots.txt file.
- Automated disobedient-robot identification (via reverse IP lookup), admin-notification (via email) and blacklist inclusion (via htaccess).
- Automated inclusion of disobedient robot identification on our now public Disobedient Robots page.
- Imroved htaccess rules, designed to eliminate scum-sucking worms and other useless vermin.
- Automated tracking tools, designed to keep a close eye on any suspicious or questionable activity.
- Automated 404-error statistics, designed to optimize the elimination slash resolution of 404 errors.
- Plus a few other secret-agent tricks that we are not at liberty to discuss ;)
As you can see, we have been pretty busy around here — fortunately, the new security features have been working flawlessly, reducing stolen bandwidth, potential spam, disobedient robots, and 404 errors. Hopefully, the end result of these new features will involve smoother site functionality and better browsing for everyone.
About the Author
Jeff Starr = Fullstack Developer. Book Author. Teacher. Human Being.