Just a reminder to keep your backup files offline. Do not store them in any publicly accessible space. It’s just not worth the risk man. And if you’re working online, you should know this already. If not, then continue reading to learn why it’s absolutely mission critical. Continue reading »
I’ve been noticing a new strategy for brute-force login attacks: the slow, incremental “drip” attack. Instead of slamming a login page with hundreds or thousands of brute-force login attempts all within a few minutes, some attackers have been taking a more low-key approach by slowing down the rate of login attempts in order to bypass security measures. The “drip” brute-force attack is extremely annoying, and possibly dangerous if any of your registered users are using weak login credentials. This article […] Continue reading »
A user of my 6G Firewall recently asked how to block the “baidu” bot from accessing their site. This post explains why Baidu is not blocked in 6G and provides a quick .htaccess technique to deny it (or anything claiming to be it) access to your site. Continue reading »
Some douchebag has been scanning my sites for a variety of potential database exploits. My sites are secure, so there is no real security threat, but the scans are extremely annoying and waste my server resources. Resources like bandwidth and memory that I would rather use for legitimate visitors. So after collecting some data and experimenting a bit, I wrote a simple .htaccess snippet to block a vast majority of these pathetic database-exploit scans. Continue reading »
Image Courtesy NASA/JPL-Caltech. Update: Pro version now available! Check out Blackhole Pro » Finally translated my Blackhole Spider Trap into a FREE WordPress plugin. It’s fun, fast, flexible, and works silently behind the scenes to protect your WordPress-powered site from malicious bots. Here are some of the features: Continue reading »
After three years of development, testing, and feedback, I’m pleased to announce the official launch version of the 6G Firewall (aka the 6G Blacklist). This version of the nG Firewall is greatly refined, heavily tested, and better than ever. Fine-tuned to minimize false positives, the 6G Firewall protects your site against a wide variety of malicious URI requests, bad bots, spam referrers, and other attacks. Blocking bad traffic improves site security, reduces server load, and conserves precious resources. The 6G […] Continue reading »
One of the most annoying, persistent scans I’ve seen in a long time are those hunting for the revslider vulnerability. In the five or so months since the exploit was discovered, many sites have been compromised. And based on what I’ve been seeing in my traffic logs, the risk is far from over. Apparently every 2-bit script kiddie and their pet hamster wants a piece of the “revslider action”. Continue reading »
BBQ Firewall is a lightweight, super-fast plugin that protects your site against a wide range of threats. BBQ checks all incoming traffic and quietly blocks bad requests containing nasty stuff like eval(, base64_, and excessively long request-strings. This is a simple yet solid solution for sites that are unable to use a strong Apache/.htaccess firewall. Continue reading »
In this tutorial you will learn how to integrate Google’s new reCatcha model in WordPress Login, Comment, Registration and Lost Password Forms. Continue reading »
Whether you like it or not, there are scripts and bots out there hammering away at your sites with endless HTTP “POST” requests. POST requests are sort of the opposite of GET requests. Instead of getting some resource or file from the server, data is being posted or sent to it. To illustrate, normal surfing around the Web involves your browser making series of GET requests for all the resources required for each web page. HTML, JavaScript, CSS, images, et […] Continue reading »
I woke up this morning to the sound of thousands of 404 requests hitting the server. It’s sad that there are kiddies out there who have nothing better to do than buy some pathetic $50 script and then sit there like an imbecile harassing people for hours on end. But alas, that is the world we live in — fortunately it’s less than trivial to block the entire scan with just a few lines of good old .htaccess. Continue reading »
Over the past several months, I’ve assembled a “micro” blacklist to keep some recent threats at bay. Eventually, this will be integrated into the next nG Blacklist, but for now I just wanted to post and share with anyone else who is actively monitoring their server logs and aware of the recent spike in malicious activity. Continue reading »
The 2013 User Agent Blacklist blocks hundreds of the worst bots while ensuring open-access for normal traffic, major search engines (Google, Bing, et al), good browsers (Chrome, Firefox, Opera, et al), and everyone else. Compared to blocking threats by IP, blocking by user-agent is more effective as a general security strategy. Although it’s trivial to spoof any user agent, many bad requests continue to report user-agent strings that are known to be associated with malicious activity. For example, the notorious […] Continue reading »
When time allows, I like to post my collections of the worst IP addresses for the current year. Certainly, there are pros and cons to using an IP blacklist. In general, IPs are easily spoofed, change frequently, and are therefore unreliable as a general security strategy. But as a short-term solution, IP blacklists serve as an excellent method for dealing with specific and/or ongoing threats and attacks. Continue reading »
Following up on much feedback (and this post), here is an update for the 5G Blacklist for 2013. As explained in the 2012 article (and elsewhere), the 5G Blacklist helps reduce the number of malicious URL requests that hit your website. It’s one of many ways to improve the security of your site and protect against evil exploits, bad requests, and other nefarious garbage. If your site runs on Apache and you’re familiar with .htaccess, the 5G is an effective […] Continue reading »
Just as there are specifications for designing with CSS, HTML, and JavaScript, there are specifications for working with URIs/URLs. The Internet Engineering Task Force (IETF) clearly defines these specifications in RFC 3986: Uniform Resource Identifier (URI): Generic Syntax. Within that document, there are guidelines regarding which characters may be used safely within URIs. This post summarizes the information, and encourages developers to understand and implement accordingly. Continue reading »