Block Bad Bots with Blackhole Pro + Save 25% with code: CENTAURUS Get plugin »
websites
Tag Archive

Provide a Link for Visitors to Verify Your Feedburner Subscriber Count

Recently, I received a bizarre email accusing me of calling someone out on their fake Feedburner subscriber count. Apparently, some desperate blogger had been claiming to have something like 30,000 Feedburner subscribers when in reality they only had around 700. From what I could tell, the fraudulent site was displaying a counterfeit Feedburner subscriber-count badge using some fancy CSS image-replacement or something. Whatever. I really could care less, but the information contained in the email got me thinking: Providing an easy way for visitors to verify your subscriber count is a good idea.. Enabling visitors to quickly and easily verify […] Read more »

Does Google Hate Web Standards?

Consider the Google home page — arguably the most popular, highly visited web page in the entire world. Such a simple page, right? You would think that such a simple design would fully embrace Web Standards. I mean, think about it for a moment.. How would you or I throw down a few lists, a search field, and a logo image? Something like this, maybe: Read more »

How to Cache Mint JavaScript

NOTE: This post was written many months ago under the erroneous assumption that caching Mint’s JavaScript was a good idea (for Y-Slow compliance, performance, et al); however, after a brief chat with the man himself, Shaun Inman, I was quickly informed that this was a bad idea: caching Mint JavaScript files will cause Mint to stop functioning. But, for what it’s worth, and for the sake of retaining potentially useful information, I present the original article here for your amusement.. Recently, I spent some time addressing a few of the performance issues pointed out by Yahoo!’s very useful YSlow extension […] Read more »

Set the Browser Access Point Profile for the AT&T 8525

This unfeatured post provides information for (re)establishing wireless Internet access from AT&T/Cingular Data Service (via WAP or MEdia Net) for the HTC/AT&T 8525 mobile device. Or something. This information is useful if you are unable to connect to the Internet and receiving error messages similar to the following: Error: Your Internet connection is not configured properly. Please verify your settings in Data Connections. Note: Use of the following settings enables the AT&T 8525 to access MEdia Net (WAP) pages using the MEdia Net connection profile. Additional features such as Wi-Fi and device based e-mail along with other 3rd party applications […] Read more »

Unexplained Crawl Behavior Involving Tagged Query Strings

I need your help! I am losing my mind trying to solve another baffling mystery. For the past three or four months, I have been recording many 404 Errors generated from msnbot, Yahoo-Slurp, and other spider crawls. These errors result from invalid requests for URLs containing query strings such as the following: https://perishablepress.com/press/page/2/?tag=spam https://perishablepress.com/press/page/3/?tag=code https://perishablepress.com/press/page/2/?tag=email https://perishablepress.com/press/page/2/?tag=xhtml https://perishablepress.com/press/page/4/?tag=notes https://perishablepress.com/press/page/2/?tag=flash https://perishablepress.com/press/page/2/?tag=links https://perishablepress.com/press/page/3/?tag=theme https://perishablepress.com/press/page/2/?tag=press ..plus hundreds and hundreds more 1. The URL pattern is always the same: a different page number followed by a query string containing one of the tags used here at Perishable Press, for example: “/?tag=something”. The problem is that there are […] Read more »

Series Summary: Building the 3G Blacklist

In the now-complete series, Building the 3G Blacklist, I share insights and discoveries concerning website security and protection against malicious attacks. Each article in the series focuses on unique blacklist strategies designed to protect sites transparently, effectively, and efficiently. The five articles culminate in the release of the next generation 3G Blacklist. For the record, here is a quick summary of the entire Building the 3G Blacklist series: Read more »

Perishable Press 3G Blacklist

After much research and discussion, I have developed a concise, lightweight security strategy for Apache-powered websites. Prior to the development of this strategy, I relied on several extensive blacklists to protect my sites against malicious user agents and IP addresses. Over time, these mega-lists became unmanageable and ineffective. As increasing numbers of attacks hit my server, I began developing new techniques for defending against external threats. This work soon culminated in the release of a “next-generation” blacklist that works by targeting common elements of decentralized server attacks. Consisting of a mere 37 lines, this “2G” Blacklist provided enough protection to […] Read more »

Building the 3G Blacklist, Part 5: Improving Site Security by Selectively Blocking Individual IPs

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. Wrapping up the series with this article, I provide the final key to our comprehensive blacklist strategy: selectively blocking individual IPs. Previous articles also focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. In the next article, these five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Selectively Blocking Individual IPs The final component of the 3G Blacklist establishes a vehicle through which individual IPs may be blocked. As […] Read more »

Building the 3G Blacklist, Part 4: Improving the RedirectMatch Directives of the Original 2G Blacklist

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this fourth article, I build upon previous ideas and techniques by improving the directives contained in the original, 2G Blacklist. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving the RedirectMatch Directives of the Original 2G Blacklist In the first version (2G) of the next-generation blacklist, a collection of malicious attack strings […] Read more »

Building the 3G Blacklist, Part 3: Improving Site Security by Selectively Blocking Rogue User Agents

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this third article, I discuss targeted, user-agent blacklisting and present an alternate approach to preventing site access for the most prevalent and malicious user agents. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Selectively Blocking Rogue User Agents Several months ago, while developing improved methods for protecting websites […] Read more »

Building the 3G Blacklist, Part 2: Improving Site Security by Preventing Malicious Query-String Exploits

In this continuing five-article series, I share insights and discoveries concerning website security and protecting against malicious attacks. In this second article, I present an incredibly powerful method for eliminating malicious query string exploits. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Preventing Malicious Query String Exploits A vast majority of website attacks involves appending malicious query strings onto legitimate, indexed URLs. Any webmaster serious about site […] Read more »

Building the 3G Blacklist, Part 1: Improving Site Security by Recognizing and Exploiting Server Attack Patterns

In this series of five articles, I share insights and discoveries concerning website security and protecting against malicious attacks. In this first article of the series, I examine the process of identifying attack trends and using them to immunize against future attacks. Subsequent articles will focus on key blacklist strategies designed to protect your site transparently, effectively, and efficiently. At the conclusion of the series, the five articles will culminate in the release of the next generation 3G Blacklist. Improving Site Security by Recognizing and Exploiting Server Attack Patterns Crackers, spammers, scrapers, and other attackers are getting smarter and more […] Read more »

2G Blacklist: Closing the Door on Malicious Attacks

Since posting the Ultimate htaccess Blacklist and then the Ultimate htaccess Blacklist 2, I find myself dealing with a new breed of malicious attacks. It is no longer useful to simply block nefarious user agents because they are frequently faked. Likewise, blocking individual IP addresses is generally a waste of time because the attacks are coming from a decentralized network of zombie machines. Watching my error and access logs very closely, I have observed the following trends in current attacks: User agents are faked, typically using something generic like “Mozilla/5.0” Each attack may involve hundreds of compromised IP addresses Attacks […] Read more »

Over 150 of the Worst Spammers, Scrapers and Crackers from 2007

Update 2010/07/07: Please visit the 2010 IP Blacklist for more current information. Over the course of each year, I blacklist a considerable number of individual IP addresses. Every day, Perishable Press is hit with countless numbers of spammers, scrapers, crackers and all sorts of other hapless turds. Weekly examinations of my site’s error logs enable me to filter through the chaff and cherry-pick only the most heinous, nefarious attackers for blacklisting. Minor offenses are generally dismissed, but the evil bastards that insist on wasting resources running redundant automated scripts are immediately investigated via IP lookup and denied access via simple […] Read more »

Laser-Focused Feeds and Smarter Feed Management

My current adventure into the fascinating realms of site redesign and optimization has yielded several chunks of fruit related to managing and delivering feed content. One of my primary concerns regarding the overhaul of Perishable Press is streamlined content delivery and rights management. An important area of convergence for these two factors involves the management and delivery of a site’s syndicated content. In this article, I explain the shortcomings of many default feed configurations and present an effective overall strategy for better feed management. When it comes to managing syndicated content, most blogging platforms enable bloggers to provide a multitude […] Read more »

Important Note for Your Custom Error Pages

Just a note to web designers and code-savvy bloggers: make sure your custom error pages are big enough for the ever-amazing < cough> Internet Explorer browser. If your custom error pages are too small, IE will take the liberty of serving its own proprietary web page, replete with corporate linkage and poor grammar. How big, baby? Well, that’s a good question. In order for users of Internet Explorer to enjoy your carefully crafted custom error pages, they need to exceed 512 bytes in size. Using proper doctype markup, your custom pages should include more than around 10 lines (roughly) of […] Read more »

Latest Tweets Detect Attacks with PHP and .htaccess: perishablepress.com/detect-att… #security