Save 10% on our Pro WordPress plugins with discount code: 10PERCENT
Web Dev + WordPress + Security

Cleaning Up Google Search Results

[ Drawing: Abstract Entity Pursuing Clean Search Results ]

This post is about how I cleaned up an incorrect URL in the Google search results. My business site is basically a one-page portfolio site, located at the URL https://monzillamedia.com/. But in the Google search results, the URL was showing as https://monzilla.biz/, which did not exist. So all potential customers were getting an error page. Fortunately I was able to re-acquire the monzilla.biz domain and redirect all traffic to monzillamedia.com. Continue reading »

Example of a Spoofed Search Engine Bot

While solving the recent search engine spoofing mystery, I came across two excellent examples of spoofed search engine bots. This article uses the examples to explain how to identify any questionable bots hitting your site. Continue reading »

List of All User Agents for Top Search Engines

Here is a working list of all user agents for the major, top search engines. I use this information frequently for my plugins such as Blackhole for Bad Bots and BBQ Pro, so I figured it would be useful to post the information online for the benefit of others. Having the user agents for these popular bots all in one place helps to streamline my development process. Each search engine includes references and a regex pattern to match all known […] Continue reading »

Tell Google NOT to Index Certain Parts of Your Web Pages

There are several ways to instruct Google to stay away from various pages in your site: Robots.txt directives Nofollow attributes on links Meta noindex/nofollow directives X-Robots noindex/nofollow directives ..and so on. These directives all function in different ways, but they all serve the same basic purpose: control how Google crawls the various pages on your site. For example, you can use meta noindex to instruct Google not to index your sitemap, RSS feed, or any other page you wish. This […] Continue reading »

SEO Experiment: Let Google Sort it Out

One way to prevent Google from crawling certain pages is to use <meta /> elements in the <head></head> section of your web documents. For example, if I want to prevent Google from indexing and archiving a certain page, I would add the following code to the head of my document: Continue reading »

Best Practices for Error Monitoring

Given my propensity to discuss matters involving error log data (e.g., monitoring malicious behavior, setting up error logs, and creating extensive blacklists), I am often asked about the best way to go about monitoring 404 and other types of server errors. While I consider myself to be a novice in this arena (there are far brighter people with much greater experience), I do spend a lot of time digging through log entries and analyzing data. So, when asked recently about […] Continue reading »

Custom OpenSearch for Your Website

I recently added OpenSearch functionality to Perishable Press. Now, OpenSearch-enabled browsers such as Firefox and IE 7 alert users with the option to customize their browser’s built-in search feature with an exclusive OpenSearch-powered search option for Perishable Press. The autodiscovery feature of supportive browsers detects the custom search protocol and enables users to easily add it to their collection of readily available site-specific search options. Now, users may search the entire Perishable Press domain with the click of a button. […] Continue reading »

Taking Advantage of the X-Robots Tag

Controlling the spidering, indexing and caching of your (X)HTML-based web pages is possible with meta robots directives such as these: <meta name="googlebot" content="index,archive,follow,noodp"/> <meta name="robots" content="all,index,follow"/> <meta name="msnbot" content="all,index,follow"/> I use these directives here at Perishable Press and they continue to serve me well for controlling how the “big bots”1 crawl and represent my (X)HTML-based content in search results. For other, non-(X)HTML types of content, however, using meta robots directives to control indexing and caching is not an option. An […] Continue reading »

Seven Ways to Beef Up Your Best Pages for the Next Google PR Update

[ Image: Grotesquely muscular older man ]

Time is running out! Soon, it will be time for the next Google PageRank (PR) update. While it is difficult to predict how your site will perform overall, it seems likely that your highest ranking pages will continue to rank well. The idea behind this article is to improve your site’s overall pagerank by totally beefing up your most popular pages. Of course, every page on your site is important. Ideally, you would want to employ these techniques to every […] Continue reading »

SEO 101: Best Practices

[ Image: Abstracted Documents ]

After studying Peter Kent’s excellent book, Search Engine Optimization for Dummies, several key methods emerged for optimizing websites for the search engines. Although the book is written for people who are new to the world of search engine optimization (SEO), many of the principles presented throughout the book remain important, fundamental practices even for the most advanced SEO-wizards. This article divulges these very useful SEO practices and organizes them into manageable chunks. Continue reading »

Harvesting cPanel Raw Access Logs

[ Image: Harvesting the Land ]

For those of you using cPanel as the control panel for our websites, a wealth of information is readily available via cPanel ‘Raw Access Logs’. The cPanel log files perpetually are updated with data. Each logged visit includes information about the user agent, IP address, HTTP response, request URI, request size, and a whole lot more. To help you make use of this potentially valuable information, here is a quick tutorial on accessing and interpreting your cPanel raw access logs. […] Continue reading »

Search Engine Registration Notes

In his excellent book, Search Engine Optimization for Dummies, Peter Kent explains that many search engines actually get their search results from one (or more) of the larger search engines, such as Google or The Open Directory Project. Therefore, the author concludes that it may not be necessary to spend endless hours registering with thousands of the smaller search sites. Rather, the author provides a brief list of absolutely essential search sites with which it is highly recommended to register. […] Continue reading »

SEO 101: Establishing and Evolving an Effective Link Strategy

Optimizing your website for the search engines involves many important aspects including keyword development, search engine registration, and SEO logging. This Perishable Press tutorial scopes yet another critical weapon in the SEO wars: establishing and evolving an effective link campaign. We will begin our article by focusing on incoming and outgoing link strategies, proceed with a few tips for internal links, and then conclude with some ideas for getting links. Continue reading »

Roll Your Own SEO Log

Search engine optimization (SEO) is the business of every serious webmaster. The process of optimizing a website for the search engines involves much more than properly constructed document headers and anchor tags. Websites are like trees: their roots are the growing collection of content presented through the branching universe of the World Wide Web. Or something. The point is that optimizing a website requires nurturing the site itself while also ensuring proper exposure to the requisite elements of the internet. Continue reading »

Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
Digging Into WordPress: Take your WordPress skills to the next level.
Thoughts
W3C.org has a very thorough list of accessibility tools.
The more you wake up, the more you realize you are still asleep.
7G Firewall v1.4 now available!
I would pay twice as much for a shorter/smaller/lighter phone.
Taking a much needed break in August :)
The Web was better before social media.
WP 5.8 Gutenberg/Block Widgets is breaking many sites. Fortunately Disable Gutenberg makes it easy to restore Classic Widgets with a click.
Newsletter
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.