Fall Sale! Code FALL2024 takes 25% OFF our Pro Plugins & Books »
Web Dev + WordPress + Security

Humans.txt

One thing I love about Twitter is the instant feedback. For the past few weeks I’ve been seeing lots of 404 requests like this:

https://perishablepress.com/humans.txt
https://perishablepress.com/humans.txt
https://perishablepress.com/humans.txt

At first I thought it was some skript kiddie getting creative, you know as a play on the robots.txt file, which is also located in the root of many websites. So it seemed interesting enough to tweet about:

Fun new 404 requests appearing in the error logs: http://example.com/humans.txt

Almost instantly there were four replies, letting me know what’s up:

[ Screenshot: Tweets about humanstxt.org ]

This would’ve occurred to me eventually, but the point is that Twitter – or better yet, your Twitter followers – are an incredible resource for quickly gathering information that you never knew existed. Google works fine if you know what you’re looking for, but in this case the whole “humans.txt” thing was news to me.

Note: originally this article had three links pointing to the humans.txt website, https://humanstxt.org/. But I’ve had to remove the links because the humans.txt site is offline most of the time. Just FYI.

Robots & Humans

After reading more about humans.txt, I can see it really catching on and becoming something associated with the robots.txt file. Both are plain-text files located in the root directory, but instead of telling compliant robots which pages to crawl (or not crawl), humans.txt provides information about the people and techniques behind the site. It’s definitely an interesting idea, and potentially useful if the same information isn’t already available in a “Contact” or “About” page.

Create a text file called humans.txt (always in lower-case) and make it UTF-8 encoded to avoid issues with special characters and multiple languages. – humanstxt.org

To create a humans.txt file for your site, follow these simple steps:

  1. Place a reference to the file in the <head> of the site:
    <link rel="author" href="humans.txt">
  2. Customize the template below with your details
  3. Save the file and place in your site’s root directory

Here’s the template:

/* TEAM */
	Your title: Your name.
	Site: email, link to a contact form, etc.
	Twitter: your Twitter username.
	Location: City, Country.

	[...]

/* THANKS */
	Name: name or url

	[...]

/* SITE */
	Last update: YYYY/MM/DD
	Standards: HTML5, CSS3,..
	Components: Modernizr, jQuery, etc.
	Software: Software used for the development

	[...]

The official humanstxt.org guidelines tell you to also “add the humansTXT button to your site and link it to your humansTXT file,” but you’re probably safe just linking to your humans.txt file like you would the sitemap. And even then, the information will be there for those who are looking for it.

For further examples, you can inspect the humans.txt file used at the humanstxt.org website (sorry site is now offline), and also check out the humans.txt file used here at Perishable Press. Update: here is a humans.txt template file, to give you a starting point for making your own.

Inhuman.txt

So that’s all fine and dandy, but what if you don’t want to provide humans.txt information for your site? You have several options:

  • Do nothing and ignore the countless 404 errors filling your access logs
  • Leave it blank or maybe with a simple message explaining something
  • Redirect requests for the humans.txt file (see below for details)

It could be argued that the humans.txt file is opt-out, with inevitable 404 errors for sites that for whatever reason don’t provide one. This may not be by design, as it’s just as easy to argue that nobody is forcing you to do anything. But if/when the humans.txt catches on, those without the file may want to do something about the repeat 404 errors. Heck, if I were trying to establish a “pancakes.txt” protocol, a sort of black-hat way to attract attention would be to create a few bots to go around requesting http://yoursite.com/pancakes.txt from jillions of sites. You know, make it impossible for people to ignore. Whatever, it’s all good, but if you do want an easy way to clean up the humans.txt errors, read on..

Redirect requests for humans.txt

To eliminate needless 404 errors from slowing things down and messing up your access logs, either upload a blank humans.txt or add one of these slices to your root .htaccess file:

Using Apache’s mod_alias to redirect all requests for /humans.txt to the URL of your choice:

# REDIRECT HUMANS.TXT
<ifModule mod_alias.c>
 RedirectMatch 301 /humans\.txt$ http://example.com/
</ifModule>

Here is an alternate technique using mod_rewrite:

# REDIRECT HUMANS.TXT
<ifModule mod_rewrite.c>
 RewriteCond %{REQUEST_URI} /humans\.txt$ [NC]
 RewriteRule . http://example.com/ [R=301,L]
</ifModule>

To use either of these methods, just edit the example.com with the URL/path of your choice. Bada-boom, bada-bing. Being human has never been easier! ;)

About the Author
Jeff Starr = Web Developer. Security Specialist. WordPress Buff.
Digging Into WordPress: Take your WordPress skills to the next level.

14 responses to “Humans.txt”

  1. There are a few plugins that automatically look for humans.txt on page load. That could be why you see a lot more traffic.

    I enjoy this one for Chrome

  2. The very first thing I thought after learning about this was “Why not humans.html”?

    Then you could, like, have links to the people you credit. Sorta like the rest of the entire internet.

    I do like the idea though, especially on sites where having that information available isn’t practical to do on the regular public side.

  3. But where isn’t it practical Chris? If it isn’t practical on a site for say political/commercial/or other sensitivity issues, or just a downright refusal by a client to allow a ‘credits’ page – then why is a humans.txt page really so much different? – that’s what I don’t get.

    Humans.txt has been around a while, for those that want it great, but for those that don’t it really isn’t worth the effort creating it… unless of course Google decides that it’ll rate sites better with a humans.txt page – then no doubt we’ll all be forced into it.

  4. I think the analogy is false or flawed between robots.txt and humans.txt . A robots.txt file says, “Robots/crawlers, here’s how/what you can/can’t access and look for.” A humans.txt file says, “Here are some of the people who produced this website.” Why are those two things placed side-by-side, in the same place, with the same naming convention? A credits.txt file might make more sense, since
    though still, less than 1% of the public would look for or find it. A more analagous humans.txt might be a guide for humans for how to view the site, where to look, or something, though in plain text that wouldn’t be very useful. Or a robots.txt that tells you the robots that worked on the website.

    • Totally agree with you Nathan, credits.txt would be far more appropriate.

      I just cant see many people visiting a site and thinking, I’ll look for the humans.txt file to see who created it.

  5. so is it like Who.is information about our site?

  6. I like the idea, although it is gimmicky, it gave me a small chuckle and i instantly related to it. But if a picky client refuses a link – they will probably blow a head gasket when they find a dedicated file on the server.

    I much prefer the humans.html idea and may start throwing that one in and see if i get away with it.

  7. I added mine, but the format is very loose which makes it impossible to parse and ….. oh, um, never mind.

  8. Yael K. Miller 2011/06/02 8:05 am

    Do redirects in the .htaccess slow down site load time?

    How are you handling this humans.txt issue? Are you ignoring or redirecting?

    Is there a robots.txt method of handling this?

  9. Pali Madra 2011/06/20 8:19 am

    Those who are aware should use humans.txt. I do not see a reason not to use it as the file provides more information for human visitors to my website.

  10. It’s worth noting that you must CHMOD the humans.txt file 755 or 644. I say this because a humans.txt file on a client’s site kept throwing inexplicable 403 Forbidden errors. After much headscratching checking and rechecking htaccess settings to no avail, the CHMOD check was a last resort – but solved the problem.

    Cheers
    I

  11. Jeff Starr 2011/07/01 6:16 pm

    Good point – Thanks for the reminder :)

Comments are closed for this post. Something to add? Let me know.
Welcome
Perishable Press is operated by Jeff Starr, a professional web developer and book author with two decades of experience. Here you will find posts about web development, WordPress, security, and more »
Banhammer: Protect your WordPress site against threats.
Thoughts
I disabled AI in Google search results. It was making me lazy.
Went out walking today and soaked up some sunshine. It felt good.
I have an original box/packaging for 2010 iMac if anyone wants it free let me know.
Always ask AI to cite its sources. Also: “The Web” is not a valid answer.
All free plugins updated and ready for WP 6.6 dropping next week. Pro plugin updates in the works also complete :)
99% of video thumbnail/previews are pure cringe. Goofy faces = Clickbait.
RIP ICQ
Newsletter
Get news, updates, deals & tips via email.
Email kept private. Easy unsubscribe anytime.