ref: 2c661674317d980fff4f48cb410b929fa18b0bf2
parent: 01f71babfcc772b34d8a4226f2c2e1e62ef24c8b
author: Anthony Fok <[email protected]>
date: Wed Jan 6 16:11:02 EST 2016
Docs: Copyedit extras/robots-txt.md
--- a/docs/content/extras/robots-txt.md
+++ b/docs/content/extras/robots-txt.md
@@ -1,25 +1,27 @@
---
-lastmod: 2015-12-08
-date: 2013-07-09
+lastmod: 2016-01-06
+date: 2015-12-08
menu:
main:
parent: extras
next: /community/mailing-list
prev: /extras/urls
-title: Table of Contents
+title: Custom robots.txt
weight: 120
---
-Hugo can generated customized [robots.txt](http://www.robotstxt.org/) in the
-[same way than any other template]({{< ref "templates/go-templates.md" >}}).
+Hugo can generated a customized [robots.txt](http://www.robotstxt.org/) in the
+[same way as any other templates]({{< ref "templates/go-templates.md" >}}).
-By default it generates a robots.txt which allows everything, it looks exactly
+By default, it generates a robots.txt, which allows everything, with the following content:
- User-agent: *
+```http
+User-agent: *
+```
-To disable it just set `disableRobotsTXT` option to false in the [command line]({{< ref "commands/hugo.md" >}}) or [configuration file]({{< ref "overview/configuration.md" >}}).
+To disable it, just set `disableRobotsTXT` option to `false` in the [command line]({{< ref "commands/hugo.md" >}}) or [configuration file]({{< ref "overview/configuration.md" >}}).
-Hugo will use the template `robots.txt` following the list starting with the one with more priority
+Hugo will use the template `robots.txt` according to the following list in descending precedence:
* /layouts/robots.txt
* /themes/`THEME`/layout/robots.txt
@@ -26,10 +28,11 @@
An example of a robots.txt layout is:
- User-agent: *
+```http
+User-agent: *
- {{range .Data.Pages}}
- Disallow: {{.RelPermalink}}{{end}}
-
+{{range .Data.Pages}}
+Disallow: {{.RelPermalink}}{{end}}
+```
This template disallows and all the pages of the site creating one `Disallow` entry for each one.