How to create a robots.txt in Umbraco and edit it from the backoffice
Published 22/10/2016 - Updated 27/07/2023
Table of Contents
It’s very easy to create a robots.txt in Umbraco which you can edit from the backoffice. You can achieve this natively without installing any packages or writing custom code with these simple steps:
- Create a “txt file” document type in the backoffice
- Add a line in the template for the “txt file” doctype so that Umbraco serves it as text rather than HTML
- Configure Umbraco’s built in url rewriting module to handle a request that ends in “.txt”
Here’s what we’re going to achieve:
This file will be accessible to the crawlers at www.mywebsite.co.uk/robots.txt
1. Create a “txt file” document type
We need to create a document type for “.txt” files. I refrain from calling it “robots.txt doctype” or similar, because there’s no reason this document type couldn’t be used again for another txt file web standard, such as humans.txt
This is easy. All we need is a text area for the file content. Make sure that the document type is created with accompanying template and that all your permissions are set up correctly.
2. Write the txt file template
Again, this is very simple. We need one line of Razor code to take the string from our text area and render it in the template.
Line no. 5 however, may take some explaining. This line tells Umbraco to set this page’s content type header to plain text. Web servers generally send these extra bits of information with the web page to tell the user’s browser what type of content it’s dealing with so it can be rendered properly. For instance, it would obviously be incorrect for the browser to assume that HTML files and PNG images should be rendered in the same way!
Umbraco’s default content type header is text/html
, so we need to change it to text/plain
so that our clients know they’re dealing with a plain text file.
Now we can create the robots.txt file in our content tree and add our content to it.
3. Configure Umbraco to recognise the “robots.txt” URL
Once you’ve created a “robots.txt” file with your new document type in the backoffice, and you try to access it on www.mywebsite.com/robots.txt
, you may see a 404 page, a blank page, or something else depending on how your web server is configured. This is because Umbraco doesn’t intercept URLs with extensions like .txt
by default. You’ll need to configure your site to intercept the request to /robots.txt
and send it to your content node.
The good news is that Umbraco provides an out-the-box solution for this. Did you know that Umbraco comes with a URL rewriting module? You can easily configure it to intercept a url with one line of XML configuration.
First you’ll need to find the URL that Umbraco will have generated for you by clicking on your content node and going to the “Properties” tab.
You see here that Umbraco’s auto-generated URL for me is /robotstxt
. You’ll notice that even though there’s a dot in the name of this content node, Umbraco doesn’t add one to the URL for me.
Now, we need to open up Umbraco’s URL rewriting config in our editor and add a line to rewrite the /robots.txt
URL.
The config file is located at ~ConfigUrlRewriting.config
. You can find an example from a live Umbraco site here.
And here’s the line that we have to add:
Name can be set to whatever you want, as long as it’s unique in the file. virtualUrl is the URL people will enter to get to your page (represented as a regular expression). destinationUrl is the URL we’re rewriting to (Umbraco’s auto-generated URL from the properties tab in the backoffice).
Et Voila
Expert advice
You're reading the GL Digital blog, where auto marketing experts share proven tactics to grow your dealership.
Struggling to make good video?
Sometimes it feels like posting on TikTok is a waste of time.
Build a powerful local brand and watch customers roll in from TikTok.