Sitemaps and why you should use them

Next stop in our Digital Marketing Strategy designed to assist anyone out there with a small business is – sitemaps. Chances are you’ll have utilised a website’s sitemap during your own time on the internet – they’re a useful, time-saving way to work out exactly what you’re looking for and whether or not the website you are currently browsing will be able to supply this to you. Ecommerce sites in particular will find that their sitemap is consulted frequently by its users. While user accessibility is a key component in the reasons for having a sitemap, there are other ways in which it can be useful; namely, as a means for search engine robots to crawl your pages more efficiently, which we will go into in more detail in this article.

A sitemap is essentially a list of all the pages that you want to make visible to your users and search engines, setting out the pathway that your visitors should follow. Ideally, the sitemap should flow in a natural way that reflects how a user would ordinarily browse your site, with the purpose being that here you are simply making it easier for them to do so. This is beneficial for search engines as well as people because it means that they have a clear-cut order in which to crawl your site more efficiently, and should ensure that any important pages aren’t missed out, as you can highlight which pages are high priority.

When sitemaps first began to be used, they were just standard HTML files to outline a site’s content purely for use by visitors. Nowadays, their use has evolved and includes directions for web crawlers to index sites. Generally, the preferred format for a sitemap is in XML, which speaks more directly to the search engines as opposed to being of use to visitors, but it is recommended that you use both if possible (don’t worry about duplicating content here – Google won’t penalise you for it in a sitemap). This will mean that you don’t have to sacrifice usability for SEO, and vice versa. As is true with most elements of digital marketing, at the end of the day you need your website to be catering for the people out there that are actually trying to use your website to get the most of the product/service/information you offer, and not solely catering for search engines. Bearing in mind what search engine robots will pick up on is obviously important, as it will help your site’s visibility on search results pages and garner new interest, but if you over-optimise your site in the name of rankings then you are unlikely to be producing something that is actually usable by the general public.

Going back to HTML and XML sitemaps, the reason that they accommodate different audiences is because they are designed to do entirely different things. HTML focuses on how data is presented, and has a more aesthetic function. XML, however, focuses on the data itself and is more informative. For instance, with an XML sitemap you can give search engine robots additional information about your pages, which assists them when your page is being crawled.

There are a number of tags that will be used in your XML sitemap, some of which are required and some of which are not, so it’s up to you whether you want to include other information. The tags that you will always need to use are the:

<url> – This goes at the beginning of your sitemap as a ‘parent tag’ to cover each URL that is identified as part of your list of content. The specific URL will be prefaced as below:

<loc> – Without this, your sitemap would be pretty useless! Here you include the URL (web address) for the particular page you’re referring to. The ‘loc’ stands for the absolute URL/locator that you are referring to, and will begin with a protocol (for instance, http).

Then, you have your optional tags:

<lastmod> – The last date/time that the particular page was modified.

<changefreq> – How often a page is updated. Can be hourly, weekly, monthly, etc.

<priority> – The priority of the page in relation to the rest of your website.

Naturally, these aren’t a necessity, but considering that Google does now look for how often a website is updated as an indicator of its quality, it’s a useful way to show that your pages are up to date.

If you’ve got a small website, then it will be fairly easy for you to create and upkeep your own sitemap, but for anything bigger you can use a sitemap generator to make it easier and save you some time. You also need to think about how many pages you are listing – again, if you’re a small company then it probably won’t affect you, but for large companies you may not be able to list every single page due to the risk of appearing to be a link spammer. As a general rule, try to keep it between 25 and 40 pages. Regardless of whether you have 4 pages on your sitemap or 40, it is vital that you keep checking that all the links work and that any new pages are added accordingly. There’s also the anchor text that you use – utilise keywords where possible.

One of the handy things about creating a sitemap is that if you are just starting out with your website, it will be one of the best things that you can do to get your site noticed by search engines. When you’re a newbie on the internet scene, you won’t have any of the backlinks from external websites that help lead web crawlers onto your site, so submitting your sitemap to search engines is a brilliant way to make sure those robots are paying attention to your site. You also the advantage of being able to control at least in part the access that robots have to your content, with things such as the <priority> tag and the way that you organise your list of pages. This means that from the outset search engines can get the best information about your site and you’ll set yourself up for the best possible rankings (providing you follow the rest of our digital marketing tips, of course..!).

Another of the means that you have as a website owner to control what is seen and what is not seen by web crawlers is in your robots.txt section. Alongside the general use for your robots.txt as being the place to ‘disallow’ certain pages from being crawled by search engines, here you can also include a sentence that makes sure that robots discover your sitemap quickly. Put it in your robots.txt like this:


The robots.txt file is the first place that web crawlers look, so having the link to your sitemap in there is the sure-fire way to ensure that it is made use of.

With these tips you should know where to begin with either creating or improving your sitemap. If you do decide to use a sitemap generator, make sure you spend some time researching the most reputable sources supplying what you are after – it might be worth spending some money on it to make sure that what you’re getting is worth it. Always remember to check, check, check – any broken links definitely won’t look good when your site is being crawled! If you have a Google account, make the most of their Webmaster Tools – when you submit your sitemap, they’ll let you know if there are any errors and will save you a job.

For more information email  or fill in our Contact Form.

Image by Stephen Coles –

learn more about

What We Do