The little things to not forget about during development [Part: 2] | Force 5
  • Left Brains.

  • Right Brains.

  • Brand Soul.

The little things to not forget about during development [Part: 2]

Sitemap.xml

What are Sitemaps and why are they important?

Sitemaps are a tool for developers to inform search engines about the website content that is available to be indexed. The sitemap protocol is made up of XML that contains a list of URLs, last modified dates, and page priorities for your website.

Here is a quick sample in simple form of a sitemap.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<urlset  xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9”>
  <url>
  <loc>http://www.discoverforce5.com/</loc>
  </url>
  <url>
  <loc>http://www.discoverforce5.com/About/</loc>
  </url>
</urlset>

For more information on sitemap protocol and XML tag definitions visit: http://www.sitemaps.org/protocol.php#xmlTagDefinitions

Below you will see two optional attributes available to include in your sitemap. The new attributes you see will be <changefreq> and <priority>. As stated within the sitemap documentation:

  • <changefreq> refers to how often a page is likely to change even though search engines may not crawl that often.
  • <priority> refers to the priority related to other links/URL’s within your site.
<?xml version="1.0" encoding="UTF-8"?>
<urlset  xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9”>
  <url>
  <loc>http://www.discoverforce5.com/</loc>
  <changefreq>monthly</changefreq>
  <priority>1.00</priority>
  </url>
  <url>
  <loc>http://www.discoverforce5.com/About/</loc>
  <changefreq>monthly</changefreq>
  <priority>0.80</priority>
  </url>
</urlset>

For more information about sitemaps and more advance information head over to www.sitemaps.org.

Next steps - Submitting your sitemap to the search engines

Each search engine is different in how they approach webmasters in submitting sites. Here is a list of locations to submit your sitemap.xml to search engines:

Tying in both the sitemap.xml with robots.txt

In part 1 of this series discussing robots.txt, you can declare the location of your sitemap.xml file for web crawlers/bots. Here is an example what a Robots.txt would look like:

User-agent: *
Allow: /
Sitemap: http://www.yourdomain.com/sitemap.xml

A quick refresher about the syntax above: "User-agent: *" is defining all bots and "Allow: /" is stating index all folders.

If you have multiple sitemaps you can declare them in your Robots.txt file. Here is an example in how to do so:

User-agent: *
Allow: /
Sitemap: http://www.yourdomain.com/sitemap-1.xml
Sitemap: http://www.yourdomain.com/sitemap-2.xml

In conclusion, having a sitemap.xml is not required for successful search engine indexing. Although, at Force 5 we believe it is beneficial when submitting your site to search engines that your site will be properly indexed. A great example is when a new site is going live where older pages may no longer exist in the same location. Even though search engines use other methods on indexing your site, using Sitemaps will in the end help the indexing process.

If you have any SEO needs or questions please give Force 5 a call.