As demand for search engine optimized website are increasing day by day, importance of sitemaps can not be ignored.
What are Sitemaps?
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
In order of your website to be considered in search engines you are required to generate compatible XML sitemap and ping different search engines about the updated sitemap.
In this article I will teach you how to generate dynamic sitemap and ping search engines.
Standard structure of the sitemap per sitemaps.org gudelines is as under:
<?xml version="1.0" encoding="UTF-8"?>
XML tag definitions
|<urlset>||required||Encapsulates the file and references the current protocol standard.|
|<url>||required||Parent tag for each URL entry. The remaining tags are children of this tag.|
|<loc>||required||URL of the page. This URL must begin with the protocol (such as http) and end with a trailing slash, if your web server requires it. This value must be less than 2,048 characters.|
|<lastmod>||optional||The date of last modification of the file. This date should be inW3C Datetime format. This format allows you to omit the time portion, if desired, and use YYYY-MM-DD.
Note that this tag is separate from the If-Modified-Since (304) header the server can return, and search engines may use the information from both sources differently.
|<changefreq>||optional||How frequently the page is likely to change. This value provides general information to search engines and may not correlate exactly to how often they crawl the page. Valid values are:
The value “always” should be used to describe documents that change each time they are accessed. The value “never” should be used to describe archived URLs.
Please note that the value of this tag is considered a hint and not a command. Even though search engine crawlers may consider this information when making decisions, they may crawl pages marked “hourly” less frequently than that, and they may crawl pages marked “yearly” more frequently than that. Crawlers may periodically crawl pages marked “never” so that they can handle unexpected changes to those pages.
|<priority>||optional||The priority of this URL relative to other URLs on your site. Valid values range from 0.0 to 1.0. This value does not affect how your pages are compared to pages on other sites—it only lets the search engines know which pages you deem most important for the crawlers.
The default priority of a page is 0.5.
Please note that the priority you assign to a page is not likely to influence the position of your URLs in a search engine’s result pages. Search engines may use this information when selecting between URLs on the same site, so you can use this tag to increase the likelihood that your most important pages are present in a search index.
Also, please note that assigning a high priority to all of the URLs on your site is not likely to help you. Since the priority is relative, it is only used to select between URLs on your site.
A dynamic sitemap pull its data from the active database and generates standard sitemap structure. I have used LINQ and used DBML (DataContext) for the purpose of fetching records. This project requires empty “sitemap.xml” file to be created, so records can be filled at later stage.
Generate XML node for the each unique url of your website. I have created function “CreateSiteMapNode”, which accepts four parameters namely lastmod, changefreq, and priority and returns XmlNode.
XmlNode nodeUrl = xd.CreateElement("url");
XmlNode nodeLoc = xd.CreateElement("loc");
nodeLoc.InnerText = strLoc;
XmlNode nodeLastMod = xd.CreateElement("lastmod");
nodeLastMod.InnerText = dtLastMod.ToString("yyyy-MM-ddThh:mm:ss+00:00");
XmlNode nodeChangeFreq = xd.CreateElement("changefreq");
nodeChangeFreq.InnerText = strChangeFreq;
XmlNode nodePriority = xd.CreateElement("priority");
nodePriority.InnerText = strPriority;
Now add created XmlNode to XmlDocument and save it to the “sitemap.xml” located at the root of your website.
DCSitemap DCS = new DCSitemap();
xd = new XmlDocument();
XmlNode rootNode = xd.CreateElement("urlset");
XmlAttribute attrXmlNS = xd.CreateAttribute("xmlns");
attrXmlNS.InnerText = "http://www.sitemaps.org/schemas/sitemap/0.9";
var item = from i in DCS.Posts select i;
foreach (var i in item)
rootNode.AppendChild(CreateSiteMapNode(i.url, Convert.ToDateTime(i.created), "hourly", "1.00"));
// append all nodes to the xml-document and save it to sitemap.xml
xd.InsertBefore(xd.CreateXmlDeclaration("1.0", "UTF-8", null), rootNode);
//Ping search engines about updated sitemap over your website
// resubmit to Google
System.Net.WebRequest reqGoogle = System.Net.WebRequest.Create("http://www.google.com/webmasters/tools/ping?sitemap=" + HttpUtility.UrlEncode("http://www.shapemyyatra.com/sitemap.xml"));
// resubmit to Ask
System.Net.WebRequest reqAsk = System.Net.WebRequest.Create("http://submissions.ask.com/ping?sitemap=" + HttpUtility.UrlEncode("http://www.shapemyyatra.com/sitemap.xml"));
// resubmit to Yahoo
System.Net.WebRequest reqYahoo = System.Net.WebRequest.Create("http://search.yahooapis.com/SiteExplorerService/V1/updateNotification?appid=YahooDemo&url=" + HttpUtility.UrlEncode("http://www.shapemyyatra.com/sitemap.xml"));
// resubmit to Bing
System.Net.WebRequest reqBing = System.Net.WebRequest.Create("http://www.bing.com/webmaster/ping.aspx?siteMap=" + HttpUtility.UrlEncode("http://www.shapemyyatra.com/sitemap.xml"));
catch (System.Exception ex)
I have compiled the entire code into ASP.NET C# 4.0 LINQ solution and is available for download.