A sitemap... What's the purpose?
-
Hello everybody,
my question is really simple: what's the purpose of a sitemap?
It's to help the robots to crawl your website but if you're website has a good architecture, the robots will be able to crawl your site easily!
Am I wrong?
Thank you for yours answers,
Jonathan
-
I highly recommend checking out the Webinar Friday Rand did on this very subject: Getting Value from XML Sitemaps, HTML Sitemaps & Feeds.
-
If you have a static site with twenty pages that doesn't get new pages added very often then yes, a site map probably isn't of a whole lot of use if your website has good architecture.
However, if your site is 30,000 pages and gets new content added regularly, then an xml sitemap is useful to make sure that the engines know about all of your pages.
Using multiple sitemaps can be useful to help you diagnose what type of content Google is crawling best. A hypothetical example is that you have a large site where you a) sell baking supplies b) have recipes and c) have user profiles that you want indexed. You could submit a site map for each area (then a master sitemap that lists each of the sub sitemaps).
In Google Webmaster Tools, you get a report that says how many pages you submitted for each site map, and how many of those pages are indexed. using the above setup, you might find something like:
baking supplies has 50 URLs indexed out of 2000 submitted
recipes has 10,000 URLs indexed out of 11,000 submitted
users has 500 URLs indexed out of 1000 submittedAt a glance, you can tell that something is up with the products you're trying to sell and that Google isn't indexing that section very well, and you know to focus on that section, and maybe there's a bug in the code that put a noindex on most of the pages on accident.
Does that help?
-
A sitemap can help not only Google, but viewers find its way through your site. It is a great way to show the hierarchy and flow of your website. As mentioned, there are a few tools on the web that can help make this process pretty painless. At the end of the day, it can only help.
Hope that helps!
-
I agree to the benefits of having a sitemap on any website. Search for Google webmaster help on youtube. You can get to see a lot of supporting tutorials.
-
Hey Jonathan
A HTML sitemap can be useful for getting your site indexed and the XML one can also help with indexation but there are no guarantees that pages in the XML sitemap will be indexed. I read an article on here showing the indexation benefits of a sitemap and google have stated that they like you to have a HTML one for users as well as SEO so... it's like one of those 1% things, it may help a little bit, and it can't hurt but you still have to do everything else right.
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Sitemap Contains Blocked Resources
Hey Mozzers, I have several pages on my website that are for user search purposes only. They sort some products by range and answer some direct search queries users type into the site. They are basically just product collections that are else ware grouped in different ways. As such I didn't wants SERPS getting their hands on them so blocked them in robots so I could add then worry free. However, they automatically get pulled into the sitemap by Magento. This has made Webmaster tools give me a warning that 21 urls in the sitemaps are blocked by robots. Is this terrible SEO wise? Should I have opted to NOINDEX these URLS instead? I was concerned about thin content so really didnt want google crawling them.
Technical SEO | | ATP0 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Creating in-text links with ' 'target=_blank' - helping/hurting SEO!?!
Good Morning Mozzers, I have a question regarding a new linking strategy I'm trying to implement at my organization. We publish 'digital news magazines' that oftentimes have in-text links that point to external sites. More recently, the editorial department and me (SEO) conferred on some ways to reduce our bounce rate and increase time on page. One of the suggestions I offered is to add the 'target=_blank" attribute to all the links so that site visitors don't necessarily have to leave the site in order to view the link. It has, however, come to my attention that this can have some very negative effects on my SEO program, most notably, (fake or inaccurate) time(s) on-page. Is this an advisable way to create in-text links? Are there any other negative effects that I can expect from implementing such a strategy?
Technical SEO | | NiallSmith0 -
No Google cached snapshot image... 'Text-only version' working.
We are having an issue with Googles cached image snapshops... Here is an example: http://webcache.googleusercontent.com/search?q=cache:IyvADsGi10gJ:shop.deliaonline.com/store/home-and-garden/kitchen/morphy-richards-48781-cooking/ean/5011832030948+&cd=308&hl=en&ct=clnk&gl=uk I wondered if anyone knows or can see the cause of this problem? Thanks
Technical SEO | | pekler1 -
How to generate a visual sitemap using sitemap.xml
Are there any tools (online preferably) which will take a sitemap.xml file and generate a visual site map? Seems like an obvious thing to do, but can't find any simple tools for this?
Technical SEO | | k3nn3dy30 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
What's the website that analyzes all local business submissions?
I was recently looking at a blog post here or a webinar and it showed a website where you could see all of the local sites (yelp, Google places) where your business has been submitted. It was an automated tool. Does anyone remember the name of the site?
Technical SEO | | beeneeb0