Is it best to create multiple xml sitemaps for different sections of a site?
-
I have a client with a very big site that includes a blog, videos, photo gallery, etc. Is it best to create a separate xml file for each of these sections? It seems to me like that would be the best way to keep everything organized. Or at least separate the blog out from the main site. Does anybody have any tips or recommendations? I'm not finding any good information about this.
-
This is definitely a good idea.
I've just done exactly this for a large ecommerce site, as well as implement a new URL rewrite, and by adding this granularity to the sitemap you can really see which pages are being indexed and control it better.
There was a blog post here on SEOMoz which I used for guidance and that demonstrated how the improved site maps actually lead to increased site traffic. (http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic)
If you're a Wordpress user, checkout the Wordpress SEO plugin by yoast, as his site maps do exactly this and are set-out very nicely.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
General questions about implementing hreflang using XML sitemap
I created another thread regarding hreflang sitemaps. However, this one is more general and doesn't cover multiple sitemaps for different localizations so I think it's reasonable creating a new thread. We are trying to implement hreflang using XML sitemap. We have localized content for a few countries, but only 1/3 of the content is 'duplicate' localized content. How should this be presented in the sitemap? Can we have some urls with hreflang-tags and some without? Also, where should this be located? In the usual sitemap file at site.com/sitemap.xml or should we create a different sitemap site.com/hreflang.xml where we just paste all hreflang-info? And if it should be in /hreflang.xml - can we have the same URL twice (in both current sitemap and hreflang sitemap)?
Technical SEO | | Telsenome0 -
Does anyone know a sitemap generation tool that updates your sitemap based on changes on your website?
We have a massive site with thousands of pages which we update everyday. Is there a sitemap generator that can create google sitemaps on the fly and change only based on changes in the site? Our site is much too large to create new sitemaps on regular basis. Is there a tool that will run on server that does this automatically?
Technical SEO | | gwynethmarta0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
Have my site been penalised ?
Hi, I recently hired a link builder (who works for a digital marketing agency and wanted to earn some extra cash ) to do some link building for me on keyword Carpet Cleaner Hire Basically , May my ranking for this work has moved from 29 up to 13 , then down to 500 + and now up to 67 ?. It's been all over the place and I am worried that maybe google thinks its unnatural or to many links etc. I looked at some of the work this chap did and there was a log of bookmarks done , apparently to increase social awareness and some articles on random forums I never heard of but the articles didn't look to good and in my eyes were a little spammy.. I don't have any messages in WMT saying I have been penalised or anything but I am naturally worried that this chap may have caused damaged and that I was manually penalised. I have since asked him to stop immediately.. does this sound normal for the ranking to be all over the place in this manner when link building ? How can I tell if I have been manually penalized in any way ? What;s the general consensus ?. Should I contact google or leave it and see what happens ? Other keywords seems to be okay though. Any advice much appreciated. thanks Sarah.
Technical SEO | | SarahCollins1 -
Best META Fields to Include on New Site
I am in the process of transitioning sites to a Drupal CMS and am curious to know what META information to provide on each of the new site pages. Currently, this is the set-up I plan on using: My questions to the community are: whether or not I've added all pertinent information, and if there's anything I'm overlooking
Technical SEO | | NiallSmith0 -
Best way to redirect 3 sites to 1 new one.
Hi All We currently have 3 old sites that have tones of content. Due to brand/business consolidation we have merge all 3 to produce 1 website. The new site contains all the old content from the old 3. So, I know I need to 301 redirect all the old content from the previous sites to the equivelent content on the new sites but am confused how you do this with 3 domains? One of the domains is being replaced with the new site. So I have: www.domain1.co.uk www.domain2.co.uk www.domain3.co.uk All the content for all the sites have been imported into a new site and any duplicate content issues havce been resolved. Can anyone point me in the right direction? Thanks
Technical SEO | | EclipseLegal0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
Which is the best wordpress sitemap plugin
Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?
Technical SEO | | simoncmason0