Should I submit an additional sitemap to speed up indexing
-
Hi all,
Wondered if there was any wisdom on this that anyone could impart my way?
I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense. Our URLs are very long in some cases, so this ought to help with some rationalisation there too.
We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible. In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)? The site is thousands of pages.
Any benefits / disadvantages anyone could think of? Any thoughts very gratefully received.
-
I personally would not. There is no need to submit an additional sitemap when you can just ping google to crawl your site at anytime.
Google will put you in a que and crawl you almost immediately.
I would not recommend abusing this. Only ask Google to index your website when you have new content or Google is taking to long to index current content.
The more fresh content you provide, the more Google will re-visit your page and continue to index you
Hope this helps!
-
Yeah why not an additional sitemap for all pages. I usually send the bot to the old once and let the bot get the redirect. Often times with a new special-for-that created sitemap. I don't sent the new urls - I mean - thats showing the bot existing content on different URLs. Not what I wanted, I want the bot to see the redirects asap. My relaunches or big changes are allways done in one day - so I think thats a good way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps: Best Practice
What should and what shouldn't go in the sitemap? In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs? Thanks for any advice/ anecdotes 🙂
Intermediate & Advanced SEO | | Fubra0 -
Best Sitemap for Large Website
i have more than 3500 pages on my website. Please let me know the best sitemap plugin for my website.
Intermediate & Advanced SEO | | Michael.Leonard1 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Old pages still in index
Hi Guys, I've been working on a E-commerce site for a while now. Let me sum it up : February new site is launched Due to lack of resources we started 301's of old url's in March Added rel=canonical end of May because of huge index numbers (developers forgot!!) Added noindex and robots.txt on at least 1000 urls. Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000) Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March! I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index. 4cArHPH.png
Intermediate & Advanced SEO | | ssiebn70 -
How can i stop such links being indexed
Hi, How can i stop such links being indexed The first link is what i want to stop indexed. We have 1,000's of people writing articles and the below URl shows how many articles each did http://www.somename.com/article/15633 But this is the URl which shows the exact articlehttp://www.Somename.com/article/step-step-installation-ibm-lotus-notesAs both start as thishttp://www.Somename.com/article/How can i set noindex? Should we set for each URL manually one by oneThanks
Intermediate & Advanced SEO | | mtthompsons0 -
Defining Canonical First and Later No Indexing
We found some repetitive pages on site which has mostly sort or filter parameters, tried lot to remove them but nothing much improvement Is it correct way that:- a) We are creating new pages altogther of that section and putting up rel canonical tag from old ones to new ones b) Now, after canonical declared, we will noindex the old pages Is it a correct way to let new pages supercede the old pages with new pages.
Intermediate & Advanced SEO | | Modi0 -
Correct Syntax for Meta No Index
Hi, Is this syntax ok for the bots, especially Google to pick up? Still waiting for Google to drop lots of duplicate pages caused by parameters out of the index - if there are parameters in the querystring it inserts the code above into the head. Thanks!
Intermediate & Advanced SEO | | bjs20101 -
Getting a Sitemap for a Subdomain into Webmaster Tools
We have a subdomain that is a Wordpress blog, and it takes days, sometimes weeks for most posts to be indexed. We are using the Yoast plugin for SEO, which creates the sitemap.xml file. The problem is that the sitemap.xml file is located at blog.gallerydirect.com/sitemap.xml, and Webmaster Tools will only allow the insertion of the sitemap as a directory under the gallerydirect.com account. Right now, we have the sitemap listed in the robots.txt file, but I really don't know if Google is finding and parsing the sitemap. As far as I can tell, I have three options, and I'd like to get thoughts on which of the three options is the best choice (that is, unless there's an option I haven't thought of): 1. Create a separate Webmaster Tools account for the blog 2. Copy the blog's sitemap.xml file from blog.gallerydirect.com/sitemap.xml to the main web server and list it as something like gallerydirect.com/blogsitemap.xml, then notify Webmaster Tools of the new sitemap on the galllerydirect.com account 3. Do an .htaccess redirect on the blog server, such as RewriteRule ^sitemap.xml http://gallerydirect.com/blogsitemap_index.xml Then notify Webmaster Tools of the new blog sitemap in the gallerydirect.com account. Suggestions on what would be the best approach to be sure that Google is finding and indexing the blog ASAP?
Intermediate & Advanced SEO | | sbaylor0