Include or exclude noindex urls in sitemap?
-
We just added tags to our pages with thin content.
Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
-
Hi vcj and the rest of you guys
I would be very interested in learning what strategy you actually went ahead with, and the results. I have a similar issue as a result of pruning, and removing noindex pages from the sitemap makes perfect sense to me. We set a noindexed follow on several thousand pages without product descriptions/thin content and we have set things up so when we add new descriptions and updated onpage elements, the noindex is automatically reversed; which sounds perfect, however hardly any of the pages to date (3000-4000) are indexed, so looking for a feasible solution for exactly the same reasons as you.
We have better and comparable metrics and optimization than a lot of the competition, yet rankings are mediocre, so looking to improve on this.
It would be good to hear your views
Cheers
-
I'm aware of the fact Google will get to them sooner or later.
The recommendation from Gary Illyes (from Google), as mentioned in this post, was the reason for my asking the question. Not trying to outsmart Google, just trying to work within their guidelines in the most efficient way possible.
-
Just to put things into perspective,
if these URLs are all already indexed and you have used "noindex" on those pages, sooner or later google will re-crawl these pages and they will be removed. You may want to remove them from the index ASAP for some reason, but it wont really change anything. Because Google will not deindex your noindex pages just because they are in your sitemap.xml.
Google deindexes a sie only when it is time to re-crawl the page.Google never recommends using noindex in sitemaps, and google wont suggest that in their blocking search indexing results guidelines. Also Google indicates the following:
"Google will completely drop the page from search results, even if other pages link to it. If the content is currently in our index, we will remove it after the next time we crawl it. (To expedite removal, use the Remove URLs tool in Google Webmaster Tools.)"But hey! every SEO has its own take.. Some tend to try outsmart Google some not..
Good luck
-
That opens up other potential restrictions to getting this done quickly and easily. I wouldn't consider it best practices to create what is essentially a spam page full of internal links and Googlebot will likely not crawl all 4000 links if you have them all there. So now you'd be talking about maybe making 20 or so thin, spammy looking pages of 200+ internal links to hopefully fix the issue.
The quick, easy sounding options are not often the best option. Considering you're doing all of this in an attempt to fix issues that arose due to an algorithmic penalty, I'd suggest trying to follow best practices for making these changes. It might not be easy but it'll lessen your chances of having done a quick fix that might be the cause, or part of, a future penalty.
So if Fetch As won't work for you (considering lack of manpower to manually fetch 4000 pages), the sitemap.xml option might be the better choice for you.
-
Thanks, Mike.
What are your thoughts on creating a page with links to all of the pages we've Noindexed, doing a Fetch As and submitting that URL and its linked pages? Do you think Google would dislike that?
-
You could technically add them to the sitemap.xml in the hopes that this will get them noticed faster but the sitemap is commonly used for the things you want Google to crawl and index. Plus, placing them in the sitemap does not guarantee Google is going to get around to crawling your change or those specific pages. Technically speaking, doing nothing and jut waiting is equally as valid. Google will recrawl your site at some point. Sitemap.xml only helps if Google is crawling you to see it. Fetch As makes Google see your page as it is now which is like forcing part of a crawl. So technically Fetch As will be the more reliable, quicker choice though it will be more labor-intensive. If you don't have the man-hours to do a project like that at the moment, then waiting or using the Sitemap could work for you. Google even suggests using Fetch As for urls you want them to see that you have blocked with meta tags: https://support.google.com/webmasters/answer/93710?hl=en&ref_topic=4598466
-
There are too many pages to do that (unless we created a page with links to all of the Noindexed pages, then asked Google to crawl that and all linked pages, though that seems like it might be a bad approach). It's an ecommerce website and we Noindexed nearly 4,000 pages that had thin or duplicate content (manufacturer descriptions, no description on brand page, etc) and had no organic traffic in the past 90 days.
This site was hit by Panda in September 2014 and isn't ranking for things it should be – pages with better backlink profiles, higher DA/PA, better content, etc. than our competitors. Our thought is we're not ranking because of a penalty against thin/duplicate content. So we decided to Noindex these pages, improve the content on products that are selling and getting traffic, then work on improving pages that we've Noindex before switching them back to Index.
Basically following recommendations from this article: https://mza.bundledseo.com/blog/pruning-your-ecommerce-site
-
If the pages are in the index and you've recently added a NoIndex tag with the express purpose of getting them removed from the index, you may be better served doing crawl requests in Search Console of the pages in question.
-
Thanks for your response!
I did some more digging. This seems to contradict your suggestion:
https://twitter.com/methode/status/653980524264878080
If the goal is to have these pages removed from the index, and having them in the sitemap means they'll be picked up sooner by Google's crawler, then it seems to make sense that they should be included until they're removed from the index.
Am I misinterpreting this?
-
Hi
The reason you submit a sitemap to a searchengine is to ease and aid in crawling process for the pages that you want to get indexed. It speeds up the crawling process and lets search engine to discover all those pages that has no inner linkings to it etc..
A "noindex" tag does the opposite.
So no, you should not include noindex pages inside your sitemap files.
In general you should avoid pages that are not returning 200 also.Good luck
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap generator partially finding list of website URLs
Hi everyone, When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt. Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap. Thanks!
Technical SEO | | Taysir0 -
Sitemap submission for site migration?
Hi mozzers, We're about to migrate 4 domains into 1. Is there a particular way I should generate and submit the sitemap or should I just follow the same protocol as for one domain? Should I even worry submitting a sitemap when the site has this drupal module? I have access to the webmaster tools of all domains, should I do something specific on the accounts that are migrating besides submitting a sitemap? Thanks for letting me know!
Technical SEO | | Ideas-Money-Art0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
Need URL structure suggestions
On my website I am in the process of creating expat city guides for different cities in Cambodia. I've already gotten three up, but I am worried that my URL structure is not the best, so I am wondering if I should fix it before I put the rest up. Right now the city guides are housed here: movetocambodia.com/expat-city-and-island-guides/ There's a section for each city, this one is for Battambang: movetocambodia.com/expat-city-and-island-guides/battambang And then there are sections for hotels, restaurants, etc. movetocambodia.com/expat-city-and-island-guides/battambang/battambang-hotels-and-accommodation So once you finally get to a review for an individual hotel or activity, the URL is really long, like this: movetocambodia.com/expat-city-and-island-guides/battambang/battambang-hotels-and-accommodation/classy-hotel Should I just par the section names down so the URL would be something like this: movetocambodia.com/expat-city-guides/battambang/accommodation/classy-hotel/ ? I was hoping by having the long URLs slugs for my section pages, such as "battambang-hotels-and-accommodation" they would be more likely to show on search terms like "Battambang hotels" than if the section was just "accommodation." However, this whole section is getting much less search traffic than anything else on my site, so I am wondering if it is because of these ridiculously long URLs. Any suggestions would be appreciated.
Technical SEO | | Lina5000 -
Video Sitemap Help
MOZ Helpers, Currently our clients videos are hosted on Viemo and that will not change as our client likes the creative/artist vibe and community via Viemo. That being said we need to create a video sitemap. BTW, Our site uses wordpress. When someone in house uploads a video in the future we want them to be able to enter the video title, description, and tags on the video and when they hit "update" the video and information will get added to our video site map. Wistia has this option here http://wistia.com/doc/video-seo , but like I mentioned above our client has all videos via Viemo. I found a Google XML wordpress plugin, but that said it only works for Youtube video's. The Jr. developer is not confident in creating one himself from the Google webmaster instructions and asked me to search for another solution. Check out the attached pic, that is what I am looking for. Is their a plugin or another option where we can use for easy sitemap updating and management for our Viemo videos? Thanks in advance! M video-seo-dialog.png?id=video-seo video-seo-dialog.png?id=video-seo
Technical SEO | | matthew-2202000 -
Should I not Change the URL of Ranking Pages
My site currently ranks #1 or #2 for 2 separate pages on web design & SEO for my geographic location. The URLs are currently mysite.com/services/web-design/ and mysite.com/services/seo/ I'm redesigning my site and I'm taking out the "Services" page as I'm focusing on web design and SEO and lumping everything else into my "Internet Marketing" page. Because my pages for web design and SEO rank so well, should I keep the URL structure even though I don't have a "Services" page or should I just remove services and 301 redirect so I have mysite.com/web-design/ and mysite.com/seo/. I know doing a 301 redirect could hurt me in the short term but I'm wondering if I should just bite the bullet now and change it in favor of a better URL structure. What do you think?
Technical SEO | | JaredDetroit0