DeIndexing pagination
-
I have a custom made blog with boat loads of undesirable URLs in Google's index like this:
.com/resources?start=150
.com/resources?start=160
.com/resources?start=170I've identified this is a source of duplicate title tags and had my programmer put a no index tag to automatically go on all of these undesirable URLs like this:
However doing a site: search in google shows the URLs to still be indexed even though I've put the tag up a few weeks ago.
How do I get google to remove these URLs from the index? I'm aware that the Search Console has an answer here https://support.google.com/webmasters/topic/4598466?authuser=1&authuser=1&rd=1 but it says that blocking with meta tags should work.
Do I just get google to crawl the URL again so it sees the tag and then deindexes the URLs? Or is there another way I'm missing.
-
Adding a meta noindex tag can mean it takes a few weeks for a page to fall out of the index. These pages probably aren't doing you much harm, so if you wanted to just wait for them to fall out, that's probably fine (although I would update the tag content to "noindex, follow" to help Google crawl to the other noindexed pages). If you really want them out of the index faster, you could use the "Remove URLs" function under Google Index in Google Search Console, which will temporarily remove them from the index while Google is registering the noindex tags, or you can use the Fetch + Render tool and then Submit URLs in Google Search Console, which will cause Google to come back and crawl your pages and find the noindex tag.
-
You could use URL parameter settings in Google Search Console and Bing Webmaster Tools - if all ?start= URLs can be treated the same way by Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I deindex a page then will Google stop counting those links pointing to it?
Hey everyone, I am deindexing some posts of my website as I think they are not providing any value to the users. My question is that if I deindex a post and it has some good quality links pointing to it, will google stop those links counting for my website?
Intermediate & Advanced SEO | | Bunnypundir0 -
Massive local + national disconnect in rankings (local deindexed)
I asked the question originally on webmaster central. I tried RickRoll's solutions (but it doesn't seem to have solved the issue). Problem below: I've been noticing for some time that certain pages of our site (https://www.renthop.com/boston-ma/apartments-for-rent) have been deindexed locally (or very low ranked), but indexed nationally (well ranked). In fact, it seems that the actual page isn't ranking (but the blog https://www.renthop.com/blog is). This huge mismatch between national vs local rankings seem to only happen for Boston & Chicago. Other parts of the country seem unaffected (and the national & local rankings are very similar). A bit of a background (and my personal theory as to what's happening). We use to have subdomains: boston.renthop.com & chicago.renthop.com for the site. These subdomains stopped working, though, as we moved the site to the directory format (https://www.renthop.com/boston-ma/apartments-for-rent). These subdomain URLs were inactive / broken for roughly 4 months. After the 4 months, we did a 301 from the subdomain to the main page (because these subdomains had inbound external links). However, this seems to have caused the directory pages to exhibit the national/local mismatch effect instead of helping. Is there anything I'm doing wrong? I'm not sure if the mismatch is natural, if the pages are getting algo penalized on a local level (I'm negative SEOing myself), or if it's stuck in some weird state because of what happened with bad sub-domain move). Some things I've tried: I've created webmaster console (verified) accounts for both the subdomains. I've asked Google to crawl those links. I've done a 1-1 mapping between individual page on the old site vs the new directory format I've tried both doing a 301, 302 and meta-refresh redirect from the subdomains to the directory pages. I've made sure the robots.txt on the subdomain is working properly I've made sure that the robots.txt on the directory pages are working properly. See below for a screenshot of the mismatch & deindexing in local search results (this is using SERPS - but can be replicated with any location changer). Note the difference between the ranking (and the page) when the search is done nationally vs in the actual location (Boston, MA). I'd really appreciate any help.. I've been tearing my hair out trying to figure this out (as well as experimenting). renthop%2Bboston.png
Intermediate & Advanced SEO | | lzhou0 -
E-Commerce Mobile Pagination Dillema
Hi Everybody, I'm managing the SEO for an E-commerce site with different desktop and mobile sites (meaning, not responsive). We're changing the way reviews on mobile product pages will be displayed from 'view all' to pagination (due to server load). Basically the above the fold part of the page will always display the product, and below the fold will have x numbers of reviews on each page. But here is where it gets tricky: 1 - A different number of review pages will exist on mobile vs desktop (due to different no. of reviews per page on each device) - so I'm wondering what's the solution regarding canonicals. Usually every mobile page points to its desktop parallel, but now we'll have non-matching pages. 2 - The users will be able to change the no. of reviews displayed on each page. So the number of paginated pages will change accordingly. I was thinking about a solution where all the reviews will be in the first page's html (and only X of them will be displayed on screen), and all the other paginated pages will be created dynamically (with # and won't be indexed, so basically no pagination in mobile). Does anyone think this can be seen as cloaking or has any other thoughts? Thanks, Sarah
Intermediate & Advanced SEO | | Don340 -
Should I do different H1 and H2 Tags on Paginated Pages ?
Hi All, I use pagination (Rel=Prev , Rel=Next) and a canconical tag on my paginated pages and also on my View All page to point to my root page (Page 1). My thoughts are am I missing a trick by having the same H1 and H2 tag on each of my paginated pages ? Should I having different ones on say Page 2, 3,4 and view all , to give the Collective Page better SEO . Just wondered what peoples thoughts here were ?. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Pagination causing duplicate content problems
Hi The pagination on our website www.offonhols.com is causing duplicate content problems. Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
Intermediate & Advanced SEO | | offonhols
http://offonhols.com/default.aspx?dp=2
http://offonhols.com/default.aspx?dp=3
etc0 -
Potential Pagination Issue/ Duplicate content issue
Hi All, We upgraded our framework , relaunched our site with new url structures etc and re did our site map to Google last week. However, it's now come to light that the rel=next, rel=Prev tags we had in place on many of our pages are missing. We are putting them back in now but my worry is , as they were previously missing when we submitted the , will I have duplicate content issues or will it resolve itself , as Google re-crawls the site over time ?.. Any advice would be greatly appreciated? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
How should I handle pagination on an e-commerce site?
I am looking at one of our category pages and it has 25 additional pages for a total of 26 pages. The url for the first page looks good, then the next one ends with ?SearchText=768&SearchType=Category All additional pages have the same url. My first concern was duplicate content, but after looking no pages after the 1st are even indexed. What is the best way to handle this?
Intermediate & Advanced SEO | | EcommerceSite0 -
Dilemma: Should we use pagination or 'Load More' Function
In the interest of pleasing Google with their recent updates and clamping down on duplicate content and giving a higher preference to pages with rich data, we had a tiny dilemma that might help others too. We have a directory like site, very similar to Tripadvisor or Yelp, would it be best to: A) have paginated content with almost 40 pages deep of data < OR > B) display 20 results per page and at the bottom have "Load More" function which would feed more data only once its clicked. The problem we are having now is that deep pages are getting indexed and its doing us no good, most of the juice and page value is on the 1st one, not the inner pages. Wondering what are the schools of thought on this one. Thanks
Intermediate & Advanced SEO | | danialniazi0