Should We Remove Content Through Google Webmaster Tools?
-
We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search.
Some relevant notes:
- We transitioned the site from SiteCore to Wordpress to allow for greater flexibility
- The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS)
- The new content (and all required 301s) went live on 12/2
- The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog)
- As of posting this question, a site:domain search shows 6,110 results
While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful?
We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.
-
As Donna pointed out, the 'delay' tween what you expect time-line wise and what Google can 'do' is often longer than anyone would wish........
-
I agree with Ray-pp. It can take some time - weeks to months - for Google to catch up with the changes made to the site. Sounds like something else might be going on causing you to have so many extra pages indexed. Can you explain the cause of having ~5,000 extra pages indexed? When did they first start to appear? Are you sure you've configured your wordpress implementation to minimize unnecessary duplicates?
-
If you have implemented 301 redirects properly, then the old URLs (the ones redirecting to the new site) will naturally drop from the search engines as Google deems appropriate. There are a number of factors that influence when a page gets deindexed, such as the crawl rate for a website and how many links it may have.
If you really desire the pages to be removed, then as you've suggested you can ask for their removal from GWT. However, there is no harm is allowing them to stay indexed and waiting for Google to adjust appropriately.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Organic Listings showing Google Tag Manager + Google Page Title...?
I'm a bit stumped with this. I optimise all my titles etc for Australia - and now the organic liatings are showing something strange. For example ( we sell health supplements ) Meta title = "My Product , Buy Online Australia" If I type "My Product" - the title in the organic listings says "My Product - My Company Limited" - and the only place I can see it getting that from is a combination of Meta Data used in Google Tag Manager + the Name on my Google places page. This is much more obvious for categories.. but it's a pain in the butt. If I type "My Product Australia" Then the original "My Product , Buy Online Australia" comes up. Any ideas on policy etc? I have taken the "Limited" off the Google business page - so hopefully this will change over time - but I can't find any information on why google would do something like this. If you had shed any light on this - would be much appreciated.
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Website Displayed by Google as Https: when all Secure Content is Blocked - Causing Index Prob.
Basically, I have no inbound likes going to https://www.mysite.com , but google is indexing the Homepage only as https://www.mysite.com In June, I was re included to the google index after receiving a penalty... Most of my site links recovered fairly well. However my homepage did not recover for its top keywords. Today I notice that when I search for my site, its displayed as https:// Robots.txt blocks all content going to any secure page. Leaving me sort of clueless what I need to do to fix this. Not only does it pose a problem for some users who click, but I think its causing the homepage to have an indexing problem. Any ideas? Redirect the google bot only? Will a canonical tag fix this? Thx
Intermediate & Advanced SEO | | Southbay_Carnivorous_Plants0 -
Google Places Listing Active In Two Seperate Google Places Accounts?
Hi is there any issues with having a google places listing in two seperate google places accounts. For example we have a client who cannot access their old google places account (ex-employee had their login details which they can't get) and want us to take control over the listing. If we click the "is this your listing" manage this page button - and claim the listing, will this transfer the listing to our control? Or will it create a duplicate? Are there any problems having the listing in different separate accounts. Is it a situation in which the last person who manages the listing takes control? And the listing automatically deactivates from the old account? Do all the images remain aswell? Thanks,
Intermediate & Advanced SEO | | MBASydney
Tom0 -
Should I remove 404 urls in webmaster tools?
I've recently removed a lot of category pages so should I remove the urls in webmaster tools or let them drop out of the index naturally?
Intermediate & Advanced SEO | | SamCUK0 -
How long for Google Webmaster tools to update/reflect link changes
Hi all, Does anyone know or have experience of how long GWMT takes to update its data?, we did some work on our link profile back in October/November but are still seeing old links (removed) showing in GWMT. Thanks in advance,
Intermediate & Advanced SEO | | righty0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Content Focus
I have a particular Page which shows primary contact details as well as "additional" contact details for the client. GIven I do not believe I want Google to misinterpret the focus of the page from the primary contact details which of the following three options would be best? Place the "additional" contact details (w/maps) in Javascript, Ajax or similar to suppress them from being crawled. Leave "additional" contact details alone but emphasize the Primary contact details by placing the Primary contact details in Rich Snippets/Microformats. Do nothing and allow Google to Crawl the pages with all contact details Thanks, Phil
Intermediate & Advanced SEO | | AU-SEO0 -
Google Places verification
What advice do you have for achieving verification for Google Places for a client? I have a client at the moment and I tried getting the call sent through and I'm not sure what happened but a couple of tries at this did not work. I've tried the post card way and I'm still waiting. Do I need to be more patient in Australia for this verification post card? Is there a way I can verify the info myself? note: I have set up a seperate email that there business email to handle a lot of the link building but this is different to there business email which Google uses.
Intermediate & Advanced SEO | | iSenseWebSolutions0