Please help :) Troubles getting 3 types of content de-indexed
-
Hi there,
I know that it takes time and I have already submitted a URL removal request 3-4 months ago.
But I would really appreciate some kind advice on this topic.Thank you in advance to everyone who contributes!
1) De-indexing archives
Google had indexed all my:
/tag/
/authorname/
archives.I have set them as no-index a few months ago but they still appear in search engine.
Is there anything I can do to speed up this de-indexing?2) De-index /plugins/ folder in wordpress site
They have also indexed all my /plugins/ folder. So I have added a disallow /plugin/ in my robots.txt 3-4 months ago, but /plugins/ still appear in search engine.
What can I do to get the /plugins/ folder de-indexed?
Is my disallow /plugins/ in robots.txt making it worse because google has already indexed it and not it can't access the folder? How do you solve this?3) De-index a subdomain
I had created a subdomain containing adult content, and have it completely deleted it from my cpanel 3months ago, but it still appears in search engines.
Anything else I can do to get it de-indexed?
Thank you in advance for your help!
-
Hi Fabio
If the content is gone when you visit your old URLs do you get a 404 code? You can plug the old URLs into urivalet.com to see what code is returned. If you do, then you're all set. If you don't, see if you can just upload a robots.txt file to that subdomain and block all search engines. Here's info on how to do that http://www.robotstxt.org/robotstxt.html
-Dan
-
Hey Dan,there is no content.
The whole website has been deleted, but it still appears in search results.What should I do?
should I put back some content and then de-index it?Thanks!
fabio -
Hi There
You should ensure the content either;
- has meta noindex tags
- or is blocked with robots.txt
- or 404's or 410's (is missing)
And then use the URL removal tool again and see if that works.
-
Hey Dan thanks a lot for all your help!
There still is a problem though. A while ago I had created an adult subdomain: adult.mywebsite.comThen I completely deleted everything inside it (even though I noticed the subfolder is still in my account).
A few days ago, when I started this thread, I also created a GWMT account for adult.mywebsite.com and submitted a removal request for all those URLs (about 15).Now today when I check:
site:mywebsite.com
or
site.adult.mywebsite.comthe URLs still appear in search results.
When I check
cache:adult.mywebsite.comit sends me to a google 404 page:
http://webcache.googleusercontent.com/search?/complete/search?client=hp&hl=en&gs_rn=31&gs_ri=hp&cp=26&gs_id=s xxxxxxxxxxxxxxxxxxxxxxxxSo I don't know what this means...
Does it mean google hasn't deindexed them?
How do I get them deindexed?
Is it possible google is having troubles de-indexing them because they have no content in them or something like that?What should I do to get rid of them?
Thanks a lot!!!!!!!!!!
Fabio -
Hey Fabio
Regarding #2 I'd give it a little bit more time. 301's take a little longer to drop out, so maybe check back in a week or two
Technically the URL removal will mainly work if the content now 404's, is noindexed or blocked in robots.txt but with a redirdect you can do none of those, so you just have to wait for them to pick up on the redirects.
-Dan
-
Hi Dan,
1. Ok! I will.
2. When I click on the /go/ link in search results it redirects me to the affiliate website. I asked for the removal of /go/ a few days ago, but they (about 30 results) still appear in google when I search with the site:mywebsite.com trick.
What should I do about it? How can I get rid of them? They were created with the SimpleUrl plugin which I deleted about 3 months ago though.
3. Got it!
Thanks!
Fabio -
Hi There
1. For the flash file NoReflectLight.swf - I would do a removal request in WMT and maintain the blocking in robots.txt of /plugins/
2. When you do a URL removal in WMT the files need to either be blocked in robots.txt or have a noindex on them or 404. Doesn't that sort of link redirect to your affiliate product? In other words, if I were to try to visit /go/affiliate-product/ it would redirect to www.affiliateproductwebsite.com ?Or does /go/affiliate-product/ load it's on page on your site?
3. I would maintain the robots.txt bloking on /plugins/ - if no other files from there are indexed, they will not be in the future.
-Dan
-
Hey Dan,
thanks for the quick reply.I have gone trough site:mywebsite.com and I found that tags and categories disappeared but there still is some content that shouldn't be indexed like this:
mywebsite.com/wp-content/plugins/wp-flash-countdown/counter_cs3_v2_NoReflectLight.swf
and this:
mywebsite.com/go/affiliate-product/and I found this:Disallow: /wp-content/plugins/
in my robots.txtThing is that:
- I have deleted that wp-flash-countdown plugin at least 9 months ago
- I have manually removed all the urls with /go/ from GWMT and when I search for a cached version of them they are not there
- If I remove Disallow: /wp-content/plugins/ from my robots.txt won't that get all my plugins' pages to be indexed? So how do I make sure they are not indexed?
Thank you so much for your help!So far you have been the most helpful answerer in this forum.
-
Hey There
You want to look for this;
You can just do a cntrl-f (to search text in the source) and type in "noindex" and it should be present on the Tag archives.
-Dan
-
Hey Dan, thanks a lot for your help.
I have tried the cache trick on my home page and the cached version was about 4-5 days old.
I have then tried to cache:mywebsite/tag/ and it gives me a google 404 not found which I suppose is a good sign.
But if they have been de-indexed why do they appear in search results then?
I am not sure how to check the double SEO no-index in the source code though. How do I do that exactly? What should I look for after right-clicking -> source code?
Thanks for your help!
My MOZ account ends in two days so I may not be able to reply back next time.
-
Hi There
Should have explained better
if you type cache: in front of any web URL for example cache:apple.com you get;
And see the "cache" date? This is not the same as the crawl date, but it can give you a rough indication of how often Google might be looking at your pages.
So try that on some of your tag archives and if the cache date is say 4+ weeks ago maybe Google isn't looking at the site very often.
But it's odd they haven't been removed yet, especially with the URL removal tool - that tool usually only takes a day. Noindex tags usually only take a week or two.
Have you examined the source code to make sure it does in fact say "noindex" by the robots tag - or that there is not a conflicting duplicate robots noindex tag? Sometimes wordpress themes and plugins both try adding SEO tags and you can end up with duplicates.
-Dan
-
Hey Dan thanks,
well, so google had indexed all my tags, categories and stuff.The only things I had blocked in my robots was
/go/ for affiliate links
and
/plugins/ for pluginsso I did let google see that categories and archives pages were no-indexed.
I have also submit the removal request many months ago but I haven't quite understood what you say about the cache dates. What should I check?
Thanks for your help!
-
Hi There
For all these cases above, this may be a situation where you've BOTH blocked these in robots.txt and added noindex tags. You can not block the directories in robots.txt and get them deindexed, because Google can not then crawl the URLs to see the noindex tag.
If this is the case, I would remove any disallows to /tag/ etc in robots.txt, allow Google to crawl the URLs to see the nodinex tags - wait a few weeks and see what happens.
As far as the URL removal not working, make sure you have the correct subdomain registered - www or non-www etc for the URLs you want removed.
If neither one of those is the issue, please write back so I can try to help you more with that. Google should noindex the pages in a week or two under normal situations. The other thing is, check the cache date of the pages. If the cache dates are prior to the date you added the noindex, Google might not have seen the noindex directives yet.
-Dan
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Prerender.io and similar services to index content - legit?
A client has a huge, unique, updated list of B2B products that are in javascript and not indexed. Reading around, I think I've found that: Google allows showing bots and users different content (if it's fundamentally the same) with no penalty There are good, bad, and ugly ways to do it It's a semi-common problem There are services like prerender.io and formerly ajaxsnapshots.com that can help with this However..... I can't find a single authoritative (read: from Google or Moz) that says the above point 1. I found this White Hat Cloaking: It exists. It's permitted. It's useful. But can't tell where my situation fits (or if it does). So... if I use prerender.io to surface content to get it indexed... is that a smart move? I'm 95% sure it is, but I need 100% to make the decision.
Intermediate & Advanced SEO | | DanSullivan0 -
Renaming your domain from an existing live domain and SEO implications - Please Help *shudder*
Please see the details below. Site A: http://south-african-holiday.mobi is an existing site that is our best site. It is Joomla 3.1 and runs all our ecommerce. Site B: http//www.southerncircle.com/ is our original and has the best DA but is out of date and pretty clunky. joomla 1.5 and all bookings (tour site) are redirected to Site A for processing. Instead of redesigning the Site A I'd like to change the domain name of http://south-african-holiday.mobi -> http://southerncircle.com So far my reading and research (Thanks MOZ for awesome forum!) has provided me with: 1. Do the SEO groundwork. i.e. remove dead links from both sites. Delete useless content and generally tidy up both sites. 2. Map all pages from site a: http://southerncircle.com -> http://south-africa-holiday/ so that the existing pages that have good ranking will have a home on the new site. 3. When ready do a small sample 301 redirect from: http://southerncircle.com to http://south-africa-holiday.mobi. 4. arghhhh now I'm stuck ..... If I redirect to this site then I lose my http://southerncircle.com domain which is what I want to keep....I just want the .mobi site to move to the southerncircle.com site.... I don't consider myself totally thick but this is really confuseing the *$%# out of me PLEASE could you give me some insight here. I'm sure it has been done before without completely losing the sites seo ranking and sending my site into SEO oblivion. If there are any JOOMLA gurus that have done this I'd love to hear from you as well. Many thanks in advance.
Intermediate & Advanced SEO | | SoutherlySwell0 -
Trouble ranking
I have a site that got messed over pretty hard by a BigCommerce issue. They used to rank but then Big Commerce had a glitch that set every page on the site to a https which was auto set, by their system, to not be indexed. This caused the entire site to go missing. It was then fixed by me, only to have the same glitch happen again. I again fixed it, and BigCommerce released a patch to resolve the issue. They admitted blame to my client and said it can take a while to resolve. It has been a few months now, and google is slowly recrawling the site. It has about half the pages indexed. The pages that are indexed do not rank at all. I was wondering if you guys see any major flags that would cause this or if it is still related to the big commerce glitch. link
Intermediate & Advanced SEO | | Atomicx0 -
Help needed for a domain
I have a small translation agency in Brazil (this website), totally dependent on SEM. We are in business since 2007, and we were on top position for many relevant keywords until the middle of 2011, when the ranking for the most important keywords started dropping. In that time, we believed that we needed to redesign the old static website and replace it by a new modern one, with fresh content and with weekly updates, which we did, and it's now hosted on Squarespace. I took care to keep the old links working with 301 redirections. When we made the transfer from the static site to Squarespace (Mar/2012, see the attachment), the ranking dropping became even more serious. Today, we have less than 50 unique visitors per day, in a total desperate situation! To make things worse, we received an alert from Google on 23/September/2012 talking about unnatural inbound links, but Google said that "As a result, for this specific incident we are taking very targeted action on the unnatural links instead of your site as a whole", so we thought we didn't need to worry about. Google was correct, I worked many hours to register our website in web directories, I thought there would be no problem since I was doing this manually. My conclusions are: Something happened prior to Mar/2012 that was making us losing territory. I just don't know what! The migration to Squarespace was a huge mistake. I lost control over the html, and squarespace doesn't do a good job optimizing the pages for SEO. We also were also blasted by Penguin on September, but I believe this is not the main cause of the drop. We were already running very badly at this time. My actions are: a) I generated a DTOX report and I'm trying to clean up the links marked as toxic. That's a hard work! After that I will submit a reconsideration request. b) I'm working on the site: Improving internal link building for relevant keywords Recently I removed a "tag cloud" which I believe was hurting my SEO. Also, I did some redirections that were missing. c) I trying to generate new content to improve link building to my site. d) I'm also considering to stop putting all my coins on this domain, and maybe start a fresh new one. Yes, I'm desperate! 🙂 I would appreciate a lot to hear from you guys, expert people! Thanks a lot, MWcEdPa.png?1
Intermediate & Advanced SEO | | rodrigofreitas0 -
How can you indexed pages or content on pages that are behind a pay wall or subscription login.
I have a client that has a boat of awesome content they provide to their client that's behind a pay wall ( ie: paid subscribers can only access ) Any suggestions mozzers? How do I get those pages index? Without completely giving away the contents in the front end.
Intermediate & Advanced SEO | | BizDetox0 -
Indexing non-indexed content and Google crawlers
On a news website we have a system where articles are given a publish date which is often in the future. The articles were showing up in Google before the publish date despite us not being able to find them linked from anywhere on the website. I've added a 'noindex' meta tag to articles that shouldn't be live until a future date. When the date comes for them to appear on the website, the noindex disappears. Is anyone aware of any issues doing this - say Google crawls a page that is noindex, then 2 hours later it finds out it should now be indexed? Should it still appear in Google search, News etc. as normal, as a new page? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0