Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword Suggestions Tool & Different Subdomains
Hey all, Was reading Dan Shure's brilliant post on the Keyword Planner, and decided to plug a few of my own pages into the URL-suggester tool as well. What I got back was nothing short of strange. After plugging in our Features page, (which describes our Social Media Contesting Platform,) and getting back a bunch of suggestions related to Dr Seuss and Interior Design Scholarships, I realized that the Keyword Suggestion tool was being broken by our subdomains. I looked for precedent on my particular issue, but I think I might not be searching properly. Could anyone provide any insight into whether or not this might affect how spiders see the content on Strutta.com, whether or not this is just something that will affect the Keyword Suggestions Tool or actual SERP rankings, and if this content is already present elsewhere on MOZ, a link to said content? Much obliged 🙂
Technical SEO | | danny.wood0 -
Subdomain vs Main Domain Penalties
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
Technical SEO | | Prospector-Plastics0 -
SEO credit for subdomain blogs?
One of my clients is currently running a webstore through Volusions. They would like to add a blog to their website, but since Volusions doesn't currently support blogs on the same domain we would have to create a Wordpress blog and link it to a subdomain. (http://support.volusion.com/article/linking-your-blog-your-volusion-store) Using this method, will their primary website receive any SEO credit for the content being created on the blog or will it only count towards the subdomain? Thanks!
Technical SEO | | CMSSolutions980 -
Multiple Subdomains, my worst seo mistake. now what should i do?
Hello Everyone, I have been running www.designzzz.com from lats 3 years now. and was doing extremely good with a PR 6 and 800K+ traffic monthly, but 6 months ago it started falling and falling badly.. now i am down to 350K total impressions :{ per month. I have been blaming penguin for this and been talking to google reps continously over it. they assured me that my site is not under any type of manual spam etc. Then i begin think and i realized taht was exactly the time when i launched a few subdomains as sub parts of my site like coding.designzzz.com , wordpress.designzzz.com , photograph.designzzz.com , shop.designzzz.com in the making... now is the part that i can't undo these subdomains.. what should i do ? my search traffic is almost killed. I seriously need insight on this guys : thanks in advance! Ayaz
Technical SEO | | wickedsunny10 -
Does this content get indexed?
A lot of content on this site is displayed in pop up pages. Eg. Visit the Title page http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title To access the sample report or fee details, the info is shown in a pop up page with a strange url. Example: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details I can't see any of these pages being indexed in Google or other search engines when I do a site search: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details Is there a way to get this content indexed besides telling the client to restructure this content?
Technical SEO | | Bigheadigital0 -
Getting images indexed in the SERPS
Good Afternoon form 13 degrees C totally Sunny Wetherby UK 🙂 Am i right in thinking that the only way to get images appearing like this in your serps: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/innovia-merchant-immages-serpscopy.jpg is to be hooked up to Google Merchant? Which kind of means if the sight your working on has no images then this type of enhancement is out of bounds? Thanks in advance, David
Technical SEO | | Nightwing0 -
Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines. I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important. Resources: http://www.ericpender.com/blog/tabs-and-seo http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643 Sample in use: http://www.seomoz.org/article/search-ranking-factors **Old Version: ** http://screencast.com/t/BWn0OgZsXt http://seatgeek.com/boston-celtics-tickets/ New Version with tabs: http://screencast.com/t/VW6QzDaGt http://screencast.com/t/RPvYv8sT2 http://seatgeek.com/miami-heat-tickets/ Notes: Content not displayed stacked on browser when Javascript turned off, but it is in the source code. Content shows up in Google cache of new page in the text version. In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be? Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional). This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource. Are there any issues with hidden text / is this too far down in the html? Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
Technical SEO | | chadburgess0 -
Why is a 301 redirected url still getting indexed?
We recently fixed a redirect issue in a website, and although it appears that the redirection is working fine, the url in question keeps on getting crawled, indexed and cached by google. The redirect was done a month ago, and google shows cached version of it, even for a couple of days ago. Manual checking shows that its being redirected, and also a couple of online tools i checked report a 301 redirect. Do you have any idea why this could be happening? The website I'm talking about is www.hotelmajestic.gr and its being redirected to www.hotel-majestic.gr
Technical SEO | | dim_d0