Blocking subdomains without blocking sites...
-
So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com.
Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load?
This is a simplification of a problem I am running across.
Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites).
Thoughts?
-
Ah, I see now. Try this out http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt#reply_26992 - basically, when a subdomain is identified, it would pull a different file into the robots.txt location (which would contain the disallow: / syntax)
Read the remaining comments about getting the subdomain removed via GWT.
-
You are correct, but that isn't what I was asking.
user1.bloggingplatform.com and myblog.com point to the same web server files. If I put up a robots.txt on user1.b... I would effectively de-index myblog.com.
The problem we have run accross is that user205.bloggingplatform.com might be doing something shady, but instead of de-listing the subdomain google kills the primary domain from the index as well.
Because user205.bloggingplatform.com should only be used for technical reasons, and not be in Google's index I am looking for a way to tell google not to index the sub-domain.
I think the better way to solve the problem would be to change the technical subdomain's domain though so change it from user205.bloggingplatform.com to user205.bloggingplatformtesting.com.
Then google can kill that URL all it wants as I don't care.
-
bloggingplatform.com/robots.txt
and
user1.bloggingplatform.com/robots.txt
can and should be different. If you disallow at the subdomain level, only the subdomain will be affected. You can search around for other examples of this but i'm certain it works (we have a development domain that is indexed and create subdomains for all clients that aren't indexed and done via individual robots.txt files)
-
I don't think that works. Since both URLs point to the same server the robots.txt file for the test URL would completely kill the main url.
Or am I missing something?
-
Each subdomain should have a robots.txt file that blocks that specific subdomain. e.g. user1.bloggingplatform.com/robots.txt should have:
User-agent: *
Disallow: /
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Messy older site
I am taking over a website that doesn't have any canonical tags and spotty redirects. It looks like they have http://, https://, www and non-www pages indexed but GA is just set up for the http://non-www home page. Should all versions of the site be set up in GA and Search Console? I think so but wanted to confirm. Thanks in advance.
Technical SEO | | SpodekandCo0 -
Canonical URLs in an eCommerce site
We have a website with 4 product categories (1. ice cream parlors, 2. frozen yogurt shops etc.). A few sub-categories (e.g. toppings, smoothies etc.) and the products contained in those are available in more than one product category (e.g. the smoothies are available in the "ice cream parlors" category, but also in the "frozen yogurt shops" category). My question: Unfortunately the website has been designed in a way that if a subcategory (e.g. smoothies) is available in more than 1 category, then itself (the subcategory page) + all its product pages will be automatically visible under various different urls. So now I have several urls for one and the same product: www.example.com/strawberry-smoothie|SMOOTHIES|FROZEN-YOGURT-SHOPS-391-2-5 and http://www.example.com/strawberry-smoothie|SMOOTHIES|ICE-CREAM-PARLORS-391-1-5 And also several ones for one and the same sub-category (they all include exactly the same set of products): http://www.example.com/SMOOTHIES-1-12-0-4 (the smoothies contained in the ice cream parlors category) http://www.example.com/SMOOTHIES-2-12-0-4 (the same smoothies, contained in the frozen yogurt shops category) This is happening with around 100 pages. I would add canonical tags to the duplicates, but I'm afraid that by doing so, the category (frozen yogurt shops) that contains several non-canonical sub-categories (smoothies, toppings etc.) , might not show up anymore in search results or become irrelevant for Google when searching for example for "products for frozen yoghurt shops". Do you know if this would be actually the case? I hope I explained it well..
Technical SEO | | Gabriele_Layoutweb0 -
Shutting down a site, where do I 301 it?
I'm working with a few international sites that we are going to collapse into one main site. Our current plan is to 301 the 4 other sites into our main site home page. Is this ok? Is there a better way to do this? Thanks
Technical SEO | | MarloSchneider0 -
Blocked by robots
my client GWT has a number of notices for "blocked by meta-robots" - these are all either blog posts/categories/or tags his former seo told him this: "We've activated following settings: Use noindex for Categories Use noindex for Archives Use noindex for Tag Archives to reduce keyword stuffing & duplicate post tags
Technical SEO | | Ezpro9
Disabling all 3 noindex settings above may remove google blocks but also will send too many similar tags, post archives/category. " is this guy correct? what would be the problem with indexing these? am i correct in thinking they should be indexed? thanks0 -
Why is this site ranking so good?
Site in question: http://bit.ly/aBvVbm Our main competitor in the UK seems to be ranking extremely good for the keyword "jigsaw puzzles" even though their linking profile doesn't seem all that great? They mainly have site-wide links on 2 of their other ecommerce sites which seem to be given them their ranking power as this equals to 100's of links. Does sitewide links on 2 sites really give this much ranking power or am I missing something?
Technical SEO | | Tonyy30 -
Can a site be removed from alexa?
let's say you have complete control over the webserver, and the hosting server. is there a way to set it up so that alexa statistics CANNOT be gained?
Technical SEO | | highersourcesites0 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0 -
.CA site same as .com site - are both necessary?
Dear Friend, We representa a major national brand in the auto care industry, and they have locations in both US and Canada. There is a primary content site at .com that we have duplicated at .ca. We are hosting the .ca site on a separate IP on a server in Canada - but by in large it is the same site. (there are some minor changes we made to change US English to Canadian English - though minor. When we search Google.ca we generally see strong search results for the .com site, but rarely, if ever any evidence of rankings for the .ca site. The .com site was launched several years ago about 18 months before the .ca site. Why doesn't Google.ca show the .ca site? Is this an issue of duplicate content, and Google.ca simply shows the .com version which it knew about first? Are we wasting our time, money and efforts having both? Thanks, Tim ps. this isn't about location. We use a separate site to locate local shops, and have coordinated that well with Google Places, and when looking for local auto care - we do well in both US and Canada. The sites described above are largetl content sites.
Technical SEO | | lunavista-comm0