Preventing CNAME Site Duplications
-
Hello fellow mozzers!
Let me see if I can explain this properly.
First, our server admin is out of contact at the moment,
so we are having to take this project on somewhat blind. (forgive the ignorance of terms).We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end.My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out.Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily.Lastly, is an A record required for this type of situation in any way?
Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful! -
It is pointing to the other server now. We have it blocked from indexing on that end, just wanted to make sure that was enough.
-
No,
it is because you are pointing sales to a different sever, it seems to me that you don't have your dns set up correctly. you don't want sales pointing to your main website. -
So does this work better because Google will not show an IP address in search results?
-
You need to point your cname to the ip of the server that hosts your sales.domain.com
don't
Do
sales.domain.com > 123.123.123.123
where 123.123.123.123 is the ip of the hosting webserver.
-
Hello David,
I think with the robots rule (there are many examples out there) should be more than enough in your case! take a look at this helpful article: http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt
I hope that was helpful! Sorry about my English... I'm Spanish
Luis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor has same site with multiple languages
Hey Moz, I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc. My first thought is this is basically a way to game the system but I could be wrong. They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it. TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
White Hat / Black Hat SEO | | HashtagHustler0 -
My site in 2 page
my site in 2 page how can i rank with this keywords in dubai legal translation in Dubai
White Hat / Black Hat SEO | | saharali150 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Pharma Hack/Grey hat SEO. Cannot get site to rank, tons of incoming bad links
I have been working on a website trying to get it to show up in the SERPs again. It is being indexed which is great, it has some errors that I'm fixing now. But for the most part it should be ranking. It don't show any penalties going on, but when I did a backlink search we keep getting the cialis, viagra etc inbound links. First thought was Pharma Hack. But it's not a WP site and I recently rebuilt it. So whatever bad code could have been there it's not anymore. It doesn't show up in google either for the search site:www.mysite.com viagra cialis etc... So I'm wondering if anyone has any insight in a direction to point me? I don't understand what would be causing this to still not rank. Only thing it ranks for is it's name. Any suggestions would be very appreciated.
White Hat / Black Hat SEO | | WeBuyCars.com0 -
Forcing Entire site to HTTPS
We have a Wordpress site and hope to force everything to HTTPS. We change the site name (in wordpress settings) to https://mydomain.com In the htaccess code = http://moz.com/blog/htaccess-file-snippets-for-seos Ensure we are using HTTPS version of the site. RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] but some blogs http://stackoverflow.com/questions/19168489/https-force-redirect-not-working-in-wordpress say RewriteCond %{HTTPS} off RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] Which one is right? 🙂 and are we missing anything?
White Hat / Black Hat SEO | | joony0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Am i getting backlink benefits from sites i design and host
I own & host over 300 domains for as many businesses. They all link back to my site from every page. but seomoz shows only hundred. so do other seo tools. why is that?
White Hat / Black Hat SEO | | nooptee0 -
Should this site be punished?
Every summer for the past 4 years one of our customer's competitors suddenly has a big jump in Google's (.co.uk) rankings for some of the main industry phrases, particularly "air conditioning". We were always under the impression that they bought links before the busy summer season, as they have these strange massive jumps in the rankings. (for the rest of the year they often drop down) I recently checked out some of the back-links going to their site and noticed something I'd not seen before. Of the (approx) 480 links that showed up, around 80% of the SourceURL's ended with "?Action=Webring" (see 1st attached image). To me it doesn't look natural at all and I'm surprised that Google hasn't picked up on. Their site is www.aircon247.com. It had been mentioned to me that this may be to do with link sharing sites (which I assume is black-hat) but I'm not 100% sure that they are doing this. They also have an identical long spammy-looking footer at the bottom of every page which is clearly only for search engines to see. We reported it to Google a year ago but no action was taken. Do you think that it is acceptable to have it on every page? (see 2nd attached image) I would be interested to know your thoughts on both of these, and whether this would be a dangerous tactic to try and emulate? Gc5MU.png iXGA9.png
White Hat / Black Hat SEO | | trickshotric0