Preventing CNAME Site Duplications
-
Hello fellow mozzers!
Let me see if I can explain this properly.
First, our server admin is out of contact at the moment,
so we are having to take this project on somewhat blind. (forgive the ignorance of terms).We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end.My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out.Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily.Lastly, is an A record required for this type of situation in any way?
Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful! -
It is pointing to the other server now. We have it blocked from indexing on that end, just wanted to make sure that was enough.
-
No,
it is because you are pointing sales to a different sever, it seems to me that you don't have your dns set up correctly. you don't want sales pointing to your main website. -
So does this work better because Google will not show an IP address in search results?
-
You need to point your cname to the ip of the server that hosts your sales.domain.com
don't
Do
sales.domain.com > 123.123.123.123
where 123.123.123.123 is the ip of the hosting webserver.
-
Hello David,
I think with the robots rule (there are many examples out there) should be more than enough in your case! take a look at this helpful article: http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt
I hope that was helpful! Sorry about my English... I'm Spanish
Luis
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Want to remove a large amount of links from spam sites. SEO company says we will lose a lot of link juice?
Hi, We have a lot of links that have a spam score above 30% and 60%. I don't know if someone has spammed our website. However our SEO company has said we should remove these carefully over a period of 3 months while they add new good links. I don't quite trust this advice. Are they trying to get more business?? They have put doubt in our mind. Can anyone please shed any light on this?? Thank you
White Hat / Black Hat SEO | | YvonneDupree0 -
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Linking Authentic Sites Together - Semi-PBN?
Recently I've had a lot of ideas of sites to build that all would have some sort of relevance to each other, all that would be relevant to my current business. For example, say you have sites for: bars/clubs, music festivals, cinemas, etc, one site for each. While these aren't all directly related to each other, they all kind of fall within a category of entertainment and having fun. Now, I'm not thinking about this as if I were to build a Private Blog Network, but instead each site would actually be valuable to visitors, be content rich, have regular updates and thriving social media etc, as if each were its own individual business. What would be your opinion on actually linking these together at some point down the line? I must stress that these would not be like typical PBN sites where the themes are the same, content is spun or badly written, no human touches or actual value, anything spammy etc, these would actually be authentic quality sites that you would reasonably expect to have a thriving community. Personally, after changing my ways from blackhat to weary-of-linkbuilding whitehat when Penguin 1 was released, I'm aware of what a bad linkbuilding strategy can do and would rather steer clear, however when I compare the plan of these authentic sites I have in my head to the obvious, low quality PBNs that I find competitors use to rank well all the time, I'm coming around to the idea that they may not pose a threat with the way I intend to implement them. Can I get some thoughts?
White Hat / Black Hat SEO | | Leads.Bz2 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Penalty for all new sites on a domain?
Hi @all, a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved. We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem! 😉 Thanks!!!
White Hat / Black Hat SEO | | TheLastSeo0 -
Am I Being Penalized For Having My Whole Site In A Subfolder Named With A Keyword?
I inherited a client. For some reason, their previous webmaster set up the site so everything is in a subfolder /law/. It's an attorney website. All the urls have the primary domain name /law/ and then assigned url. I can't image this is helping but could the site be penalized for this by Google or Bing? It's set up like this: www.attorneysite.com**/law/**therestoftheurl /law/ is included in EVERY PAGE... even the homepage.
White Hat / Black Hat SEO | | DeltonChilds0 -
Getting sites unbanned in bulk
Prior to the Panda update we had 1 main site, and 300 or so satellite sites. The satellite sites all had an identical template with identical content. The satellite sites all got flagged, and the main site persevered. We'd like to TRY to get all of these sites unbanned in bulk. My question is...how 'DIFFERENT' should these sites be? I know that a real google employee will be looking. All of these sites will be in the same industry...so how 'different' can the content really be? I am going to try to do this in sets of 10 and purchase a different template for EACH city/satellite site, as well as having varying categories, but realistically how doomed/successful do you think this endeavor will be? Any advice? realistic timeline?
White Hat / Black Hat SEO | | ilyaelbert0