SEO impact on archives and mirror services?
-
Hi,
We're looking at setting up an email archive site for certain lists related to our niche. We've been considering using archives.ourdomain.com versus ourdomain.com/archives. Part of the problem is that while the archives won't initially have huge amounts of content, over time it'll outstrip the amount of content we produce and I was worried about whether or not this would cause a duplicate content penalty as the emails will also be archived elsewhere.
If the volume of mails over time combined with duplication penalties are going to penalise the site, I'd rather we used a subdomain or a separate domain. If not, then subdomains might be the way to go.
-
I meant include this robot in the subdomain not include the subdomain in the robot. Was answering an employee at the same time and crossed some wires.
-
You have a few choices here to avoid that.
If you want to avoid it completely, simply include this entire subdomain.domain.tld in the robots.txt as:
User-agent: *
Disallow: /
Disallow: *Then place it in the root of your subdomain folder, for example if you sub was at subdomain.domain.tld you would normally have a folder called subdomain within your normal root folder.
Do not place this robots.txt file in your main folder or the whole site will not be indexed.
Extra measure, you can place a:
in the head section of each page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exchanging Guest Blog Opportunities - SEO Implications?
Haven't found a clear, recent answer on this. What are the SEO implications of exchanging guest blogging opportunities (in other words, we write an article for a partner blog with a backlink, and they write an article on on our blog with a backlink)? The partner site has a 57 domain authority and we have a 24 domain authority.
Content Development | | mikekeeper0 -
Updating blogs - SEO best practice
Thinking of new blog content and one option obviously is to check out historical popular blogposts via Analytics and do fresh versions of those. So my question is what is best practice: 1. Copy and paste the old blogpost copy but edit it to be slightly different while still having the old blogpost live or 2. just update the old one and re-promote I assume it's better to have a new version of the blogpost?
Content Development | | digitalbua1 -
My Boss tells me personal narrative content isn't read online and bad for SEO, anyone else disagree? b/c I do!
I am in a constant debate that content 1st person or 3rd person doesn't make a difference in terms of SEO and what people on the web want to read. What do you all think? Does it make a difference?
Content Development | | GoAbroadKP0 -
SEO, Rel Author and Several Clients
How do you deal with Rel author on blogs for your clients? Suppose I'm a contributor (or rather, the only copywriter) to a clients blog - which happens to be about DIY. The blog articles are written from the business perspective and published from 'the business'. not from Me. Should I be adding myself as a contributor and using my own personal Google profile on these blogs? If so, then it's my face that appears next to the blog post and I'm not sure that's what I want for a blog about fixing shelves etc! Am I misunderstanding how I should be using rel author? Is it for SEOs to work into client strategy, or is it just for our own personal blogs?
Content Development | | littlesthobo0 -
Magento Multi Stores and seo
Hello all, im not that clued on SEO and only know the basics so please bear with me! I've just had built 5 different furniture retail websites, all of which will be selling the same products, but all priced differently and all of the sites will have different content etc on each one (4 of the 5 websites are on a multi store system with one back end on magento). The only thing that the 5 sites will have in common is the names of the products in the titles but all the content in the text and prices etc will all be different on each site, including articles and blogs… How will this sit with google in terms of rankings on each site? I know a lot of other companies that do this already and they all have good rankings on each site for their chosen keywords, the only difference between my scenario and theirs is 4 of my sites on the same back end which is mainly the ones i am concerned about… Any thoughts? thanks anthony
Content Development | | anthonybriant0 -
How much weight does UGC really have in SEO
Hi Guys....One of my sites has a blog and the site does very well overall.....however managing the blogs comments (fighting spam...attacks...general monitoring and approval) is becoming to much trouble. The blog itself really gets fed from facebook and most of the blog comments are done here. I am thinking of removing the comment section on the blog and only responding to facebook comments.....I am concerned though how much weight the "user generated content" carries with regards to SEO...it has always been said that it does carry weight but what if i get traffic to these pages and links but remove the comments ability... just really wanted to hear some opinions on this and how much respect i should give UGC. thanks for your time
Content Development | | nomad-2023230 -
Displaying archive content articles in a writers bio page
My site has writers, and each has their own profile page (accessible when you click their name inside an article). We set up the code in a way that the bios, in addition to the actual writer photo/bio, would dynamically generate links to each article he/she produces. Figured that someone reading something by Bob Smith, might want to read other stuff by him. Which was fine, initially. Fast forward, and some of these writers have 3,4, even 15 pages of archives, as the archive system paginates every 10 articles (so www.example.com/bob-smith/archive-page3, etc) My thinking is that this is a bad thing. The articles are likely already found elsewhere in the site (under the content landing page it was written for, for example) and I visualize spiders getting sucked into these archive black holes, never to return. I also assume that it is just more internal mass linking (yech) and probably doesnt help the overall TOS/bounce/exit, etc. Thoughts?
Content Development | | EricPacifico0