Up to my you-know-what in duplicate content
-
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google.
The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages?
Thanks.
-
You've pretty much confirmed my suspicions. I can set the redirects up myself, its just been about 5 years since I've done any SEO work. What I meant was should I mod_rewrite or "redirect 301 /oldurl /newurl" ...I've forgot a lot of stuff that I used to do with ease. My own sites were always started off right and weren't as bad as the one I'm working on now, so I'm in unfamiliar territory. Thanks for your advice, I appreciate it
-
I want to make sure that you are getting the proper advice. Can you provide me the URLs here, or PM them to me to keep them private? Once I see the problem firsthand, I can reply with the answer here for you. I am pretty sure my advice above is the way to go, but it doesn't hurt to double check!
You need to choose ONE domain for going forward. I don't care which one it is, but choose one. It makes sense to choose the one with the better rankings, at least from my perspective.
After that, you 301 redirect all versions the URLs to the proper URL (which would be WWW if it was my choice).
Yes, mod_rewrite is a server-side redirect that you can choose. Make sure whoever sets them up knows what he is doing. Having a ton of server-side redirects can increase load times and cause issues with site speed if it is not done properly. Don't be afraid of doing it, but just make sure you know what you are doing, especially since you're dealing with thousands of URLs.
You want to use permanent 301 redirects, yes.
-
Thanks I appreciate the advice. So you don't think having 2 seperate domains pointing (or redirecting) to each other occasionally will hurt anything? I have like 1000+ URLs I need to redirect already on the completely separate domain.com, as for the keyworddomain.com forum I don't think I need too many redirects as just one from seperate.domain.com to keyworddomain.com, and then one there from nonWWW to WWW should fix all the broken URLs right? When you say 301 do you mean "redirect 301" or mod_rewrite? Thanks for the help
-
I would first, choose which version you want to use going forward. You have three versions: subdomain, non-www, and www. Don't use the subdomain, that is a given. I personally like using WWW instead of non-WWW, however there are reasons to use non-WWW over WWW. But, given this scenario, it makes sense to use the WWW version. I know that the non-WWW version has more pages indexed, but pages indexed doesn't mean much in the grand scheme of things. Given that WWW has good rankings and is more identifiable to a user, I would choose that. Of course, if you choose non-WWW my advice below will remain the same.
Now that you have chosen what version you want to use going forward, you need to do a few things:
-
Implement a .htaccess 301 server-side redirect and redirect non-WWW to WWW (or vice versa if you so choose), make sure it's permanent. This way going forward, it'll fix your non-www and WWW issue.
-
Next, you need to redirect all non-WWW indexed pages and URLs to their WWW version. This is not easy, especially with thousands of pages. However, it must be done to help preserve the PR and link-juice so it passes as much as it can through. What I recommend is seeing if there is a plugin or extension for whatever forum software you use that can aid you in this effort, or hire a programmer to build you one. It's actually not that complex to do and I have done it before in a similar situation and it does work. If you need more advice on that, PM me.
-
You need to take care of the subdomain by setting up a permanent redirect to the main WWW version if someone goes to the subdomain, and also setup redirects for existing subdomain pages/URLs that have PR/Rank/LinkJuice.
-
From there, make sure that you are utilizing sitemaps properly, that can greatly increase your indexing rate and volume.
I hope that these help, if you need anything further please do not hesitate to PM me or post here.
Good luck!
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Duplicated content in news portal: should we use noindex?
Hello, We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective. The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website. I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do. I would appreciate any opinion on that.
Technical SEO | | forex-websites0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Determining where duplicate content comes from...
I am getting duplicate content warnings on the SEOMOZ crawl. I don't know where the content is duplicated. Is there a site that will find duplicate content?
Technical SEO | | JML11790 -
Product Duplicate Content Issue with Google Shopping
I have a site with approx 20,000 products. These products are resold to hundreds of other companies and are fed from one database therefore the content is duplicated many many times. To overcome this, we are launching the site with noindex meta tags on all product pages. (In phase 2 we will begin adding unique content for every product eek) However, we still want them to appear in Google Shopping. Will this happen or will it have to wait until we remove the noindex tags?
Technical SEO | | FGroup0 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
Duplicate Content
We have a main sales page and then we have a country specific sales page for about 250 countries. The country specific pages are identical to the main sales page, with the small addition of a country flag and the country name in the h1. I have added a rel canonical tag to all country pages to send the link juice and authority to the main page, because they would be all competing for rankings. I was wondering if having the 250+ indexed pages of duplicate content will effect the ranking of the main page even though they have rel canonical tag. We get some traffic to country pages, but not as much as the main page, but im worried that if we remove those pages and redirect all to main page that we will loose 250 plus indexed pages where we can get traffic through for odd country specific terms. eg searching for uk mobile phone brings up the country specific page instead of main sales page even though the uk sales pages is not optimized for uk terms other than having a flag and the country name in the h1. Any advice?
Technical SEO | | -Al-0