Up to my you-know-what in duplicate content
-
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google.
The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages?
Thanks.
-
You've pretty much confirmed my suspicions. I can set the redirects up myself, its just been about 5 years since I've done any SEO work. What I meant was should I mod_rewrite or "redirect 301 /oldurl /newurl" ...I've forgot a lot of stuff that I used to do with ease. My own sites were always started off right and weren't as bad as the one I'm working on now, so I'm in unfamiliar territory. Thanks for your advice, I appreciate it
-
I want to make sure that you are getting the proper advice. Can you provide me the URLs here, or PM them to me to keep them private? Once I see the problem firsthand, I can reply with the answer here for you. I am pretty sure my advice above is the way to go, but it doesn't hurt to double check!
You need to choose ONE domain for going forward. I don't care which one it is, but choose one. It makes sense to choose the one with the better rankings, at least from my perspective.
After that, you 301 redirect all versions the URLs to the proper URL (which would be WWW if it was my choice).
Yes, mod_rewrite is a server-side redirect that you can choose. Make sure whoever sets them up knows what he is doing. Having a ton of server-side redirects can increase load times and cause issues with site speed if it is not done properly. Don't be afraid of doing it, but just make sure you know what you are doing, especially since you're dealing with thousands of URLs.
You want to use permanent 301 redirects, yes.
-
Thanks I appreciate the advice. So you don't think having 2 seperate domains pointing (or redirecting) to each other occasionally will hurt anything? I have like 1000+ URLs I need to redirect already on the completely separate domain.com, as for the keyworddomain.com forum I don't think I need too many redirects as just one from seperate.domain.com to keyworddomain.com, and then one there from nonWWW to WWW should fix all the broken URLs right? When you say 301 do you mean "redirect 301" or mod_rewrite? Thanks for the help
-
I would first, choose which version you want to use going forward. You have three versions: subdomain, non-www, and www. Don't use the subdomain, that is a given. I personally like using WWW instead of non-WWW, however there are reasons to use non-WWW over WWW. But, given this scenario, it makes sense to use the WWW version. I know that the non-WWW version has more pages indexed, but pages indexed doesn't mean much in the grand scheme of things. Given that WWW has good rankings and is more identifiable to a user, I would choose that. Of course, if you choose non-WWW my advice below will remain the same.
Now that you have chosen what version you want to use going forward, you need to do a few things:
-
Implement a .htaccess 301 server-side redirect and redirect non-WWW to WWW (or vice versa if you so choose), make sure it's permanent. This way going forward, it'll fix your non-www and WWW issue.
-
Next, you need to redirect all non-WWW indexed pages and URLs to their WWW version. This is not easy, especially with thousands of pages. However, it must be done to help preserve the PR and link-juice so it passes as much as it can through. What I recommend is seeing if there is a plugin or extension for whatever forum software you use that can aid you in this effort, or hire a programmer to build you one. It's actually not that complex to do and I have done it before in a similar situation and it does work. If you need more advice on that, PM me.
-
You need to take care of the subdomain by setting up a permanent redirect to the main WWW version if someone goes to the subdomain, and also setup redirects for existing subdomain pages/URLs that have PR/Rank/LinkJuice.
-
From there, make sure that you are utilizing sitemaps properly, that can greatly increase your indexing rate and volume.
I hope that these help, if you need anything further please do not hesitate to PM me or post here.
Good luck!
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content for Locations on my Directory Site
I have a pretty big directory site using Wordpress with lots of "locations", "features", "listing-category" etc.... Duplicate Content: https://www.thecbd.co/location/california/ https://www.thecbd.co/location/canada/ referring URL is www.thecbd.co is it a matter of just putting a canonical URL on each location, or just on the main page? Would this be the correct code to put: on the main page? Thanks Everyone!
Technical SEO | | kay_nguyen0 -
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
When is Duplicate Content Duplicate Content
Hi, I was wondering exactly when duplicate content is duplicate content? Is it always when it is word-for-word or if it is similar? For example, we currently have an information page and I would like to add a FAQ to the website. There is, however, a crossover with the content and some of it is repeated. However, it is not written word for word. Could you please advise me? Thanks a lot Tom
Technical SEO | | National-Homebuyers0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
How much to change to avoid duplicate content?
Working on a site for a dentist. They have a long list of services that they want us to flesh out with text. They provided a bullet list of services, we're trying to get 1 to 2 paragraphs of text for each. Obviously, we're not going to write this off the top of our heads. We're pulling text from other sources and trying to rework. The question is, how much rephrasing do we have to do to avoid a duplicate content penalty? Do we make sure there are changes per paragraph, sentence, or phrase? Thanks! Eric
Technical SEO | | ericmccarty0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Duplicate content, Original source?
Hi there, say i have two websites with identicle content. website a had content on before website b - so will be seen as the original source? If the content was intended for website b, would taking it off a then make the orinal source to google then go to website b? I want website b to get the value of the content but it was put on website a first - would taking it off website a then give website b the full power of the content? Any help of advice much appreciated. Kind Regards,
Technical SEO | | pauledwards0 -
How to resolve duplicate content and title errors?
Hello, I'm new to this resource and SEO. I have taken the time to read other posts but am not entirely sure about the best way to resolve the issues I am experiencing and so am hoping for a helpful hand. My site diagnostics advise me that most of my errors relate to duplicate content and duplicate page titles. Most of these errors seem to relate to our ecommerce product pages. A little about us first, we manufacture and retail over the internet our own line of unique products which can only be purchased through our website. So it’s not so important to make our product pages stand out from competitors. An example of one of our product pages can be found here: http://www.nabru.co.uk/product/Sui+2X2+Corner+Sofa In terms of SEO we are focusing on improving the rankings of our category pages which compete much more with our competitors, but would also like our product pages to be found via a google search for those potential customers that are at the late stage of a buying cycle. So my question: Whilst it would be good to add more content to the product pages, user reviews, individual product descriptions etc (and have good intentions to do this over time, which unfortunately is limited) is there an easy way to fix the duplicate content issues, ensure our products can be found and ensure that the main focus is on our category pages? Many thanks.
Technical SEO | | jannkuzel0