Handling duplicate content, whilst making both rank well
-
Hey MOZperts,
I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites.
URL structure:
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet)Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here.
My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS.
- Is this possible?
- Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one?
- Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue?
Keen to hear your thoughts and if you have any suggestions for how we can handle this best.
Thanks in advance!
-
No. We're actually not launching this initiative for SEO purposes. We just want to create value for our users and having their own stand-alone website is valuable to them.
I just want to make sure we're structured properly from an SEO point of view so that we don't compromise the SEO of our marketplace, or their stand-alone site.
Also, each site has unique content, but it is identical data to their marketplace store. So, every seller has a marketplace store (with items, a profile etc) AND a stand-alone website (with the same items, same profile etc, just designed differently and accessible via a sub-domain).
Hope that makes sense.
-
Thanks so much for your input.
I must admit, I'm not too familiar with Panda, so will need to do some digging there. We literally launched the new version of Zibbet 2 months ago, with different meta data etc, so I'm not sure how that affects it.
If we don't add the rel=canonical, do you think we'll get punished by Google?
-
If I understand correctly, you're asking how you can create a business model that fills up the search results with a bunch of sites that all have the same content. I think you're somewhat late to that party. The Google of today doesn't really let you do that and it's pretty good at preventing it. And if you were thinking of maybe linking back to your main site from all those dupes, I'd rethink that strategy, as well.
-
Hi,
First of all I would really take care of that Panda issue you have there: http://screencast.com/t/SzbL6hTFwWr
To answer your questions:
- Is this possible?
They can't rank both. You need to decide canonical version - the one to rule them all
- Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one?
There are no duplicate content issues but yes, it's best that you chose and don't let google do that for you. Add rel canonical.
- Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue?
That won't help.
Overall you got hit by Panda already - you should take care of what content you push into the index as that's not the way to recover. before pushing more content into the index you should clean the site for all issues related with Panda.
Cheers.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://mza.bundledseo.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Duplicate Page Content
We have different plans that you can signup for - how can we rectify the duplicate page content and title issue here? Thanks. | http://signup.directiq.com/?plan=100 | 0 | 1 | 32 | 1 | 200 |
Intermediate & Advanced SEO | | directiq
| http://signup.directiq.com/?plan=104 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=116 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=117 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=102 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=119 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=101 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=103 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=5 |0 -
Will merging sites create a duplicate content penalty?
I have 2 sites that would be better suited being merged and creating a more authoritative site. Basically I'de like to merge site A in to site B. If I add new pages from site A to Site B and create 301 redirects for those pages on site A to the new pages on Site B is that the best way to go about it? As the pages are already indexed would this create any duplicate content issue or would the redirect solve this?
Intermediate & Advanced SEO | | boballanjones0 -
Implications of posting duplicate blog content on external domains?
I've had a few questions around the blog content on our site. Some of our vendors and partners have expressed interest in posting some of that content on their domains. What are the implications if we were to post copies of our blog posts on other domains? Should this be avoided or are there circumstances that this type of program would make sense?
Intermediate & Advanced SEO | | Visier1 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
Duplicate page content query
Hi forum, For some reason I have recently received a large increase in my Duplicate Page Content issues. Currently it says I have over 7,000 duplicate page content errors! For example it says: Sample URLs with this Duplicate Page Content http://dikelli.com.au/accessories/gowns/news.html http://dikelli.com.au/accessories/news.html
Intermediate & Advanced SEO | | sterls
http://dikelli.com.au/gallery/dikelli/gowns/gowns/sale_gowns.html However there are no physical links to any of these page on my site and even when I look at my FTP files (I am using Dreamweaver) these directories and files do not exist. Can anyone please tell me why the SEOMOZ crawl is coming up with these errors and how to solve them?0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Duplicate content on the same page--is this an issue?
We are transitioning to responsive design and some of our pages will not scale properly, so we were thinking of adding the same content twice to the same URL (one would be simple text -- for mobile and the other would include the images, etc for the desktop version), and content would change based on size of the screen. I'm not looking for another technical solution (I know google specifies that you can dynamically serve different content based on user agent)--I am wondering if any one knows if having the same exact content appear twice on the same URL will cause a problem with SEO (any historical tests or experience would be great). Thank you in advance.
Intermediate & Advanced SEO | | nicole.healthline0