Best practice for franchise sites with duplicated content
-
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues.
All sites are hosted on the same server therefor the same IP address
All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate
Almost all sites have the same design (A few of the groups we work with have multiple design options)
Any suggestions would be greatly appreciated.
Thanks Again
Aaron
-
I fully agree. We have notified them all and let them know its in their best interest to modify the content throughout the site. Unfortunately most of them don't and the copy remains templated.
Thanks for your answers
-
If the search is for Company product or service, You can take little advantage by doing local listing of each franchisee. Except This content rewriting is only option as per my knol.
-
Maybe part of the literature describing your program can include the point that to be really effective the franchisee's will have to write their own content. It all depends on your business model, whether you want to make them aware that they have 100-5,000 competitors from your company alone.
-
I fully agree with you EGOL "There is another problem - maybe bigger than Google's desire for unique content."
We give each franchisee the opportunity to expand on the content and make it their own, however I would say 90% of them don't make any changes.
I don't think that either the franchisee or corporate would want to pay the $$$ it would cost to have our Copywriters write unique copy for each site. (50-100+ products/services) per site or franchisee.
-
I wish we could redo the strategy but we aren't talking about small franchises here. We are talking franchises anywhere from 100 stores all the way up to 5,000 stores.
The products/services they offer are described very well and unfortunately the only thing we add into each product page is maybe a few location identifiers and a company name.
I don't want to use the canonical solution because each site has to be seen as a stand along site.
-
Each Franchise has their own domain.
Each Product/Service has a single description - Each franchisee has to use the same corporate approved logo.
All Images are named the same thing so it can matter.
I like your suggestions though...you are going the same route we have in the past.
-
Information about Google using OCR... Use this link to see an example of how google extracted and highlighted "wrigley swim" from this newspaper scan.
Google can determine the color of an image... image files are actually characters and google can extract the colors. If you go into image search there is an option limit the results by color. Some of that is done via context (such as words in the file name or words near the image), however, some is done by extracting data from the image file.
-
Here we are all giving advices based on their own knowledge. So i personally think Google cannot read images or what a specific image relates to. If I'm wrong and I hope I'm not ... can i get more details EGOL
Thanks.
-
...Google cannot read images or colors...
Are you willing to bet a month's pay on that?
-
I want to make sure that Google sees each one of these sites as unique sites...
I don't think that there is an inexpensive way to get this done and have high quality results. If you want unique content you gotta pay the price... but you could consider.
Hire several writers to reauthor the content - will cost a lot less than starting from scratch.
Get an article spinner program - that will be cheap but you will probably not like the results.
Make a enthusiastic sales pitch to each franchisee with incentives to write their own content.
...templated content approved by corporate...
There is another problem - maybe bigger than Google's desire for unique content.
Good luck.
-
You may want to re-think your strategy of franchising the product and the content. If the content is the same the only way to eliminate the duplicate content problem is to point to one of them as the canonical version, and that would very much impact the performance of the other versions of the other sites.
-
I suggest your give the products (franchise) use their own sort of domain(logo) but add franchise [your logo].
1. Their Own Domain
2. Their own product description even if it's the same product (maybe add your logo to make sure people recognizes the brand.
3. Design does not matter (urls, title, description, content etc counts) as Google cannot read images or colors
Hope it helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Duplicate Content
Hi guys Now that the full crawl is complete I've found the following: http://www.trespass.co.uk/mens-onslow-02022 http://www.trespass.co.uk/mens-moora-01816 http://www.trespass.co.uk/site/writeReview?ProductID=1816 http://www.trespass.co.uk/site/writeReview?ProductID=2022 The first 2 duplicate content is easily fixed by writing better product descriptions for each product (a lot of hours needed) but still an easy fix. The last 2 are review pages for each product which are all the same except for the main h1 text. My thinking is to add no index and no follow to all of these review pages? The site will be changing to magento very soon and theres still a lot of work to do. If anyone has any other suggestions or can spot any other issues, its appreciated. Kind regards Robert
On-Page Optimization | | yournetbiz1 -
Duplicate content "/"
Hi all, Ran my website through the SEOMOZ campaigns and the crawl diagnostics give me a duplicate error for these urls http://www.mysite.com/cat1/article http://www.mysite.com/cat1/article/ so the url with the "/" is a duplicate of the one without the "/" Can someone point me out to a solution to solve this ? regards, Frederik
On-Page Optimization | | frdrik1230 -
Duplicate Page Content on Empty Manufacturer Pages
I work for an internet retailer that specializes in pet supplies and medications. I was going through the Crawl Diagnostics for our website, and I saw in the Duplicate Page Content section that some of our manufacturer pages were getting flagged. The way our site is set up is that when products are discontinued we mark them as discontinued and use 301 redirects to redirect their URLs to other relevant products, brands, or our homepage. We do the same thing with brand and manufacturer pages if all of their products are discontinued. 90% of the time, this is a manual process. However, the other 10% of the time certain products come and go automatically as part of our inventory system with one of our fulfillment partners. This can sometimes create empty manufacturer pages. I can't redirect these empty pages because there's a chance that products will be brought back in stock and the page will be populated again. What can we do so that these pages won't get marked as duplicates while they're empty? Write unique short descriptions about the companies? Would the placement of these short descriptions matter--top of the page under the category name vs bottom of the page underneath where the products would go? The links in the left sidebar, top, and in the footer our part of our site architecture, so those are always going to be the same. To contrast, here's what a manufacturer page with products looks like: Thanks! http://www.vetdepot.com/littermaid-manufacturer.html
On-Page Optimization | | ElDude0 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Summarize your question.Images being seen as duplicate content/pages
My images suddenly are appearing in my crawl reports as duplicate content, without meta tags, this happened over night and cant figure out why.
On-Page Optimization | | RBYoung0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
XML sitemap latest best practices?
If anyone optimised their sitemap recently and would like to share some tips it would be much appreciated:-) Cheers
On-Page Optimization | | dublinbet0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5