Best practice for franchise sites with duplicated content
-
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues.
All sites are hosted on the same server therefor the same IP address
All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate
Almost all sites have the same design (A few of the groups we work with have multiple design options)
Any suggestions would be greatly appreciated.
Thanks Again
Aaron
-
I fully agree. We have notified them all and let them know its in their best interest to modify the content throughout the site. Unfortunately most of them don't and the copy remains templated.
Thanks for your answers
-
If the search is for Company product or service, You can take little advantage by doing local listing of each franchisee. Except This content rewriting is only option as per my knol.
-
Maybe part of the literature describing your program can include the point that to be really effective the franchisee's will have to write their own content. It all depends on your business model, whether you want to make them aware that they have 100-5,000 competitors from your company alone.
-
I fully agree with you EGOL "There is another problem - maybe bigger than Google's desire for unique content."
We give each franchisee the opportunity to expand on the content and make it their own, however I would say 90% of them don't make any changes.
I don't think that either the franchisee or corporate would want to pay the $$$ it would cost to have our Copywriters write unique copy for each site. (50-100+ products/services) per site or franchisee.
-
I wish we could redo the strategy but we aren't talking about small franchises here. We are talking franchises anywhere from 100 stores all the way up to 5,000 stores.
The products/services they offer are described very well and unfortunately the only thing we add into each product page is maybe a few location identifiers and a company name.
I don't want to use the canonical solution because each site has to be seen as a stand along site.
-
Each Franchise has their own domain.
Each Product/Service has a single description - Each franchisee has to use the same corporate approved logo.
All Images are named the same thing so it can matter.
I like your suggestions though...you are going the same route we have in the past.
-
Information about Google using OCR... Use this link to see an example of how google extracted and highlighted "wrigley swim" from this newspaper scan.
Google can determine the color of an image... image files are actually characters and google can extract the colors. If you go into image search there is an option limit the results by color. Some of that is done via context (such as words in the file name or words near the image), however, some is done by extracting data from the image file.
-
Here we are all giving advices based on their own knowledge. So i personally think Google cannot read images or what a specific image relates to. If I'm wrong and I hope I'm not ... can i get more details EGOL
Thanks.
-
...Google cannot read images or colors...
Are you willing to bet a month's pay on that?
-
I want to make sure that Google sees each one of these sites as unique sites...
I don't think that there is an inexpensive way to get this done and have high quality results. If you want unique content you gotta pay the price... but you could consider.
Hire several writers to reauthor the content - will cost a lot less than starting from scratch.
Get an article spinner program - that will be cheap but you will probably not like the results.
Make a enthusiastic sales pitch to each franchisee with incentives to write their own content.
...templated content approved by corporate...
There is another problem - maybe bigger than Google's desire for unique content.
Good luck.
-
You may want to re-think your strategy of franchising the product and the content. If the content is the same the only way to eliminate the duplicate content problem is to point to one of them as the canonical version, and that would very much impact the performance of the other versions of the other sites.
-
I suggest your give the products (franchise) use their own sort of domain(logo) but add franchise [your logo].
1. Their Own Domain
2. Their own product description even if it's the same product (maybe add your logo to make sure people recognizes the brand.
3. Design does not matter (urls, title, description, content etc counts) as Google cannot read images or colors
Hope it helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using Yoast variables for meta content overwrite any pages that already have custom meta content?
The question is about the Yoast plugin for WP sites. Let's say I have a site with 200 pages and custom meta descriptions / title tags already in place for the top 30 pages. If I use the Yoast variable tool to complete meta content for the remaining pages (and make my Moz issue tracker look happier), will that only affect the pages without custom meta descriptions or will it overwrite even the pages with the custom meta content that I want? In this situation, I do want to keep the meta content that is already in place on select pages. Thanks! Zack
On-Page Optimization | | rootandbranch0 -
Duplicate Content in Footers (Not as routine as it seems)
Hello there, I know that content in the footer of sites are safe from duplication penalisation; however, what if the footers where replicated across different subdomains? For instance, the footer was duplicated across: www.example.com blog.example.com blog2.example.com I don't see it as a big issue personally; however, outsourced "specialists" seem to think that this is causing duplication problems and therefore negatively affecting the ranking power of "lesser" subdomains i.e. not the www version, which is by far the strongest subdomain. Would be good to get some insight if anybody has any. Thanks.
On-Page Optimization | | SEONOW1230 -
Unique Pages with Thin Content vs. One Page with Lots of Content
Is there anyone who can give me a definitive answer on which of the following situations is preferable from an SEO standpoint for the services section of a website? 1. Many unique and targeted service pages with the primary keyword in the URL, Title tag and H1 - but with the tradeoff of having thin content on the page (i.e. 100 words of content or less). 2. One large service page listing all services in the content. Primary keyword for URL, title tag and H1 would be something like "(company name) services" and each service would be in the H2 title. In this case, there is lots of content on the page. Yes, the ideal situation would be to beef up content for each unique pages, but we have found that this isn't always an option based on the amount of time a client has dedicated to a project.
On-Page Optimization | | RCDesign741 -
Duplicate content, which seems not to be duplicate :S
After crawling I am used to getting a lot of duplicate content messages in Moz, which are High Priority. I do not know what to do with them, since I believe we tackled all the issues. Main point being the advise to put in a link rel=canonical. An example of a page that accordeing to the report has a duplicate. I do not see how. Can you help with that? http://www.beat-it.nl/4y6hctr24x7wdmr-ml350-p-ic-procaresvc.html duplicate sample http://www.beat-it.nl/modu-hp-a5800-acm-for-64-256-aps.html
On-Page Optimization | | Raymo0 -
How much content does Google Crawl on your site?
Hi, We've had a debate around the office where some people believe that Google only crawls the first 150-200 words on a page and some people believe that they priority content that is above the fold and other people believe that all content has the same priority. Can you help us? Thanks,
On-Page Optimization | | mdorville
Matt0 -
How to avoid duplicate page content
I have over 5.000 duplicate page content because my urls contains ?district=1&sort=&how=ASC¤cy=EUR. How can I fix this?
On-Page Optimization | | bruki0 -
What is the best way to manage industry required duplicate Important Safety Information (ISI) content on every page of a site?
Hello SEOmozzer! I have recently joined a large pharmaceutical marketing company as our head SEO guru, and I've encountered a duplicate content related issue here that I'd like some help on. Because there is so much red tape in the pharmaceutical industry, there are A LOT of limitations on website content, medication and drug claims, etc. Because of this, it is required to have Important Safety Information (ISI) clearly stated on every page of the client's website (including the homepage). The information is generally pretty lengthy, and in some cases is longer than the non-ISI content on each page. Here is an example: http://www.xifaxan.com/ All content under the ISI header is required on each page. My questions are: How will this duplicated content on each page affect our on-page optimization scores in the eyes of search engines? Is Google seeing this simply as duplicated content on every page, or are they "smart" enough to understand that because it is a drug website, this is industry standard (and required)? Aside from creating more meaty, non-ISI content for the site, are there any other suggestions you have for handling this potentially harmful SEO situation? And in case you were going to suggest it, we cannot simply have an image of the content, as it may not be visible by all internet users. We've already looked into that 😉 Thanks in advance! Dylan
On-Page Optimization | | MedThinkCommunications0 -
Is is it true that Google will not penalize duplicated content found in UL and LI tags?
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
On-Page Optimization | | jdjamie0