Best practice for franchise sites with duplicated content
-
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues.
All sites are hosted on the same server therefor the same IP address
All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate
Almost all sites have the same design (A few of the groups we work with have multiple design options)
Any suggestions would be greatly appreciated.
Thanks Again
Aaron
-
I fully agree. We have notified them all and let them know its in their best interest to modify the content throughout the site. Unfortunately most of them don't and the copy remains templated.
Thanks for your answers
-
If the search is for Company product or service, You can take little advantage by doing local listing of each franchisee. Except This content rewriting is only option as per my knol.
-
Maybe part of the literature describing your program can include the point that to be really effective the franchisee's will have to write their own content. It all depends on your business model, whether you want to make them aware that they have 100-5,000 competitors from your company alone.
-
I fully agree with you EGOL "There is another problem - maybe bigger than Google's desire for unique content."
We give each franchisee the opportunity to expand on the content and make it their own, however I would say 90% of them don't make any changes.
I don't think that either the franchisee or corporate would want to pay the $$$ it would cost to have our Copywriters write unique copy for each site. (50-100+ products/services) per site or franchisee.
-
I wish we could redo the strategy but we aren't talking about small franchises here. We are talking franchises anywhere from 100 stores all the way up to 5,000 stores.
The products/services they offer are described very well and unfortunately the only thing we add into each product page is maybe a few location identifiers and a company name.
I don't want to use the canonical solution because each site has to be seen as a stand along site.
-
Each Franchise has their own domain.
Each Product/Service has a single description - Each franchisee has to use the same corporate approved logo.
All Images are named the same thing so it can matter.
I like your suggestions though...you are going the same route we have in the past.
-
Information about Google using OCR... Use this link to see an example of how google extracted and highlighted "wrigley swim" from this newspaper scan.
Google can determine the color of an image... image files are actually characters and google can extract the colors. If you go into image search there is an option limit the results by color. Some of that is done via context (such as words in the file name or words near the image), however, some is done by extracting data from the image file.
-
Here we are all giving advices based on their own knowledge. So i personally think Google cannot read images or what a specific image relates to. If I'm wrong and I hope I'm not ... can i get more details EGOL
Thanks.
-
...Google cannot read images or colors...
Are you willing to bet a month's pay on that?
-
I want to make sure that Google sees each one of these sites as unique sites...
I don't think that there is an inexpensive way to get this done and have high quality results. If you want unique content you gotta pay the price... but you could consider.
Hire several writers to reauthor the content - will cost a lot less than starting from scratch.
Get an article spinner program - that will be cheap but you will probably not like the results.
Make a enthusiastic sales pitch to each franchisee with incentives to write their own content.
...templated content approved by corporate...
There is another problem - maybe bigger than Google's desire for unique content.
Good luck.
-
You may want to re-think your strategy of franchising the product and the content. If the content is the same the only way to eliminate the duplicate content problem is to point to one of them as the canonical version, and that would very much impact the performance of the other versions of the other sites.
-
I suggest your give the products (franchise) use their own sort of domain(logo) but add franchise [your logo].
1. Their Own Domain
2. Their own product description even if it's the same product (maybe add your logo to make sure people recognizes the brand.
3. Design does not matter (urls, title, description, content etc counts) as Google cannot read images or colors
Hope it helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Site Architecture tool for analysis my & competitors site?
Hello All, I am really confused with my current architecture for Ecommerce site, can you please suggest any tool or software where I can analysis mine and competitors site architecture ? Thanks!
On-Page Optimization | | pragnesh96390 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
Is tracking code added to the end of a URL considered duplicate content
I have two URLs one with a tracking coded and one without. http://www.towermarketing.net/lets-talk-ux-baby and http://www.towermarketing.net/lets-talk-ux-baby/**#.U6ghgLEz64I ** My question is will this be considered as two separate URLs, will Google consider this as two pages with duplicate content. Any recommendations would be much appreciated.
On-Page Optimization | | TowerMarketing0 -
Duplicate content, is it ever ok?
I am building a large site for a client who sells physical products. I am using WordPress as my CMS (as a piece of background information). There are a few products that need to be listed in the sites hierarchy in multiple locations as such: Parent A Child 1 Parent B Child 2 Child 3 Parent C Child 1 I am concerned that having a product exist in multiple instances will cause indexing problems for that product. I can't be the only person to come across this issue, would love some feedback on the best practices for such an issue. Thanks in advance
On-Page Optimization | | Qcmny0 -
Duplicate Content - Category Pages 2+
I have my Wordpress SEO settings to deindex past page 1 of each category. However, Google Webmasters is selling me I have 210 pages with duplicate title tags. My site tanked last weekend and I don't know if it was Google Panda or what. I have been getting some fantastic backlinks and it seems like they just decided to disregard all of them as I am completely off the SERPs. Is this duplicate content a contributing factor? How can I get google to deindex my category pages past page 1? (I do need the first page to index as that does bring me organic traffic) Thanks.
On-Page Optimization | | 2bloggers0 -
How to avoid duplicates when URL and content changes during the course of a day?
I'm currently facing the following challenge: Newspaper industry: the content and title of some (featured) articles change a couple of times during a normal day. The CMS is setup so each article can be found by only using it's specific id (eg. domain.tld/123). A normal article looks like this: domain.tld/some-path/sub-path/i-am-the-topic,123 Now the article gets changed and with it the topic. It looks like this now: domain.tld/some-path/sub-path/i-am-the-new-topic,123 I can not tell the writers that they can not change the article as they wish any more. I could implement canonicals pointing to the short url (domain.tld/123). I could try to change the URL's to something like domain.tld/some-path/sub-path/123. Then we would lose keywords in URL (which afaik is not that important as a ranking factor; rather as a CTR factor). If anyone has experiences sharing them would be greatly appreciated. Thanks, Jan
On-Page Optimization | | jmueller0 -
Is This Duplicate Content Hurting Our SERPs?
We sell 1000s of audio book title, many of which are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher. Currently each title has its own page but the basic description of the title (story) is the same. Here is an example title that is offered in three formats. 44 Charles Street - Danielle Steel - abridged CD audiobook 44 Charles Street - Danielle Steel - MP3 CD audiobook 44 Charles Street - Danielle Steel - CD audiobook Each of the above pages has a different page title, a different URL, a different meta description however much of the body (from [Listen to a FREE Audio Clip] down is the same. Is this duplicate content hurting our SERPs?
On-Page Optimization | | lbohen1 -
Site Architecture: How do I best Optimize for Similar Keywords?
Hello Moz Community! I'm really struggling trying to decide on an improved site architecture. I run an online proofreading & editing website. This leaves us targeting many different niche keywords. For example: blog editing/proofreading, essay editing/proofreading, book editing/proofreading, resume... you get the point. I feel like editing & proofreading are similar enough to target on the same page(s). However, the issue is that I'm also having to deal with what I'm calling derivative keywords. For example, when I try to optimize for 'essay editing/proofreading', I also have to think about: paper editing, paper editor, paper correction, edit my paper, etc. I would have no problem optimizing the page for 'essay editing' in the title, H1, etc. and then targeting these words as secondary keywords within the body text, etc., however, I keep thinking 'a large slice of a small pie is better than a small slice of a big one.' You see, the keyword 'essay correction' has only about one-third the monthly searches as 'essay editing', but it is 50% less competitive. The same is loosely true for the rest of the 'derivative' keywords. I'd have no problem building specific pages for these derivative keyword groups, however, I'm very concerned how this would effect my site from a user experience perspective. I don't want to have a master "services" page with links to book editing, resume editing, essay editing, etc. and then also show paper editing, essay correction, etc. To me, this would be confusing... "What's the difference between essay editing and paper editing?". Any guidance is much appreciated. This has got my head spinning! Thanks!
On-Page Optimization | | TBiz0