"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
-
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines).
Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area.
Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies.
I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics).
Questions Assuming general on-page optimization and linking factors are equal:
- Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)?
If I choose to differentiate each client's website, how much differentiation makes sense? Specifically:
-
Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'?
-
Are images as important as copy in differentiating content?
-
From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)?
Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names.
Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent.
In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions.
Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
-
so since you're doing what are all the right things generally, then I'd recommend looking at what the inbound link quality/volume/diversity is for various sites you have compared to their individual market competitors. Beyond that it would need to be a case by case evaluation to better nail down issues/problems.
On a final note, social's become a big signal and should be highly encouraged as well... (twitter engagement for example), though I know it's a challenge in that type of market.
-
Hi Alan,
The template site is fairly basic static html, address/contact info is repeated on every page in an 'About Us' sidebar box and prominent phone numbers throughout, also a 'Service Area' table that lists cities is on every page. The site in total is about 27 html pages at average ~25KB a page.
We could definitely differentiate the image alt tags further.
Geographic information is included in title tags for home page and all service-offered related pages, but not in title tags for pages like 'privacy policy.'
Google Places, Yelp, Yahoo/Bing Local etc. are all in place.
Thank you for your feedback!
-
When you ask about the templatized repetitiveness, I need to wonder how much code exists underneath the visible content. If there is an overwhelming ratio of code to on-page content, this can, by itself, negatively impact a site's uniqueness if there are dozens, hundreds, or thousands of identical templates, however it should be a minor concern if there's enough unique content specific to geo-location and individual site owner.
So for example, is geographic information included in every page title and within every page's content? Are site owners able to include their own unique image alternate attribute text? Is their address and contact info on every page? Do they have their own Google Place pages (properly optimized, and pointing back to their site's contact page? Or do they even also have Yelp, CitySearch, Bing Local or Yahoo local listings similarly set up?
All of these can help.
As far as the template repetition, if the rest of the above is all properly utilized, it shouldn't be a major problem, so I'd start looking at those considerations and go from there.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Buying a disused website and using their content - penalty risk?
Hi all, I'm in the process of setting up a new website. I have found various old websites covering a similar topic and I'm interested in purchasing two of these websites for their content as it is very good, despite those sites struggling to make ends meet. One of these websites is still live, the other one hasn't been live for 2 years. Let's say I bought these websites for their content, then used that content on my new domain and made sure the two websites where this content came from were offline, would I run a risk of getting penalised? Does Google hold onto content from a website even if it is now offline?
Intermediate & Advanced SEO | | Bee1590 -
Google WMT/search console: Thousands of "Links to your site" even only one back-link from a website.
Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf
Intermediate & Advanced SEO | | vtmoz0 -
B2B site targeting 20,000 companies with 20,000 dedicated "target company pages" on own website.
An energy company I'm working with has decided to target 20,000 odd companies on their own b2b website, by producing a new dedicated page per target company on their website - each page including unique copy and a sales proposition (20,000 odd new pages to optimize! Yikes!). I've never come across such an approach before... what might be the SEO pitfalls (other than that's a helluva number of pages to optimize!). Any thoughts would be very welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
SEO Strategy for Microsite
I am working on a project to build a microsite of sorts that will represent a joint program between two large organizations with established web presences and strong domains. Each of the organizations has dedicated sections on their sites speaking to the program, but the leadership has decided the joint program deserves it's own site with dedicated content. The two larger sites perform very well for SEO, and I don't necessarily want to jeopordize thir rankings by delivering content that competes directly with them. So I am doing some keyword research to find some opportunities that will alllow me to use the new site to target keywords not yet being captialized by the larger sites. My grand scheme is to have the three sites targeting the broadest array of keywords possible, thus maximizing exposure and avoiding competition. Here is the rub: the content between the three sites will be different but very similar, and there will be plenty of cross linking, especially from the existing sites to the new site, as we grow the brand of the joint program. I'm curious to here some expert opinions on what the puitfalls of the strategy are and what are some of the things I can do to avoid falling in the black hat category - I recognize that proliferating sites around a single topic and cross linking them is black hat. The organizations simply want to build a brand around a joint program and we are striggling to do that without a dedicated website.
Intermediate & Advanced SEO | | AmyLB0 -
Are there any SEO Tips before killing a website?
Hey guys, My company acquired another company, and after a couple of months we decided to completely kill their website. I'm not finding any info about SEO best practices for this type of situation. From the "switching domains" and "new sites" articles and blog posts I can extrapolate that I should: 301 redirect their home page to ours Look at specific pages with good authority that relate to our pages and 301 them. Look at the strongest backlinks to their site and try to change them to point to our site. Create a 404 page for the rest of their webpages that tells them that we acquired the company (hopefully with a main menu and search bar) Any other suggestions?
Intermediate & Advanced SEO | | nrv0 -
Local Business schema / markup
What markup should local businesses employ on their website? I'm aware of the newer schema.org markup but does Google still use rich snippets, geotags, etc.?
Intermediate & Advanced SEO | | BryanPhelps-BigLeapWeb0