"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
-
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines).
Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area.
Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies.
I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics).
Questions Assuming general on-page optimization and linking factors are equal:
- Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)?
If I choose to differentiate each client's website, how much differentiation makes sense? Specifically:
-
Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'?
-
Are images as important as copy in differentiating content?
-
From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)?
Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names.
Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent.
In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions.
Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
-
so since you're doing what are all the right things generally, then I'd recommend looking at what the inbound link quality/volume/diversity is for various sites you have compared to their individual market competitors. Beyond that it would need to be a case by case evaluation to better nail down issues/problems.
On a final note, social's become a big signal and should be highly encouraged as well... (twitter engagement for example), though I know it's a challenge in that type of market.
-
Hi Alan,
The template site is fairly basic static html, address/contact info is repeated on every page in an 'About Us' sidebar box and prominent phone numbers throughout, also a 'Service Area' table that lists cities is on every page. The site in total is about 27 html pages at average ~25KB a page.
We could definitely differentiate the image alt tags further.
Geographic information is included in title tags for home page and all service-offered related pages, but not in title tags for pages like 'privacy policy.'
Google Places, Yelp, Yahoo/Bing Local etc. are all in place.
Thank you for your feedback!
-
When you ask about the templatized repetitiveness, I need to wonder how much code exists underneath the visible content. If there is an overwhelming ratio of code to on-page content, this can, by itself, negatively impact a site's uniqueness if there are dozens, hundreds, or thousands of identical templates, however it should be a minor concern if there's enough unique content specific to geo-location and individual site owner.
So for example, is geographic information included in every page title and within every page's content? Are site owners able to include their own unique image alternate attribute text? Is their address and contact info on every page? Do they have their own Google Place pages (properly optimized, and pointing back to their site's contact page? Or do they even also have Yelp, CitySearch, Bing Local or Yahoo local listings similarly set up?
All of these can help.
As far as the template repetition, if the rest of the above is all properly utilized, it shouldn't be a major problem, so I'd start looking at those considerations and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
Intermediate & Advanced SEO | | JB19770 -
Our company creates cobranded subdomains for our clients; does that hurt our SEO?
We create cobranded websites for local businesses in many towns throughout the United States, always under a subdomain of our main site (e.g., afcreditunion.teachbanzai.com). Does this hurt our SEO rankings? We have a very specific reason for creating these microsites, because it's a high selling point. I've read and watched the material here on Moz regarding subdomains and subfolders, but it doesn't quite answer my question: since we create all these microsites not with the intent of passing authority to our website but with the intent of making their microsite have their branding.
Intermediate & Advanced SEO | | teachbanzai0 -
Will changing category URLs on site hurt SEO?
Hi Moz Community, We're looking to replace some URLs on our Wordpress site and I want to make sure we won't hurt our SEO with the changes. The site is lushpalm.com When we originally launched our site we created pages (which are linked to in our main menu) to essentially display our categories. We did this as a workaround because we didn’t like the URL to have the word “category” in it. Now we would like to make some changes and we want to make sure we’re not going to hurt our SEO in any way by accidentally duplicating content or otherwise. We want to fix our structure and now link to our category pages from our main menu, BUT we want to change the URL of the category page so that it doesn’t have “category” in it, essentially renaming it the name of the page currently linked to in our main menu. So basically, the category lushpalm.com/category/surf-trips, would be renamed with the URL lushpalm.com/surf-trips and the current page that is at lushpalm.com/surf-trips would be therefore replaced. My questions are: If we did this, would that mean that the previous “lushpalm.com/category/surf-trips” would cease to exist? Or is there some imprint of that out on the web? And if it is then would it re-direct to the new page? Would replacing the current page URL with a category hurt our current SEO in any way? Would this change cause any duplicate pages somehow? Thanks so much for your help!
Intermediate & Advanced SEO | | TaraLP1 -
Is a One Page Website template bad for SEO?
I have a website of a freelancer who is using a One Page template which includes the following section About Him Portfolio Resume I also got 5 sperate pages which are related to the keywords he wants to rank for. Will this be sufficient or should I suggest him to go for a separate website template?
Intermediate & Advanced SEO | | iamgaurav12900 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
What is the effect on using jQuery sliders for content on SEO?
I know using css in subversive manners gets you dinged for points. I didnt know if JS counted the same since you are essentially hiding parts of the content and showing it in intervals as slides. The goal would be having key items for a client in divs and rotating those divs via a slider plugin as slides. I was just curious if that effected things in any way. Thanks! ~Paul
Intermediate & Advanced SEO | | peb72680 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1