"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
-
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines).
Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area.
Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies.
I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics).
Questions Assuming general on-page optimization and linking factors are equal:
- Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)?
If I choose to differentiate each client's website, how much differentiation makes sense? Specifically:
-
Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'?
-
Are images as important as copy in differentiating content?
-
From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)?
Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names.
Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent.
In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions.
Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
-
so since you're doing what are all the right things generally, then I'd recommend looking at what the inbound link quality/volume/diversity is for various sites you have compared to their individual market competitors. Beyond that it would need to be a case by case evaluation to better nail down issues/problems.
On a final note, social's become a big signal and should be highly encouraged as well... (twitter engagement for example), though I know it's a challenge in that type of market.
-
Hi Alan,
The template site is fairly basic static html, address/contact info is repeated on every page in an 'About Us' sidebar box and prominent phone numbers throughout, also a 'Service Area' table that lists cities is on every page. The site in total is about 27 html pages at average ~25KB a page.
We could definitely differentiate the image alt tags further.
Geographic information is included in title tags for home page and all service-offered related pages, but not in title tags for pages like 'privacy policy.'
Google Places, Yelp, Yahoo/Bing Local etc. are all in place.
Thank you for your feedback!
-
When you ask about the templatized repetitiveness, I need to wonder how much code exists underneath the visible content. If there is an overwhelming ratio of code to on-page content, this can, by itself, negatively impact a site's uniqueness if there are dozens, hundreds, or thousands of identical templates, however it should be a minor concern if there's enough unique content specific to geo-location and individual site owner.
So for example, is geographic information included in every page title and within every page's content? Are site owners able to include their own unique image alternate attribute text? Is their address and contact info on every page? Do they have their own Google Place pages (properly optimized, and pointing back to their site's contact page? Or do they even also have Yelp, CitySearch, Bing Local or Yahoo local listings similarly set up?
All of these can help.
As far as the template repetition, if the rest of the above is all properly utilized, it shouldn't be a major problem, so I'd start looking at those considerations and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO and Website Redirection
Hey there Mozzers, If you have a website for example www.example.com and you wanted to target Australia and UK and you owned the .com.au and .co.uk. Would that be ok if everything redirected to the .com ? I know that having the .com.au is a signal for Google but the redirection is causing me troubles. Would that be huge of a difference if everything redirected to the .com version of the site?
Intermediate & Advanced SEO | | AngelosS0 -
New site, new URL, lots of custom content. Load it all or "trickle" it over time?
New site, new URL, lots of custom content. Load it all or "trickle" it over time? Would it make a difference in terms of ranking the site? Interested in your thoughts. Thanks! BBuck!
Intermediate & Advanced SEO | | BBuck0 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0 -
What counts as a "deeper level" in SEO?
Hi, I am trying to make our site more crawlable and get link juice to the "bottom pages" in an ecommerce site. Currently, our site has a big mega menu - and we have: Home > CAT 1
Intermediate & Advanced SEO | | bjs2010
SUBCAT 1
SUBSUBCAT 1
PRODUCT Our URL Structure looks:
www.domain.com/cat1/subcat1/subsubcat1/ and here are the links to the products but the URL's look like: www.domain.com/product.html Obviously the ideal thing would be to cut out one of the CATEGORIES. But I may be unable to do that in the short term - so I was wondering if by taking CAT1 out of the equation - e.g., just make it a static item that allows the drop down menu to work, but no page for it - Does that cut out a level? Thanks, Ben0 -
URL strategy mobile website
Hello everyone, We are facing a challenging decision about where our website (Flash Gaming website) is going. We are in the process of creating html5 games in the same theme of the flash games that we provide to our users. Now our main concern is to decide how to show this new content to the user? Shall we create brand new set of urls such as : http://www.mydomain.com/games/mobile/kids/ Or shall we adapt the main desktop url : http://www.mydomain.com/games/kids/ and show the users two different versions of the page depending on whether they are using a mobile device (so they see a mobile version) or a pc/laptop (so they a see desktop version). Or even redirect people to a sub-domain : http://m.mydomain.com/ The main idea we had is to keep the same url structure, as it seems that google is giving the same search results if you are using a mobile device or not. And creating a new set of urls or even a sub-domain, may involve a lot of work to get those new links to the same PA as the desktop URL that is here and know since a while now. Also the desktop page game should not be accessible to the mobile devices, so should this be redirected (301?) to the mobile homepage of the site? But how google will look at the fact that one url is giving 2 different contents, CSS etc, and also all those redirects might look strange... we are worried that doing so will hurt the page authority and its ranking ... but we are trying to find the best way to combine SEO and user experience. Any input on this will be really appreciated. Cheers,
Intermediate & Advanced SEO | | drimlike0 -
Pagination: rel="next" rel="prev" in ?
With Google releasing that instructional on proper pagination I finally hunkered down and put in a site change request. I wanted the rel="next" and rel="prev" implemented… and it took two weeks for the guy to get it done. Brutal and painful. When I looked at the source it turned out he put it in the body above the pagination links… which is not what I wanted. I wanted them in the . Before I respond to get it properly implemented I want a few opinions - is it okay to have the rel="next" in the body? Or is it pretty much mandatory to put it in the head? (Normally, if I had full control over this site, I would just do it myself in 2 minutes… unfortunately I don't have that luxury with this site)
Intermediate & Advanced SEO | | BeTheBoss1 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Duplicate content - canonical vs link to original and Flash duplication
Here's the situation for the website in question: The company produces printed publications which go online as a page turning Flash version, and as a separate HTML version. To complicate matters, some of the articles from the publications get added to a separate news section of the website. We want to promote the news section of the site over the publications section. If we were to forget the Flash version completely, would you: a) add a canonical in the publication version pointing to the version in the news section? b) add a link in the footer of the publication version pointing to the version in the news section? c) both of the above? d) something else? What if we add the Flash version into the mix? As Flash still isn't as crawlable as HTML should we noindex them? Is HTML content duplicated in Flash as big an issue as HTML to HTML duplication?
Intermediate & Advanced SEO | | Alex-Harford0