Duplicate Content - Local SEO - 250 Locations
-
Hey everyone,
I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc.
I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates.
So here's my question:
If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct?
Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations.
I really appreciate any insight!
Thank you,
-
** I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.**
If Google sees pages on your site that are substantially duplicate. It will filter all but one of them from the SERPs.
** is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.**
Yes. The reward is enormous. Ha.
-
Hey There!
In my view, the client has 2 options here:
-
Spring for unique content on the 250 site
-
Reconsider his decision about bringing everything into a single site. The question you've asked (can you really write about the identical service 250 times) is exactly why he should see his strategy is cumbersome. Ideally, you'd have a good handful of unique pages describing benefits of the service and would then have 250 semi-unique pages on the website, one for each physical location.
-
-
Hi SEO Team @ G5!
Since you are unable to create one large domain that houses all of the locations, I would attempt to make each of the websites as "unique" as possible. But keep in mind that unique content doesn't necessarily mean that you need to completely reword the content in different ways 250 times. Small changes can make a big difference.
There's a great (and short) video of Google's Matt Cutts talks about how Google handles duplicate content. There's also another helpful video about it here.
Matt Cutts has said, "Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it’s just one piece of content. So most of the time, suppose we’re starting to return a set of search results and we’ve got two pages that are actually kind of identical. Typically we would say, “OK, rather than show both of those pages since they’re duplicates, let’s just show one of those pages and we’ll crowd the other result out,” and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, “OK, I want to see every single page” and then you’d see that other page. But for the most part, duplicate content isn’t really treated as spam. It’s just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen."
Read more from this article here: https://searchenginewatch.com/sew/news/2319706/googles-matt-cutts-a-little-duplicate-content-wont-hurt-your-rankings
With this in mind, I do think your assumption is correct. If you make sure that any location that could be seen as competing areas has unique content, they won't necessarily be dinged for duplicated content. Unless you were trying to rank nationally, this shouldn't be a major problem for each individual website that is targeting a different location.
-
Thanks for your response. We would love to move to a single-domain, but unfortunately the client won't allow us to make that change.
I agree that ideally all 250 locations would have unique content, but I was just curious if anyone knew if the duplicate content would suppress traffic for locations that aren't in the same city.
Also, my other concern is; is it even possible to re-write the same 750 word service page "uniquely" 250 times? Ha.
-
I would also make them into one big website.
But at the same time, I would have full unique content for each of the 250 locations. I know that sounds like a huge expense and a lot of work, but any company who has the resources to support 250 locations can support the small expense of unique content for each of them.
-
I completely understand where you are coming from, but I can only advise that you scrap all of the individual sites and make them into one big website. I know that sounds easier than it really is and there are most likely some complications that prevented them from doing it in the first place but it really is the best thing to do.
I do believe that the duplication will still matter, even if you only have one office/store in that location.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Service Location links in footer and on the service page - spamming or good practice?
We are are a managed IT services business so we try and target people searching for IT support in a number of key areas. We have created individual location pages (11) to localise our service in these specific areas. We put these location links in the footer which went to the specified IT support pages respectively. Now we have created a general 'managed IT services' page and are thinking of linking to these specific pages on there as well as it makes sense to do it. Would having these 11 links in the footer as well as on the 'managed IT services' page be spamming? or would it be good practice? If this is spamming, which linking location should hold preference. Would appreciate the feedback
Local Website Optimization | | AndyL93
Thanks
Andy0 -
How many SEO clients do you handle?
I work in a small web & design agency who started offering SEO 2 yrs ago as it made sense due to them building websites. There have been 2 previous people to me and I now work there 3 days a week and they also have a junior who knew nothing before she started working for us. She mainly works for me. My question is, how many clients do you think would be reasonable to work on? We currently have around 55 and I have been working there for nearly 5 months now and haven't even got to half of the sites to do some work on. I've told them the client list is way too big and we should only have around 15 clients max. However they don't want to lose the money from the already paying clients so won't get rid of any and keep adding new ones Their systems were a mess and had no reporting or useful software so I had to investiagte and deploy that, along with project management software. Their analytics is also a mess and have employed a contractor to help sort that out too. It's like they were offering SEO services but had no idea or structure to what they did. Meta descriptions were cherry picked which ones to be done, so say 50/60 on a site not filled in. So it's not like I have 45 or so well maintained accounts. They're all a mess. Then the latest 10 new ones are all new sites so All need a lot of work. I'm starting to feel incredibly overwhelmed and oppressed by it all and wanted to see what other SEO professionals thought about it. Any thoughts would be appreciated.
Local Website Optimization | | hanamck0 -
Hreflang | Should I implement hreflang for regional targeted but - different content of websites?
Hello, I'm implementing hreflang for my e-commerce websites which have different languages and do serve different content based on location. Currently, I'm only using hreflang for for alternate language (fr-fr, fr-be, fr-ma, ...). I wonder if it might be better or if I am allowed to add other version of my websites (IT, ES, DE,... ) even if those version are serving specific content for these specific location. So, the content (products) of Germany is different of the product of the other countries. Here is an example : www.mywebsite.com/apple-phone (selling apple phone for US with product avalaible only in US). www.mywebsite.de/apple-phone (selling apple phone for Germany with product avalaible only in Germany, the available models might be different from US and other websites). www.mywebsite.it/apple-phone (selling apple phone for Italy with product avalaible only in Italy, the available models might be different from US and other websites). www.mywebsite.es/apple-phone (selling apple phone for Spain with product avalaible only in Spain, the available models might be different from US and other websites). www.mywebsite.pt/apple-phone (selling apple phone for Portugal with product avalaible only in Portugal, the available models might be different from US and other websites).
Local Website Optimization | | manoman880 -
Applying NAP Local Schema Markup to a Virtual Location: spamming or not?
I have a client that has multiple virtual locations to show website visitors where they provide delivery services. These are individual pages that include unique phone numbers, zip codes, city & state. However there is no address (this is just a service area). We wanted to apply schematic markup to these landing pages. Our development team successfully applied schema to the phone, state, city, etc. However for just the address property they said VIRTUAL LOCATION. This checked out fine on the Google structured data testing tool. Our question is this; can just having VIRTUAL LOCATION for the address property be construed as spamming? This landing page is providing pertinent information for the end user. However since there is no brick and mortar address I'm trying to determine if having VIRTUAL LOCATION as the value could be frowned upon by Google. Any insight would be very helpful. Thanks
Local Website Optimization | | RosemaryB1 -
Listing multiple schema Things (e.g. Organization, LocalBusiness, Telephone, Locations, Place, etc)
Greetings All, My law office features many pages with what are essentially directory listings (names, addresses, and phone numbers of places, agencies, organizations that clients might find helpful). Am I correct in assuming that using schema for each of these listings might cause confusion for search engines? In other words, are search engines looking for schema on pages or sites to tell them only about the company running that page or site, or do search engines appreciate schema markup to tell them about all the pieces of content on the pages or that site?
Local Website Optimization | | micromano0 -
Will NAP Schema Impact non local searches
Hi, Just got a business address and a toll free number for my website. I have read that adding the NAP details schema to the site gives that additional weight of trust to Google and also helps local search. Now my website is NOT local. However, if I add my LA address details on my website using the Local Business schema.org, it might give Google the impression that I am based out of CA. Fair enough, but my question is, will it impact negatively for SERPs from other states. For example I might want to rank for KW "Autism Alternative Treatment". Obviously now that I have added my NAP, if someone keys in Autism Alternative Treatment LA or Autism Alternative Treatment CA, google should give my site preference. But if someone searched Autism Alternative Treatment Arizona, will google exclude/downgrade me (even though there may not be a local site for Arizona) from the search results under the pretext that I am not Arizona based? Your suggestion would be very helpful.
Local Website Optimization | | DealWithAutism0 -
Separate Domains for Different Locations (in Different Cities)
We are in the process of building a new website for a client with locations in Tucson and Phoenix. Currently, they have one website that encompasses all locations, however, we are going to build them location specific websites (as many of the services are different between locations). Now my question is, as far as SEO goes, which one of these options would be the best? Option 1: Have separate domain names for each location. For example, StevesPetTucson.com and StevesPetPhoenix.com. _Pros: Easy to target specific, local keywords. Better looking domains. _ _Cons: Splits backlinks between two domains. _ Option 2: Setup StevesPet.com/Phoenix and StevesPet.com/Tucson. Pros: Keeps all backlinks pointing to one root domain. Note: We are going to use seperate WordPress installs for both websites, regardless of how we setup the domains. As we will be using different templates, menus and so on, we found this to be the best option. Thanks for any advice!
Local Website Optimization | | McFaddenGavender1