Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
-
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in.
They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area.
I'm told this was a SEO tactic circa 2001. Obviously this is an issue.
So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages).
Creating "real and meaningful content" on each page isn't quite an option, sorry!
-
Hi James,
The way this was done is not good, but the idea is right. If the client has 14 different offices, it is a standard best practice to create a unique page for each of them, and, yes, that means unique content. too. Rather than dealing with re-directs and rels, my advice is to reconsider your stance on creating meaningful content for each page. You just need to find a unique story to tell for each page. If the client is a service business, then doubtless, he's got a unique story from each service area to share. Put a minimum of 400 engaging words together, put the complete NAP(name, address, phone), a map of the service area, 5-10 local testimonials, maybe a video of the client at work in the local area and, bingo, you've now got a strong page that's been locally optimized for the community it serves.
Are you sure this isn't an option? It will take some effort/funding, but the business needs to understand the importance of taking a high quality approach to presenting themselves on the web, or they are going to be overrun by competitors who are willing to make these efforts.
-
Hi James,
you in fact come up with the answer I would have suggested.
One thing to think about is the purpose of why these pages come into existence and why it was a tactic 10+ years ago...
The reason I would think is.. keywords.
Croydon Lift Repair
Essex Lift Repair
London Lift Repair
....Depending on the clients website and using my example service, if they happened to have a very strong presence for the keyword lift repair, then you probably could easily incorporate those longer keywords London Lift Repair onto the main lift repair page and simply canonical the subpages.
If they don't have a strong presence on the root keyword (lift repair) but do show good results for the longer keyword results (London Lift Repair), then all you have to really worry about is 14 pages of content. Which in the scheme of a website is not that much.
I would think that it would be a good idea to include, content, physical address's phone numbers, map results on this page. Also if possible, contact some of the clients and ask them if it is okay get their opinion and thoughts on the service that was provided and use that as "What Others Are Saying". section on these pages. If they are hesitant maybe offer their business a free follow link, which could sway them in the right direction. From there you could talk about the unique things about each branch, could be employees, service equipment, local community involvements, clients, high profile contracts etc...)
The short is you're on the right track and I hope my thoughts helped you get on.
Don
-
Re-reading http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
I don't think I have any choice except to use rel=canonical tags and work at improving page content. Adapting this question to ask the speculative "how much does duplicate content hurt my ranking"
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with rel=canonical when using POST parameters
Hi there,
On-Page Optimization | | mjk26
I currently have a number of URLs throughout my site of the form: https://www.concerthotels.com/venue-hotels/o2-academy-islington-hotels/256133#checkin_4-21-2024&checkout_4-22-2024&rooms_1&guests_2&artistid_15878:256133 This sends the user through to a page showing hotels near the O2 Academy Islington. Once the page loads, my code looks at the parameters specified in the # part of the URL, and uses them to fill in a form, before submitting the form as a POST. This basically reloads the page, but checks the availability of the hotels first, and therefore returns slightly different content to the "canonical" version of this page (which simply lists the hotels before any availability checks done). Until now, I've marked the page that has had availability checks as noindex,follow. But because the form was submitted with POST parameters, the URL looks exactly like the canonical one. So the two URLs are identical, but due to POST parameters, the content is slightly different. Does that make sense? My question is, should both versions of this page be marked as index,follow? Thanks
Mike0 -
Canonicals
I dynamically generated pages using rewrite functions in wordpress (new-york, san-diego, san-francisco). All these pages look the same but with different content. yoast (seo wordpress plugin) was unaware of this and set canonicals up relating to the wordpress page used as the template page for the dynamic pages (City_home_page). so all these pages had the canonical https://dinnerdancecruises.com/City_Home_Page. using search console, i saw google indexed my site, looked at all these dynamically created pages (which is about 30 pages) and took them all in as duplicate pages. this happen sometime in april or may. I fixed this problem and set unique canonicals up for each dynamically created page. but now google is not crawling them for some reason. im not sure why its been months and these pages are not indexed. i thought to myself is it because these links end up on multiple pages? sort of like having "terms of agreement" link at the footer. every single page has that terms of agreement link. does google look at those links as duplicates and not index the page at all. this is where my issue lies. i need google to crawl regularly and see those pages with their new, unique canonicals and re-index them correctly. but it seems to save cpu resources, google feels once a thief always a thief. i could be wrong but this is why i need your suggestion. thank you.
On-Page Optimization | | bobperez7360950 -
Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites). I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors. It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble. Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do? Thanks Moz community!
On-Page Optimization | | paulz9990 -
Would you consider this to be thin content
I always struggle with these pages I have on my site going back and forth debating what I want to do with them. On one side Google was content, yet at the same time its all about user experience. http://www.freescrabbledictionary.com/word-lists/words-that-start-with/letter/h/ I used to have all my words listed on one page which could have been well over 10,000. Now I pagination them as you can see. I debate writing a header of content for these pages, but honestly users just want the words. Get in, get what you need and get out. What is the recommendation on these pages. Should I write content? Should I not?
On-Page Optimization | | cbielich0 -
How do I fix duplicate page issue on Shopify with duplicate products because of collections.
I'm working with a new client with a site built on Shopify. Most of their products appear in four collections. This is creating a duplicate content challenge for us. Can anyone suggest specific code to add to resolve this problem. I'm also interested in other ideas solutions, such as "don't use collections" if that's the best approach. I appreciate your insights. Thank you!
On-Page Optimization | | quiltedkoala0 -
Ecommerce, Adding Content To Categories/Product Pages
In an eCommerce store, when is it appropriate to add quality category pages content and when is it more appropriate to add content to the actual product pages instead?
On-Page Optimization | | BobGW0 -
301 to Intermediate Page then Rel=Canonical from Intermediate to target page
Hi I'm working on an eCommerce site and don't have direct access to the CMS. I had requested developers to provide me a facilty to 301 via htaccess however this is working slight differently. I need guidance from experts whether it's okay or not: Old Page: example.com/old Target New Page: example.com/new After Implementing the redirect, It redirects to an intermediate page or in other words, The same target URL with a question mark added: example.com/new? (notice the question mark in the new URL) This intermediate page has a canonical tag for the exact target URL. So, if I 301 redirect example.com/old to example.com/new? (Intermediate page) and If the intermediate page example.com/new? has a canonical tag for the exact target URL (example.com/new), Will I be able to pass the link juice and authority of old page to the new page?
On-Page Optimization | | Ankkesh0 -
Duplicate content problem
I am having an issue with duplicate content that I can't seem to figure out. I got rid of the www.mydomain.com by modifying the htaccess file but I can't figure out how to fix theproblem of mydomain.com/ and mydomain.com
On-Page Optimization | | ayetti0