Canonical for 80-90% duplicate content help
-
Hi . I seem to spend more time asking questions atm.
I have a site I have revamped www.themorrisagency.co.uk
I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg:
http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to :
http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc
So I am going through this slow but essential process atm.
I have a main http://www.themorrisagency.co.uk/band-hire/ page
My question is:
Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/
Rather than remove them
What are your thoughts as I am aware that the damage using a rel= could make it worse.
Thanks as always
Daniel
-
OK Seen Video thanks Wiqas, understand that Rel is a thorny thing. So I am not going to opt for that Brett thanks!
Suggestion is to no index /no follow until it is made unique and concentrate on the most imortant pages first and then submit those bit by bit. Necessary ?
Is it more beneficial to remove crap pages than to have them there even though they are indexed albeit at a low level.
-
Did you saw this Video by Rand : http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls
It will give you Quite a Good idea What to Do..
-
Hi I would be mindful of using rel.
I would use a simpler more logical way and restructure the pages with original content and just use 301 redirection. This passes Googles website design guidelines available here:-
https://support.google.com/webmasters/answer/93633?hl=en
In particular:-
- You're merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.
Which is essentially what you have done/doing
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
What to do with localised landing pages on listings website - Canonical question
Hi Run a pet listings website and we had tonnes of duplicate content that we have resolved. But not sure what to do with the localised landing pages. We have everything pointing back back to the main listings URL http://www.dogscatsandpets.co.uk/for-sale-stud-and-adoption/ but haven't pointed the URLs that show pets for specific towns and cities eg http://www.dogscatsandpets.co.uk/for-sale/dogs-and-puppies/in-city-of-london/ back to the main url. Obviously this is giving us duplicate content issues, but these pages do rank in local search and drive traffic into the site. So my question is should we canonicalise the local pages back to the main url and if we do will this mean our local landing pages will no longer rank? Is there any alternatives?
Local Website Optimization | | dogscatsandpets0 -
Implementation advice on fighting international duplicate content
Hi All, Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation. The situation is that we have 5 sites with similar content. Out of these 5: 2 use the same URL stucture and have no suffix 2 have a different URL structure with a .html suffix 1 has an entirely different URL structure with a .asp suffix The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level). 4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content. Is there an easy way to go about this or is the only way a manual addition? Has anyone had a similar experience? Your advice will be greatly appreciated. Many thanks, Emeka.
Local Website Optimization | | OptiBacUK0 -
Is this an example of bad doorway pages or perfectly fine and helping users?
I'm asking because I want to do something similar. http://bit.ly/1puGXJu Imagine hundreds of pages like this, with the city names switched out. Since the inventory is different on each page, due to different inventory in different cities, are these pages not considered doorway pages and Google will probably be fine with them?
Local Website Optimization | | CFSSEO0 -
Had SEO Firm tell me to Start Over - pros and cons help please
Hi So I have quotes of 1250 to 2500 a month to run my website, seo wise. What I am told is they will do all facebook postings, 4 blog posts each month, some citations, and site optimization. Those amounts due seem like a lot. Yet I was last to start all over. Basically I was told that because of some bad backlinks, which only a few remain, that you can never recover from an algorithm penalty. And with a Disavow, its like telling Google - penalize me please So the plan was this: $3000 for a new site, and new domain, and then it has no penalties, and I will be ranking in no time. The problem is I am branded. My domain and business name is Bernese Of The Rockies. People know us and we are very respected. So if we create a new site like example.com, I do not want to mislead people. Or if there is a penalty for say a landing page or site, where I am sending people to my main site for more info type of thing. Just looking for your input if this is a common issue, where if you have a non manual, but algo penalty that you must restart? Thank you so much for your thoughts and suggestions.
Local Website Optimization | | Berner0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0