Canconical tag on site with multiple URL links but only one set of pages
-
We have a site www.mezfloor.com which has a number of Url's pointing at one site. As the url's have been in use for many years there are links from many sources include good old fashioned hard copy advertising. We have now decided that it would be better to try to start porting all sources to the .co.uk version and get that listing as the prime/master site.
A couple of days ago I went through and used canonical tags on all the pages thinking that would set the priority and that would also strengthen the page in terms of trust due to the reduced duplication. However when I went to scan the site in MOZ the warning that the page redirects came up and I am beginning to think that I need to remove all these canonical tags so that search engines do not get into a confused spiral where we loose the little page rank we have.
Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at.
-
Yes, it is good when there is a clear Google guideline to follow. I'm happy for your quick win!
-
Thanks
I am pleased I do not have to go through the whole site again and even more pleased as I have a number of other sites to work on.These could certainly do with a bit of a boost and this is a quick win.
-
So you want to put a canonical of www.b.co.uk/index.html on a page that can be reached via www.b.co.uk/index.html and you are worried that it will become a loop?
Don't worry. Google specifically thought about the possibility that people might use self-referential canonicals (SEO plugins do it all the time) and engineered it so that this does not cause a loop. (See Matt Cutts on the topic.)
I myself inherited some ugly urls for which I made nice user-friendly aliases and I tagged those pages with the friendly canonical. There were no problems and the pages started doing much better. (In my case it was not cross-domain, but cross-domain canonicals are supposedly supported and in fact I have succesfully used them in other situations.)
-
Hi thanks for the response
The issue is we have one set of pages on a server which is addressed through several different url's.
I never got involved in the server side of things so I do not know if that was by redirects at the route URL. Just maybe I am trying to add canonical links that just are not required.
If I have www.a.co.uk/index.html, www.a.com/index.html, www.b.co.uk/index.html and want them all to point to www.b.co.uk/index.html. As index.html is on the server once then my thought was that I should have a canonical link to that page from within that page with the www.b.co.uk/index.html as the route. This may be right or wrong but there is the risk that a spider stops when it gets to the link and goes to the start of the same page, again and again in a loop.
You are of course right that the Google bot should be OK with this but the Moz bot stopped in its tracks and asked if I wanted the page indexed so I had to do this manually.
Gut feel says I should remove the links for now but need to understand what we did server side. Gut feel maybe wrong and I would prefer to do the right thing!
-
Okay you lost me a little but let me see If I can help.
First off the canonical tag - Its fantastic for duplicate content (even across other sites) now so good if you don't have duplicate content.
301's - It's very similar to above can work well with duplicate content but not essential. Now you can 301 a few pages into one page so if a user types a URL in (or even has it as a bookmark etc.) the will land on the page you want. its normally a good idea to 301 into similar pages to you don't get users thinking they are going to buy (e.g.) a pair of boots and land on a page about t-shirts.
Google getting lost - Don't worry about Google getting lost, if a user can get around so can Google, plan plan and plan again if you plan it all out (you can even draw flow diagrams) so you know where its all going to and from until you are happy. You can also get someone who doesn't know your site to test it see if they get lost.
Hope that background helps a bit, you lost me here-
"Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at."
Why can't you redirect all your pages to the target URL ?
One helpful tool I recommend is screaming frog it can help you pick up redirects 404 etc.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Broken URL Links
Hi everyone, I have a question regarding broken URL links on my website. Late last year I move my site from an old platform to Shopify, and now have broken URL links giving out 4xx errors. When I look at Moz Pro>Campaigns>Insights>links, I can see the top broken URL links, however there is a difference if copy & paste URL directly from Moz Pro and by Export CSV file. For example below, If I copy and paste links direct from Moz Pro, it has the “http://” in front as below: http://www.thehairhub.com.au/WebRoot/ecshared01/Shops/thehairhub/57F3/1D8F/D244/C675/E27D/AC10/003F/35AD/manic-panic-colours.jpg But when I export the list of links as an CSV file, the http:// is removed. www.thehairhub.com.au/WebRoot/ecshared01/Shops/thehairhub/57F3/1D8F/D244/C675/E27D/AC10/003F/35AD/manic-panic-colours.jpg Another Example below: By copy & paste URL direct from Moz Pro
Technical SEO | | johnwall
http://thehairhub.com.au/Shop-Brands/Vitafive-CPR/CPR-Rescue By export CSV file.
thehairhub.com.au/Shop-Brands/Vitafive-CPR/CPR-Rescue Which one do I use to enter into the “Redirect From” field in Shopify URL Redirects? Do I need to have the http:// in front of the URL? Or is it not required for redirects to work? Kind Regards, John Wall
The Hair Hub0 -
Yoast settings for ecommerce site
Hello, I can't find the answer anywhere so I wonder if someone here could help? The ecommerce site I have has Yoast and Woocommerce installed. The Post Types tab under Titles and Metas has various options: Posts, pages, media, products, gift cards. There is also custom post type archives for products and gift cards. Should i noindex the media and also the custom post type archives for product and gift cards and if so why? What about the taxonomies for ecommerce? What's best practise? Noindex? I understand the settings for Yoast when its not an ecommerce site but this has kind of thrown me. Thanks
Technical SEO | | AL123al0 -
Transferring link juice on a page with over 150 links
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks
Technical SEO | | flo20 -
Are my Domain URLs correctly set up?
Hi Im struggling with this probably easy concept, so I am sure one of you guys out there can answer it fairly easy! My website is over50choices.co.uk and whilst using the free tools in Majestic it said that I had: 77 Referring Domains pointing to www.over50choices.co.uk and only 35 pointing to www.over50choices.co.uk/ And in Moz it said: The URL you've entered redirects to another URL. We're showing results for www.over50choices.co.uk/ since it is likely to have more accurate link metrics. See data for over50choices.co.uk instead? Does this mean that my domains arent set up correctly and are acting as separate domains - should one be pointing to the other? Your help appreciated. Ash
Technical SEO | | AshShep10 -
Too Many On-Page Links on a Blog
I have a question about the number of on-page links on a page and the implications on how we're viewed by search engines. After SEOmoz crawls our website, we consistently get notifications that some of our pages have "Too Many On-Page Links." These are always limited to pages on our blog, and largely a function of our tag cloud (~ 30 links) plus categories (10 links) plus popular posts (5 links). These all display on every blog post in the sidebar. How significant a problem is this? And, if you think it is a significant problem, what would you suggest to remedy the problem? Here's a link to our blog in case it helps: http://wiredimpact.com/blog/ The above page currently is listed as having 138 links. Any advice is much appreciated. Thanks so much. David
Technical SEO | | WiredImpact0 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this: . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0 -
Multiple URLs and Dup Content
Hi there, I know many people might ask this kind of question, but nevertheless .... 🙂 In our CMS, one single URL (http://www.careers4women.de/news/artikel/206/) has been produced nearly 9000 times with strings like this: http://www.careers4women.de/news/artikel/206/$12203/$12204/$12204/ and this http://www.careers4women.de/news/artikel/206/$12203/$12204/$12205/ and so on and so on... Today, I wrote our IT-department to either a) delete the pages with the "strange" URLs or b) redirect them per 301 onto the "original" page. Do you think this was the best solution? What about implementing the rel=canonical on these pages? Right now, there is only the "original" page in the Google index, but who knows? And I don't want users on our site to see these URLs, so I thought deleting them (they exist only a few days!) would be the best answer... Do you agree or have other ideas if something like this happens next time? Thanx in advance...
Technical SEO | | accessKellyOCG0