Many canonical warnings. Is this a problem?
-
My site has over 80 canonical warnings. The report states the url is for example http://www.musicliveuk.com and the 'tag value' column says http://www.musicliveuk.com/ Is that a good thing? I'm new to seo and am running my site on wordpress with all in one seo pack. Does this mean the seo pack has automatically added canonical tags to my pages? If so why is it showing as an error? I am also getting lots of 301 permanent redirects and I haven't set any up manually. I'm getting them for every page on my site from the normal url to a url with a slash at the end.
-
Pleasure
-
Indeed I did have two plugins running... DOH!
Thanks guys.
-
Do you have multiple SEO plugins running? Maybe the template has canonical settings out the box?
See lines 65 & 82on the home page.. there are 2.. I'm leaning towards template if I had to guess
-
Thanks guys. I also get warnings that some pages have more than one canonical tag. I don't add any manually and just use the all in one seo pack settings. How can this be and how do I fix it?
-
Agree with Vahe. Also, warnings are not necessarily errors but meant to raise flags for you to check the site to see if everything is meant to be there.
An incorrect rollout of canonicals (e.g. if every page on your site has the home page as the canonical) can result in a lot of pages being removed out of the index.
Regarding the 301s, check all pages for links to other internal pages & look at any links that have a trailing slash at the end and change to remove the trailing slash, e.g. these are 2 different URLs:
- http://www.musicliveuk.com/category/planning-events
- http://www.musicliveuk.com/category/planning-events/
Yet "/category/planning-events/" 301s to "/category/planning-events"
The non www version 301s to the www version too.. so check if there are any internal links to:
and change to:
-
There's nothing wrong. The wordpress SEO pack is actually doing the right thing to ensure search engines like Google see only one version (and the right version) of your website, this being with the www. with the /.
If your site didn't do what you had mentioned above, search engines would have indexed (listed) what they thought was the right version. There are also other several disadvantages to this:
(1) The page rank (domain authority) would be split between the www and non www versions of the site. Some would go even further and say that it would cause site duplication, which is not favoured by search engines.
(2) People linking to your site would not link to one URL version. Again this would cause spreading the link juice.
Since there are the proper 301 redirects on your site, no matter which version people link to, it will go back to the www version with the /. Just make sure that in Bing and Google Webmaster tools you also change your preferred domain settings to the www version.
Hope this helps,
Vahe
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you 301, 302, or rel=canonical private pages?
What should you do with private 'logged in' pages from a seo perspective? They're not visible to crawlers and shouldn't be indexed, so what is best practice? Believe it or not, we have found quite a few back links to private pages and want to get the ranking benefit from them without them being indexed. Eg: http://twiends.com/settings (Only logged in user can see the page) 302 them: We can redirect users/crawlers temporarily, but I believe this is not ideal from a seo perspective? Do we lose the link juice to this page? 301 them: We can do a permanent redirect with a short cache time. We preserve most link juice now, but we probably mess up the users browser. Users trying to reach a private page while logged out may have issues reaching it after logged in. **Serve another page with rel=canonical tag: **We could serve back the home page without changing the URL. We use a canonical tag to tell the crawlers that it's a duplicate of the home page. We keep most of the link juice, and the browser is unaffected. Yes, a user might share that different URL now, but its unlikely. We've been doing 302's up until now, now we're testing the third option. How do others solve this problem? Is there a problem with it? Any advice appreciated.
On-Page Optimization | | dsumter0 -
Too Many On-Page Links
Hi, I did a SEOmoz campaign and got results today, One of the results is Too "Many On-Page Links" when i am drilling down, i see that that's include inside links. for example, i sale food, i have my main department window - inside i have 30 products - each product is linked to a detailed page about the product. so automatically i have 30 links - not including all the others in this page, and i easily get over 100 and even sometimes 200 is this a big issue? does it damages my SEO? If yes, is there a way to write the HTML in a way that internal links like that wont be counted? Thank you SEOWiseUs
On-Page Optimization | | iivgi0 -
301 Problem
Hi Guys, Just have a small problem with a htaccess 301 redirect. I would like to 301: www.old-domain.com & old-domain.com to www.new-domain.com. The site is exactly the same with the same directory structure so i would also like all the subpages to work. Eg: www.old-domain.com/folder/page & old-domain.com/folder/page to www.new-domain.com/folder/page Many thanks
On-Page Optimization | | MSSTORAGE0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
To many links hurting me even though they are helping users
I have a scrabble based site where I function as a anagram solver, scrabble dictionary look up and tons of different word lists. In each of these word lists I link every word to my scrabble dictionary. This has caused Google to index 10018 pages total for my site and over 300 of them have well over 100 links. Many of them contain over 1000 links. I know Google's and SEOMOZ stance that anything over 100 will hurt me. I have always seen the warnings in my dashboard warning me of this but I have simply ignored it. I have posted on this Q and A that I have this issue, but IMO having these links benefit the users in the aspect that they don't have to worry about coping the text and putting it in the search box, they can simply click the link. Some have said if it helps the users then I am good, others have said opposite. I am thinking about removing these links from all these word lists to reduce the links per page. My questions are these. 1. If I remove the links from my page could this possible help me? No harm in trying it out so this is an easy question 2. If I remove the links then I will have over 9000 pages that are indexed with Google that no longer have a link pointing to them, except for the aspect that they are indexed with Google still. Is it going to hurt me if I remove these links and Google no longer sees them linked from my site or anywhere else?
On-Page Optimization | | cbielich0 -
0 urls indexed in GWT, many found with site: command
Hi, This is happening with a brand new site. We have created sitemaps and submitted them to Google Webmaster Tools. GWT says sitemaps are ok, "x" number of urls submitted, but no urls indexed. When I check in Google with site:domain.com I see that many of the urls are already indexed. Why this discrepancy between GWT and reality? Thanks for your time!
On-Page Optimization | | gerardoH0 -
Too many links on page has a state directory how do i lessen this?
I dont mean how do i take away links on the page but i know right off i am going to have at least 50 right off the bat and with my footer and menu bar and such im up over the limit of 100. My seo guy took care of it on my one site and dont know how Site with issue is www.Preexistingconditioninsuranceplan.com/pcip-states one that was fixed that doesnt appear to be too many on seomoz was http://criticalillnesspolicies.com/insurance-coverage/united-states/ if anyone can inform or decypher the code please let me know
On-Page Optimization | | lance010