How does a sitemap affect the definition of canonical URLs?
-
We are having some difficulty generating a sitemap that includes our SEO-friendly URLs (the ones we want to set as canonical), and I was wondering if we might be able to simply use the non-SEO-friendly, non-canonical URLs that the sitemap generator has been producing and then use 301 redirects to send them to the canonical. Is there a reason why we should not be doing this? We don't want search engines to think that the sitemap URLs are more important than the pages to which they redirect.
How important is it that the sitemap URLs match the canonical URLs? We would like to find a solution outside of the generation of the sitemap itself as we are locked into using a vendor’s product in order to generate the sitemap.
Thanks!
-
Thank you for your responses.
We use Endeca, but while they have a site map generator, for whatever reason they are unable to produce URLs that match our new SEO-friendly vanity URLs. Right now we've had no site map for months, as we're waiting to try and find a solution to this problem.
From what I'm gathering, this is the right approach? As in, it would do more harm than good to upload a "bad" sitemap. Yes?
Also, there seems to be no way to get around this with a clever redirect scheme. Am I right in this also?
In which case, it may boil down to choosing between an accurate sitemap and SEO'd URLs. Not sure which would be more important.
Website's here, if that's useful: www.pli.edu
-
Bing has said that anything over 1% of bad URLs in a sitemap constitutes a dirty sitemap to them, so yes, it is very important.
Are you able to share the system that you're using? Others may have experience in working around this already.
-
It's extremely important the sitemap URLs match the canonical URLs that people arrive at. If they do not match the search engine will consider the sitemap "dirty" and not valuable as it is not accurate to the actual layout of the website.
Essentially, the search engines consider a sitemap URL that does not return an HTTP 200 status a bad URL and reject the sitemap. This is absolutely something that you should work to correct.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Removed URLs
Hi all, We have recently removed 200+ articles from our blog. However, those links are still being shown on Google weeks after their removal. In there a way to speed up the process? What effect will this have on our SEO ranking?
Technical SEO | | businessowner0 -
Canonical Expert question!
Hello, I am looking for some help here with an estate agent property web site. I recently finished the MoZ crawling report and noticed that MoZ sees some pages as duplicate, mainly from pages which list properties as page 1,2,3 etc. Here is an example: http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=2
Technical SEO | | artdivision
http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=3 etc etc Now I know that the best practise says I should set a canonical url to this page:
http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=all but here is where my problem is. http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 contains good written content (around 750 words) before the listed properties are displayed while the "page=all" page do not have that content, only the properties listed. Also http://www.xxxxxxxxx.com/property-for-rent/london/houses?page=1 is similar with the originally designed landing page http://www.xxxxxxxxx.com/property-for-rent/london/houses I would like yoru advise as to what is the best way to can url this and sort the problem. My original thoughts were to can=url to this page http://www.xxxxxxxxx.com/property-for-rent/london/houses instead of the "page=all" version but your opinion will be highly appreciated.0 -
Wordpress multilanguage sitemaps
Hi, I have a multilingual wordpress site. which is in Bulgarian and English - translated using qtranslate. The xml sitemap of the 2 languages is in one sitemap file- all the links for the Bulgarian and English version are in one file. (Our web is using this plugin - http://wordpress.org/extend/plugins/google-xml-sitemaps-v3-for-qtranslate Do you have any idea how can I make separate xml sitemap for every language? I ask you here because may be you have identical problems with your multilanguage wordpress website. You can see the sitemap with 2 languages links in one sitemap here: http://cholakovit.com/ sitemap.xml Cholakov IT I have read from this article that it is better practise and also it will help with geo-targetazing your web site: http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic
Technical SEO | | vladokan0 -
URL paths and keywords
I'm recommending some on-page optimization for a home builder building in several new home communities. The site has been through some changes in the past few months and we're almost starting over. The current URL structure is http://homebuilder.com/oakwood/features where homebuilder = builder name Oakwood Estates= name of community features = one of several sub-paths including site plan, elevations, floor plans, etc. The most attainable keyword phrases include the word 'home' and 'townname' I want to change the URL path to: http://homebuilder.com/oakwood-estates-townname-homes/features Is there any problem with doing this? It just seems to make a lot of sense. Any input would be appreciated.
Technical SEO | | mikescotty0 -
Magento URL Question
Calling all Magento Kings out there! I'm working on a client' site - powered by magento. I'm looking to rewrite a lot of the URLs. I know there is the URL rewrite tool, but I think what I need to do may go beyond this. Typical example would be: Old URL - http://www.xxxxxxxx.co.uk/fabric/product/product-black-screen-print-and-silver-fabric.html New URL - http://www.xxxxxx.co.uk/fabric/product/silver I know that magento's URLs seem to be created through categories so wanted to double check with someone the best way to do this. Also, I've heard that 301 redirects of non www to www in the .htaccess has a knock on effect on discounts? All comments greatly appreciated.
Technical SEO | | PerchDigital0 -
URLs: To Change or Not to Change
Hello, We recently launched a redesigned site in Drupal in December of last year. We are an eco-travel company. My current URL's look like this: /africa-and-middle-east/kenya-tanzania /central-south-america/galapagos-islands My pages have good term targeting grades, and the rankings for the terms we are targeting - "kenya and tanzania safaris" and "galapagos islands cruises" are decent, but not great - most are on page 2 or 3. The one URL where I targeted our most important term, "amazon river cruises," I am still on page 2. /central-south-america/amazon-river-cruises My questions are: Did I miss an opportunity with the rest of the URL's, and should I consider changing the rest to more targeted terms with 301s? Since the new site launched in January, perhaps I have not given enough time for my new URL's to index and mature. Would it be easier to set up landing pages with unique article content that targets terms such as "galapagos islands cruises" and "kenya and tanzania safaris"? If so, how can I do it in such a way as to not "compete" with the pages I want to drive them to? This also raises the question of redirecting the same URL twice i.e. I would have 2 redirects in place for the same url e.g. from the former site to the new site, and yet another redirect to the most-recent URL. Is that a problem? Sorry if I've asked too many questions in one post. 😉 Any advice appreciated.
Technical SEO | | csmithal0 -
Is the full URL necessary for successful Canonical Links?
Hi, my first question and hopefully an easy enough one to answer. Currently in the head element of our pages we have canonical references such as: (Yes, untidy URL...we are working on it!) I am just trying to find out whether this snippet of the full URL is adequete for canonicalization or if the full domain is needed aswell. My reason for asking is that the SEOmoz On-Page Optimization grading tool is 'failing' all our pages on the "Appropriate Use of Rel Canonical" value. I have been unable to find a definitive answer on this, although admittedly most examples do use the full URL. (I am not the site developer so cannot simply change this myself, but rather have to advise him in a weekly meeting). So in short, presumably using the full URL is best practise, but is it essential to its effectiveness when being read by the search engines? Or could there be another reason why the "Appropriate Use of Rel Canonical" value is not being green ticked? Thank you very much, I appreciate any advice you can give.
Technical SEO | | rmkjersey0