Query on Sitemap xml Root Path
-
- Is it compulsory to have sitemap.xml at this path - abcd.com/sitemap.xml?
My sitename is abcd.com. Now is it compulsory to have sitemap.xml at this path - abcd.com/sitemap.xml only?
a) If i take cnd services where path can be like xyz.com/sitemap.xml and then this sitemap i can submit in robot file so it is fine?
b) What will happen here in webmaster tool as in webmaster tool when we submit sitemap by default it gives us domain name like abcd.com and we have to just add /sitemap.xml
-
Hi Ian,
Thanks for your support!
-
Hi,
What is the main reason for you to upload the sitemap to a new location?
I also found an article on sitemaps that might help. It says: The XML sitemaps protocol defines that XML sitemap files can not contain URLs from different domains. This includes subdomains and other kinds of variations. You have to keep all URLs to a single domain per XML sitemap.
Taken from this site https://www.microsystools.com/products/sitemap-generator/help/multiple-domains-xml-sitemaps/
This includes tutorials on different types of sitemaps.
Hope this can help further.
Thanks,
Ian
-
Hi Ian,
I have a large ecommerce website whose XML site map is currently located at https://www.abcd.com./sitemap.xml , whereas I want to upload it at new location I.e https://data.abcd.com/sitemap.xml it means on new location which is sudomain, Also if it is OK as per Google guidelines to upload your sitemap in sudomain rather than main domain then please let me know that in webmaster console how can I upload this new sitemap? Because when I try to upload sitemap in console it ask mandatorily to upload sitemap to be available at root of the website so what next I can do ? Thanks!
-
Hi,
I Believe it's compulsory for a sitemap to be at abcd.com/sitemap.xml.
Here is a guide on sitemaps and their format for future references: https://www.sitemaps.org/protocol.html
You tend to only have one sitemap, unless you have a large site then you will need divide the sitemaps across different pages, a general rule of thumb is to keep the sitemap below 50,000 URLs. I'd say one sitemap at abcd.com/sitemap.xml should be enough for a standard website.
I'm unsure of Question a) if you could repeat that in more detail please.
Finally, there are two ways to submit a sitemap,
-
Directly within Google Search Console (previously Webmaster Tools) using the 'Test/Add Sitemap' feature, by adding /sitemap.xml and testing it before submitting it.
-
Insert the your sitemap line anywhere in your
robots.txt
file, specifying the path to your sitemap:
Sitemap: http://example.com/sitemap_location.xml
You can also find more information and guidelines on sitemaps here: https://support.google.com/webmasters/answer/156184?hl=en&ref_topic=4581190
Hope that helps.
Ian
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap submission for site migration?
Hi mozzers, We're about to migrate 4 domains into 1. Is there a particular way I should generate and submit the sitemap or should I just follow the same protocol as for one domain? Should I even worry submitting a sitemap when the site has this drupal module? I have access to the webmaster tools of all domains, should I do something specific on the accounts that are migrating besides submitting a sitemap? Thanks for letting me know!
Technical SEO | | Ideas-Money-Art0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Sub Domain vs. New Root Domain for New Brand
Would you recommend a new brand be placed as a subdomain to the existing parent company or create a separate root domain for this new brand?
Technical SEO | | ScratchMM0 -
Generating a Sitemap for websites linked to a wordpress blog
Greetings, I'm looking for a way to generate a sitemap that will include static pages on my home directory, as well as my wordpress blog. The site that I'm trying to build this for is in a temporary folder, and can be accessed at http://www.lifewaves.com/Website 3.0 I plan on moving the contents of this folder to the root directory for lifewaves.com whenever we are go for launch. What I'm wondering is, is there a way to build a sitemap or sitemap index that will point to the static pages of my site, as well as the wordpress blog while taking advantage of the built in wordpress hierarchy? If so, what's an easy way to do this. I have generated a sitemap using Yoast, but I can't seem to find any xml files within the wordpress folder. Within the plugin is a button that I can click to access the sitemap index, but it just brings me to the homepage of my blog. Can I build a sitemap index that points to a sitemap for the static pages as well as the sitemap generated by yoast? Thank you in advance for your help!! P.S. I'm kind of a noob.
Technical SEO | | WaveMaker0 -
How long does it take for traffic to bounce back from and accidental robots.txt disallow of root?
We accidentally uploaded a robots.txt disallow root for all agents last Tuesday and did not catch the error until yesterday.. so 6 days total of exposure. Organic traffic is down 20%. Google has since indexed the correct version of the robots.txt file. However, we're still seeing awful titles/descriptions in the SERPs and traffic is not coming back. GWT shows that not many pages were actually removed from the index but we're still seeing drastic rankings decreases. Anyone been through this? Any sort of timeline for a recovery? Much appreciated!
Technical SEO | | bheard0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0 -
Which is the best wordpress sitemap plugin
Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?
Technical SEO | | simoncmason0 -
More than 1 XML Sitemap
I recently took over administration of my site and I have 2 XML sitemaps for my main site and 1 XML sitemap for my blog (which is a sub-page of the main site). Don't I only need 1 sitemap for my site and one for my blog? I don't know which one to delete - they both has the same page authority. Also, only 1 of them is accessible by browser search. http://www.rmtracking.com/rmtracking-sitemap.xml - accessible in browser http://www.rmtracking.com/sitemap.xml - regularly updated in Google Webmaster Tools but not accessible in search browser. I don't have any error messages in Webmaster tools.
Technical SEO | | BradBorst0