Automated XML Sitemap for a BIG site
-
Hi,
I would like to do an automated sitemap for my site but it has more than a million pages. It would need to be a sitemap index with a separation on different parts of the site (i.e. news, video) and I'll want a news sitemap and video sitemap as well (of course). Does anyone have any recommended way of making this and how much would you recommend it getting updated? For news and , I would like it to be pretty immediate if possible but the static pages don't need to be updated as much.
Thanks!
-
Another good reference:
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
that points to how to ping:
http://www.sitemaps.org/protocol.html#submit_ping
specific search engine examples:
-
Excellent. Thank you! How would you ping google when a sitemap is updated?
-
Yes, split them out. You will need an index sitemap. That is a sitemap that links to other sitemaps
https://support.google.com/webmasters/answer/75712?vid=1-635768989722115177-4024498483&rd=1
In any given sitemap you can have up to 50,000 URLs listed in it and it can be no larger than 50MB uncompressed.
https://support.google.com/webmasters/answer/35738?hl=en&vid=1-635768989722115177-4024498483
Therefore, you could have an index sitemap with links up to 50,000 other sitemaps. Each of those sitemaps could contain links to 50,000 URLs on your site each.
If my math is right, that would be a max of 2,500,000,000 URLs if you have 50,000 sitemaps of 50,000 URLs each.
(Interesting side note Google allows up to 500 index sitemaps, so if you take 2,500,000,000 pages x 500 - 1,250,000,000,000 URLs that you can submit to Google via sitemaps)
How you divide up your content into sitemaps would relate to how your organize the pages on your site, so you are on the right track in breaking out the sitemaps by types of content. Depending on how big any one section of the site is, you may need to have more of those sitemaps in that type i.e. articlesitemap1.xml articlesitemap2.xml etc. You get the idea.
It is recommended that you ping Google every time a page in a sitemap is updated so Google will come back and recrawl the sitemap. I don't run any sites with 1M URLs but I do run several that run in the 10s of thousands. We break them up by type and ping whenever we update a page in that group. You need to consider your crawl budget with Google in that it may not crawl all 1M pages in your sitemap as often and so you may consider for a group of pages setting them up so that if you have articlesitemap1.xml, articlesitemap2.xml, articlesitemap3.xml you are always adding your newest URLs to the most recent sitemap created (i.e. articlesitemap3.xml) That way you are generally pinging Google about the update of a single sitemap out of the group vs all three.
My other thought is that in addition to pinging Google only on the sitemaps that that you have updated, you show a 304 server response to all sitemaps that have not been updated. 304 means "not modified" since last visit. One of your challenges will be your crawl budget with Google and so why make them recrawl a sitemap they have already crawled? You may want to consider a 304 on any URL on your site that has not changed since last time Google visited.
All of that said, as I mentioned above, I have not worked at the scale of 1M+ pages and would defer to others on the best way to approach. The general thought process would be the same though in trying to figure out the best way to use your sitemaps to manage your crawl budget from Google. Small side note, if you have 1M+ pages and any of those are from the use of things like sorting parameters, duplicate content, printer friendly pages, you may want to just noindex them regardless and leave them out of the sitemap and not allow Google to crawl them to start with.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
Using Site Maps Correctly
Hello I'm looking to submit a sitemap for a post driven site with over 5000 pages. The site hasn't got a sitemap but it is indexed by google - will submitting a sitemap make a difference at this stage? Also, most free sitemap tools only go up to 5000 pages, and I'm thinking I would try a sitemap using a free version of the tool before I buy one - If my site is 5500 pages but I only submit a sitemap for 5000 (I have no control of which pages get included in the sitemap) would this have a negative effect for the pages that didn't get included? Thanks
Reporting & Analytics | | wearehappymedia0 -
When analysing my inbound anchor text am I using by page or site?
When checking to see my anchor text profile to make sure it's not too dense with the same phrases, should I be measuring against my whole site or page specific? eg if i have 100 links across my site and 20 are for the same phrase this is 20%, but if the same 20 phrases are to one page and that page has 40 links this is 55% Many thanks Ash
Reporting & Analytics | | AshShep10 -
Deleted Rarely Visited Pages - Traffic Dropped (Big Time)
Hi folks: I'd appreciate any thoughts you might have on a problem I am having with organic traffic. One of our sites has about 500 pages/blog posts. We had about 200 pages that no one was visiting, or only one to ten people had visited in an entire year. As a result, we decided to experiment, and delete any page which had fewer than 5 visits in a year. This resulted in a deletion of about 90 pages.We did this on April 6 or 7 of this year. Two days later, we had a substantial drop in visits to the site. We had been getting about 300 sessions a day. Now, we are lucky to get that in a month. I know there was an algorithm update in late March, but our traffic dropped about two weeks after that, and a day or so after the deletion of the pages. There is a clear demarcation on analytics. I gave it a month, the traffic did not recover, so we decided to restore the pages. Traffic has not recovered and it has been about 3 months now. Does anyone have any thoughts on why we might have experienced such a drastic drop as well as what we might do to recover from it? Thanks very much
Reporting & Analytics | | jnfere0 -
Site operator result anomaly
"Site:" search for site:http://www.mycity4kids.com/Bangalore/activity-based-approach is showing 76 results.I am using SERPS Redux to collect all the indexed pages, but when I re-checked indexed status of these pages using "site operator" google showed that these pages are not indexed. What is the possible explanation for this? Thanks
Reporting & Analytics | | prsntsnh0 -
Why am i getting a flux of increase in Impressions on my site & then it decreases
They guys. Hope everyone is having a great week. I wanted to get some inputs from you guys in regards to what is happening to my site that i quite don't understand. Every month or so i get this influx of high visibility with impressions for my keywords and then the impressions go away but my rankings still keep going up. Has anyone experienced this before and can give me some insight on what is going . Why do i get such a big jump and then it dies off only to return again a month later or 2 months later. I know you guys want probably some info from my site or from analytics or webmaster tools so i will provide as much as i can . For now i have included a screen shot. ScreenShot2013-06-04at31220PM_zps0d02f5fc.png ScreenShot2013-06-04at31134PM_zps5bb81b68.png ScreenShot2013-06-04at31134PM_zps5bb81b68.png ScreenShot2013-06-04at31220PM_zps0d02f5fc.png
Reporting & Analytics | | BizDetox0 -
Local site rankings have dropped off first page but Universal went up.
My site was performing first page locally on 6 of 20 keywords, and universally on 3 of 20 keywords. We started a link building campaign and optimization about 3 weeks ago. When I looked at the rankings today I was happy to see that 16 of 20 keywords were in the top 20 rankings universally, but not happy to see that only 1 of the 25 words were ranking locally now. I lost my local ranking on 5 very important keywords. I realize that you can not rank first page for both local and organic but its as if I traded my first page local ranking for a universal ranking that appears lower on the page. Maybe someone could point me in the right direction.
Reporting & Analytics | | whmgatx0 -
Free Online XMl Site Map Creators up sites over 1000 pages
Does anybody know of a XML site map creator that is free for websites over 1000 pages?
Reporting & Analytics | | AppleCapitalGroup0