Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
-
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools.
However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
-
Thanks, Ryan. I was confusing those.
To execute the sitemap index, would I just point the crawlers to the index? Do you have any links to overviews of how to set that up?
-
There are also scripts you can purchase for very little cost and have them install it for you on your server and set up a cron job to have your sitemap run automatically each week and ping the search engines to find your sitemap.
One such service is at xml-sitemaps.com - they can install it for you and set up the cron job as well.
Just make sure you are on a good server that can handle the script if your website is large.
-
I think you may be getting xml sitemaps confused with sitemap pages. Your xml sitemap should live at /sitemap.xml as Alan pointed out. The seomoz and other sites that have a /sitemap page is for different purposes. Its not your xml file, its a "topical guide" to your website and all the major sections of your site.
Remember that you can also create a xml sitemap index if you need to have different sitemaps (video, news, content) that houses all the different xml sitemaps underneath it.
-
Typically I use /sitemap.xml
I dont think it matters what you call it, as long as its submitted to Google webmaster tools.
Check out sitemaps.org for more info on how to create a quality sitemap
-
Great, thanks!
Should we also have it live at /sitemap, as SEOmoz does?
-
Yes they will pick it up, you can put the address in your robots.txt file,
http://thatsit.com.au/robots.txt
then you dont need to submit it and all search engines will find it.
Keeep it up todate, free from 404's, 301's all urls should be status code 200, and keep it accurate. Bing for one will igniore it if it is not clean and acurate, they allow only a couple of percent error rate.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
Sudden decrease in indexed AMP pages after 8/1/16 update
After the AMP update on 8/1/16, the number of AMP pages indexed suddenly dropped by about 50% and it's crushing our search traffic- I haven't been able to find any documentation on any changes to look out for and why we are getting a penalty- any advice or something I should look out for?
Technical SEO | | nystromandy0 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
Will it make any difference to SEO on an ecommerce site if they use their SSL certificate (https) across every page
I know that e-commerce sites usually have SSL certificates on their payment pages. A site I have come across is using has the https: prefix to every page on their site. I'm just wondering if this will make any difference to the site in the eyes of Search Engines, and whether it could effect the rankings of the site?
Technical SEO | | Sayers1 -
What changes do i need to make to my site to get into google news
Hi, when we had the old design, we were in google news but then when we upgraded our site, we had a major problem which forced us to have to redesign our site. Since then we have not been included in google news and we would like to get back in. We only want to be in google news for the following page http://www.in2town.co.uk/Latest-News-Headlines But for some reason, no matter what we do we keep getting knocked back. I would love to know what we should be doing to get into google news and see what the problems are. We have moved to a bigger dedicated server to increase speed so i know it is not that. Any help would be great Also is there an alternative to google news that i can get our site into to generate traffic and to get our news stories straight out to people Hi, Thank you for your note. We appreciate your interest in sharing your content with us. However, when we reviewed your site, we found that we cannot include it in Google News at this time. We have certain guidelines in place regarding the quality of sites which are included in Google News. Please feel free to review these guidelines at the following link: http://www.google.com/support/news_pub/bin/answer.py?hl=en&answer=40787 We know it can be frustrating to not have more information about this but we appreciate your efforts and understanding. We will log your site for future consideration. Please keep in mind that we will be unlikely to review your site for at least 60 days following this email. Thanks for your understanding and your continued interest in Google News. Regards,
Technical SEO | | ClaireH-184886
The Google News Team0 -
What does the Google Crawler see when crawling this page?
If you look at this page http://www.rockymountainatvmc.com/t/49/61/185/730/Batteries. You will see we have a vehicle filter on it. Right now you only see a picture of a battery and some bad text that needs to be updated ( We just hired a copywriter!). Our question is when google crawls this site will thy just see this or will they see all the products that appear after you pick a "machine type" "make" "model" and "year" Any help would be great. Right now we think it just sees this main page how we have set things up; however, we know that the crawler is also crawling some ajax. We just want to be sure of things.
Technical SEO | | DoRM0 -
Googlebot Crawl Rate causing site slowdown
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT: http://imgur.com/dyIbf I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot. Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings? Thanks
Technical SEO | | SuperMikeLewis0 -
Sitemaps for Google
In Google Webmaster Central, if a URL is reported in your site map as 404 (Not found), I'm assuming Google will automatically clean it up and that the next time we generate a sitemap, it won't include the 404 URL. Is this true? Do we need to comb through our sitemap files and remove the 404 pages Google finds, our will it "automagically" be cleaned up by Google's next crawl of our site?
Technical SEO | | Prospector-Plastics0