How can I tell Google, that a page has not changed?
-
Hello,
we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot.
We would like to tell googlebot, to stop crawling pages that never change. This one for instance:
http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html
As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back.
The following header fields might be relevant. Currently our webserver answers with the following headers:
Cache-Control:
no-cache, must-revalidate, post-check=0, pre-check=0, public
Pragma:no-cache
Expires:Thu, 19 Nov 1981 08:52:00 GMT
Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future?
I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us.
Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages.
Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages?
Thanks for your help
Cord
-
Unfortunately, I don't think there are many reliable options, in the sense that Google will always honor them. I don't think they gauge crawl frequency by the "expires" field - or, at least, it carries very little weight. As John and Rob mentioned, you can set the "changefreq" in the XML sitemap, but again, that's just a hint to Google. They seem to frequently ignore it.
If it's really critical, a 304 probably is a stronger signal, but I suspect even that's hit or miss. I've never seen a site implement it on a large scale (100s or 1000s of pages), so I can't speak to that.
Two broader questions/comments:
(1) If you currently list all of these pages in your XML sitemap, consider taking them out. The XML sitemap doesn't have to contain every page on your site, and in many cases, I think it shouldn't. If you list these pages, you're basically telling Google to re-crawl them (regardless of the changefreq setting).
(2) You may have overly complex crawl paths. In other words, it may not be the quantity of pages that's at issue, but how Google accesses those pages. They could be getting stuck in a loop, etc. It's going to take some research on a large site, but it'd be worth running a desktop crawler like Xenu or Screaming Frog. This could represent a site architecture problem (from an SEO standpoint).
(3) Should all of these pages even be indexed at all, especially as time passes? More and more (especially post-Panda), more indexed pages is often worse. If Googlebot is really hitting you that hard, it might be time to canonicalize some older content or 301-redirect it to newer, more relevant content. If it's not active at all, you could even NOINDEX or 404 it.
-
Thanks for the answers so far. The tips are not really solving my problems yet, though: I don't want to set down general crawling speed in the webmaster tools, because pages that frequently change should also be crawled frequently. We do have XML Sitemaps, although we did not include these picture pages, as in our example. There are ten- maybe houndreds- of thousands of these pages. If everyone agrees on this, we can include these pages in our XML Sitemaps of course. Using "meta refresh" to indicate, that the page never changed, seems a bit odd to me. But I'll look into it.
But what about the http headers, I asked about? Does anyone have any ideas on that?
-
Your best bet is to build an Excel report using a crawl tool (like Xenu, Frog, Moz, etc), and export that data. Then look to map out the pages you want to log and mark as 'not changing'.
Make sure to built (or have a functioning XML sitemap file) for the site, and as John said, state which URL's NEVER change. Over time, this will tell googlebot that it isn't neccessary yo crawl those page URL's as they never change.
You could also place a META REFRESH tag on those individual pages, and set that to never as well.
Hope some of this helps! Cheers
-
If you have Google Webmaster Tools set up, go to Site configuration > Settings, and you can set a custom crawl rate for you site. That will change it site-wide, so if you have other pages that change frequently, that might not be so great for you.
Another thing you could try is generate a sitemap, and set a change frequency of never (or yearly) for all of the pages you don't expect to change. That also might slow down Google's crawl rate of those pages.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Ignoring region settings on contact pages
Hi All, I've got an issue with multi-region contact pages. For example, Google favors the UAE other region contact pages for French region searches, when I only want /fr/contact. I've used a Rel-con and set up the website to be pointing to the correct regions.
Technical SEO | | WattbikeSEO0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Can Google move up my ranking without caching it ?
Is it possible that my site was last cached in Google on 9 Oct (it was ranking 23 that time) and today on 29th oct its ranking position 3 for a kw though still showing cached on 9 oct. Its not cached since 9 oct, does that mean its not crawled since then? How can Google move up my ranking without caching it ?
Technical SEO | | Personnel_Concept0 -
Changed URLs from Upper to Lower Case, and lost page authority should we switch back?
During an overhaul of our site architecture we switched from having capitals in our urls to all lower case, we did the 301's but the page authority is not nearly what it was should we switch back? (new) http://www.usleaseoption.com/rent-to-own/florida vs (old) http://www.usleaseoption.com/rent-to-own/Florida/
Technical SEO | | mjo1360 -
Help changing category and page titles on established Magento site.
Hi All, This is my first post, I've been a Pro member for a while now, read many posts but have never asked a question here. I have an established Magento site that I need to do a lot of SEO work on. I am a newbie at SEO, so I wanted to ask for advice here. I just changed one of the category names, the url, meta title, and meta description. When making the change I selected the checkbox in the Magento backend that says "Create Permanent Redirect for Old URL" Example:
Technical SEO | | TrulyLuna
Old Category: http://www.domain.com/old-category-name.html
New Category: http://www.domain.com/new-category-name.html Everything worked fine with the exception of one alarming thing... I lost all of the page authority for the category itself and all of the sub categories and pages below it. Now the category and all pages below it show (on the moz toolbar) a PA of 1 and 0 links and 0 domains. I updated the sitemap. I did not do a 301 redirect in the .htaccess, I only chose the option to redirect on the Magento backend when changing the name and url of the category. I need to change a lot of other category and page names and urls but now I'm a bit gun shy, as I do not want to do something that will damage what little page rank I have at the moment. I'm looking for some advice from one of you guys who might be able to help me do this the correct way. I did some research on Google, but I'm still not sure of the correct method to accomplish this. Thanks in advance for any help that you can offer.0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
Should i Change On Page Optimization ?
Hi, PC monitoring and computer monitoring software are our targeted keywords. Around 5 weeks ago, We created a page for pc monitoring software (home/pc-monitoring-software) and did some bookmarking and guest posts targeting PC monitoring software keyword. Now we are in Top 15 on Google for PC monitoring software keyword . Initially we were thinking to change content of around 2 year old home page to adjust computer monitoring software keyword and do SEO for this keyword. But few days ago, we noticed that our pc-monitoring-software page is already ranking in early fourties for computer monitoring software keyword as well. May be Google is giving advatage of being synonym of PC . Now we are thinking that we should optimize the PC monitoring software page for both computer and PC software keywords like adding "computer monitoring software" in addition to existing "pc monitoring software" in title and similalrly do other on page related work for Computer Mnitoring Software. We are also thinking of doing 301 redirect of existing pc-monitoring-software page to new computer-monitoring-software page which will be optimized both for PC and Computer. Please suggest me if it will help to get good ranking for both PC and Computer Monitoring software if we make above mentioned changes or we should not change the existing pc-monitoring-software page and shall stick to earlier plan of changing the home page to adjust for computer monitoring software.? I'm new to SEO, so want to make wise decision with your help instead of learning with failures. Thanks, shahzad
Technical SEO | | shaz_lhr0 -
Google website-links changing back and fourth
Thought I might ask you guys if you have ever seen anything similar, 'cause I sure haven't. 🙂 I have a client who stumbled across a problem with his website links. Google change them back and fourth. one day one of the links will be called "iPhone 4 accessories" and some weeks pass and then it changes to " 4 accessories". Weeks pass again and then the iphone is back. First I thought to myself that Google might have expanded the AdWords filter to include website-links.. But then I remembered that they were ordered by the EU courts to size that practice.. so that can't be it. Plus allot of his competition doesn't seem to have the same problem. I have checked everything, the links, title tags, page titles exc.. and I acn't realt find any reason why this should be happening to him and I must admit I have never seen anything similar. Any hints and pointers would be most welcome 🙂
Technical SEO | | ReneReinholdt0