Googlebots and cache
-
Our site checks whether visitors are resident in the same country or live abroad.
If it recognises that the visitor comes from abroad, the content is made more appropriate for them. Basically, instead of encouraging the visitor to come and visit a showroom, it tells them that we export worldwide. It does this by IP checking.
So far so good! But I noticed that if I look at cached pages in Google's results, that the cached pages are all export pages. I've also used Google Webmaster Tools (Search Console) and rendered pages as Google - and they also render export pages.
Does anybody have a solution to this?
Is it a problem?
Can Google see the properly (local - as in UK) version of the site? -
Google won't see the local version (I assume your site is UK based) - Googlebot is visiting with an IP from California & will see the "international" version of your site. They indicate that they have bots visiting the site from other IP addresses (local aware crawling) - but to be honest, if I check the server logs of our sites (based in FR & ES) I only find visits from US IP's.
If the international version has only minor differences to the local version it shouldn't be a major problem - if they are major differences it's probably better to find another solution. This could be creating a different version of your site (which could be overkill), or presenting international visitors the choice on first visit (local version/international version). You store the choice in a cookie & personalise the content on the pages based on the cookie value. This way, Google would see the "local" version of the site.
Hope this helps,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes. How does this impact SEO?
Technical SEO | | DML-Tampa0 -
My some pages are not showing cached in Google, WHY?
I have website http://www.vipcollisionlv.com/ and when i check the cache status with tags **site:http:vipcollisionlv.com, **some page has no cache status.. you can see this in image. How to resolve this issue. please help me.
Technical SEO | | 1akal0 -
Mobilegeddon Help - Googlebot Mobile cHTML & Mobile: XHTML/WML
Our website is (www.billboard.com) and we have a mobile website on a sub-domain (www.m.billboard.com). We are currently only redirecting Googlebot Type "Mobile: Smartphone" to our m.billboard.com domain. We are not redirecting Googlebot Mobile: cHTML & Mobile: XHTML/WML Using this URL as an example: http://www.billboard.com/articles/news/1481451/ravi-shankar-to-receive-lifetime-achievement-grammy, I fetched the URL via Google webmaster tools: http://goo.gl/8m4lQD As you can see only the 3rd Googlebot mobile was redirected, while the first 2 Google bot mobile spiders resolved 200 for the desktop page. QUESTION: could this be hurting our domain / any page that is not redirecting properly post mobilegeddon?
Technical SEO | | Jay-T0 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0 -
Why googlebot indexing one page, not the other?
Why googlebot indexing one page, not the other in the same conditions? In html sitemap, for example. We have 6 new pages with unique content. Googlebot immediately indexes only 2 pages, and then after sometime the remaining 4 pages. On what parameters the crawler decides to scan or not scan this page?
Technical SEO | | ATCnik0 -
Tool which checks cache date of pages?
Does anyone know of a tool which can check the cache date of each page of a site? i can get each page of the site into a .csv or xml file
Technical SEO | | Turkey1 -
Word Press Cache
I'm looking to install a word press caching plugin to help speed up my site. My question is which plugin or method is best practice to insure my ranking are not hurt? Thanks
Technical SEO | | mmaes0