Googlebots and cache
-
Our site checks whether visitors are resident in the same country or live abroad.
If it recognises that the visitor comes from abroad, the content is made more appropriate for them. Basically, instead of encouraging the visitor to come and visit a showroom, it tells them that we export worldwide. It does this by IP checking.
So far so good! But I noticed that if I look at cached pages in Google's results, that the cached pages are all export pages. I've also used Google Webmaster Tools (Search Console) and rendered pages as Google - and they also render export pages.
Does anybody have a solution to this?
Is it a problem?
Can Google see the properly (local - as in UK) version of the site? -
Google won't see the local version (I assume your site is UK based) - Googlebot is visiting with an IP from California & will see the "international" version of your site. They indicate that they have bots visiting the site from other IP addresses (local aware crawling) - but to be honest, if I check the server logs of our sites (based in FR & ES) I only find visits from US IP's.
If the international version has only minor differences to the local version it shouldn't be a major problem - if they are major differences it's probably better to find another solution. This could be creating a different version of your site (which could be overkill), or presenting international visitors the choice on first visit (local version/international version). You store the choice in a cookie & personalise the content on the pages based on the cookie value. This way, Google would see the "local" version of the site.
Hope this helps,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webmaster is not crawling links and site cache still in old date
Hi guys, I have been trying to get my page indexed in Google with new title and descriptions but it is not getting indexed. I have checked in many tools but no useful. Can you please tell me what could be the issue? Even I have set up And Google webmaster is not crawling links I have built so far. Few links are indexed but others do not. Why this is happening. My url is: https://www.paydaysunny.com thanks
Technical SEO | | ksmith880 -
Why does my Google Web Cache Redirects to My Homepage?
Why does my Google Webcache appears in a short period of time and then automatically redirects to my homepage? Is there something wrong with my robots.txt? The only files that I have blocked is below: User-agent: * Disallow: /bin/ Disallow: /common/ Disallow: /css/ Disallow: /download/ Disallow: /images/ Disallow: /medias/ Disallow: /ClientInfo.aspx Disallow: /*affiliateId* Disallow: /*referral*
Technical SEO | | Francis.Magos0 -
My some pages are not showing cached in Google, WHY?
I have website http://www.vipcollisionlv.com/ and when i check the cache status with tags **site:http:vipcollisionlv.com, **some page has no cache status.. you can see this in image. How to resolve this issue. please help me.
Technical SEO | | 1akal0 -
Issue with Cached pages
I have a client who has a three domains:
Technical SEO | | paulbaguley
budgetkits.co.uk
prosocceruk.co.uk
cheapfootballkits.co.uk Budget Kits is not active but Pro Soccer and Cheap Football Kits are. The issue is when you do site:budgetkits.co.uk on Google it brings back results. If you click on the link it goes to page saying website doesn't exist which is correct but if you click on cached it shows you a page from prosocceruk.co.uk or cheapfootballkits.co.uk. The cached pages are very recent by a couple of days ago to a week. The first result brings up www.budgetkits.co.uk/rainwear but the cached page is www.prosocceruk.co.uk/rainwear The third result brings up www.budgetkits.co.uk/kids-football-kits but the cached page is http://www.cheapfootballkits.co.uk The history of this issue is that budgetkits.co.uk was its own website 7 years ago and then it used to point at prosocceruk.co.uk after that but it no longer does for about two months. All files have been deleted from budgetkits.co.uk so it is just a domain. Any help with this would be very much appreciated as I have not seen this kind of issue before.0 -
Googlebot does not obey robots.txt disallow
Hi Mozzers! We are trying to get Googlebot to steer away from our internal search results pages by adding a parameter "nocrawl=1" to facet/filter links and then robots.txt disallow all URLs containing that parameter. We implemented this late august and since that, the GWMT message "Googlebot found an extremely high number of URLs on your site", stopped coming. But today we received yet another. The weird thing is that Google gives many of our nowadays robots.txt disallowed URLs as examples of URLs that may cause us problems. What could be the reason? Best regards, Martin
Technical SEO | | TalkInThePark0 -
Googlebot take 5 times longer to crawl each page
Hello All From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms. I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server. Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it. Many Thanks Si
Technical SEO | | spes1230 -
Problems with google cache
Hi Can you please advise if the following website is corrupted in the eyes of Google, it has been written in umbraco and I have taken over it from another developer and I am confused to why it is behaving the way it is. cache:www.tangoholidaysolutions.com When I run this all I see is the header, the start of the main content and then the footer. If I view text view all the content is visible. The 2nd issue I have with this site is as follows: Main Page: http://www.tangoholidaysolutions.com/holiday-lettings-spain/ This page is made up of widgets i.e. locations, featured villas, content However the widgets are their own webpages in their own right http://www.tangoholidaysolutions.com/holiday-lettings-spain/location-picker/ My concern is that this part pages will affect the performance of the seo on the site. In an ideal world I would have the CMS setup so these widgets are not classed as pages, but I am working on this. Thanks Andy
Technical SEO | | iprosoftware0 -
Differences between Lynx Viewer, Fetch as Googlebot and SEOMoz Googlebot Rendering
Three tools to render a site as Googlebot would see it: SEOMoz toolbar.
Technical SEO | | qlkasdjfw
Lynxviewer (http://www.yellowpipe.com/yis/tools/lynx/lynx_viewer.php )
Fetch as Googlebot. I have a website where I can see dropdown menus in regular browser rendering, Lynxviewer and Fetch as Googlebot. However, in the SEOMoz toolbar 'render as googlebot' tool, I am unable to see these dropdown menus when I have javascript disabled. Does this matter? Which of these tools is a better way to see how googlebot views your site?0