Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
-
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS.
- Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source?
- Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)?
- Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem?
Any insight would be greatly appreciated.Thanks!
-
FYI, in this screenshot, I am seeing in the Google cached version of the site the "About", "additional info", "contact", and "media" pages. But I do need to click on those pages to make the content appear.
To Google and other search engines, these are not separate pages, but content that is served within the same page. The URL doesn't change at all. If you wanted to have those pages indexed, I'd recommend creating them as separate pages, with links that open up in a new page.
That said, you might get penalized for duplicate content if you have all of the same content on the page, but list this information below.
Another idea would be to keep the left hand navigation for the About, Additional Info, Contact and Media, but have all of the content display on the page; just link to the content from the top.
The way you have it built does limit the page length, but the user experience may be confusing to some, especially on a touchscreen tablet.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Desktop & Mobile Versions
We have a relatively new site and I have noticed recently that Google seems to be indexing both the mobile and the desktop version of our site. There are some queries where the mobile version will show up and sometimes both mobile and desktop show up. This can't be good. I would imagine that what is supposed to happen is that the desktop version is the one that should be indexed (always) and browser detection will load the mobile version where appropriate once the user is on the site. Do you have any advice on what we should do to solve this problem as we are a bit stuck?
Technical SEO | | simonukss0 -
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Google showing wrong title
Hi, Can anyone assist a newbie please? My keyword 'Security Systems' is giving me position 1 on page 1, but the title it is using is not the page title. I am assuming for some reason it has made it up, please see below. The actual title tag says - <colgroup><col width="420"></colgroup>
Technical SEO | | DaddySmurf
| Security systems | wireless | battery powered | Police Approved | CSS | Google is showing - Compound Security Systems: Wireless Security Systems | Battery ... <cite>www.compoundsecurity.co.uk/</cite>Manufacturers & suppliers of The Mosquito Device & Professional industry compliant and Police recommended battery powered wireless security systems.Contact us - Mosquito Anti-Loitering Devices - Security Equipment - Installers<iframe class="SEOmoz-iframe" src="chrome-extension://eakacpaijcpapndcfffdgphdiccmpknp/html/serpbar.html#{" settings":{"dmr":false,"dmt":false,"mr":true,"mt":true,"serp-overlay":true,"toolbar-enabled":true,"subdomain-metrics":false,"serp-panel-open":true,"toolbar-position":"bottom"},"lsdata":{"feid":1245,"fipl":315,"fmrp":4.717881718426137,"fmrr":3.169235978744307e-9,"ftrp":5.245402699173709,"ftrr":8.283399271183101e-8,"fuid":15756,"pda":47.20670031521297,"peid":1266,"pid":315,"pmrp":4.634793943915049,"pmrr":1.137485303730264e-7,"ptrp":4.949510322693934,"ptrr":1.6938187788875737e-7,"puid":15777,"ueid":510,"uemrp":4.693787066897207,"uemrr":2.520503065617982e-10,"uid":1060,"uipl":191,"ujid":1038,"umrp":4.96103855611017,"umrr":5.477715129904515e-10,"upa":55.929113684261466,"utrp":6.066855084050327,"utrr":2.1655670926371268e-10},"user":{"level":"pro","options":{},"proenabled":true,"aclurl":"http:="" moz.com="" users="" level?src="mozbar","loginURL":"https://mza.bundledseo.com/login","logoutURL":"https://mza.bundledseo.com/logout","apiRequest":"url-metrics","cols":"133146078688","expired":false,"accessId":"pro-RGFkZHlTbXVyZg%3D%3D","expires":1374249469,"signature":"M%2Bh%2FXh4OS%2BAMtT2j6kuntWSTvNI%3D","displayName":"DaddySmurf"},"serp":1,"host":"www.compoundsecurity.co.uk"}"" scrolling="no"></iframe> If anyone can tell me how to correct this, i would very much appreciate it. Regards, Si0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
How does Google determine freshness of content?
With the changes in the Google algorithm emphasizing freshness of content, I was wondering how they determine freshness and what constitutes new content. For instance, if I write a major update to a story I published last July, is the amended story fresh? Is there anything I can do in addition to publishing brand new content to make Google sure they see all my new content?
Technical SEO | | KnutDSvendsen0 -
Rel cannonical on all my URL's
Hi, sorry if this question has already been asked, but I can't seem to find the correct answer. In my crawling report for the domain: http://www.wellbo.de I get rel cannonical notices. I have redirected all pages of http://wellbo.de to http://www.wellbo.de with a 301 redirect. Where is my error? Why do I get these notices? I hope the image helps. Ep7Rw.jpg
Technical SEO | | wellbo0 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0