Hiding content until user scrolls - Will Google penalize me?
-
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections.
I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it).
Is this still the case?
Thanks,
Sam
-
Hi,
An alternative approach would be to use http://michalsnik.github.io/aos/ library. It does not set the visibility: hidden or hide the content, but uses the concept of as the element is within the viewport it will apply the animation. Make sure to test AOS library though because it does set the opacity to 0 so feel free to test in a development environment and fetch as google using Webmaster Tools.
If you don't want to use the AOSjs library you can write your own Javascript (JS) library to detect if the element is within the viewport and add the CSS class from the https://daneden.github.io/animate.css/ library as needed.
-
Interesting, far enough I suppose. Would certainly hold me back from making webpages a lot less visually appealing.
-
Thanks Kane,
Yes, this is a visual feature to appear as the user scrolls.
Would love to hear if there is a better way.
Sam
-
Hey Sam.
Is this for a visual feature, like making the content "appear" as the user scrolls? While Google is doing a great job of reading JS, my concern would be that this looks like cloaking or hidden text if the purpose is misinterpreted.
There may be safer ways to do this depending on what your goal is. Let me know and I can go from there.
-
John Mueller addressed a similar question in a recent Google Webmaster Central office-hours hangout, and he was pretty definitive. The question was about text that's hidden behind tabs. He states that they see the hidden content but won't give it as much weight.
Here's the link - https://www.youtube.com/watch?v=zZAY-BwL6rU. The question starts at 6:45.
Google does read JavaScript and CSS, and that's why they send warnings to webmasters if such files are blocked from googlebot.
-
True, but won't tell me easily if it's given less weighting.
-
Grab a few unique phrases in what is not shown immediately to the visitor, then search for it in quotes.
Should answer the question fast.
-
Is Google really that cleaver to look into my scripts folder and see that the content is actually shown on scroll, probably not, so I'm guessing as you've both suggested it may not be worth it.
I wonder if there's a better way of doing this other than using opacity.
-
This is my understanding too, Laura. It has proven frustratingly difficult to find a definitive answer to this question!
-
Google will probably index it, but it won't be given the same weight as content that's immediately visible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why would a developer build all page content in php?
Picked up a new client. Site is built on Wordpress. Previous developer built nearly all page content in their custom theme's PHP files. In other words, the theme's "page.php" file contains virtually all the HTML for each of the site's pages. Each individual page's back-end page editor appears blank, except for some of the page text. No markup, no widgets, no custom fields. And no dedicated, page-specific php files either. Pages are differentiated within page.php using: elseif (is_page("27") Has anyone ever come across this approach before? Why might someone do this?
Web Design | | mphdavidson0 -
Recovering organic traffic and Google rankings post-site-crash
Hi everyone, we had a client's Wordpress website go down about 2 weeks ago and since then organic traffic has basically plummeted. We haven't identified exactly what caused the crash, but it happened twice in one week. We spent a lot of time optimizing the site for organic SEO, improving load times, improving user experience, improving the website content, improving CTR, etc. Then one morning we get a notification from our uptime monitoring service that the site was down, and upon further inspection we believe it may have been compromised. The child theme that the website was using, all of the files were deleted and/or blank. We reverted the website to a previous backup, which fixed the problem. Then, a few days later, the same exact thing happened, only this time the child theme files were missing after the backup was restored. We've since re-installed and reconfigured the child theme, changed all passwords (Wordpress, FTP, hosting, etc.), and we're looking into changing hosting providers in the very near future. The site uses the Yoast Wordpress SEO plugin, which has recently been reported as having some security flaws. Maybe that was the cause of the problem. Regardless, the primary focus right now is to recover the organic traffic and Google rankings that we've worked so hard to improve over the past few months up until this disaster occurred. The client is in a very competitive niche and market, so I'm pretty frustrated that this has happened after we were making such great progress, Since the website went down, organic search traffic has decreased by 50%. The site and all internal pages are loading properly again (and have been since the second time the website went down), but Google Webmaster Tools is still reporting a number of pages as "not found" witht he crawl dates as early as this past weekend. We've marked all errors as "fixed", and also re-submitted the Sitemaps in Google Webmaster Tools. The website passes the "mobile-friendly" tests, received A and B grades in GTMMetrix (for whatever that's worth), and still has the same original Google Maps rankings as before. The organic traffic, however, and organic rankings on Google have seen a pretty dramatic decrease. Does anyone have any recommendations when it comes to recovering a website's authority and organic traffic after it's experienced some downtime?
Web Design | | georgetsn0 -
In Google Anayltics does the iPad always report portrait orientation?
I find I have disproportionate anount of visitors viewing at 768x1024, which coincides with the high iOS visitor rate. However, does this mean the vistors are all viewing in portrait orientation, or does the report portair regardlaess of orientation?
Web Design | | gotomarketers0 -
Using content from other sites without duplicate content penalties?
Hi there, I am setting up a website, where i believe it would substantially benefit users experience if i setup a database of information on artists. I am torn because to feasibly do this correctly, i would have content that is built from multiple sources, but has no real unique content. It would have parts from Wikipedia, parts from other websites etc. All would be sourced of-course. My concern is that if i do this, am i risking in devaluing my website because of this. Is there a way i can handle this without taking a hit?
Web Design | | BorisD0 -
Avoiding duplicate content with multi-lagusage site
Hi, We have a client in China that is looking to create three versions of the same website, English, Chinese and Korean. They do not want to use a translation plugin like Google translate, preferring to have the pages duplicated. What is the best way to do this bearing in mind that the site needs to be found in all three languages. Would also appreciate if anyone knows of a good hosting company that has English support on the Chinese main land. Thanks Fraser
Web Design | | fraserhannah0 -
Getting tons of duplicate content and title errors on my asp.net shopping cart, is there way to resolve this?
The problem I am having is that the web crawlers are seeing all my category pages as the same page thus creating duplicate content and duplicate title errors. At this time I have 270 of these critical errors to deal with. Here is an example: http://www.baysidejewelry.com/category/1-necklaces.aspx http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=1 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=2 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=3 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=4 All of these pages are see as the same exact page by the crawlers. Because these pages are generated by a SQL database I don't have a way I know of to fix it.
Web Design | | bsj20020 -
Two URLs with same content
We recently had a client who own multiple brands switch from having multiple urls to having a single domain with multiple sub domains. I've posted an example below to better explain. My question is the original url is still functional, so there are two urls with identical content, yet I haven't been getting a duplicate content error. Also, would a rel canonical link be beneficial in this case since the duplicate content is on two separate domains? My thoughts were to put a 301 redirect on the original pages so they permanently forward to the new sub-domain format. Is this the best course of action? If not, what would you recommend? Example: Original URLs
Web Design | | BluespaceCreative
www.example1.com
www.example2.com
www.example3.com
www.parentcompany.com New URLs
example1.parentcompany.com
example2.parentcompany.com
example3.parentcompany.com
www.parentcompany.com Let me know if this I need to clarify anything in better detail.
Thanks in advance!0 -
Redirecting 301 Redirects -- Will Search Engines Notice?
Hello Mozzers, We're currently evaluating a client site where the previous web developer redesigned the site and got lazy, 301 redirecting hundreds of pages to the home page instead of to their respective new URLs. Ugh. In any case, we will probably fix this for the sake of implementing best practices. But I am curious how search engines treat 301'd URLs, as they are supposed to be permanent redirects. Will search crawlers ever visit the old URLs again to find that we've re-redirected them? Or have they written them off as moved to the home page for good, meaning that there's no way to direct the authority of the previous URLs to their rightful targets? Thanks!
Web Design | | SEOTeamSF0