Content Below the Fold
-
Hi
I wondered what the view is on content below the fold?
We have the H1, product listings & then some written content under the products - will Google just ignore this?
I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is?
Thank you
-
Hi Becky,
Here is what I found:
The pros and cons of hiding content using JavaScript and CSS (display: none) has been a topic of some debate within the SEO industry, and Google’s comments over time have somewhat added to the confusion.
- **November 2014 **– Google’s John Mueller stated that Google _“may not” _index or rank hidden content. In aGoogle+ Hangout the following month, John repeated this, stating that hidden content would be _“discounted”_and has been for a number of years
- **21 July 2015 **– Google’s Gary Illyes, contributing to a Stack Overflow forum thread , provided clarification of this by stating that this type of content is given “way less weight in ranking”
- **27 July 2015 **– In a separate Stack Overflow thread on the same topic, Gary Illyes again confirmed that _“[Google] will index that but the content’s weight will be lower since it’s hidden” _
So the content will still be indexed, but deemed less important by the crawlers.
-
Yeh it's disappointing.
I've tried having some content behind a tab and some under the products and I am not seeing either one as having much of an effect.
Unless I remove it altogether, I'm not sure what else I can do with it?
-
Hi
Yes I tried different pages and it's still the same. I think it's to do with things we have blocked in robots.txt...
-
I'm not seeing a problem in my GoogleBot simulators, Becky, but the one within your Google Search Console is still the best judge. Have you tried reloading the Fetch as... a couple of times? And tried it on different pages?
-
Yup - Google still says content that can only be seen after a user interaction is given less importance. Kinda stupid, given that things like tabs/accordians are a major usability enhancement, but that's still where we are.
P.
-
Hi
So I did fetch as Google - and I'm seeing the page quite differently if I'm Googlebot vs. visitors.
It just sees a few big images, I can't see it rendering any product listings or content - do I now have a bigger problem?
Thank you
-
Hi
Thank you for the replies. I don't want to hide it, I just can't have it pushing products down the page so they can't be seen..
I thought in Google webmaster guidelines they included a comment to say they will ignore content behind tabs?
Becky
-
Any content below the fold will still be read. Are you trying to hide it but still get the SEO value? If that's the case, I would create a collapsible tab to keep the content on the page but hidden. If you want it to be visible, leave it as is and don't worry about Google not reading it—it will be read.
-
While theoretically logical, Google's own John Mueller stated last week that code to text ratio has absolutely no effect on crawling of a site, and in a followup question, he directly told me text/code ratio has no effect as a ranking factor either.
These used to be very minor considerations back when the search engine crawlers weren't as powerful, but no longer.
Fully agree with Pia that the idea of "above the fold" influencing ranking is nonsense as well. Given that the sweet spot for consistently high-ranking pages is ~2200 words, the idea that only the first paragraph or two are more important is unsupportable.
Hope that helps?
Paul
-
Additionally, do check the content using Fetch as google in Google Search Console / Webmaster tools. It would really help you see how spiders see the content compared to users. This is an important aspect of SEO which a lot of people ignore, you are looking to find that whether the spiders see a structured view of the content and not messy. I hope this helps, if you have further questions, please feel free to ask. Regards, Vijay
-
There's no manipulation whatsoever. In fact, Google encourage website developers and SEOs to optimise/tidy their code and keep a good code-to-content ratio. This is why Google gives us so many tools in order to do so. It makes our sites easier to crawl for Google, and in return Google may even like us more for it!
Just found an article that sums it up quite nicely:
"Essentially what is being stated is a fairly logical conclusion: reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easy to crawl and, additionally, allow you greater control over which portions of your content are given more weight by the search engines. The thinking behind validating code for SEO benefits is that, once you have a compliant site, not only will your site be a better user experience on a much larger number of browsers, but you’ll have a site with far less code, that will rank higher on the search engines because of it."
- http://www.hobo-web.co.uk/official-google-prefers-valid-html-css/
But going back to your original post, "above-the-fold is dead", yadda yadda... So long as your content in the source is metaphorically "above the fold" and not drowning in heavy code, on the page itself just worry about giving your users the "experience" that they're looking for. And not how many pixels from the top of the browser your content is. Hope that makes more sense!
-
Great thank you, you read so many conflicting articles that it's difficult to know.
I'll see if we can look at our code, but I'd want to be mindful of not manipulating Google.
Thank you!
-
I feel prioritising elements to be "above the fold" is a bit of an outdated concept these days.
Where is the fold? Different devices and screen resolutions will have different folds, and more websites are being designed now to make the traditional "above the fold" section more visually interesting and designed for user experience, rather than packed full of content.
The higher the content is in the source code itself, the more weight it will have on the page. This doesn't necessarily translate to the "visually higher the content is on the page". Google is going to be reading from top to bottom of your code, so naturally you want the most important content/links to be found first. As long as you meet (or exceed!) the user's expectation of the content upon arrival, and you keep the code tidy in terms of how much Google has to read before it gets to the real valuable content, I doubt Google's going to worry about whether users have to scroll a little to get to it.
-
Hi Becky,
As far as i understand Google will not ignore however Google do treat some part of the page as more important than other. For instance, if you have written a description of the product and some of the description is been hide.
Google, will take that as the important piece of content been displayed for user and least important been hide.
I do not see any point for Google to ignore the fold one. -
Content below the fold is still read, however less value is placed on it. So it is still worth having content that is produced for below the fold as it will still help that page rank.
Show the user what they want to see when they land on the page, majority of the time in doing this you will actually show Google what they need to rank you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
SEO value of article title content?
I work for an online theater news publisher. Our article page titles include various pieces of data: the title, publication date, article category, and our domain name (theatermania.com). Are all of these valuable from an SEO standpoint? My sense it'd be cleaner to just show the title (and nothing more) on a SERP. But we'll certainly keep whatever helps us with rankings.
Intermediate & Advanced SEO | | TheaterMania0 -
Apps content Google indexation ?
I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂
Intermediate & Advanced SEO | | JoomGeek0 -
Can i get banned for my content?
Last night all our indexed pages are gone from google. Completely deindexed - banned. Links could not cause it, all of them are related, anchors diversified and spam is never used. Content is the same like our other website has, just some small changes. First stronger website is working as usual. So can it be that duplicate content caused a complete ban? (Website is 6 months old. Content has never been properly indexed, due to same reasons i think. Last week we made changes, ant it started to get indexed quite well until tonight..)
Intermediate & Advanced SEO | | bele0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Does onsite content updates have an effect on SERPs?
Hi, Some might see this as a very (VERY) basic question but wanted to drill down into it anyway. Onsite content: Lets say you have a service website and attached to it is a blog, the blog gets updated every other day with 500 words of relevant content, containing anchor text links back to a relevant page on the main website. Forget about social signals and natural links being built from the quality content, will adding the content with anchor text links be more beneficial then using that content to generate links through guest blogging? 10 relevant articles onsite with anchor links, or 10 guest posts on other websites? I guess some might say 5 onsite and 5 guest posts.
Intermediate & Advanced SEO | | activitysuper0 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0