Content Below the Fold
-
Hi
I wondered what the view is on content below the fold?
We have the H1, product listings & then some written content under the products - will Google just ignore this?
I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is?
Thank you
-
Hi Becky,
Here is what I found:
The pros and cons of hiding content using JavaScript and CSS (display: none) has been a topic of some debate within the SEO industry, and Google’s comments over time have somewhat added to the confusion.
- **November 2014 **– Google’s John Mueller stated that Google _“may not” _index or rank hidden content. In aGoogle+ Hangout the following month, John repeated this, stating that hidden content would be _“discounted”_and has been for a number of years
- **21 July 2015 **– Google’s Gary Illyes, contributing to a Stack Overflow forum thread , provided clarification of this by stating that this type of content is given “way less weight in ranking”
- **27 July 2015 **– In a separate Stack Overflow thread on the same topic, Gary Illyes again confirmed that _“[Google] will index that but the content’s weight will be lower since it’s hidden” _
So the content will still be indexed, but deemed less important by the crawlers.
-
Yeh it's disappointing.
I've tried having some content behind a tab and some under the products and I am not seeing either one as having much of an effect.
Unless I remove it altogether, I'm not sure what else I can do with it?
-
Hi
Yes I tried different pages and it's still the same. I think it's to do with things we have blocked in robots.txt...
-
I'm not seeing a problem in my GoogleBot simulators, Becky, but the one within your Google Search Console is still the best judge. Have you tried reloading the Fetch as... a couple of times? And tried it on different pages?
-
Yup - Google still says content that can only be seen after a user interaction is given less importance. Kinda stupid, given that things like tabs/accordians are a major usability enhancement, but that's still where we are.
P.
-
Hi
So I did fetch as Google - and I'm seeing the page quite differently if I'm Googlebot vs. visitors.
It just sees a few big images, I can't see it rendering any product listings or content - do I now have a bigger problem?
Thank you
-
Hi
Thank you for the replies. I don't want to hide it, I just can't have it pushing products down the page so they can't be seen..
I thought in Google webmaster guidelines they included a comment to say they will ignore content behind tabs?
Becky
-
Any content below the fold will still be read. Are you trying to hide it but still get the SEO value? If that's the case, I would create a collapsible tab to keep the content on the page but hidden. If you want it to be visible, leave it as is and don't worry about Google not reading it—it will be read.
-
While theoretically logical, Google's own John Mueller stated last week that code to text ratio has absolutely no effect on crawling of a site, and in a followup question, he directly told me text/code ratio has no effect as a ranking factor either.
These used to be very minor considerations back when the search engine crawlers weren't as powerful, but no longer.
Fully agree with Pia that the idea of "above the fold" influencing ranking is nonsense as well. Given that the sweet spot for consistently high-ranking pages is ~2200 words, the idea that only the first paragraph or two are more important is unsupportable.
Hope that helps?
Paul
-
Additionally, do check the content using Fetch as google in Google Search Console / Webmaster tools. It would really help you see how spiders see the content compared to users. This is an important aspect of SEO which a lot of people ignore, you are looking to find that whether the spiders see a structured view of the content and not messy. I hope this helps, if you have further questions, please feel free to ask. Regards, Vijay
-
There's no manipulation whatsoever. In fact, Google encourage website developers and SEOs to optimise/tidy their code and keep a good code-to-content ratio. This is why Google gives us so many tools in order to do so. It makes our sites easier to crawl for Google, and in return Google may even like us more for it!
Just found an article that sums it up quite nicely:
"Essentially what is being stated is a fairly logical conclusion: reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easy to crawl and, additionally, allow you greater control over which portions of your content are given more weight by the search engines. The thinking behind validating code for SEO benefits is that, once you have a compliant site, not only will your site be a better user experience on a much larger number of browsers, but you’ll have a site with far less code, that will rank higher on the search engines because of it."
- http://www.hobo-web.co.uk/official-google-prefers-valid-html-css/
But going back to your original post, "above-the-fold is dead", yadda yadda... So long as your content in the source is metaphorically "above the fold" and not drowning in heavy code, on the page itself just worry about giving your users the "experience" that they're looking for. And not how many pixels from the top of the browser your content is. Hope that makes more sense!
-
Great thank you, you read so many conflicting articles that it's difficult to know.
I'll see if we can look at our code, but I'd want to be mindful of not manipulating Google.
Thank you!
-
I feel prioritising elements to be "above the fold" is a bit of an outdated concept these days.
Where is the fold? Different devices and screen resolutions will have different folds, and more websites are being designed now to make the traditional "above the fold" section more visually interesting and designed for user experience, rather than packed full of content.
The higher the content is in the source code itself, the more weight it will have on the page. This doesn't necessarily translate to the "visually higher the content is on the page". Google is going to be reading from top to bottom of your code, so naturally you want the most important content/links to be found first. As long as you meet (or exceed!) the user's expectation of the content upon arrival, and you keep the code tidy in terms of how much Google has to read before it gets to the real valuable content, I doubt Google's going to worry about whether users have to scroll a little to get to it.
-
Hi Becky,
As far as i understand Google will not ignore however Google do treat some part of the page as more important than other. For instance, if you have written a description of the product and some of the description is been hide.
Google, will take that as the important piece of content been displayed for user and least important been hide.
I do not see any point for Google to ignore the fold one. -
Content below the fold is still read, however less value is placed on it. So it is still worth having content that is produced for below the fold as it will still help that page rank.
Show the user what they want to see when they land on the page, majority of the time in doing this you will actually show Google what they need to rank you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unique content for international SEO?
Hi Guys, We have a e-commerce store on generic top-level domain which has 1000s of products in US. We are looking to expand to aus, uk and canda using subfolders. We are going to implement hreflang tags. I was told by our SEO agency we need to make all the content between each page unique. This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content. Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages? Cheers.
Intermediate & Advanced SEO | | geekyseotools0 -
Duplicate content across different domains
Hi Guys, Looking for some advice regarding duplicate content across different domains. I have reviewed some previous Q&A on this topic e.g. https://mza.seotoolninja.com/community/q/two-different-domains-exact-same-content but just want to confirm if I'm missing anything. Basically, we have a client which has 1 site (call this site A) which has solids rankings. They have decided to build a new site (site B), which contains 50% duplicate pages and content from site A. Our recommendation to them was to make the content on site B as unique as possible but they want to launch asap, so not enough time. They will eventually transfer over to unique content on the website but in the short-term, it will be duplicate content. John Mueller from Google has said several times that there is no duplicate content penalty. So assuming this is correct site A should be fine, no ranking losses. Any disagree with this? Assuming we don't want to leave this to chance or assume John Mueller is correct would the next best thing to do is setup rel canonical tags between site A and site B on the pages with duplicate content? Then once we have unique content ready, execute that content on the site and remove the canonical tags. Any suggestions or advice would be very much appreciated! Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Galleries and duplicate content
Hi! I am now studing a website, and I have detected that they are maybe generating duplicate content because of image galleries. When they want to show details of some of their products, they link to a gallery url
Intermediate & Advanced SEO | | teconsite
something like this www.domain.com/en/gallery/slide/101 where you can find the logotype, a full image and a small description. There is a next and a prev button over the slider. The next goes to the next picture www.domain.com/en/gallery/slide/102 and so on. But the next picture is in a different URL!!!! The problem is that they are generating lots of urls with very thin content inside.
The pictures have very good resolution, and they are perfect for google images searchers, so we don't want to use the noindex tag. I thought that maybe it would be best to work with a single url with the whole gallery inside it (for example, the 6 pictures working with a slideshow in the same url ), but as the pictures are very big, the page weight would be greater than 7 Mb. If we keep the pictures working that way (different urls per picture), we will be generating duplicate content each time they want to create a gallery. What is your recommendation? Thank you!0 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Duplicate content on the same page--is this an issue?
We are transitioning to responsive design and some of our pages will not scale properly, so we were thinking of adding the same content twice to the same URL (one would be simple text -- for mobile and the other would include the images, etc for the desktop version), and content would change based on size of the screen. I'm not looking for another technical solution (I know google specifies that you can dynamically serve different content based on user agent)--I am wondering if any one knows if having the same exact content appear twice on the same URL will cause a problem with SEO (any historical tests or experience would be great). Thank you in advance.
Intermediate & Advanced SEO | | nicole.healthline0 -
What do I do about sites that copy my content?
I've noticed that there are a number of websites that are copying my content. They are putting the full article on their site, mentioning that it was reposted from my site, but contains no links to me. How should I approach this? What are my rights and should I ask them to remove it or add a link? Will the duplicate content affect me?
Intermediate & Advanced SEO | | JohnPeters0