Content Below the Fold
-
Hi
I wondered what the view is on content below the fold?
We have the H1, product listings & then some written content under the products - will Google just ignore this?
I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is?
Thank you
-
Hi Becky,
Here is what I found:
The pros and cons of hiding content using JavaScript and CSS (display: none) has been a topic of some debate within the SEO industry, and Google’s comments over time have somewhat added to the confusion.
- **November 2014 **– Google’s John Mueller stated that Google _“may not” _index or rank hidden content. In aGoogle+ Hangout the following month, John repeated this, stating that hidden content would be _“discounted”_and has been for a number of years
- **21 July 2015 **– Google’s Gary Illyes, contributing to a Stack Overflow forum thread , provided clarification of this by stating that this type of content is given “way less weight in ranking”
- **27 July 2015 **– In a separate Stack Overflow thread on the same topic, Gary Illyes again confirmed that _“[Google] will index that but the content’s weight will be lower since it’s hidden” _
So the content will still be indexed, but deemed less important by the crawlers.
-
Yeh it's disappointing.
I've tried having some content behind a tab and some under the products and I am not seeing either one as having much of an effect.
Unless I remove it altogether, I'm not sure what else I can do with it?
-
Hi
Yes I tried different pages and it's still the same. I think it's to do with things we have blocked in robots.txt...
-
I'm not seeing a problem in my GoogleBot simulators, Becky, but the one within your Google Search Console is still the best judge. Have you tried reloading the Fetch as... a couple of times? And tried it on different pages?
-
Yup - Google still says content that can only be seen after a user interaction is given less importance. Kinda stupid, given that things like tabs/accordians are a major usability enhancement, but that's still where we are.
P.
-
Hi
So I did fetch as Google - and I'm seeing the page quite differently if I'm Googlebot vs. visitors.
It just sees a few big images, I can't see it rendering any product listings or content - do I now have a bigger problem?
Thank you
-
Hi
Thank you for the replies. I don't want to hide it, I just can't have it pushing products down the page so they can't be seen..
I thought in Google webmaster guidelines they included a comment to say they will ignore content behind tabs?
Becky
-
Any content below the fold will still be read. Are you trying to hide it but still get the SEO value? If that's the case, I would create a collapsible tab to keep the content on the page but hidden. If you want it to be visible, leave it as is and don't worry about Google not reading it—it will be read.
-
While theoretically logical, Google's own John Mueller stated last week that code to text ratio has absolutely no effect on crawling of a site, and in a followup question, he directly told me text/code ratio has no effect as a ranking factor either.
These used to be very minor considerations back when the search engine crawlers weren't as powerful, but no longer.
Fully agree with Pia that the idea of "above the fold" influencing ranking is nonsense as well. Given that the sweet spot for consistently high-ranking pages is ~2200 words, the idea that only the first paragraph or two are more important is unsupportable.
Hope that helps?
Paul
-
Additionally, do check the content using Fetch as google in Google Search Console / Webmaster tools. It would really help you see how spiders see the content compared to users. This is an important aspect of SEO which a lot of people ignore, you are looking to find that whether the spiders see a structured view of the content and not messy. I hope this helps, if you have further questions, please feel free to ask. Regards, Vijay
-
There's no manipulation whatsoever. In fact, Google encourage website developers and SEOs to optimise/tidy their code and keep a good code-to-content ratio. This is why Google gives us so many tools in order to do so. It makes our sites easier to crawl for Google, and in return Google may even like us more for it!
Just found an article that sums it up quite nicely:
"Essentially what is being stated is a fairly logical conclusion: reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easy to crawl and, additionally, allow you greater control over which portions of your content are given more weight by the search engines. The thinking behind validating code for SEO benefits is that, once you have a compliant site, not only will your site be a better user experience on a much larger number of browsers, but you’ll have a site with far less code, that will rank higher on the search engines because of it."
- http://www.hobo-web.co.uk/official-google-prefers-valid-html-css/
But going back to your original post, "above-the-fold is dead", yadda yadda... So long as your content in the source is metaphorically "above the fold" and not drowning in heavy code, on the page itself just worry about giving your users the "experience" that they're looking for. And not how many pixels from the top of the browser your content is. Hope that makes more sense!
-
Great thank you, you read so many conflicting articles that it's difficult to know.
I'll see if we can look at our code, but I'd want to be mindful of not manipulating Google.
Thank you!
-
I feel prioritising elements to be "above the fold" is a bit of an outdated concept these days.
Where is the fold? Different devices and screen resolutions will have different folds, and more websites are being designed now to make the traditional "above the fold" section more visually interesting and designed for user experience, rather than packed full of content.
The higher the content is in the source code itself, the more weight it will have on the page. This doesn't necessarily translate to the "visually higher the content is on the page". Google is going to be reading from top to bottom of your code, so naturally you want the most important content/links to be found first. As long as you meet (or exceed!) the user's expectation of the content upon arrival, and you keep the code tidy in terms of how much Google has to read before it gets to the real valuable content, I doubt Google's going to worry about whether users have to scroll a little to get to it.
-
Hi Becky,
As far as i understand Google will not ignore however Google do treat some part of the page as more important than other. For instance, if you have written a description of the product and some of the description is been hide.
Google, will take that as the important piece of content been displayed for user and least important been hide.
I do not see any point for Google to ignore the fold one. -
Content below the fold is still read, however less value is placed on it. So it is still worth having content that is produced for below the fold as it will still help that page rank.
Show the user what they want to see when they land on the page, majority of the time in doing this you will actually show Google what they need to rank you.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Not Indexing App Content
Hello Mozzers I recently noticed that there has been an increase in crawl errors reported in Google Search console & Google has stopped indexing our app content. Could this be due to the fact that there is a mismatch between the host path name mentioned within the android deeplink (within the alternate tag) and the actual URL of the page. For instance on the following desktop page http://www.example.com.au/page-1 the android deeplink points to http://www.example.com.au/android-app://com.example/http/www.example.com.au/4652374 Please note that the content on both pages (desktop & android) is same.Is this is a correct setup or am I doing something wrong here? Any help would be much appreciated. Thank you so much in advance.
Intermediate & Advanced SEO | | InMarketingWeTrust0 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
Two Different Domains exact same content
Hello, I suspect I know the answer to this but would like to have it confirmed. I have been speaking to a company the last couple of weeks who have 2 domains with the exact same content. Possible a third but they haven't supplied a link. This from all I've read would be a huge problem for ranking and SEO. What would be the best way to deal with this ? I did do a search and found articles/questions on same content on the same site and in articles etc but nothing about exactly the same websites on 2 domains. Cheers David.
Intermediate & Advanced SEO | | techdesign0 -
Page structure and how to optimize old content
SITE STRUCTURE I am trying to optimize the structure of our site Dreamestatehuahin.com. Getting a visible sitemap of my page make me realized it was not a pyramid as I expected it to be but instead very flat. I Would be happy for some advise on how to structure my site in future aswell how to optimize certain place on the page that i think need a change. 1: structure on posts. Maybe I misunderstand how post works in wordpress or something happen with my theme. When I look at my page sitemap my page is VERY flat because permalinks setting I chose the setting as post name (recommended in most articles). http://www.dreamestatehuahin.com/sample-post What I actually believed was that post name was place after /blog/ like: http://www.dreamestatehuahin.com/blog/sample-post I would be a good idea to do like this right? Should I add some SEO text on the top of my blog page before the actually posts. Or would this be a bad idea due to pagination causing double content? Could one do 4 blogs in one site and replace the name “blog” in the url with a keywords http://www.dreamestatehuahin.com/real-estate-announcement/sample-post http://www.dreamestatehuahin.com/hua-hin-attractions/sample-post 2) Pages Based on property type From our top menu, i have made links under for sael using wordpress property types http://www.dreamestatehuahin.com/property-type/villa/ http://www.dreamestatehuahin.com/property-type/hot-deals/ http://www.dreamestatehuahin.com/property-type/condominium/ Earlier I found that these pages created duplictaon of titles due to pagenation so I deleted the h1 What would you do with these pages. Should I optimize them with a text and h1. maybe it is possible to add some title and text content for the top of the first page only (the one page that are linked to our top menu) http://www.dreamestatehuahin.com/property-type/villa and not to page 2, 3, 4….. http://www.dreamestatehuahin.com/property-type/villa/page/2/ b) Also maybe I should rename the property types WOuld it make sence to change name of the property types from etc villa to villas for sale or even better villas for sale hua hin Then the above urls will look like this instead: http://www.dreamestatehuahin.com/property-type/villas-for-sale/ Or Maybe renaming a property type would result in many 404 errors and not be worth the effort? 3) LINKING + REPOSTING OUR “PROPERTY” PAGES AND DO A 301 REDIRECT? a) Would It be good idea to link back from all properties description to one of our 5 optimized landingpages (for the keyword home/house/condo/villa) for sale in Hua Hin? http://www.dreamestatehuahin.com/property-hua-hin/ http://www.dreamestatehuahin.com/house-for-sale-hua-hin/ b) Also so far we haven’t been really good about optimizing each property (no keywords, optimized titles or descriptions) etc. http://www.dreamestatehuahin.com/property/baan-suksamran/ I wonder if it would be worth the effort to optimize content of each of the old properties )photos-text) on our page? Or maybe post the old properties again in a new optimized version and do a 301 redirect from the old post?
Intermediate & Advanced SEO | | nm19770 -
Content position and topic modelling
Hi, Two questions here, First: Does the position of content have any impact on performance? For example say a page displays a league table (20 rows) so eats up most of the above-fold space. Would that table being top followed by content have a negative impact? Would creating 'some' content before a table help? Second: Does topic modelling actually help relevance signals? So say I sold guitars and the page had the word 'guitar' throughout the content, would including electric, acoustic, strings, amps etc also in the content help the page become more relevant for the term 'guitar'? Or would it just expand the terms the page would be eligible to show for? Thanks.
Intermediate & Advanced SEO | | followuk1 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Avoiding duplicate content on an ecommerce site
Hi all, I have an ecommerce site which has a standard block of text on 98% of the product pages. The site also has a blog. Because these cause duplicate content and duplicate title issues respectively, how can I ever get around this? Would having the standard text on the product pages displayed as an image help? And how can I stop the blog being listed as duplicate titles without a nofollow? We already have the canonical attribute applied to some areas where this is appropriate e.g. blog and product categories. Thanks for your help 🙂
Intermediate & Advanced SEO | | CMoore850