Does page size and relative content position affect SEO?
-
Good morning,
Each product page of our e-commerce site consists of a fairly lengthy header and footer. The former of which contains links to ~60 product categories, the logo, etc, while the latter contains information such as the latest posts from our blog, links to support, etc. The main "content" of the page is of course product related information, which also happens to contain a bit of templated data such as links which when clicked open respective sliders containing information regarding our return and shipping policies.
The question: We wonder whether the relative "size" of the page has anything to do with SEO results. As an example, suppose the page header consists of 20% of the total page size, the important page-specific content consumes 60%, and the footer consumes the final 20%. Is this relevant?
Or to rephrase the question: Should we be concerned about keeping our headers and footers as small as possible?
Thanks!
-
My sample size is small (I am currently maintaining just four sites) but all of them include a large banner on top, which leaves 95% of text content below the fold. None of them have been penalized for this so far, and all of them rank well.
As for the original question, such ratio does not seem to affect standings of the sites I've worked on. However, keep in mind that such heavily skewed page interface tends to look bad, affecting your customers' UX. I would try to keep your footer at most as large as the body of your smallest page.
-
Certainly your content should be substantial, or you will be done for thin content, but what they limit is no one knows for sure, you would also want to see some content above the fold. Google recently bought out a update that penalizes pages with no content above the fold. How I read it, it is mainly for pages that have lots of adds and other stuff and the content you searched for below the fold, but could mean that you need some text above the fold that mentions the search terms.
So if you want to rank for widgets, you would want to have the keyword widgests above the fold.http://googlewebmastercentral.blogspot.com.au/2012/01/page-layout-algorithm-improvement.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
Content on top-level-domain vs. content on subpage
Hello Seomoz community, I just built a new website, mainly for a single affiliate programm and it ranks really well at google. Unfortunately the merchant doesn’t like the name of my domain, that’s why I was thrown out of the affiliate program. So suppose the merchant is a computer monitor manufacturer and his name is “Digit”. The name of my domain is something like monitorsdigital.com at the moment. (It’s just an example, I don’t own this URL). The structure of my website is: 1 homepage with much content on it + a blog. The last 5 blog entries are displayed on the homepage. Because I got kicked out of the affiliate program I want to permanent redirect monitorsdigital.com to another domain. But what should the new website look like? I have two possibilities: Copy the whole monitorsdigital website to a new domain, called something like supermonitors.com. Integrate the monitorsdigital website into my existing website about different monitor manufacturers. E.g.: allmonitors.com/digit-monitors.html (that url is permitted by the merchant) What do you think is the better way? I just got the impression, that it seems to be a little easier to rank high with a top-level-domain (www.supermonitors.com) than with a subpage (www.allmonitors.com/digit-monitors.html). However the subpage can benefit from the domain authority, that was generated by other subpages. Thanks for your help and best regards MGMT
Technical SEO | | MGMT0 -
Has any positive or negative effect for the SEO results if the domain contains desired keyword?
Helo! Has any positive or negative effect for the SEO results if the domain contains desired keyword? Thanks for the answer.
Technical SEO | | Brainsum0 -
On-Page SEO of the SEOmoz Blog Section
Hey Everyone My brain isn't working (only had 1 cup of coffee so far - #2 on it's way) this morning and I could use some help. We're creating a blog on a site for a client of ours and I've been looking at the SEOmoz blog for best practices when it comes to the implementation of pagination, canonical tags and noindex. My questions: There is no use of the canonical tag on the main blog page or any of the paginated pages but it is being used on blog post pages. Why not use it on the main blog pages as well? I'm assuming because the blog pages are always changing with different content there is not much point? Paginated pages in the category sections i.e. http://www.seomoz.org/blog/category/1?page=2 are noindexed but paginated pages in the main blog section i.e. http://www.seomoz.org/blog?page=2 are not. Is this because of a duplicate content concern since the posts in the category sections are in the main blog section as well? If that's the case, why wouldn't the main category page i.e.http://www.seomoz.org/blog/category/1 be noindexed as well? What's the reason for noindexing the "Show # Posts" pages i.e.http://www.seomoz.org/blog?show=5 ? I'm assuming another concern of duplicate content? Any insights into these questions would be greatly appreciated and would help with the implementation of our clients blog. Thanks, Ken
Technical SEO | | noBulMedia0 -
A lot of product pages with very similar content
I'm working with someone who's setting up an online jewelry store. The jewelry is available in many metal types, so we're creating filters to provide a good user experience in trying to narrow down their choice. Let's take an example of a wedding ring that's available these options: 10kt yellow gold
Technical SEO | | Leighm
10kt white gold
18kt yellow gold
18kt white gold
Palladium
Platinum These are all entered as separate products, so that they can be used in the filtering system. However, apart from some minor changes to the title and description most of the content will be identical, across these 6 product pages. Also, many wedding ring styles are going to be very similar, so we're going to have very similar descriptions for a lot of the rings. We're concerned about problems this might cause with the search engines in terms of duplicate content. There's 2 issues that I an see (there may be more!): They will not index many of the pages and we'll leak link juice to those pages that will never get indexed They do index all the variations, but the content is so similar, that we have different pages competing for essentially the same keywords Also, these products are likely to come and go, so investing heavily on creating really unique content for them isn't really sustainable, affordable. Any advise? Thanks,0 -
How does a mega drop-down affects SEO?
We are looking at implementing a "mega drop-down" as our main menu on our website. Will that be good or bad for SEO? My company is a big tour operator so our website contains a lot of pages describing all our destinations, hotels etc. We have noticed that our visitors have some trouble to navigate to all this pages since it requires a lot of clicks to reach a specific page. In order to make this easier we have looked at this popular mega drop-down thing that we all love. But what about Google? Will Google love or hate us for doing this? An example showing what I mean by mega drop-down: http://www.phonehouse.se/
Technical SEO | | Googleankan0