Does page size and relative content position affect SEO?
-
Good morning,
Each product page of our e-commerce site consists of a fairly lengthy header and footer. The former of which contains links to ~60 product categories, the logo, etc, while the latter contains information such as the latest posts from our blog, links to support, etc. The main "content" of the page is of course product related information, which also happens to contain a bit of templated data such as links which when clicked open respective sliders containing information regarding our return and shipping policies.
The question: We wonder whether the relative "size" of the page has anything to do with SEO results. As an example, suppose the page header consists of 20% of the total page size, the important page-specific content consumes 60%, and the footer consumes the final 20%. Is this relevant?
Or to rephrase the question: Should we be concerned about keeping our headers and footers as small as possible?
Thanks!
-
My sample size is small (I am currently maintaining just four sites) but all of them include a large banner on top, which leaves 95% of text content below the fold. None of them have been penalized for this so far, and all of them rank well.
As for the original question, such ratio does not seem to affect standings of the sites I've worked on. However, keep in mind that such heavily skewed page interface tends to look bad, affecting your customers' UX. I would try to keep your footer at most as large as the body of your smallest page.
-
Certainly your content should be substantial, or you will be done for thin content, but what they limit is no one knows for sure, you would also want to see some content above the fold. Google recently bought out a update that penalizes pages with no content above the fold. How I read it, it is mainly for pages that have lots of adds and other stuff and the content you searched for below the fold, but could mean that you need some text above the fold that mentions the search terms.
So if you want to rank for widgets, you would want to have the keyword widgests above the fold.http://googlewebmastercentral.blogspot.com.au/2012/01/page-layout-algorithm-improvement.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Aggregator Services....good or bad for seo?
I've just had a demo from a content aggregator service called NewsCred. Essentially the service licenses the use of content from multiple sources on our own site. They claim that as the content is all properly referenced back to the original sites there are no SEO implication to the host site. Are they correct? Should we stay away?
Technical SEO | | J_Sinclair0 -
Can Googlebot crawl the content on this page?
Hi all, I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really. I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.). I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML. Thoughts?
Technical SEO | | danatanseo0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Duplicate content with "no results found" search result pages
We have a motorcycle classifieds section that lets users search for motorcycles for sale using various drop down menus to pick year-make-type-model-trim, etc.. These search results create urls such as:
Technical SEO | | seoninjaz
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=On-Off Road&vehicle_model=Tiger&vehicle_trim=800 XC ABS We understand that all of these URL varieties are considered unique URLs by Google. The issue is that we are getting duplicate content errors on the pages that have no results as they have no content to distinguish themselves from each other. A URL like:
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=Sportbike
and
www.example.com/classifieds/search.php?vehicle_manufacturer=Honda&vehicle_category=Streetbike Will have a results page that says "0 results found". I'm wondering how we can distinguish these "unique" pages better? Some thoughts:
-make sure <title>reflects what was search<br />-add a heading that may say "0 results found for Triumph On-Off Road Tiger 800 XC ABS"<br /><br />Can anyone please help out and lend some ideas in solving this? <br /><br />Thank you.</p></title>0 -
Redirecting Entire Microsite Content to Main Site Internal Pages?
I am currently working on improving site authority for a client site. The main site has significant authority, but I have learned that the company owns several other resource-focused microsites which are stagnant, but which have accrued significant page authority of their own (thought still less than the main site). Realizing the fault in housing good content on a microsite rather than the main site, my thought is that I can redirect the content of the microsites to internal pages on the main site as a "Resources" section. I am wondering a: if this is a good idea and b: the best way to transfer site authority from these microsites. I am also wondering how to organize the content and if, for example, an entire microsite domain (e.g. microsite.com) should in fact be redirected to internal resource pages (e.g. mainsite.com/resources). Any input would be greatly appreciated!
Technical SEO | | RightlookCreative1 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0