PageSpeed Vs Page Size
-
Hi,
We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements.
However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages.
However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom.
My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective.
Please suggest. Here is my homepage, just as to give you an idea of what i am talking about:
-
Thanks Cyrus and Max,
Very good answer and I am going to work as per your suggestions
-
As Max said, from a ranking perspective, Time to First Byte seems to be the most important factor. The same author of that post offered some tips to improving time to first byte: http://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte
Oftentimes, you simply have a lot of assets to load and it's difficult to cut anything back. In these cases, the order that things load becomes increasingly important for user experience (asynchronous java script, for example).
Regardless, doing everything you can to improve speed and checking with Google Page Speed Insights is usually the best advice. I've never, ever seen a website where improving speed performance didn't help with traffic metrics (wether rankings or engagement) so I believe it's an investment worth making.
-
What google really cares about is the TTFB (Time To First Byte), to check it just head for GWT, in crawl stats.
To date the general consensus is above 1s is bad and google could penalize you, below .5s is good and google could improve your ranking a little bit.
Google suggest using webpagetest to check a website performance: if you run the test for your website you will se the TTFB is not that bad: http://www.webpagetest.org/result/141124_MF_14DY/
Your overall load time is 10s and I agree is too much, it's supposedly worse your user experience, increasing your bounce rate and alienating some of your visitors. You should work to improve it, webpagetest suggest to compress images and use leverage browser cache, which are good suggestions.
Analyze closely the waterfall to investigate further and identify other areas of interventions.
-
Hi there,
I think it would improve page load if the youtube video was the last to load.
Hope it helps you.
-
You are right! Which is why I dont want to compromise on usability. Thanks for your response
-
give it some time! It should be ok. The main issue with speed should be if the users are fine with it. Think of people before SEO and you ll be fine!
-
Thanks for your response, but the images are possibly as optimized as they could be. I use ImageOptim for Mac to optimize them, they are all jpegs (stripped from all metadata) and enabled for (mild) lossy to WebP on supported browsers.
Do you feel there might be anything else that I could do?
-
Am sure you could work on the optimization a bit more, especially of the images.
none the less if you require the same structure and you are unable to change the size then I would not worry so much about it. Having a fast website is only one of the hundred of different factors that affect SEO. Just work on the other factors and it will be fine!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unlimited Product Pages
While browsing through my Moz campaign, I noticed that my site is pulling up unlimited numbers of product pages even though no products appear on them. i.e. http://www.interstellarstore.com/star-trek-memorabilia?page=16 http://www.interstellarstore.com/star-trek-memorabilia?page=100 http://www.interstellarstore.com/star-trek-memorabilia?page=200 I have no ideal how to resolve this issue. I can't possible 301 an unlimited number of pages, and I can see this being a big SEO problem. Any thoughts?
Intermediate & Advanced SEO | | moon-boots0 -
Would it work to place H1 (or important page keywords) at the top of your page in HTML and move lower on page with CSS?
I understand that the H1 tag is no longer heavily correlated with stronger ranking signals but it is more important that Keywords or keyphrases are at the top of a page. My question is, if I just put my important keyword (or H1) toward the top of my page in the HTML and move it towards the middle/lower portion with css position elements, will this still be viewed by Googlebot as important keywords toward the top of my page? QCaxMHL
Intermediate & Advanced SEO | | Jonathan.Smith0 -
PDF or HTML Page?
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
Intermediate & Advanced SEO | | Sika220 -
SEO: Subdomain vs folders
Hello, here's our situation:We have an ecommerce website, say www.example.com. For support, we use zendesk which offers a great platform that makes publishing FAQ and various resource articles very easy. We're torn between publishing these articles on our actual website, or publishing them via Zendesk. If we publish them on our website, the url would be something like:
Intermediate & Advanced SEO | | yacpro13
www.example.com/articles/title_article.html On the other hand, if we publish them via zendesk, the url would look like:
support.example.com/articles/title_of_article We would like to publish them via Zendesk, however, we do no want to miss out on any SEO benefit, however marginal it may be. Doing it this way, the domain would have all of the ecommerce pages (product and category pages), and the subdomain would have ALL other types of pages (help articles and also policies, such as return policies, shipping info, etc etc). I know a long time ago, folders were preferred over subdomains for SEO, but things change all the time.Do you think setting up all the ecommerce pages on the domain, and all content pages on the subdomain, would be a lesser solution? Thanks.0 -
Does Google make continued attempts to crawl an old page one it has followed a 301 to the new page?
I am curious about this for a couple of reasons. We have all dealt with a site who switched platforms and didn't plan properly and now have 1,000's of crawl errors. Many of the developers I have talked to have stated very clearly that the HTacccess file should not be used for 1,000's of singe redirects. I figured If I only needed them in their temporarily it wouldn't be an issue. I am curious if once Google follows a 301 from an old page to a new page, will they stop crawling the old page?
Intermediate & Advanced SEO | | RossFruin0 -
SEOMOZ crawl all my pages
SEOMOZ crawl all my pages including ".do" (all web pages after sign up ) . Coz of this it finishes all my 10.000 crawl page quota and be exposed to dublicate pages. Google is not crawling pages that user reach after sign up. Because these are private pages for customers I guess The main question is how we can limit SEOMOZ crawl bot. If the bot can stay out of ".do" java extensions it'll perfect to starting SEO analysis. Do you know think about it? Cheers Example; .do java extension (after sign up page) (Google can't crawl) http://magaza.turkcell.com.tr/showProductDetail.do?psi=1001694&shopCategoryId=1000021&model=Apple-iPhone-3GS-8GB Normal Page (Google can crawl) http://magaza.turkcell.com.tr/telefon/Apple-iPhone-3GS-8GB/1001694/.html
Intermediate & Advanced SEO | | hcetinsoy0 -
Should we deindex duplicate pages?
I work on an education website. We offer programs that are offered up to 6 times per year. At the moment, we have a webpage for each instance of the program, but that's causing duplicate content issues. We're reworking the pages so the majority of the content will be on one page, but we'll still have to keep the application details as separate pages. 90% of the time, application details are going to be nearly identical, so I'm worried that these pages will still be seen as duplicate content. My question is, should we deindex these pages? We don't particularly want people landing on our application page without seeing the other details of the program anyway. But, is there problem with deindexing such a large chunk of your site that I'm not thinking of? Thanks, everyone!
Intermediate & Advanced SEO | | UWPCE0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0