Page speed - what do you aim for?
-
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download?
Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec)Thanks, Luke
-
IMHO, if somebody is paying us for SEO, then our GOAL is to get the homepage to load in a second or less.... especially if most of the users are mobile. If it's mid 1 second, then we can grudgingly live with that.
I'm glad you asked about server response times.... for most sites, after the content is optimized ( smaller images, clunky code, etc...) the initial server response time is usually the culprit for getting over a second.... as long as the rest of the home page is "light". Light to us is under 1MB. Depending on your CMS, there are a variety of ways to get the response time to be 200ms or less.
Google Pagespeed, as David said, is a good measurement, but it's not the holy grail of measurements. We use it only to identify areas that need improvement. Waterfalls tell us what's taking so long and what's heavy.
You didn't ask about plugins - which is a major culprit to caching, minify errors, conflicts, speed and weight. We limit all active plugins to TEN (including caching, SEO, security). For some sites, plugin clean up is the easiest way to speed up a site.
At the end of the day, nothing beats clean code, light images and a lightening fast server.
-
Thanks for all the feedback everyone - much appreciated, Luke
-
As long as the page loads quick for users then I wouldn't put a huge focus on this. True that Google looks at page load speed, but I wouldn't put all your eggs in that basket. We have sites that show a 2.5-3.5 second load time, and they still dominate ranking results. Focus on creating a better experience.
One of the simple ways to speed up load times is to minify and compress CSS and Javascript files as small as possible, but be sure to check that the minification does not break areas of the site. We have seen improvements as high as 75% just from completing this step alone.
If you have a Joomla or WordPress website, here is a great plugin that will do this for you: https://www.jch-optimize.net/
-
Hi Luke! When using this tool (https://developers.google.com/speed/pagespeed/insights/) we aim to have our clients above 80 for both mobile and desktop.
-
I will be honest, I don't trust Google with PageSpeed. There are too few questions asked about how it actually fits in with the metrics and what is used. One example is Google says resources like Google Analytics do not count against your score in the SERPS. But in the test they do. If you use several Google assets like Adwords, Analytics, fonts, ect; you will show a very low score. Using them will actually block you from seeing other things that you can fix.
What we have started doing is figuring out what Google actually needs and presenting it to them. We started hiding tracking codes from Google. Bing, Facebook, ect from them. We hide our analytics tracking script from Google's crawler. I figured out that Google's test servers have the FA library and also their font library locally on the machine, so we have started hiding those from it as well. Any 3rd party script we have that Google does not need to see has been hidden as well, segmentify, olark, anything really. Doing these things has raised our score quite a bit.
-
"if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect."
Just adding a bit of weight to what you said, here's a test of a t.co link through bit.ly: https://i.gyazo.com/ca87c486a903914c2b058612cc93f3f0.png on 3g, it's 4.27s to even start loading Google. Without t.co: https://i.gyazo.com/f22c18a0879f76ecf653662153e17c43.png which is 2.35s.
-
Pagespeed score means nothing unfortunately. http/2 puts a spanner in the works for a lot of it.
https://blog.newrelic.com/2016/02/09/http2-best-practices-web-performance/
Being this section:
- Concatenating JavaScript and CSS files: Combining smaller files into a larger file to reduce the total number of requests.
- **Image spriting: **Combining multiple small images into one larger image.
- Domain sharding: Spreading requests for static assets across several domains to increase the total number of open TCP connections allowed by the browser.
- Inlining assets: Bundling assets with the HTML document source, including base-64 encoding images or writing JavaScript code directly inside
-
It's hard to be explained but "Less is MORE!" in general for that numbers.
Examples - redirectors. Redirects can overkill your site specially on mobile users. For that even simply site redirect can took second or two. Example www.example.com -> 301 -> m.example.com; looks simple isn't? But in reality after client took 301 redirect he must make new domain resolving (for m.exmaple.com) and then new connect to new server (m.example.com). And this is simply case... if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect. I know that marketers want to see "clicks", but isn't good for mobile users.
Server connection is also need to be less. But this mean that server need to be closer to user. Best example is Australia. There even simply DNS resolving + connection took one second. And client doesn't receive single byte from server yet... You can see WebPageTest.org (there are Australian servers). But of course providing single server there is expensive, so you need to have deep pockets to make servers there. That's why most of companies providing CDN support. Since CDN endpoint is closer to user it make things little bit faster for them. And if CDN is setup correct should be much faster.
So - idea is "Less is More!". The best is if you use WPT to benchmark your site from all over the world. And also setup Analytics to count speed. Because it's different speed when your site is on perfect conditions in datacenter than in real world.
-
Hi Luke,
Here is what google recommends in terms of page speed. Server response time to be less than 200 ms.
Now, coming to the Page Speed tool / Insight that google provide the measure the page speed ratings (1-100) , Google Page Speed score is indeed a strong indicator of a website’s loading performance in terms of time.
As per my research, total website download less than 10 secs corresponds to 75-85 on pagespeed score.
I hope this helps.
Thanks,
Vijay
-
Thanks Tom for picking up on that error - ugh - corrected now. Brain working sluggishly this morning lol!
-
Hi Luke,
"Avg. Page Load Time (sec) [Google recommends 200ms]:" That's actually for the server response time.
Personally, the only thing that matters is that the overall page load time is quick. I aim if possible for sub 2 seconds for any page.
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
301 Externally Linked, But Non-Producing Pages, To Productive Pages Needing Links?
I'm working on a site that has some non-productive pages without much of an upside potential, but that are linked-to externally. The site also has some productive pages, light in external links, in a somewhat related topic. What do you think of 301ing the non-productive pages with links to the productive pages without links in order to give them more external link love? Would it make much of a difference? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Product Page rankings - How to boost?
Hi folks I am responsible for an e-commerce website. Our website is doing very well but I believe that our product pages should be ranking more highly than they currently are. When taking over my current role, it became clear that a number of changes would need to be made to try and boost the under performing product pages. Amongst other things I therefore implemented the following: New Product content - we have placed a massive focus on reworking all product content so that it is unique and offers value to the reader. The new content includes videos, images and text that is all keyword rich but (I hope) not seen as overly spammy. Duplicate content - the CMS was creating multiple versions of the same page - I addressed this by implementing 301 redirects and adding canonical links. This ensures there is now only 1 version of the page Parameters - I instructed Google to not index certain URLs containing specific parameters Internal links - I have tried to improve the number of links to the products from relevant key category pages My question is, although some of the changes have only been in place for a month, what else can I do to ensure that the product pages rank as highly as possible. As an e-commerce website with so many products it is very difficult to link to these product pages directly, so any tips or suggestions would be welcome! Here's an example of a product page link : http://www.directheatingsupplies.co.uk/pid_37440/100180/Worcester-Greenstar-29CDi-Classic-Gas-Combi-Boiler-7738100216-29-Cdi.aspx
Intermediate & Advanced SEO | | DHS_SH0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
Better UX or more Dedicated Pages (and page views)?
Hi, I'm building a new e-commerce site and I'm conflicting about what to do in my category pages. If we take for example a computer store.
Intermediate & Advanced SEO | | BeytzNet
I have a category of laptops and inside there are filters by brand (Samsung, HP, etc.). I have two options - either having the brand choice open a new dedicated page -
i.e. Samsung-Laptops.aspx or simply do a JQuery filter which gives a better and faster user experience (immediate, animated and with no refresh). **Which should I use? (or does it depend on the keyword it might target)? **
Samsung laptops / dell laptops / hp laptops - are a great keyword on there own! By the way, splitting Laptops.aspx to many sub category physical pages might also help by providing the site with many actual pages dealing with laptops altogether.0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Link Juice - Lots of Pages
I have a site, PricesPrices.com where I'm steadily building inbound links and pagerank. I have about 4600 pages on the site, most of which are baby products in the baby gear sector. There are many outdated items that aren't really my focus, but do pop up in long-tail search queries from time to time. My question is a pretty basic one. Theoretically if a site has say 28/100 link juice, then as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept? My thoughts are to hide many of the products that i don't really need to focus on therefor passing more link juice to the products that remain, but I also don't want to that if it won't necessarily make the remaining pages rank higher or have more link juice. I also have to keep in mind the merchandising aspect of the site and providing a good user experience. If i only have 300 products on the site, there will be a ton of unhappy people who can't find the products they are looking for. Any thoughts and/or pointers in the direction of funneling that pagerank down into my site would be much appreciated. Thanks!
Intermediate & Advanced SEO | | modparent0