Page speed - what do you aim for?
-
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download?
Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec)Thanks, Luke
-
IMHO, if somebody is paying us for SEO, then our GOAL is to get the homepage to load in a second or less.... especially if most of the users are mobile. If it's mid 1 second, then we can grudgingly live with that.
I'm glad you asked about server response times.... for most sites, after the content is optimized ( smaller images, clunky code, etc...) the initial server response time is usually the culprit for getting over a second.... as long as the rest of the home page is "light". Light to us is under 1MB. Depending on your CMS, there are a variety of ways to get the response time to be 200ms or less.
Google Pagespeed, as David said, is a good measurement, but it's not the holy grail of measurements. We use it only to identify areas that need improvement. Waterfalls tell us what's taking so long and what's heavy.
You didn't ask about plugins - which is a major culprit to caching, minify errors, conflicts, speed and weight. We limit all active plugins to TEN (including caching, SEO, security). For some sites, plugin clean up is the easiest way to speed up a site.
At the end of the day, nothing beats clean code, light images and a lightening fast server.
-
Thanks for all the feedback everyone - much appreciated, Luke
-
As long as the page loads quick for users then I wouldn't put a huge focus on this. True that Google looks at page load speed, but I wouldn't put all your eggs in that basket. We have sites that show a 2.5-3.5 second load time, and they still dominate ranking results. Focus on creating a better experience.
One of the simple ways to speed up load times is to minify and compress CSS and Javascript files as small as possible, but be sure to check that the minification does not break areas of the site. We have seen improvements as high as 75% just from completing this step alone.
If you have a Joomla or WordPress website, here is a great plugin that will do this for you: https://www.jch-optimize.net/
-
Hi Luke! When using this tool (https://developers.google.com/speed/pagespeed/insights/) we aim to have our clients above 80 for both mobile and desktop.
-
I will be honest, I don't trust Google with PageSpeed. There are too few questions asked about how it actually fits in with the metrics and what is used. One example is Google says resources like Google Analytics do not count against your score in the SERPS. But in the test they do. If you use several Google assets like Adwords, Analytics, fonts, ect; you will show a very low score. Using them will actually block you from seeing other things that you can fix.
What we have started doing is figuring out what Google actually needs and presenting it to them. We started hiding tracking codes from Google. Bing, Facebook, ect from them. We hide our analytics tracking script from Google's crawler. I figured out that Google's test servers have the FA library and also their font library locally on the machine, so we have started hiding those from it as well. Any 3rd party script we have that Google does not need to see has been hidden as well, segmentify, olark, anything really. Doing these things has raised our score quite a bit.
-
"if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect."
Just adding a bit of weight to what you said, here's a test of a t.co link through bit.ly: https://i.gyazo.com/ca87c486a903914c2b058612cc93f3f0.png on 3g, it's 4.27s to even start loading Google. Without t.co: https://i.gyazo.com/f22c18a0879f76ecf653662153e17c43.png which is 2.35s.
-
Pagespeed score means nothing unfortunately. http/2 puts a spanner in the works for a lot of it.
https://blog.newrelic.com/2016/02/09/http2-best-practices-web-performance/
Being this section:
- Concatenating JavaScript and CSS files: Combining smaller files into a larger file to reduce the total number of requests.
- **Image spriting: **Combining multiple small images into one larger image.
- Domain sharding: Spreading requests for static assets across several domains to increase the total number of open TCP connections allowed by the browser.
- Inlining assets: Bundling assets with the HTML document source, including base-64 encoding images or writing JavaScript code directly inside
-
It's hard to be explained but "Less is MORE!" in general for that numbers.
Examples - redirectors. Redirects can overkill your site specially on mobile users. For that even simply site redirect can took second or two. Example www.example.com -> 301 -> m.example.com; looks simple isn't? But in reality after client took 301 redirect he must make new domain resolving (for m.exmaple.com) and then new connect to new server (m.example.com). And this is simply case... if you have 2 or even 3 redirects mobile users wait for 5 seconds before see anything. Hint - that's why i won't click on most bit.ly, ow.ly, goo.gl links in Twitter, Facebook, G+ when i'm on mobile. Because they first pass via t.co redirect then redirect that i can see and sometime even 3rd redirect. I know that marketers want to see "clicks", but isn't good for mobile users.
Server connection is also need to be less. But this mean that server need to be closer to user. Best example is Australia. There even simply DNS resolving + connection took one second. And client doesn't receive single byte from server yet... You can see WebPageTest.org (there are Australian servers). But of course providing single server there is expensive, so you need to have deep pockets to make servers there. That's why most of companies providing CDN support. Since CDN endpoint is closer to user it make things little bit faster for them. And if CDN is setup correct should be much faster.
So - idea is "Less is More!". The best is if you use WPT to benchmark your site from all over the world. And also setup Analytics to count speed. Because it's different speed when your site is on perfect conditions in datacenter than in real world.
-
Hi Luke,
Here is what google recommends in terms of page speed. Server response time to be less than 200 ms.
Now, coming to the Page Speed tool / Insight that google provide the measure the page speed ratings (1-100) , Google Page Speed score is indeed a strong indicator of a website’s loading performance in terms of time.
As per my research, total website download less than 10 secs corresponds to 75-85 on pagespeed score.
I hope this helps.
Thanks,
Vijay
-
Thanks Tom for picking up on that error - ugh - corrected now. Brain working sluggishly this morning lol!
-
Hi Luke,
"Avg. Page Load Time (sec) [Google recommends 200ms]:" That's actually for the server response time.
Personally, the only thing that matters is that the overall page load time is quick. I aim if possible for sub 2 seconds for any page.
Tom
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doorway page penalty
Has Google changed their interpretation of Doorway pages?We do not sell widgets but allow me to use Widget for this example;If we sold 25 very different widgets an online vendor would typically have 1 "mother" website with 25 different inner pages, each page to explain each type of widget they sell.However, for the past 9 years our approach is to have 25 different websites, one for each widget. With these 25 sites we concentrated on ranking the home page only . All these sites link back to our (No idexed) "Mother' site via no follow links where we have our Shopping Cart and Terms of Business. We did this partly to avoid having 25 separate Shopping Carts and to avoid having to change our Terms 25 times each time that became necessary. But yes we also did this as it was so much easier to rank each different type of widget in the SERPS. Also we think its a better user experience as in our business buyers of yellow widgets will not be interested in blue widgetsWe have been reading for years that google does not like doorways pages but we were not 100% certain if they might regard our sites as such .This is because our approach has worked great for nine years. That is until December last year when all 95% our sites fell dramatically in the SERPS usually from page 1 to page 2 or 3. First thing we did was to go through all our sites and search for the obvious; toxic links, duplicate content, keyword density, https issues, mobility issues, anchor text, etc etc and of course content. We found no obvious problems that could affect 95% of the sites at the same time but we ordered new homepage content for most of our sites from expert seo writers. However, after putting on this new content 3 -4 weeks ago our sites have not moved up the SERPS at all.So we are left with the inescapable conclusion that our problem is because google sees and devalues our sites as doorway pages especially as 95% of your sites have been affected all at the same time Would any SEO experts on this forum agree or be able to offer an opinion?If so, what might be the solution going forward? We have 2 solutions under consideration;1) Remove all links from each of our 25 sites to our "mother Site" and put a shopping cart and our TOS on each of the 25 sites so they are all truly independent stand alone websites.2) Create 25 inner pages on our mother site (after removing the no index) , for each of the 25 widgets we sell , then 301 each of the 25 individual sites home pages to its inner page on the mother site . I think this might be the best solution partly as almost all of our higher ranking competitors are ranking their inner pages not their homepage. But I worry if these 25 sites will really pass much link juice if they have been devalued by Google.?Any advice will be gratefully received.
Intermediate & Advanced SEO | | apcsilver90 -
Which of these examples are doorway pages?
Hi there, I am soon to launch a new platform/directory website, however, have a concern over doorway pages. I have read many articles on the difference between Doorway and Landing pages and do have a good understanding, however, am still very anxious that what I intend to do will be risking Google penalties. I have looked at other directory/platform websites and have noticed that a lot of them are still using doorway pages and are not getting penalised. So I was wondering if someone wouldn't mind kindly letting me know their opinion on which of the following examples are doorway pages and which are not so I can better understand what I can and cannot do? Example 1: When I Google 'piano lessons new york' and 'trumpet lessons new york' I get the following 'landing pages' in search: https://takelessons.com/new-york/piano-lessons https://takelessons.com/new-york/trumpet-lessons To me, the above pages are definitely doorway pages as they are very similar with content and text and are simply an intermediary step between the Google search and their listings pages for piano/trumpet teachers in New York. Is this correct? Example 2: When I Google 'piano lessons Sydney' I get presented with the following web page in search: http://www.musicteacher.com.au/directory/sydney-nsw/lessons/piano/ I would think that this is NOT a doorway page as the user has been taken directly to the search results page in the directory and the page doesn't seem to have been set up for the sole purpose of listing in search results for 'Piano Lessons in Sydney'. Example 3: When I Google 'pet minding Sydney' I get presented with the following two pages in search: https://www.madpaws.com.au/petsitters/Sydney-New-South-Wales?type=night&service=1&from=0&to=99&city=Sydney&state=New-South-Wales https://www.pawshake.com.au/petsitters/Sydney%252C%2520New%2520South%2520Wales%252C%2520Australia Like Example 2, I don't think these pages would be classified as doorway pages as they too direct to the search results page in the site directory instead of an intermediary page. What do you think? Thanks so much in advance for your expertise and help! Kind Regards, Adrian
Intermediate & Advanced SEO | | Amor20050 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Duplicated privacy policy pages
I work for a small web agency and I noticed that many of the sites that we build have been using the same privacy policy. Obviously it can be a bit of a nightmare to write a unique privacy policy for each client so is Google likely to class this as duplicate content and result in a penalty? They must realise that privacy policies are likely to be the same or very similar as most legal writing tends to be! I can block the content in robots.txt or meta no-index it if necesarry but I just wanted to get some feedback to see if this is necessary!
Intermediate & Advanced SEO | | Jamie.Stevens1 -
Incorrect cached page indexing in Google while correct page indexes intermittently
Hi, we are a South African insurance company. We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance. Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely. This is primarily for search phrases like "car insurance" and "car insurance quotes". The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery. The correct page does index for other search terms like "insurance for car". Your help would be appreciated, thanks!
Intermediate & Advanced SEO | | miway0 -
Page not appearing in SERPs
I have a regional site that does fairly well for most towns in the area (top 10-20). However, one place that has always done OK and has great content is not anywhere within the first 200. Everything looks OK, canonical link is correct, I can find the page if I search for exact text, there aren't any higher ranking duplicate pages. Any ideas what may have happened and how I can confirm a penalty for example. TIA,
Intermediate & Advanced SEO | | Cornwall
Chris0 -
Removing old versions of a page.
Recently one of my good friends updated his iweb based screen printing site to wordpress per my recommendation. This update has helped dramatically boost his rankings to #3 for most local keywords. This new site is now V5 of his site, but all older iweb versions are still on the ftp. There are a total of 209 pages on the ftp, as versions of about 30 actual pages. The pages have changed significantly with each update, leaving very little duplicate content, but the old ones are still on the google index. Would it hurt the rankings to clean up these older versions & 301 redirect to the new versions, or should we leave them? The site for reference is: http://takeholdprinting.com
Intermediate & Advanced SEO | | GoogleMcDougald0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0