How important is Lighthouse page speed measurement?
-
Hi,
Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse).
When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40.
Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async.
Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async?
Thank you,
Dave
-
It's absolutely essential that your company website is fast.
Don't purchase slow, cheap web hosting, regardless of your business type.
Instead purchase super fast hosting for your business.
Sometimes, it's much more expensive, but it's well worth it as it can help improve your organic SEO.
We purchased lightning-fast hosting; this is the one reason why we are now selling more bath garden offices than ever before before.
-
it is important to distinguish between PageSpeed Insights and Lighthouse. Maybe it's more important to follow PageSpeed Insights for your website. It becomes rather clear after reading this article https://rush-analytics.com/blog/google-pagespeed-insights-vs-lighthouse-how-do-they-differ. The differences between PageSpeed Insights and Lighthouse are explained in an easy way.
-
My understanding is that "Page Experience" signals (including the new "core web vitals) will be combined with existing signals like mobile friendliness and https-security in May, 2021. This is according to announcements by Google.
https://developers.google.com/search/blog/2020/05/evaluating-page-experience
https://developers.google.com/search/blog/2020/11/timing-for-page-experience
So, these will be search signlas, but there are lots of other very important search signals which can outweigh these. Even if a page on John Deere doesn't pass the Core Web Vitals criteria, it is still likely to rank highly for "garden tractors".
If you are looking at Lighthouse, I would point out a few things:
- The Lighthouse audits on your own local machine are going to differ from those run on hosted servers like Page Speed Insights. And those will differ from "field data" from the Chrome UX Report
- In the end, it's the "field data" that will be used for the Page Experience validation, according to Google. But, lab-based tools are very helpful to get immediate feedback, rather than waiting 28 days or more for field data.
- If your concern is solely about the impact on search rankings, then it makes sense to pay attention specifically to the 3 scores being considered as part of CWV (CLS, FID, LCP)
- But also realize that while you are improving scores for criteria which will be validated for search signals, you're also likely improving the user experience. Taking CLS as an example, for sure users are frustrated when they attempt to click a button and end up clicking something else instead because of a layout shift. And frustrated users generally equals lower conversion rates. So, by focusing on improvements in measures like these (I do realize your question about large images doesn't necessarily pertain specifically to CLS), you are optimizing both for search ranking and for conversions.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shall i index double pages of my website as compared to my competitors?
a:my competitors has indexed 10 pages (checked it with site:abcd.com and found 10 results) b:what if i index 20 pages of my site and create a lot of content which is also better than my competitors who will have the edge?
Reporting & Analytics | | calvinkj0 -
Strange 404 Pages Appearing in Google Analytics
Hi, My client has some strange urls appearing in GA which lead to error pages. Please see the following image: https://imgur.com/a/6TPO8yL e.g URLs like /h/6445738.html I've used screaming frog to see if these pages exist on the website and I can't find them, anywhere. Therefore how are they coming up in GA? If anyone could please help I'd really appreciate it.
Reporting & Analytics | | SolveWebMedia0 -
Using Canonicals on category or link pages
Hey, If a site has category or tag pages showing in search results for a particular keyword - sometimes higher than the page you would like to rank - could using canonical redirects on categories and tags be a solution? Thanks
Reporting & Analytics | | wearehappymedia0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
No Google Analytics code on page BUT reporting is active
How could Google Analytics be reporting data if my pages don't have the GA code on them? Mike
Reporting & Analytics | | Mike_c0 -
800,000 pages blocked by robots...
We made some mods to our robots.txt file. Added in many php and html pages that should not have been indexed. Well, not sure what happened or if there was some type of dynamic conflict with our CMS and one of these pages, but in a few weeks we checked webmaster tools and to our great surprise and dismay, the number of blocked pages we had by robots.txt was up to about 800,000 pages out of the 900,000 or so we have indexed. 1. So, first question is, has anyone experienced this before? I removed the files from robots.txt and the number of blocked files has still been climbing. Changed the robots.txt file on the 27th. It is the 29th and the new robots.txt file has been downloaded, but the blocked pages count has been rising in spite of it. 2. I understand that even if a page is blocked by robots.txt, it still shows up in the index, but does anyone know how the blocked page affects the ranking? i.e. while it might still show up even though it has been blocked will google show it at a lower rank because it was blocked by robots.txt? Our current robots.txt just says: User-agent: *
Reporting & Analytics | | TheCraig
Disallow: Sitemap: oursitemap Any thoughts? Thanks! Craig0 -
Duplicate Page Title
I'm new to SEO and have just signed up to SEOMOZ to see what I can learn. I got the report back on my site and it indicates various errors, one of them being Duplicate Page Title - I have a blog on my site and a lot of pages identified as with duplicates are like this: http://www.martinspencephotography.co.uk/blog?page=2 Is it important I rectify this? Do I need to rectify it?
Reporting & Analytics | | MartinSpence460 -
Will Bing/Google's engine index a page that has only been on social media?
Will Bing's engine index and rank a page that has only been seen on social media and has no inbound links? Will Google's? Are inbound links absolutely required to get a page indexed and ranking and getting traffic? If unknown, how would you go about testing this?
Reporting & Analytics | | SarahGoliger0