How important is Lighthouse page speed measurement?
-
Hi,
Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse).
When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40.
Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async.
Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async?
Thank you,
Dave
-
It's absolutely essential that your company website is fast.
Don't purchase slow, cheap web hosting, regardless of your business type.
Instead purchase super fast hosting for your business.
Sometimes, it's much more expensive, but it's well worth it as it can help improve your organic SEO.
We purchased lightning-fast hosting; this is the one reason why we are now selling more bath garden offices than ever before before.
-
it is important to distinguish between PageSpeed Insights and Lighthouse. Maybe it's more important to follow PageSpeed Insights for your website. It becomes rather clear after reading this article https://rush-analytics.com/blog/google-pagespeed-insights-vs-lighthouse-how-do-they-differ. The differences between PageSpeed Insights and Lighthouse are explained in an easy way.
-
My understanding is that "Page Experience" signals (including the new "core web vitals) will be combined with existing signals like mobile friendliness and https-security in May, 2021. This is according to announcements by Google.
https://developers.google.com/search/blog/2020/05/evaluating-page-experience
https://developers.google.com/search/blog/2020/11/timing-for-page-experience
So, these will be search signlas, but there are lots of other very important search signals which can outweigh these. Even if a page on John Deere doesn't pass the Core Web Vitals criteria, it is still likely to rank highly for "garden tractors".
If you are looking at Lighthouse, I would point out a few things:
- The Lighthouse audits on your own local machine are going to differ from those run on hosted servers like Page Speed Insights. And those will differ from "field data" from the Chrome UX Report
- In the end, it's the "field data" that will be used for the Page Experience validation, according to Google. But, lab-based tools are very helpful to get immediate feedback, rather than waiting 28 days or more for field data.
- If your concern is solely about the impact on search rankings, then it makes sense to pay attention specifically to the 3 scores being considered as part of CWV (CLS, FID, LCP)
- But also realize that while you are improving scores for criteria which will be validated for search signals, you're also likely improving the user experience. Taking CLS as an example, for sure users are frustrated when they attempt to click a button and end up clicking something else instead because of a layout shift. And frustrated users generally equals lower conversion rates. So, by focusing on improvements in measures like these (I do realize your question about large images doesn't necessarily pertain specifically to CLS), you are optimizing both for search ranking and for conversions.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google measure page position in Webmasters?
Does anyone know exactly how Google measures page position in Webmaster Tools? For example: In Google Webmaster Tools, we had a product which on the 22/12/15 was at position 7, and then dropped to position 112 on the 30/12/15. It then rose back up to position 7 on the 6/01/16 and then down to position 25 on the 16/01/16. What does this mean and why?
Reporting & Analytics | | CostumeD0 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
Does analytics track an order two times by refresh on the confirmation-page?
Hi there,
Reporting & Analytics | | Webdannmark
I have a quick question. Does Google analytics track an order two times, if the user buys a product, see the confirmation page and then click refresh/click or back and forward again?
The order/tracking data must be the same, but i guess the tracking code runs for every refresh and therefore tracks the order two times in Analytics or does analytics know that it is the same order? Someone that can clearify this?Thanks! Regards
Kasper0 -
Duplicate page content
I'm seeing duplicate page content for tagged URLs. For example:
Reporting & Analytics | | DolbySEO
http://www.dolby.com/us/en/about-us/careers/landing.html
http://www.dolby.com/us/en/about-us/careers/landing.html?onlnk=al-sc as well as PPC campaigns. We tag certain landing pages purposefully in order to understand that traffic comes from these pages, since we use Google Analytics and don't have the abiility to see clickpaths in the package we have. Is there a way to set parameters for crawling to exclude certain pages or tagged content, such as those set up for PPC campaigns?0 -
Why we shouldn't use AWstats to measure marketing efforts?
and what are the disadvantages of awstats compared to Google Analytics?
Reporting & Analytics | | esiow20130 -
Has anyone experienced Google Analytics track the page visit to a "thank you" page, but not the goal conversion?
Has anyone experienced where Google Analytics would track the page visit to a "thank you" page, but not the goal conversion that should result? The goal had worked for a long time as it is as just a goal url with head match. No funnel. Not case sensitive. For about four days now, no conversions have been recorded, but Google Analytics shows hundreds of people visited the page that should trigger the goal. Additionally, we have received the hundreds of leads. A Screaming Frog search shows the code is embedded throughout the site. For the interested, the GA code looks like (and the 8 Xs are the correct number on the site): Am I missing something?
Reporting & Analytics | | 352inc0 -
Confirmation page gets hit multiple times by some users. How I can I segment out unique visits?
Hi All, I'm web marketing manager at http://www.evenues.com which is like an AirBnB for meeting space. When calculating the number of bookings for our meeting spaces, I've set up a goal in analytics with the confirmation page as the goal URL. The problem is, it seems that some users are looking at the same confirmation page several times. We have unique URLs for each confirmation page, but some users seem to be visiting these unique pages more than 2 to 5 times. This skews our numbers a bit. This makes things a bit problematic when it comes to segmenting visitors. is there anything we can so that each unique URL visited only counts once? Thanks, Kenji
Reporting & Analytics | | eVenuesSEO0 -
How to filter pages in Analytics by multiple criteria
Hello, we have several pages with the same page title. Now out of all those pages I want to pick two. Let's call them "/page1" and "/page2". For those pages I want the following information (combined for both): Avg. time on page, Bounce rate, Navigation Summary Normally I get all the information under "Content" "Pages" and by choosing the "page title" as primary dimension and clicking on the respective page title. Let's call it "page | title". Choosing the filter for 1 page works fine (I just enter "/page1" in simple filter). But how can I filter for two pages ( entering " include page ends with /page1 and include page ends with /page2" in the advance filter will show 0 results). Thanks in advance
Reporting & Analytics | | guitarslinger0