Lighthouse is an open-source automated tool for auditing the quality of web pages. You can perform Lighthouse tests on any web page, and get a series of scores for performance, accessibility, SEO, and more.
You can run Lighthouse within Chrome DevTools, from the command line, and as a Node module. But there's no need to do any of those if you're already using SpeedCurve Synthetic monitoring. Every time you run a synthetic test, your Lighthouse scores appear at the top of your test results page by default.
Why we added Lighthouse to SpeedCurve
There were a number of compelling reasons why we made the choice to add Lighthouse to SpeedCurve:
1. Track metrics that correlate to UX
The best performance metrics are those that help you understand how people experience your site. One of the things we like about Lighthouse is that – like SpeedCurve – it tracks user-oriented metrics like Speed Index, First Meaningful Paint, and Time to Interactive.
2. See all your improvement recommendations in one place
SpeedCurve already gives you performance recommendations on your test results pages. Now you can get all your recommendations – including those for accessibility and SEO – on the same page. Because we also include your PageSpeed score, SpeedCurve is the only place where you can get your Lighthouse audits and your PageSpeed Insights under one roof.
3. Improve SEO, especially for mobile
Several of our customers have told us that their SEO teams are very interested in using Lighthouse to help them feel their way forward now that Google has announced that page speed is a ranking factor for mobile search.
4. Monitor unicorn metric for CEOs and executives
When Google talks, executives listen. Many of our customers have told us that their CEO or other C-level folks don't really care about individual metrics. They want a single aggregated score – a unicorn metric – that's easy to digest and to track over time.
5. Get alerts when your Lighthouse scores "fail"
One of the great things about running your Lighthouse tests within SpeedCurve is that you can use your Favorites dashboard to create custom charts that let you track each Lighthouse metric. You can also create performance budgets for the metrics you (or your executive team) care about most and get alerts when that budget goes out of bounds.
For example, the custom chart below tracks three Lighthouse scores – performance, accessibility, and SEO – for the Amazon.com home page. I've also created a (very modest) performance budget of 50 out of 100 for the Lighthouse Performance score. That budget is tracked in the same chart, and you can see that the budget has gone out of bounds a couple of times in the week since we activated Lighthouse. You can also see, interestingly, that this score has much more variability than the other scores. This would merit some deeper digging.
Lighthouse vs PageSpeed
In November 2018 Google released v5 of the PageSpeed Insights tool. It was a major change as under the hood it switched to using Lighthouse for all its audits and scoring.
In July 2018 we introduced Lighthouse audits and scores to SpeedCurve so we'll soon be deprecating and removing the old v4 PageSpeed score.
If you haven't already you should update any performance budgets to use the "Lighthouse Performance" score.
Your Lighthouse scores might be different from your other test results
While PageSpeed sometimes runs the same version of Lighthouse that we do, from time to time they may be out of sync with us. (Note that we always run the latest release.) The performance score is strongly influenced by TTI, which can be quite different depending on the test environment. You can’t really compare them directly like you could with the older version. The Lighthouse team have written some great background on what can cause variability in your scores.
At the bottom of the Lighthouse reports it says "Provided by environment" for network and CPU throttling. This is because we apply the throttling at the OS level, which is more accurate than Chrome's built-in throttling. We always use a 3G network speed, and we don't apply any CPU throttling.
The Lighthouse report doesn't reuse the same page load that SpeedCurve/WebPageTest does. It's a separate page load done at the end of the SpeedCurve test so the metrics numbers will be different. You can't directly compare the metrics like you see in the SpeedCurve UI like "Time To Interactive", with the metrics in the Lighthouse report. Depending on the nature of your page and the network throttling in your test settings, there could be a some variation in each page load.
We recommend that you don't overly focus on wondering why your metrics don't "match"
We've tried to make it easy to compare the recommendations and metrics coming from those different sources. But we recommend that you don't overly focus on wondering why your metrics don't "match". The idea is to have them all in one place so you can compare and decide which to focus on.
You shouldn't really consider any of these metrics as "reality" – that's what RUM is for. Synthetic testing is more about establishing a baseline in a clean and stable environment, and then improving those metrics by X% over time.
"My Lighthouse scores aren't showing up."
If you're not seeing your Lighthouse scores, it could be because there are security certificate issues on the page you're testing. If that's the case, then Lighthouse won't run.
To find out what the issue is, click through to your WebPageTest results via the link just below your waterfall chart:
On the WebPageTest results page, click on the Lighthouse Test Log to read the log and find out why the Lighthouse test didn't run.
"Why am I seeing gaps in my charts?"
If you're seeing gaps in your Lighthouse charts, that means we don’t have any data for that metric as part of the test. This is a Lighthouse issue, so we don't have any control over it. When we notice more issues than normal, we pass on bug reports to the Lighthouse team.
If you're concerned about excessive gaps in your data, you can see if your issue is already being discussed in their Github repo, and then you can either ask them directly via GitHub or Twitter (@____lighthouse).
Lighthouse Accessibility scores are provided "as is"
We provide the Lighthouse accessibility scores as part of the overall Lighthouse integration. Accessibility isn't our area of expertise, so we give you those Lighthouse scores and audits "as-is".
Sometimes people ask if the wcag 2.1 is being checked against. According to the Google Lighthouse Github repo, they run both wcag2a and wcag2aatags. This means that the tests check for compliance of A and AA levels.
If you have questions about your Lighthouse Accessibility score, the best option is to contact the Lighthouse team directly. You can see if your issue is already being discussed in their Github repo, and if it's not already there, post your question. Or you could ask them via Twitter (@____lighthouse).
Get started with Lighthouse
If you're already a SpeedCurve user, you can drill down into your individual test results and find your Lighthouse scores at the top of the page.
To see the full Lighthouse report, click on the score you're interested in (e.g., 'Performance'), which will open up the list of audits for that score. At the bottom of that list is a text link called 'Full Lighthouse Report', which will take you to the detailed report on the Lighthouse site.
If you're not a SpeedCurve user, you can sign up for a free trial and check out your Lighthouse scores and the dozens of other performance and UX metrics we track for you.
As always, we'd love your feedback!
We'd especially love to hear what your experience has been with Lighthouse. Are you using it to help with something not covered in this post? Have you learned something with Lighthouse that might otherwise have eluded you? Let us know at firstname.lastname@example.org.