Lighthouse is an open-source automated tool for auditing the quality of web pages. You can perform Lighthouse tests on any web page, and get a series of scores for performance, accessibility, SEO, and more.

You can run Lighthouse within Chrome DevTools, from the command line, and as a Node module. But there's no need to do any of those if you're already using SpeedCurve Synthetic monitoring. Every time you run a synthetic test, your Lighthouse scores appear at the top of your test results page by default. 

Why we added Lighthouse to SpeedCurve

There were a number of compelling reasons why we made the choice to add Lighthouse to SpeedCurve:

1. Track metrics that correlate to UX

The best performance metrics are those that help you understand how people experience your site. One of the things we like about Lighthouse is that – like SpeedCurve – it tracks user-oriented metrics like Speed Index, First Meaningful Paint, and Time to Interactive. 

2. See all your improvement recommendations in one place

SpeedCurve already gives you performance recommendations on your test results pages. Now you can get all your recommendations – including those for accessibility and SEO – on the same page. Because we also include your PageSpeed score, SpeedCurve is the only place where you can get your Lighthouse audits and your PageSpeed Insights under one roof.

3. Improve SEO, especially for mobile

Several of our customers have told us that their SEO teams are very interested in using Lighthouse to help them feel their way forward now that Google has announced that page speed is a ranking factor for mobile search

4. Monitor unicorn metric for CEOs and executives

When Google talks, executives listen. Many of our customers have told us that their CEO or other C-level folks don't really care about individual metrics. They want a single aggregated score – a unicorn metric – that's easy to digest and to track over time. 

5. Get alerts when your Lighthouse scores "fail"

One of the great things about running your Lighthouse tests within SpeedCurve is that you can use your Favorites dashboard to create custom charts that let you track each Lighthouse metric. You can also create performance budgets for the metrics you (or your executive team) care about most and get alerts when that budget goes out of bounds.

For example, the custom chart below tracks three Lighthouse scores – performance, accessibility, and SEO – for the Amazon.com home page. I've also created a (very modest) performance budget of 50 out of 100 for the Lighthouse Performance score. That budget is tracked in the same chart, and you can see that the budget has gone out of bounds a couple of times in the week since we activated Lighthouse. You can also see, interestingly, that this score has much more variability than the other scores. This would merit some deeper digging.

Lighthouse vs PageSpeed

In November 2018 Google released v5 of the PageSpeed Insights tool. It was a major change as under the hood it switched to using Lighthouse for all its audits and scoring.

In July 2018 we introduced Lighthouse audits and scores to SpeedCurve so we'll soon be deprecating and removing the old v4 PageSpeed score.

If you haven't already you should update any performance budgets to use the "Lighthouse Performance" score. 

Your Lighthouse scores might be different from your other test results

PageSpeed runs an alpha version of Lighthouse. We’re running the latest release. The performance score is strongly influenced by TTI, which can be quite different depending on the test environment. You can’t really compare them directly like you could with the older version.

Lighthouse is run separately from your main SpeedCurve tests, and uses the "3G Fast" network throttling regardless of your SpeedCurve browser profile. Because of this, it's normal for Lighthouse to report slightly different numbers than your SpeedCurve dashboards. 

We don't use any of the Lighthouse inbuilt network and CPU throttle settings. Instead all the network and CPU throttling is set up by WebPageTest at the OS layer. In the Lighthouse reports it says "Provided by environment" for network and CPU throttling. It means the WebPageTest test and the Lighthouse audit run with the same conditions from your SpeedCurve settings.

We recommend that you don't overly focus on wondering why your metrics don't "match"

We've tried to make it easy to compare the recommendations and metrics coming from those different sources. But we recommend that you don't overly focus on wondering why your metrics don't "match". The idea is to have them all in one place so you can compare and decide which to focus on. 

You shouldn't really consider any of these metrics as "reality" – that's what RUM is for. Synthetic testing is more about establishing a baseline in a clean and stable environment, and then improving those metrics by X% over time.

More: Why are the SpeedCurve results different from my other test results

Get started

If you're already a SpeedCurve user, you can drill down into your individual test results and find your Lighthouse scores at the top of the page. 

To see the full Lighthouse report, click on the score you're interested in (e.g., 'Performance'), which will open up the list of audits for that score. At the bottom of that list is a text link called 'Full Lighthouse Report', which will take you to the detailed report on the Lighthouse site. 

If you're not a SpeedCurve user, you can sign up for a free trial and check out your Lighthouse scores and the dozens of other performance and UX metrics we track for you. 

As always, we'd love your feedback!

We'd especially love to hear what your experience has been with Lighthouse. Are you using it to help with something not covered in this post? Have you learned something with Lighthouse that might otherwise have eluded you? Let us know at support@speedcurve.com.

Did this answer your question?