WebPageTest has supported Google's Lighthouse as part of our test suite for a while now, and there are a number of great reasons to choose to run Lighthouse on WebPageTest (accurate packet-level throttling, clean test environment with careful configuration to ensure accuracy and repeatability, always the latest version, etc. etc.)
That said, we've long felt that there were opportunities to make better use of Lighthouse results alongside WebPageTest's broader test analysis. Today, we took the first steps towards that goal with a new Lighthouse result page that has a number of unique features we think you'll find very useful.
A New Lighthouse Result Page
As of today, WebPageTest's Lighthouse results will now be displayed like other first-party test result pages on WebPageTest:
Custom Lighthouse Sharing Previews
These new result pages also offer a custom social preview for sharing your Lighthouse results, complete with scores and metrics! So whenever you want to share a Lighthouse score on the socials, here's how it'll look:
Now, before I move on I should mention that if you want to access the familiar default Lighthouse page experience, it's still there! You can find it in this menu:
To run Lighthouse on WebPageTest, just check the box to "Run Lighthouse Audit" in your test from the Start Test page, or if you'd like, run a Lighthouse-only test if that's all you need. That said, choosing to include Lighthouse in a full WebPageTest run now enables some unique features that we think you'll find useful...
Lighthouse, Meet WebPageTest Pro Experiments!
Perhaps the reason we're most excited is that this tighter Lighthouse integration allows us to offer ways to act on many of the bottlenecks Lighthouse finds, using other powerful WebPageTest features! As you may be aware, WebPageTest offers No-Code Experiments as part of its Pro tier product (notably, some experiments are available with a free starter account as well).
In our new Lighthouse result page, on Lighthouse audits that do not pass, WebPageTest will now offer links to relevant No-Code Experiments that aim to fix or mitigate specific bottlenecks Lighthouse finds! Take the following audit for example, which says that the site's LCP image was lazy-loaded, causing it to fetch later than it should. In this case, WebPageTest is able to offer an experiment that will re-test the page with lazy-loading removed from that image (simulated live on the actual website being tested, mid-request!):
Clicking "View Experiment" will take you to that experiment on the Opportunities & Experiments page, which will let you run a relevant experiment (along with any others you'd like to include).
And then you'll arrive at our Experiment result page, allowing you to see if that change is worth making to the site.
...in that case, the particular change did not improve the site's LCP metric, which tells you that either the change isn't worthwhile, or that its impact is held back by other optimizations that need to happen alongside it. Useful stuff!
One minor note on experiments: Lighthouse results do not currently reflect changes made in an experiment, because they run separately from our experiment runner. We plan to show experiment impact in Lighthouse results in the future, but for now, the metrics collected by WebPageTest are there to show you the impact of an experiment on any metrics that change (and you can infer a similar impact in Lighthouse if you implement the change on the site).
This is our first release of our new Lighthouse page, and we expect to be making refinements in the coming weeks to tighten things up wherever needed. We also intend to link more experiments from Lighthouse as we find relevant areas to do so, so keep an eye out for that!
As always, if you run into any troubles or have questions or ideas, we'd love to hear from you in the issue tracker. Thanks!