How to create an Apache-licenced Private WebPageTest setup, and get the Classic Interface

In my previous articles I took you through the process of setting up your own private WebPageTest, either via the AWS interface, or via Terraform infrastructure as code).

By default, this would create a Private WebPageTest instance that uses the latest code on the release branch of the official WPOFoundation github repo for WebPageTest.

new webpagetest ui

This is great if you like the newer UI (it’s not as up to date as the official WebPageTest.org site, which obviously evolves much faster), but it might not be what you want for a couple of reasons:

  1. You preferred the original, classic, WebPageTest UI, or
  2. You plan to monetise your private WebPageTest setup, which violates the release branch’s LICENSE.md entry about "Noncompete" and "Competition"

Since WebPageTest existed loooong before Catchpoint bought it up, the original version of the code (the fully open source version) still exists, and has no such non-competition concerns. It does have a LICENSE file, but that just lists all the licenses associated with the other libraries WebPageTest uses.

By the way, the same is also true for the WebPageTest agent – master branch & release branch vs apache branch – so bear that in mind if you’re creating a competing product. Presumably this is what Speedcurve do, for example. (Apparently so!)

In this article I’ll show you how to tweak the previous private WebPageTest installation scripts and setup processes to use the apache branch, thus reverting to the "classic" UI, and freeing you up from non-competition concerns. (if you get rich because of this article, please buy me a coffee and hire me, thanks ๐Ÿ˜)

Continue reading

Automate Your WebPageTest Private Instance With Terraform: 2021 Edition

This article assumes you have an understanding ofย what Terraform is, what WebPageTest is and some AWS basics.

all the logos

Have you ever wanted to have your WebPageTest setup managed as infrastructure as code, so you can keep all those carefully tuned changes and custom settings in source control, ready to confidently and repeatedly destroy and rebuild at a whim?

Sure you have.

In this article I’ll show how to script the setup of your new WPT server, installing from a base OS, and configuring customisations – all within Terraform so you can easily rebuild it with a single command.


Continue reading

Google’s Chrome User Experience Data in WebPageTest

This article assumes that you know the basics of AWS, WebPageTest, SSH, and at least one linux text editor.

Chrome User Experience Report logo + WebPageTest Logo + Core Web Vitals' LCP thresholds. Quite a busy hero image, I'll admit.

When talking to people about website performance stats, I’ll usually split it into Real User Metrics (RUM) and Automated (Synthetic/Test Lab):

  • RUM is performance data reported from the website you own, reported into the analytics tool you have integrated.
  • Automated are scripted tests that you run from your own performance testing tool against any website you like.

RUM is great: you get real performance details from real user devices and can investigate the difference in performance for many different options.

For example, iPhone vs Android, Mac vs Windows, Mobile vs Desktop, Chrome vs Firefox, UK vs US, even down to ISP and connection type, in order to see who is getting a good experience and who can be improved.

This data is invaluable in prioritising performance improvements, since you can tie it back to the approximate number of users it will affect, and therefore the impact on your business.

There are loads of vendors who can provide this for you (I’ve used many of them), or you can write your own – if you’re a glutton for punishment (and high AWS bills – ask me how I know…๐Ÿ˜)

However, since this is measured on your own website and reported into your own tooling, you can only see such real-world performance detail for your own website; no real-world user experience data from your competitors.

Automated tests are great: you get detailed measurements of any website you can access – your own or competitors, or basically any website – in a repeatable fashion so that you can track changes over time.

You can have as many automated tests as you like, you can test from wherever you’re able to set up a test agent, and with whatever device you’re able to automate or emulate.

However, since these are all automated tests running because you said so, you can’t use them to understand how users are using your site, on which devices, what devices are underperforming others, and from where.

Again, there are a load of vendors who can provide this for you; writing your own is a bit more of a headache though – I wouldn’t recommend it, especially while wpt continues to be free for self hosting.

What if you could get some of the usual key performance metrics you’re used to with RUM, but for sites you don’t own such as your competitors?

In this article I’ll talk about the Google Chrome User Experience dataset and how to use it in your performance test setup to find the intersection of RUM and Automated performance test results, wiring it all up into your WebPageTest setup! Continue reading