Blog Post

Web Performance Benchmark: Three Key Steps

In this article you will learn how to benchmark web performance against competitors using three steps.

One of the frequent questions I receive from clients is on “How do I benchmark my performance against the competition?”. There are different approaches to benchmarking, some better than others. The key to a successful benchmark, is to plan it carefully and collect the right data points.

I recommend companies to follow the following 3 steps:

  1. Define the end goal of the benchmark. Ask yourself what will you do with this data? Are you trying to improve your website, a webpage, or a process? Are you trying to build a business case for a project or initiative?
  2. Determine which areas of the site/application/system you will need to benchmark. If you are benchmarking to figure out your infrastructure distribution, you might care more about DNS and performance on the geographical areas of your end users. If you are planning on redesigning/rebuilding the site, you might care about the full performance of key webpages or key processes – like a shopping cart.
  3. Determine what tests to run, from what locations, and the frequency. Based on the purpose of the benchmark, and benchmark areas, you can determine how and where to test from. For example you might decide to benchmark DNS performance from key US states that account for the majority of your end users. You  might decide to run the tests every 10 minutes, if you are planning major changes, or every 30 minutes if you are simply using the data for a business case.

Over the years I have come across several benchmarks that failed for various reasons. Some of the major pitfalls are:

  • Comparing Apples and Oranges. Sadly one of the biggest mistakes is not comparing the correct items. If you are benchmarking DNS performance, you can’t simply average the DNS time of HTTP requests. If you have a DNS TTL of 5 minutes, and your competitor has a TTL of 15 minute, the averages will lie.
  • Looking only at averages. If you are looking at an average across different cities, you might lose sight of issues.
  • Looking at a metric without understanding what it means. Quite often people just pay attention to the full load time of a webpage, and ignore the rest of the data. However, webpages are different and the full time to load the page, might not be the same across – especially when pages are dynamically modifying the content.
  • Looking only at one metric. You collected all this data, but looking only at one metric is not going to help you. Dig deeper into the data so you can understand why others are better or worst. Lear from your competitors success or failure, so you can improve.

Case Study: E-commerce Benchmark

Recently we assisted an e-commerce customer that had created a benchmark in Catchpoint to compare how the homepages of key competitors ranked. The benchmark included the homepages of BestBuy, Amazon, Apple, and Newegg. The goal was to understand where their homepage ranked relative to their competitors, and to determine the steps to improve their web performance.

Ecommerce Benchmark: Web Performance Chart for Apple, NewEgg, BestBuy and Amazon

Based on the data collected they came to the conclusion that the homepage of Apple.com was the fastest. There are several factors on why Apple’s homepage is fast:

  • The Response of the url, the total time from issuing the request to receiving the entire HTML of the page, was really fast.
  • Less Downloaded bytes, Apple’s homepage was 30-50% lighter.
  • Less Requests and Hosts on  the page.
Ecommerce Benchmark: Bytes Downloaded Chart for Apple, NewEgg, BestBuy and Amazon
Ecommerce Benchmark: Requests, Host and Connections Chart for Apple, NewEgg, BestBuy and Amazon

This might seem like a successful benchmark, however, there was one little issue that made the benchmark inaccurate**.**

The goal of the client was to compare the homepages of the competing ecommerce sites. But in the case of Apple they were testing the corporate homepage, which had a different business goal – and therefore different design and implementation. The homepage of the e-commerce site for Apple is www.apple.com/store and not www.apple.com.

When benchmarking to the correct e-commerce site of Apple, the picture changed. Apple was not much faster than the rest of the stores. (I kept the home page of Apple to show the differences).

Ecommerce Benchmark: Web Performance Chart for Apple, Apple Store, NewEgg, BestBuy and Amazon
Ecommerce Benchmark: Bytes Downloaded Chart for Apple, Apple Store, NewEgg, BestBuy and Amazon
Ecommerce Benchmark: Requests, Host and Connections Chart for Apple, Apple Store, NewEgg, BestBuy and Amazon

To get a better look at the impact at user experience, we also looked at other metrics like time to title and render start time.

Ecommerce Benchmark: Response, Web Page Response, Render Start Time, Time to Title

Visually, this is what it looked like loading those 5 sites from a node located in New York on the Verizon Backbone (using a 400 ms timer, the blink of an eye is 400 ms).

Filmstrip - from NY Verizon for Amazon, Apple, Apple's store, Bestbuy and Newegg

We also implemented the use of Apdex, an excellent way to score and compare numbers from diverse pages. Apdex normalizes the data based on target goals, which vary from webpage to webpage (as we saw with Apple). For demonstration purposes I used an Apdex target response of 5,000 ms (5 seconds) for all the tests above.

Ecommerce Benchmark: Apdex Score for Apple, Apple Store, NewEgg, BestBuy and Amazon

To sum it up, a successful benchmark depends on clear end goals, everything else depends on it.

Happy Benchmarking!

Mehdi – Catchpoint

(methodology: all sites were measured using 26 US Nodes, every 10 minutes with Internet Explorer 8.)

Synthetic Monitoring
DNS
eCommerce
This is some text inside of a div block.

You might also like

Blog post

DNS Security: Fortifying the Core of Internet Infrastructure

Blog post

Retail Resilience: Lessons Learned from Cyber Week 2023

Blog post

Adobe Experience Cloud Outage: The Impact of Relying on Third-party Services