Blog Post

Performance Monitoring with Real vs. Headless Browsers

You must understand your users to build a sound performance strategy. Learn more about Chrome monitoring and the difference between headless and real.

The purpose of performance monitoring is to detect issues and minimize their impact on your end users. There are several different ways to do this, one of which is synthetic testing. Synthetic testing involves simulating real users by loading critical pages and transaction flows such as logging in and, if you’re an ecommerce company, checking out. This simulation should accurately represent the typical behavior of your users. Synthetic transaction testing for websites can come in two forms: browser emulation and real browsers.

Browser Emulation

Browser emulation can be as simple as requesting an HTML file, parsing out and requesting resources (CSS, JS, images, etc.), but not actually involving a JavaScript engine, thus not executing JavaScript and ignoring all the additional requests that the executed JavaScript would then initiate.

Another variation of browser emulation is a headless browser. Headless browsers are simply browsers without a Graphical User Interface (GUI) often used for automation for testing purposes. There are some benefits to not having a UI; it’s more lightweight, less resource intensive, and scripted automation executes quickly.


PhantomJS is a common headless browser that uses WebKit as its layout and rendering engine and JavaScriptCore as the JavaScript engine. Phantom is well suited for automation and integration with CI tools like Jenkins. It’s capable of taking screenshots and allows you to override the user-agent string header sent to the server to imitate a different browser, but in some cases imitation is only able to cause the server to send different content. The user-agent modification doesn’t make changes to how the browser works behind the scenes.

Real Browsers

Real browsers refer to the browsers that most of us are familiar with like Chrome, Safari, Internet Explorer, Firefox, Opera, etc. Real browsers include a GUI and often many other features that enrich the browsing experience. The GUI and additional features make real browsers heavier, more resource intensive, and slower to execute automated commands. Like headless browsers, they also allow you to change the user-agent string to imitate other browsers when communicating with servers.


In 2013, Chrome transitioned from using WebKit to Blink as its rendering engine and it uses V8 as its JavaScript engine. Google has also been a leader in the “make the web faster” initiative, through which Google has often pioneered new web protocols and standards. Chrome has the majority of browser market share.

Chrome vs. PhantomJS

Much of the cutting edge technology that gets put directly into Chrome, such as HTTP/2, often isn’t supported in headless browsers like PhantomJS as quickly, if at all. This can leave blind spots in your monitoring strategy, which can become costly. Additionally, overcoming challenges and defects in PhantomJS typically takes longer than scripting with Selenium for a real browser because of the differences in community size and contributions. Since Chrome and PhantomJS have different rendering/layout and JavaScript engines, PhantomJS is not suitable for representing user experience on Chrome.

To put it simply, you need to test where your users are—focus on testing the most common and critical user flows from vantage points where most of your users are (ISP and location) using similar technology. Since it currently leads the browser market share, it’s most beneficial to run synthetic tests on Chrome as opposed to relying on headless browser tests to tell you if your users are experiencing issues.

Synthetic Monitoring
Real User Monitoring
API Monitoring
SLA Management
Workforce Experience
SaaS Application Monitoring
This is some text inside of a div block.

You might also like

Blog post

Prioritize Internet Performance Monitoring, urges EMA

Blog post

The cost of inaction: A CIO’s primer on why investing in Internet Performance Monitoring can’t wait

Blog post

Mastering IPM: Key takeaways from our best practices series