Blog Post

What The World’s Worst DNA Mixup Teaches Us About Monitoring

Synthetic testing helps to analyze web applications in a controlled environment with consistent results. Any deviation in the data would point to an issue.

The suspect’s DNA had been found at 40 crime scenes, linking her to burglaries, narcotics and six murders. She was Germany’s most-wanted serial killer with a $400,000 bounty on her head, yet no one knew who she was.

It turns out the infamous ‘phantom’ wasn’t a murderer at all. Police were hunting an innocent factory worker who fatefully handled the same cotton swabs used to collect DNA samples from the crime scenes. With her DNA already on the swabs, lab results from several investigations pinned a hypothetical crime spree on her.

The Cost of Contaminated Cotton

Forensic scientists thought they’d controlled all testing variables in the lab, but an outside shipment of ‘dirty swabs’ contaminated results and sent detectives on a wild, but futile, goose chase that cost police more than 14,000 man-hours and $18 million. The lesson? If you rely on data for answers, you better know what factors can impact it. Just one factor, one skewed variable can slant your data and disrupt your test or experiment.

The same lesson applies to Web Performance Monitoring. No DevOps or engineer wants to chase down problems outside of his/her control, like being woken up at 2 AM because a consumer ISP blew a fuse in NYC. Skewed or misleading monitoring data can cost IT teams’ time investigating false positives, prolong troubleshooting and hurt overall web performance.

Controlling the Wild, Wild, Web

Getting clean and accurate data of your website’s performance on the Internet, the most volatile of environments, is very challenging. There are simply too many factors that impact what the end-user experiences, many of which are unknown or not reproducible.

Synthetic testing, or active monitoring, helps by testing end to end web application performance in a controlled environment where you define what page/endpoint is being tested, what frequency, what city and ISP the test is being performed from, etc. This controlled environment will produce consistent results, any deviation in the data would point to an issue that must be addressed. Because it is a controlled environment – you can also troubleshoot. Synthetic testing allows to run tools like traceroutes, screen capture and packet capture alongside the test. Real User Measurement, on the other hand, is not always reproducible, has many uncontrolled variables and does not allow for on the spot troubleshooting. The benefit of RUM is that you are measuring real user experience and can easily correlate the data to revenue and other engagement metrics.

All-in-all

When a lab loses control (as in the case of the DNA swabs), data becomes skewed, leading scientists to chase down the wrong culprit or conclusion. That’s why it’s important for DevOps to test in a regulated setting through Synthetic monitoring. This gives DevOps the mechanism to monitor performance with fixed variables in a mostly contained system.

Forensic Scientists don’t have the time to pre-test every cotton swab that enters the lab for DNA and IT teams are far too swamped to check every measurement.

Mehdi – Catchpoint

News & Trends
Web Experience
Real User Experience
Endpoint Experience
Application Availability
DevOps
SLA Management

You might also like

Blog post

Integration Spotlight: Catchpoint and Slack, More Than A Collaboration Tool

Blog post

Improve Your DevOps Strategy Through Platform Ops

Blog post

Introducing Opportunities & Experiments on WebPageTest: Take the Guesswork out of Performance

Blog post

Improve monitoring & observability: Catchpoint - Sumo Logic integration