In my last post, I talked about how “digital business”—the creation of new business designs by blurring the digital and physical worlds—was the talk of Gartner’s Data Center, Infrastructure & Operations Management Conference earlier this month. But what does digital business mean for application performance management? I dropped in on Gartner analyst Cameron Haight’s presentation on that very topic: “Rethinking APM in a Digital Business Era.”
Haight painted a picture of growing complexity in modern IT environments, driven by: microservices; multiple programming languages (Java, Node.Js, PHP, Ruby, etc.); continuous software release cycles; more interactions via mobile and Internet of Things; more ephemeral resources like containers that can be rapidly provisioned and de-provisioned; and new back-end infrastructure driven by software-defined networking.
This complexity is not only difficult to manage but more expensive too as monitoring software pricing hasn’t adjusted to the new reality of more and variable objects to monitor. Haight called for an APM 3.0, a new class of APM technologies, if not vendors, to better monitor modern application environments. In Haight’s taxonomy, APM 1.0 was 1990s and early 2000s APM software companies like Wily and Precise. APM 2.0 was led by high-flying startups of the last decade like New Relic and AppDynamics. Whether APM 3.0 brings new vendors or new offerings from existing players remains to be seen.
While APM 2.0 has brought us powerful deep-dive monitoring capabilities at the application and server levels combined with powerful analytics, that isn’t enough anymore, Haight argued. APM 3.0 needs to go everywhere containers and other resources go. That includes Linux kernels, resource managers like Apache Mesos and Zookeeper, network taps, and system logs. APM 3.0 might include crowd-sourced metrics, such as tapping into social media to gauge sentiment on how an application is really performing. It will learn as it goes and improve with usage via machine learning, moving from topologies to probabilities as Haight put it.
User interfaces and visualizations of data will have to get better too. Haight spoke of “immersive” environments where the results find you instead of the other way around. Pricing will need to evolve too, away from monitoring objects and more to processing events. Skills like forensics, statistics, storytelling and coding will all be in demand in APM 3.0.
Customers remain unconvinced that their current vendors can get them there. In a flash poll during the session, 68% of respondents indicated that they didn’t think their current APM tools could meet their future digital business needs.
Haight reminded the audience that the purpose of APM was “digital business insight” then ticked off the ways APM technologies needed to adapt to this new reality, in terms of technology and process.
Growing complexity of IT environments is real, we talk about it all the time in internal meetings at Catchpoint. Haight focused mostly on back-end technologies, but we see it on the front end as well, with content delivery networks, DNS resolution, API calls, and third-party tags all adding new capabilities but also complexities to your customer facing applications.
Back-end metrics are certainly important, but won’t tell you much if you don’t have good visibility into what your customers are experiencing. Gartner’s clients already told them earlier this year that end-user experience monitoring was the critical dimension in enterprise APM. That reality was brought home again in Gartner’s CIO Survey with customer experience management topping the list of CEOs’ 5-year investment intentions, as reported by their CIOs.
The business gets it. As complexity grows, APM will need to evolve to better manage that complexity. But monitoring customer experience, for speed, availability and reliability, must remain at the center of any APM 3.0 strategy.