Click here to close now.


Containers Expo Blog Authors: Pat Romanski, Yeshim Deniz, Mike Kavis, Carmen Gonzalez, Elizabeth White

Blog Feed Post

What Customers expect in a new generation APM (2.0) solution

In the last blog, I discussed the challenges with an APM 1.0 solution. 


As an application owner or application support personnel, you want to

  • Exceed service levels and avoid costly, reputation-damaging application failures through improved visibility into the end-user experience
  • Ensure reliable, high-performing applications by detecting problems faster and prioritizing issues based on service levels and impacted users
  • Improve time to market with new applications, features, and technologies, such as virtualization, acceleration, and cloud-based services


The APM 2.0 products enable you manage application performance leading with real user activity monitoring. Following are some of the top functionalities they provide that help you achieve your business objectives.

Visibility to real users and end-user driven diagnostics

  • APM 2.0 solutions provide visibility to end-to-end application performance as experience by real end users and help application support to focus on critical issues affecting end-users.


The dashboard shown in Figure 1, as an example, provides visibility of application performance as experienced by users in real-time.





  • As an application owner, you probably care about which users are impacted, what pages they are navigating and what kind of errors they are getting. You want your APM product to improve MTTR by identifying what is causing the latency issues or failure e.g. network, load balancer, ADN like Akamai, SSL or the application tier itself. Figure 2 shows a specific user session and what pages the user navigated and identified that the application tier is the cause.


  • The “details” link in Figure 2 allows the application support personnel to drill down further which application tier is the culprit for the slow or failed transaction in context to the specific user. This allows the application support personnel to track end-user request to the line of the code.

Ease of use and superior time-to-value

You want to use a product that is simple to use for your application support / operation team.

  • A modern APM solution does not require manual definition of instrumentation policies.
  • It should not require manual changes such as Java script injection for visibility to the end user.
  • APM 2.0 tools provide ability to drill down from end-user to deep-dive for diagnostics and drill up from deep-dive data to identify the impacted user and the context for the transaction without having to do a manual correlation, jumping between consoles.
  • The agent install is typically a 5-10 mts process in the modern APM deep-dive tools.
  • The APM 2.0 deep-dive solution provides automatic detection of application servers, business transactions, frameworks etc.


Figure 3 shows a specific user transaction request and latencies by tiers. It also shows the SQL and latencies information.




Suitable for production deployment

  • The real user monitoring tool should be non-invasive in nature and it should put additional overhead on application response time.
  • You should be able to deploy an always-on, deep-dive monitoring and diagnostic solution for your production enterprise and cloud-based applications.
  • It should work in an agile environment without having to configure new instrumentation policies with application releases.
  • It should scale for a large production deployment to 1000s of application servers that you want to manage in your production environment.


Operations Ready product and enables DevOps collaboration

The APM 1.0 products were originally built for developers and hence they were not very intuitive for operations use. The APM 2.0 products are operations friendly. Also you would expect some of those to enable DevOps collaboration for intelligent escalation to development.

  • Most application support personnel do not understand what frameworks or application technologies used by an application. The majority deep-dive tools in the market move very fast from a transaction view to line of code thus being not providing much value to operations team.


For example, Figure 4 shows the transaction break-down by specific technologies used by the transaction. This also provides baselines for different tiers and the system resource usage along with tiers to make intelligent decision. Figure 3 shows an application flow map for a specific transaction and time spent in each SQL or a remote web service call without having to drill down to the line of code.



  • There are many instances operations team need to escalate problems to developers. The tool should allow application support personnel to escalate to Tier 3/development for diagnostics by sending a direct link to the diagnostic instance. However in many organizations, developers do not have access to production environment and as shown in Figure 5, solution from BMC allows exporting the diagnostics data call tree with latency, parameters, etc in a HTML format.



Adaptive to virtualization and Cloud environment

The new APM 2.0 products are purpose-built and architected for cloud and virtualized environments. 

  • The APM 2.0 product components and agents are designed to communicate in a firewall-friendly protocol and can be encrypted / secured.
  • They support virtualized and dynamic environment without causing a lot of false alerts.
  • They support modern cloud frameworks and Big Data platforms such as Hadoop.



The APM 2.0 solution provides the functionalities that you need to manage your applications that will help exceed business expectations and increase customer loyalty. These tools provide capability to improve time to market. These provide you understanding how application performance affects user behavior — and how that behavior impacts the bottom line. You can leverage an APM 2.0 solutionlike BMC Application Performance Management to improve your application performance and thus meeting your business objectives.

Read the original blog entry...

More Stories By Debu Panda

Debu Panda is a Director of Product Management at Oracle Corporation. He is lead author of the EJB 3 in Action (Manning Publications) and Middleware Management (Packt). He has more than 20 years of experience in the IT industry and has published numerous articles on enterprise Java technologies and has presented at many conferences. Debu maintains an active blog on enterprise Java at

@ThingsExpo Stories
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
This week, the team assembled in NYC for @Cloud Expo 2015 and @ThingsExpo 2015. For the past four years, this has been a must-attend event for MetraTech. We were happy to once again join industry visionaries, colleagues, customers and even competitors to share and explore the ways in which the Internet of Things (IoT) will impact our industry. Over the course of the show, we discussed the types of challenges we will collectively need to solve to capitalize on the opportunity IoT presents.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in high-performance, high-efficiency server, storage technology and green computing, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermi...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.