Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Blog Feed Post

What Customers expect in a new generation APM (2.0) solution

In the last blog, I discussed the challenges with an APM 1.0 solution. 

 

As an application owner or application support personnel, you want to

  • Exceed service levels and avoid costly, reputation-damaging application failures through improved visibility into the end-user experience
  • Ensure reliable, high-performing applications by detecting problems faster and prioritizing issues based on service levels and impacted users
  • Improve time to market with new applications, features, and technologies, such as virtualization, acceleration, and cloud-based services

 

The APM 2.0 products enable you manage application performance leading with real user activity monitoring. Following are some of the top functionalities they provide that help you achieve your business objectives.

Visibility to real users and end-user driven diagnostics

  • APM 2.0 solutions provide visibility to end-to-end application performance as experience by real end users and help application support to focus on critical issues affecting end-users.

 

The dashboard shown in Figure 1, as an example, provides visibility of application performance as experienced by users in real-time.

 

image001.png

 

 

  • As an application owner, you probably care about which users are impacted, what pages they are navigating and what kind of errors they are getting. You want your APM product to improve MTTR by identifying what is causing the latency issues or failure e.g. network, load balancer, ADN like Akamai, SSL or the application tier itself. Figure 2 shows a specific user session and what pages the user navigated and identified that the application tier is the cause.

image003.png

  • The “details” link in Figure 2 allows the application support personnel to drill down further which application tier is the culprit for the slow or failed transaction in context to the specific user. This allows the application support personnel to track end-user request to the line of the code.

Ease of use and superior time-to-value

You want to use a product that is simple to use for your application support / operation team.

  • A modern APM solution does not require manual definition of instrumentation policies.
  • It should not require manual changes such as Java script injection for visibility to the end user.
  • APM 2.0 tools provide ability to drill down from end-user to deep-dive for diagnostics and drill up from deep-dive data to identify the impacted user and the context for the transaction without having to do a manual correlation, jumping between consoles.
  • The agent install is typically a 5-10 mts process in the modern APM deep-dive tools.
  • The APM 2.0 deep-dive solution provides automatic detection of application servers, business transactions, frameworks etc.

 

Figure 3 shows a specific user transaction request and latencies by tiers. It also shows the SQL and latencies information.

 

image005.png

 

Suitable for production deployment

  • The real user monitoring tool should be non-invasive in nature and it should put additional overhead on application response time.
  • You should be able to deploy an always-on, deep-dive monitoring and diagnostic solution for your production enterprise and cloud-based applications.
  • It should work in an agile environment without having to configure new instrumentation policies with application releases.
  • It should scale for a large production deployment to 1000s of application servers that you want to manage in your production environment.

 

Operations Ready product and enables DevOps collaboration

The APM 1.0 products were originally built for developers and hence they were not very intuitive for operations use. The APM 2.0 products are operations friendly. Also you would expect some of those to enable DevOps collaboration for intelligent escalation to development.

  • Most application support personnel do not understand what frameworks or application technologies used by an application. The majority deep-dive tools in the market move very fast from a transaction view to line of code thus being not providing much value to operations team.

 

For example, Figure 4 shows the transaction break-down by specific technologies used by the transaction. This also provides baselines for different tiers and the system resource usage along with tiers to make intelligent decision. Figure 3 shows an application flow map for a specific transaction and time spent in each SQL or a remote web service call without having to drill down to the line of code.

image007.png

 

  • There are many instances operations team need to escalate problems to developers. The tool should allow application support personnel to escalate to Tier 3/development for diagnostics by sending a direct link to the diagnostic instance. However in many organizations, developers do not have access to production environment and as shown in Figure 5, solution from BMC allows exporting the diagnostics data call tree with latency, parameters, etc in a HTML format.

 

image009.png

Adaptive to virtualization and Cloud environment

The new APM 2.0 products are purpose-built and architected for cloud and virtualized environments. 

  • The APM 2.0 product components and agents are designed to communicate in a firewall-friendly protocol and can be encrypted / secured.
  • They support virtualized and dynamic environment without causing a lot of false alerts.
  • They support modern cloud frameworks and Big Data platforms such as Hadoop.

 

Conclusion

The APM 2.0 solution provides the functionalities that you need to manage your applications that will help exceed business expectations and increase customer loyalty. These tools provide capability to improve time to market. These provide you understanding how application performance affects user behavior — and how that behavior impacts the bottom line. You can leverage an APM 2.0 solutionlike BMC Application Performance Management to improve your application performance and thus meeting your business objectives.

Read the original blog entry...

More Stories By Debu Panda

Debu Panda is a Director of Product Management at Oracle Corporation. He is lead author of the EJB 3 in Action (Manning Publications) and Middleware Management (Packt). He has more than 20 years of experience in the IT industry and has published numerous articles on enterprise Java technologies and has presented at many conferences. Debu maintains an active blog on enterprise Java at http://debupanda.blogspot.com.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...