Definition - What does Application Performance mean?
Application performance, in the context of cloud computing, is the measurement of the real-world performance and availability of applications. It is particularly used with remote and cloud computing applications being run in remote servers and served over a network such as the Internet. Application performance is a good indicator of the level of service that a provider is offering and is one of the top monitored IT metrics.
There are two sets of application performance metrics that are closely monitored. The first, and by far the more important, is the actual performance experienced by the end users of the application, such as the average response time under normal or peak loads. The measurements are often related to response time:
The response time for the application to act upon a user’s action such as navigation.
The response time or the time it takes to finish a specific amount of work such as searching or sorting data.
The load of the system, which is measured as the volume of transactions that the application has to process, such as requests (requests per second), transactions (transactions per second) and pages per second.
The second set involves measurements of computational resources consumed by the application for doing its tasks, which is a good indicator of whether there are enough resources for completing a given load or whether the application consumes more resources than it should; the latter is often a good indicator that the application is not optimized. This is very important, especially for cloud applications, as users should have the same experience with an application being delivered over the Internet as that running locally in a totally capable set of hardware.