Today's Internet users have high expectations for their online experiences. Users expect the web sites and web-based applications they use to work properly, to be highly responsive, and to load quickly in their browser. Despite this, slowly responding web pages are a common complaint among Internet users.
Users have short attention spans on web sites and a limited threshold for how long they will wait for a web page to respond to their actions, such as clicking on a link or a button, and for how long it takes a page to load in their browser. As a web page or web-application's responsiveness fails to perform as expected, users quickly become frustrated and will abandon a site in search of a faster performing alternative.
This abandonment costs businesses significant revenue due to lower online sales and conversion rates, damaged brand image, higher support costs, and lost customers. Further compounding the problem is that web sites continue to become more bandwidth-intensive, as more technologies and richer media are added to web pages to make sites more compelling and interactive.
Businesses often set goals for web page response time and strive to develop their web pages to load within a specific amount of time. Businesses have attempted to adopt various techniques to try and understand what page load times users are experiencing, and to identify areas that need improvement.
In one such attempt, automated “robots” in various locations on the Internet are used to simulate user activity. These robots make periodic web requests to the web site server, and monitor how long a page takes to be retrieved. While this approach is effective in determining whether a web site is up and operational, it does not provide an accurate view of the page load times that the actual web site users are experiencing on the site. This is because such robots do not account for the variables which impact web site responsiveness and page load times for the actual users of the site, such as the performance of each user's computer system that their browser runs on, and the actual quality and throughput of each user's Internet connection, which can vary widely from user to user.
In yet other attempts, a basic timing function is used in isolation on a page which tracks the amount of time that elapses from the point after a page starts loading, to the point at which it finishes.
Such attempts at monitoring page load times do not provide an adequate view of network responsiveness. They fail to measure the total duration that ensues between the time a user takes an action on one page, and the time the next page begins loading. Without this measurement, components of the network and overall system performance, which can have a significant impact on the web site's responsiveness and page load times, cannot be accounted for. Therefore one cannot accurately determine how long a page actually took to load from the user's viewpoint, which may be the viewpoint that matters most.
In still other attempts, the load time data is buffered on the client and not transmitted and recorded until a later time. Should a browser crash or network failure occur, it is possible the data may be never recorded and permanently lost. Such attempts may also require that a particular web page on a site be retrieved to initiate the tracking session before other pages on the site can be tracked, thereby possibly limiting the amount of pages from the user's session that can be monitored.
Data provided by such prior attempts may also be misinterpreted because it generally does not take into account the conditions under which the page was loaded, such as if the page was served from cache, which can greatly impact page load times and yield misleading measurements as a result.