There are more and more opportunities to provide service over a network. Service providers, which provide service over a network, monitor the quality of the network to maintain the quality of the service.
The monitoring of the network quality is roughly divided into an analysis function and a statistical function. When each packet arrives, the analysis function analyzes the header information of the packet, so as to detect various kinds of statistical information (the number of packets transmitted and received, the number of bytes transmitted and received, the number of lost packets, a delay time, etc.) for each connection of packets in almost realtime and then store them in a statistical table. At every fixed statistical cycle, the statistical function compiles the information stored in the statistical table, performs a statistical process on the compiling result, and writes the result in a statistical information database. The statistical cycle for conducting the statistics on the statistical information is as short as one minute. There is even a system in which the cycle is as long as one day.
To maintain the service quality, it is important to detect an instantaneous degradation in the network quality. For example, in the network, there are times when the amount of traffic abruptly increases. Such an abrupt increase in the amount of traffic is called bursty traffic.
To detect bursty traffic, statistical information needs to be compiled and statistics needs to be conducted at every cycle of at least milliseconds. Setting the current statistical cycle for packet analysis shorter leads to an increase in the computational cost and the number of packet samples to be used in analysis, and therefore it is difficult to record all statistical information that is obtained at every statistical cycle. For example, in the case of storing statistical information obtained at every 10 milliseconds, the amount of data stored is 6000 times more compared to the case of storing statistical information obtained at every minute. To generate and store such a large amount of statistical information would cause a high processing load and would be unrealistic.
For example, there has been considered a technique of measuring the amount of traffic flowing in a link at an infinitesimal time interval, and storing the measurement result only when the measurement result exceeds a preset threshold or storing only a predetermined number of high-order data pieces. There has also been considered another technique of providing a first memory for storing first statistical information and a second memory for storing second statistical information, separately obtaining the first and second statistical information at predetermined different time cycles, and storing the first statistical information in the first memory and the second statistical information in the second memory.
Please see, for example, Japanese Laid-open Patent Publications Nos. 2002-118556 and 2012-199707.
However, in the technique of storing only statistical information corresponding to a time period satisfying certain conditions out of the statistical information obtained at an infinitesimal time interval, it is not possible to conduct statistics at a relatively long time interval, as is conventionally done. Even if occurrence of bursty traffic is detected by obtaining statistical information at the infinitesimal time interval, it is difficult to appropriately maintain the network quality without managing the quality through the network monitoring at a relatively long time interval.
To deal with the above, there is an idea of providing both a storage function of storing statistical information at an infinitesimal time interval and a storage function of storing statistical information at a relatively long time interval. To this end, a data table (short-term statistical table) for storing statistical information at an infinitesimal time interval and a data table (long-term statistical table) for storing statistical information at a relatively long time interval are prepared. Then, when each packet arrives, both the short-term statistical table and the long-term statistical table are accessed. Considering the characteristics of computers, access to discontinuous and different memory areas increases a processing cost and therefore degrades the processing performance of the analysis function. For example, it takes a long time to access a main memory, which is 100 to 300 times longer than the time taken for the normal basic arithmetic operations. Therefore, providing both the function of storing statistical information at an infinitesimal time interval and the function of storing statistical information at a relatively long time interval imposes an excessive processing load on a computer.