In recent years, there has arisen the need for a system which processes a large number of agents in a short time. Examples include a system which supplies respective agents prepared for users with information matched with the respective tastes of the users at stations of a railroad or the like. In this service, when a previously registered user passes through an automatic ticket gate of a station, information concerning the vicinity of the station is searched for according to the taste of the user and transmitted to a mobile terminal or the like of the user. In this service, users who pass through automatic ticket gates at all stations of a railroad line can become targets. With an increasing number of users who pass through automatic ticket gates at the same time, there arises a necessity to process messages of a large number of agents in a short time.
According to Japanese Patent Laid-open No. 2004-192047, in order to efficiently process messages of a large number of agents, an agent having a message at a high priority level is processed preferentially over other agents. Further, respective agents prepared for customers are also differentiated and processed depending on priority levels matched with the customers. Moreover, for agents managed in a database, agents are cached in a system memory in order to reduce the access load on the database. After all messages stored in a message queue have been processed for each agent or after an upper limit of the number of messages consecutively processed has been exceeded, the next agent starts being processed. Thus, access to the database is minimized. That is, messages are processed according to the priority levels of agents and messages in such a manner that agents existing in the cache memory are given high priority and that messages of one agent are consecutively processed, whereby the number of times of accessing the database is attempted to be reduced and the processing of messages is attempted to be accelerated.
In the technology of the above-described Japanese Patent Laid-open No. 2004-192047, an agent having a message at a high priority level is preferentially processed. However, in the processing of messages, in order to reduce the number of times of accessing the database, not only messages at high priority levels but also messages at low priority levels are consecutively processed for one agent. Accordingly, only after the messages at low priority levels have been processed, messages at high priority levels of the next agent are processed. If there are a large number of agents having both of messages at high priority levels and messages at low priority levels, messages at high priority levels and messages at low priority levels are processed for each agent. Accordingly, the processing of a message at a high priority level of an agent processed later is considerably delayed because the processing of messages at low priority levels interrupt.
For example, in station vicinity information delivery services, delay problems are pointed out. A station vicinity information delivery service means a service in which advertisement or traffic information concerning the vicinity of a station, which matches with an information category previously registered by a user, is transmitted from an agent to a mobile phone or the like of the user when a commutation ticket is inserted into an automatic ticket gate. In the case where such a service is provided, there are restrictions on time, places, and taste information, such as the restrictions: that advertisement information or traffic information must reach its destination within a time in which the movement of the user from a station is small; and that the contents of information must correspond to taste information or the like concerning each user. The service must be provided for many users under conditions in which these are satisfied. An agent which performs these processes exists for each user, and performs a process for registering new advertisement information to be delivered to the user in addition to information delivery. Further, advertisements need to be filtered, when being registered, according to subscription information concerning each user. However, there is a necessity, when an automatic ticket gate is passed through, to avoid a heavy process in which a database is searched and customized for each user every time. Accordingly, each user has advertisement information and the like to be delivered. Further, in this service, high speed is required for an information delivery process performed when a user passes through an automatic ticket gate, but less required for a process such as a process for registering advertisement information for each user.
Moreover, similar delay problems can also occur in a service or the like: in which position information concerning a user is obtained using a GPS or the like built in a mobile phone of the user; and in which information is delivered to the mobile phone in response to a request from the user depending on the time, place, and taste of the user in a pinpoint manner. That is, in a crowded shopping mall or a tourist spot, in addition to processes for registering and delivering shop advertisement information, there can be processes for registering and delivering seat availability information concerning a restaurant or a cafe in response to a request from the user. In this case, a process for delivering information to the user needs to be performed within a time in which the user does not so much move from a place where the user has requested the information, and requires high speed. On the other hand, in a registration process, high speed is relatively required for seat availability information concerning a restaurant or a cafe. However, high speed is not required for a process for registering advertisement information concerning a sale starting in the next month.
However, in the case where the technology of the aforementioned Japanese Patent Laid-open No. 2004-192047 is adopted, a long delay is generated when advertisement information matching with a wide range of categories of users is registered. A description below will be given with a focus on a station vicinity information delivery service. However, this is one example but not intended to limit the scope of application of the present invention to a station vicinity information delivery service.
That is, in a station vicinity information delivery service, after an information delivery process performed a task regarding a user has passed through an automatic ticket gate, an agent subsequently performs an advertisement registration process. Considering that a plurality of automatic ticket gates are operating at all stations of a railroad line, many users are considered to pass through automatic ticket gates at about the same time. In some cases, information delivery from a user's agent processed further later may be significantly delayed by advertisement registration processes performed by agents of other users.
Supposing that an advertisement is delivered during evening rush hours in this station vicinity information delivery service, simple calculations are performed for the following specific case:                The number of automatic ticket gates is assumed to be 500 by assuming that the average number of automatic ticket gates per station is 10 and that the number of stations is 50, and users of this service is assumed to be 10% of all users.        The cache size for agents is assumed to be 50% of the total number of agents.        All automatic ticket gates are assumed to be fully operating because of busy time.        It is assumed that when a user enters an automatic ticket gate, the user passes in one second, and a next user starts entering immediately.        It is assumed that one advertisement is registered in relation to each of 25 stations, which are half of all stations, and that half of all users have become targets to whom the registered advertisements are to be delivered.        As for a time for processing an agent for the case where a system is fully operating, in a situation in which a plurality of agents are being processed in parallel at the same time, the CPU time of an agent relating to passage through an automatic ticket gate is assumed to be 18 ms in the case where the agent exists in a cache; the CPU time of an agent relating to passage through an automatic ticket gate is assumed to be 21 ms in the case where the agent does not exist in the cache; and the CPU time of an agent relating to advertisement information registration is assumed to be 20 ms in the case where the agent exists in the cache.        There are the following four patterns of agent processing:        
A) an agent which exists in the cache and is a target of an advertisement registration process
B) an agent which exists in the cache and is not a target of an advertisement registration process
C) an agent which does not exist in the cache and is a target of an advertisement registration process
D) an agent which does not exist in the cache and is not a target of an advertisement registration process
Since the cache size is 50% and agents which become targets of advertisement registration processes are half of all agents, the number of agents of each pattern is 12.5 out of the number of agents (50 agents because 500 persons pass through automatic ticket gates per second and 10% of them are members) processed in one second. Here, an automatic ticket gate passage process is desired to be processed preferentially over an advertisement registration process. Accordingly, the priority level of a message indicating passage through an automatic ticket gate is set to one higher than that of a message indicating an advertisement registration request. An agent scheduler selects an agent existing in the cache, executes an automatic ticket gate passage process for it, and subsequently executes an advertisement registration process for the agent. When agents waiting messages indicating passage through automatic ticket gates have disappeared from the cache, an agent not existing in the cache is read into the cache, an automatic ticket gate passage process is executed, and subsequently an advertisement registration process for the agent is executed.
The time required for completing the processing of all agents existing in the cache is12.5×18 ms+12.5×(18 ms+20 ms)=700ms, and the time required for completing the processing of all agents not existing in the cache is12.5×21 ms+12.5×(21 ms+20 ms)=775 ms. The sum of these is 1475 ms, which exceeds one second by 475 ms. When one second is exceeded, a next user enters an automatic ticket gate, and therefore new messages indicating passage through automatic ticket gates reach the system one after another. Then, the agent scheduler again processes agents existing in the cache. The processing of agents (these are agents not existing in the cache) not processed in previous one second is postponed. Since users pass through automatic ticket gates one after another, agents left unprocessed are accumulated. This means that the advertisement does not reach some users. As described above, a process requiring quick response cannot be processed in an appropriate time in a situation in which messages at various priority levels are mixed.