1. Field of the Invention
The present invention relates to the technology of supporting recording the action of a person.
2. Description of the Related Art
Conventionally, a system for improving an operation has been well known. For example, when the current operation procedure and work time are input, the system analyzes the current situation and outputs an improvement proposition. It is necessary for the analysis of the current situation to detect the time taken for each task constituting the operation. As a result of the analysis, the point where waste resides is detected and the improvement proposition is output.
The system of the patent document 1 is an example of the conventional operation analysis support system. When the current operation procedure, etc. is input, the system quantifies the operation, compares it with a predetermined operation feature pattern and generates a model, and displays the feature of the operation, the model, and the improvement proposition. The form of input data is a table form for comprehensibility of a user. However, relating to data, it is necessary for the user to grasp and input the operation contents, process target, etc. by a hearing against operators, etc.
The conventional system is suitable for operations whose operation procedure is prepared. For example, it is suitable for a case such as a production line in a factory, etc. where the procedure of an operation is definite, an observer can visually confirm what an operator is doing, etc. because the observer can easily measure the time for each operation using a watch, etc. However, since correctly grasping and inputting an operation procedure of a non-prepared operation through a hearing, etc. is restricted, it is not expected to correctly analyze a non-prepared operation.
On the other hand, in an operation mainly performed as desk work, the procedure of the operation is not definite. Additionally, when a plurality of operations to be concurrently performed are included, the action itself is hardly measured externally. For example, a system engineer, a researcher, etc. who are engaged in knowledge work always think about their work, but can hardly be understood visually. At present, for the operations performed using computers in the desk work, the operation contents and the work time can be automatically grasped and analyzed to some extent by monitoring the computer operations. For the work not including computer operations, there is no system capable of grasping offline action even for a comparatively long time such as 30 minutes, 1 hour, etc. However, since a long recording time is required and it degrades the efficiency of the entire operation, it is not preferable to keep a record by allowing an operator to manually input a description of the operation contents and work time or select an option only because it is necessary to grasp the offline action to improve the operation. Therefore, it is desired to develop a system for automatically or semi-automatically recording the operation contents and operation procedure of an operator who is mainly engaged in desk work regardless of whether or not the operation includes a computer operation.
The operation not including a computer operation can include an operation which includes moving of a person such as going out and having a meeting with a client, etc. The patent documents 2 through 5 disclose systems which detect moving of a person.
The system of the patent document 2 performs the process depending on whether or not a user is present at his or her desk (for example, a message is accepted when the user is absent). A user can explicitly instruct the computer on his or her desk that he or she is absent or present, and the system can estimate the presence/absence state. The estimation is performed based on the location of a user detected by a camera, etc., schedule information, presence state history, etc.
The system of the patent document 3 estimates the action and the position of a user from a personal schedule table, and when a user terminal has the GPS (global positioning system) function, an estimated position is amended according to the GPS information, and a Web screen on which a map and store information, etc. relating to the position is displayed in a user-selectable manner is transmitted to the user terminal.
The system of the patent document 4 analyzes the movement of a user according to the position information and the time information noticed from a mobile terminal. In this system, the transportation, etc. used for travel is estimated based on the moving speed and a locus.
The patent document 5 discloses a terminal for recording a predetermined operation contents by a user pressing a predetermined key of the terminal and managing an operation. The movement such as a walk, etc. is a large factor of a loss in work. However, since it is repeated at a high frequency, it is not desired to input data to a terminal each time travel is performed because it is a laborious process. Therefore, the terminal further includes a vibration sensor, and automatically detects and records the movement of a walk, etc. of a person who carries the terminal.
However, the above-mentioned systems have the problems that they concentrate on the position and movement of a person, does not associate them with actual action, cannot correspond to a non-prepared operation, etc. Therefore, they are not suitable for use in automatically recording or supporting a user for semi-automatically recording in a unified method operation contents and a work time of work which includes an operation performed with a computer operation and an operation performed without a computer operation, and can be hardly prepared in a form.                [Patent Document 1] Japanese Patent Application Laid-open No. 2002-352064        [Patent Document 2] Japanese Patent Application Laid-open No. H10-268959        [Patent Document 3] Japanese Patent Application Laid-open No. 2002-189656        [Patent Document 4] Japanese Patent Application Laid-open No. H10-111877        [Patent Document 5] Japanese Patent Application Laid-open No. H11-143935        