1. Field of the Invention
The present invention relates to an apparatus and a method for creating artificial feelings, and particularly to an apparatus and a method for creating artificial feelings which make it possible to actually express the feelings of a machine apparatus such as a robot, etc. like human being's combined feelings in such a way that the current feelings of a robot are created in the forms of the groups consisting of feeling values of different feeling value groups having a plurality of basic feelings.
2. Description of Related Art
The feelings of a machine apparatus such as a robot, etc. is generally limited to the feelings created at a specific position of a feeling space with a certain number of feelings which is previously set in accordance with an input via a sensor.
FIG. 1 is a schematic view illustrating a conventional robot's feeling expression method.
The calculation of the current feeling value of a robot is needed in order to make a robot express its feelings. Emotion, in other words, feelings is rarely decided in a specific form of feeling like happiness or sadness. Though a human being currently feels happiness, part of another feeling such as a feeling of surprise and a feeling of anger appears in a combined form, inevitably reflected. In other words, the expression of a feeling comes from a result of the reflection of very combined and detailed feelings. In order to implement an actual feeling expression in a robot, the feeling values adapted to a robot might be expressed in a form of vectors which reflects different and detailed feelings such as happiness, sadness, surprise, anger, etc.
As shown in FIG. 1, a feeling and a feeling expression corresponding to the feeling are mapped on a certain position of a fixed dimension space with the aid of a 2-dimensionally or a 3-dimensionally fixed space so as to express the feelings of a robot. The feeling values can be expressed and calculated as a vector value corresponding to a certain position in the space.
In other words, feelings are mapped on multiple points in a vector space, and a feeling expression corresponding to each feeling is mapped at 1:1. If a specific feeling vector is given, one feeling which is closest among the feelings mapped on the specific feeling vector and in the vector spaces is selected, and the feeling mapped at 1:1 with the selected feeling can be finally expressed.
Since there is a limit in manually mapping the feeling and the feeling expression corresponding to the feeling on a number of coordinates in the vector space, the conventional method of FIG. 1 is directed to selecting a small number of coordinates and to mapping a feeling corresponding to each coordinate and a feeling expression behavior corresponding to the feeling and then to analyzing a feeling value of a robot and to selecting a feeling of the closest coordinate, thus performing a feeling expression.
For example, a feeling value 1 {happiness 1, sadness 0, surprise 0, anger 0} is set to be expressed at a coordinate of 1 in a 4-dimensional vector space, and when a feeling value 2 {happiness ¾, sadness ¼, surprise 0, anger 0} and a feeling value 3 {happiness ¾, sadness 0, surprise ¼, surprise 0} are closer to a coordinate of 1 than the coordinate which expresses another feeling, the feeling values 1, 2, 3 all perform the feeling expressions set in the coordinate of 1.
In the above mentioned way, the conventional method is configured in such a way that though the internally, actually created feeling values differ from each other, the most similar one among the feeling values mapped on the coordinate of 1 is selected as the selected feeling value. Since the feeling expression behavior is selected based on the feeling values of the same coordinate, the types of the expressions appearing by way of an expression organ are same.
Referring to FIG. 2, it will be described in more details. FIG. 2 is a schematic view for explaining a procedure for creating feelings of a robot from a feeling state input value in the conventional art. In FIG. 2, a 1-dimensional feeling coordinate system is used for simplified explanation.
It is assumed that a basic feeling of happiness is set on the coordinate of 1 of the feeling coordinate system, and a basic feeling of sadness is set on the coordinate of −1. When an input value is 0.3, the basic feeling closest to 0.3 is extracted. As shown in FIG. 2, since 0.3 is closer to 1 than −1, the basic feeling of happiness is extracted. Since the basic feeling of happiness is on the coordinate of 1 of the feeling coordinate system, the feeling of the robot finally becomes the coordinate of 1.
According to the above described feeling creation method, though the input value is 0.5, the final feeling of the robot will become the coordinate of 1 like the input value is 0.3.
As for the feeling expression apparatus which expresses with eyes, a mouth, a gesture, etc. by receiving the feelings of the robot created by the above described feeling creation method, though the feeling state input values are different like 0.3, 0.5, etc., the same coordinate of 1 is expressed as the feeling of the robot. So, though the input values are different in most of the conventional robot, the same feelings are expressed.
In terms of the feelings of a human being, happiness could be combined with other feelings such as sadness, surprise, etc. So, the degree of the happiness appears different depending on how much another feeling such as sadness, surprise, etc. is reflected in the happiness. There might also be a difference in the expression of feelings depending on the degree of happiness.
Since what the robot's feeling expression most needs is a feeling expression which is closest to the expression of a human being, a combined feeling should be created, which combined feeling needs a combined feeling of a human being in order for a robot to express the feeling which is closest to that of a human being.
The Korean patent publication number 2007-0061054 discloses a robot and a method which are directed to creating a plurality of feelings such as a first feeling and a second feeling; however the first feeling disclosed in the above mentioned publication comes from a feeling (surprise feeling, fear feeling, etc.) created in the form of a robot's feeling without external evaluations based on the information received from a sensor part, and the second feeling comes from a feeling (feelings of happiness, anger, rejection, neutral position, sadness, partial fear, etc.) created by means of an evaluation based on a list standard of a database in addition to the information from a sensor and an influence evaluation of time, etc. In other words, the above mentioned patent publication does not disclose combined feelings which are closest to the combined feelings of a human being, so the above described patent publication has the limits like the technologies of FIGS. 1 and 2.