US 12,169,591 B2
Method and apparatus for dynamic human-computer interaction
Juan Perez-Rua, Staines (GB); Tao Xiang, Staines (GB); and Maja Pantic, Staines (GB)
Assigned to Samsung Electronics Co., Ltd., Suwon-si (KR)
Appl. No. 17/630,303
Filed by Samsung Electronics Co., Ltd., Suwon-si (KR)
PCT Filed Dec. 4, 2020, PCT No. PCT/KR2020/017662
§ 371(c)(1), (2) Date Jan. 26, 2022,
PCT Pub. No. WO2021/125647, PCT Pub. Date Jun. 24, 2021.
Claims priority of application No. 1918852 (GB), filed on Dec. 19, 2019.
Prior Publication US 2022/0269335 A1, Aug. 25, 2022
Int. Cl. G06F 3/01 (2006.01); G06V 10/70 (2022.01); G06V 40/16 (2022.01); G06V 40/20 (2022.01); G10L 25/63 (2013.01)
CPC G06F 3/011 (2013.01) [G06V 10/70 (2022.01); G06V 40/174 (2022.01); G06V 40/20 (2022.01); G10L 25/63 (2013.01); G06F 2203/011 (2013.01)] 16 Claims
OG exemplary drawing
 
1. A method for dynamically adjusting a human-computer interaction (HCI) on a user device in response to user emotion, the method comprising:
initiating an HCI directed graph when a user launches an interactive activity on the user device, the HCI directed graph comprising a plurality of nodes and defining transition between the plurality of nodes to perform the interactive activity, wherein at least one of the plurality of nodes is an emotion-based option node that is linked to at least two output nodes;
receiving, from at least one sensor when the HCI directed graph is at the emotion-based option node, data indicating user emotion;
identifying, using a machine learning model, a user emotion from the received data; and
selecting an output node among the at least two output nodes based on the identified user emotion,
wherein the at least two output nodes are nodes located in a next step of the emotion-based option node in a flow of the interactive activity, thereby continuing the flow through the HCI directed graph based on the user emotion.