1. Field of the Invention
The present invention relates to a method of fitting a virtual item and system for providing a fitting service for a virtual item, and more particularly, to a method and system for determining fitting of a virtual item and an overall look without actually wearing an item desired by a user, through use of a three-dimensional (3D) human body model reflecting an appearance and a skeletal structure of the user.
2. Description of the Related Art
In recent times, an introduction of a depth sensor has allowed users to readily obtain user shape information and joint motion information at a reasonable price. Consequently, applications of the depth sensor resulted in changes to a user interface used in various services. More particularly, following the changes to the user interface, new interactive user services are emerging. One of the interactive user services may include a virtual item fitting service in which a user wears items, such as clothes or accessories, in a virtual setting without actually wearing the items.
In a conventional virtual item fitting service, a motion of a user provided by a depth sensor is received, an avatar rigged or skinned is controlled, and the rigged avatar is animated. Such services are provided irrespective of a body shape of the user, hence an issue of lack of reality.
In addition, a conventional service of creating a customized item is also provided by scanning an appearance of a user via a 3D full-body scanner. However, the 3D whole-body scanner used in such a service is highly expensive, and a motion of the user changing in real time may not be fully reflected due to the motion of the user being restricted during the scanning process.
Accordingly, there is a need for a solution in terms of real time processing, and a more realistic and intuitive service reflecting the motion of the user despite the user wearing clothes or accessories in a virtual setting only.