1. Field of the Invention
The invention relates generally to computer-generated virtual world environments, and particularly, to a system, method and computer program product for providing human identification proofs verifying an avatar in the virtual world environment as being controlled by a human user, as opposed to an automated program, e.g., “bot”.
2. Description of the Related Art
A virtual world is a computer-based simulated environment where avatars (i.e., a virtual representation of a user) inhabit and interact with other avatars. In a virtual world (e.g., Second Life, World of Warcraft, Google Lively, Activeworld™), a human typically projects himself/herself into the virtual world in the form of an actor (e.g., a motional avatar) that can interact within the virtual world. However, not all worlds require avatar representation as some provide first-person views only and others involve no “sight” of others.
Published PCT Patent Application No. WO03058518A2, incorporated by reference herein, provides an avatar user interface system and provides general teaching of virtual world environments. Examples of virtual worlds that include, but are not limited to, Second Life® (http://en.wikipedia.org/wiki/Internet_bot, Retrieved 27 August 07), 3DVirtual, IMetaverse, and MMORPGs (Massively Multiplayer Online Role-Playing Games).
“Internet bots”, also known as web robots, WWW robots, or simply bots, are software applications that run automated tasks over the Internet (See http://en.wikipedia.org/wiki/Internet_bot). Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human editor alone.”
Programs and algorithms can be used to create bots that mimic actions of avatars within virtual environments. Bots could be a particular issue within Virtual Store Environments, creating a three-dimensional version of email spamming and junk mail. For example, as more retailers enter the realm of Second Life, bots could be used as a viral marketing technique as avatars are created for no reason other than to promote products, hassle customers, etc.
This type of activity is already a concern for Massively Multiplayer Online Roleplaying Games (MMORPGs), where bots are used to “farm” resources that would take a significant amount of time (and tedium) for a human player to obtain. In gaming environments, these ‘bots’ are, for the most part, merely a nuisance that violates the terms and conditions of that particular game. However, in other environments, the presence of bots can have more drastic implications. Within a virtual retail store, for example, these bots could interfere with the profitability of a particular store by annoying clients and customers, reducing productivity, etc. In another example, within an online poker environment, “bots” can attend to many more games than human players and can compromise the fairness of the game.
In another form of ‘botting,’ “Click fraud” is a type of internet crime that occurs in pay per click online advertising when a person, automated script, or computer program imitates a legitimate user of a web browser clicking on an ad, for the purpose of generating a charge per click without having actual interest in the target of the ad's link.” (http://en.wikipedia.org/wiki/Click_fraud, Retrieved 27 August 07).
CAPTCHAs (Completely Automated Public Turing tests to tell Computers and Humans Apart, trademarked by Carnegie Mellon University) are challenge-response tests (used to determine whether users are human. In short, because computers are unable to solve a CAPTCHA, a correct answer begets the presumption that a user is human (http://www.captcha.net/, Retrieved 27 August 07). CAPTCHAs can be qualified as Human Identification Proofs (HIPs), an inverse version of a Turing test.
Character-based HIPs are the most widely employed HIPs in commercial web sites today because of their ease of use: a user reads a HIP comprising distorted or warped characters and in the entry field below the HIP, typically must correctly type the characters in order to prove he/she is a human.
Current methods employed to prevent spamming includes those systems and methods as described in: U.S. Pat. No. 6,868,498 B1 directed to an email system that verifies that a suspected nonhuman message is human by requesting a confirmation message; US 2007/0071200 A1 directed to a system that surfaces randomly selected phonetically encoded messages, which must be read and confirmed as human; US 2006/0026242 directed to a system for generating an automated group spam information database; US 2004/0199597 directed to a method for challenging outbound emails with a CAPTCHA to see if they are real; and, US 2003/0204569 A1 directed to a system that initiates surfacing of a CAPTCHA when a virus-infected email is discovered to determine whether it was a computer or human sender.
However, it would be highly desirable to provide a system and method for human identification proof method for use in a virtual world environment as well as other online gaming environments, and particularly, a system and method for verifying whether an avatar, representing a user in a virtual world environment, is controlled by a human user or an automated program user, i.e., a bot.