User interfaces to computer applications have evolved over a period of years from text-based interfaces to graphical interfaces. It is further expected that interfaces will evolve beyond the graphical user interface (GUI) of the 1990s to a more natural user interface in the decades ahead. International Data Corporation® (IDC) gives a name to this new interface, the natural user interface (NUI).
The primary object of the NUI is to broaden the use of speech and natural language capabilities in human interaction with computers. Therefore, the two key requirements of NUI include the ability of application software to accommodate conversational dialogs between system and user, and the incorporation of a common sense knowledge base to enhance the system's ability to properly interpret the meaning of the user's conversation.
Present day computer users are familiar with manipulating computer applications via user graphical interfaces, which in turn can be manipulated via a mouse and a keyboard. While the color, modeless GUI was a great improvement over its predecessors; the ability to operate such interfaces is not natural (but on the contrary, an acquired ability). However, a human learns the art of pointing at an object on the screen and clicking a mouse button with relative ease, as opposed to the complexity involved in having to type a set of cryptic commands. Touch screens are in the same category. There are more natural ways for us to communicate, however, especially when keypads become too small for serious typing, as in the case of portable computer-based devices such as personal digital assistants (PDAs).
As stated earlier, IDC expects the user interface of the next decade to be much more natural. This new interface will contain one or more natural language understanding, speech recognition and speech synthesis, and handwriting recognition. Although the user interface language was an issue in developing GUI-based applications, the language impact on the new natural interfaces will be much greater and much more sophisticated. Of all the features proposed by the IDC for the natural language interface, the handwriting recognition aspect finds the greatest applicability in the area of PDAs and other hand-held computer-based devices, since upon successful implementation, users are be able to write down information (which in turn is identified via a handwriting recognition algorithm) on the screen of the PDA instead of typing in the information using a small virtual keyboard.
Handwriting recognition is the technique by which a computer system can recognize characters and other symbols written by hand. In theory, handwriting recognition should free us from our keyboards, allowing us to write and draw in a more natural way. It is considered one of the key technologies that will determine the ultimate success or failure of PDAs and other hand-held devices. To date, however, the technology has had only limited success. This is partly because it is still a new technology and is not as fast or accurate as it needs to be.
Although prior art applications have implemented, with minimal success, handwriting recognition algorithms for languages such as English, they have yet to overcome the complex linguistic challenges posed by other languages such as Arabic. One reason why Arabic poses problems in the area of handwriting recognition is the fact that there are a myriad of externals (including diacritics) associated with the Arabic language, thereby making it complicated for algorithms to discern the exact written content. Thus, there is a need for an Arabic handwriting recognition system. More specifically, there is a need for an Arabic handwriting recognition system, with a natural user interface, that takes into accounts the complex nature of the Arabic language.