The present invention relates to content aggregation and organization, and more particularly to creating and displaying wiki entries based on a user's personal knowledge.
An encyclopedia is a type of reference work or compendium holding a comprehensive summary of information from either all branches of knowledge or a particular branch of knowledge. Encyclopedias are divided into articles or entries, which are usually accessed alphabetically by keyword. Encyclopedia articles focus on factual information to cover the thing or concept for which the article name relates to.
In lexical analysis, tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens. The list of tokens becomes input for further processing such as parsing or text mining. Tokenization is useful both in linguistics (where it is a form of text segmentation), and in computer science, where it forms part of lexical analysis.