The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also be inventions.
In computer science, a queue is a particular kind of collection in which the elements in the collection are kept in order, and the principal (or only) operations on the collection are the addition of elements to the rear terminal position, known as enqueue, and removal of elements from the front terminal position, known as dequeue. Such operations make the queue a First-In-First-Out (FIFO) data structure, in which the first element added to the queue will be the first element to be removed, which is equivalent to the requirement that once a new element is added, all elements that were previously added have to be removed before the new element can be removed.
Queues perform the function of a buffer when various elements such as data, objects, or events are stored and held to be processed later. For an example, users may request to purchase products through an e-commerce website, and the website administrator may create a data structure similar to the one below to maintain a product list against each user on the web server side. Although this example data structure is based on the Java programming language and may be accessed by a Java thread pool executor, other data structures that are accessed by other executors are similar.
public class UserProductList { private String userId; : : : private ConcurrentHashMap<ProductId, ProductOrderDetails> productMap;}
If a user sequentially requests to purchase 2 products, the web server can add these 2 user requests to a queue, such that 2 concurrently executing threads can remove these user requests from the queue and concurrently process these 2 user requests. Although receiving simultaneous user requests from the same user is rare, the website administrator still needs to use server-side data structures that are safe for concurrent threads just in case two concurrent threads finish processing 2 user requests from the same user at approximately the same time. Since these parallel threads need to update the same UserProductList object, the website administrator defines the map as a concurrent hash map, which ensures that this data structure inside the UserProductList object is safe for concurrent threads.
FIG. 1 depicts the functioning of a system 100 that includes a multi-thread queue. A web server 102 receives all user requests from the users 110-118, and adds these user requests to a multi-thread queue 130, so that concurrently executing threads 140-146 can remove the user requests from the multi-thread queue 130 and concurrently process these user requests. Although FIG. 1 depicts the system 100 as including 5 users 110-118, 1 queue 130, and 4 threads 140-146, the system 100 can include any number of users 110-118, only 1 queue 130, and at least 2 threads 140-146.