Network systems permeate computer technology, and likewise everyday business operations. Networks interconnect computer systems in nearby locations with computers systems in remote locations. Networks carry data, such as business transactions ranging from cargo waybills to financial transactions. Each business transaction sends and receives data over a computer network, and millions of business transactions may occur in a day or an hour. Latency in handling network traffic can cause delays and decrease productivity in businesses. Thus, there is a continuing desire to decrease latency in computer networks and computer systems.
Conventional computer networking systems employ, for example, a circular queue for managing network data. FIG. 1 is a block diagram illustrating a conventional network queue for processing network requests received from an application. A circular queue 102 includes, for example, locations 102A and 102B, When an application 104 needs to transfer data over a network, the application inserts the data into the queue 102, such as into location 102A. Although only one application 104 is shown, multiple applications may be inserting data into the queue 102. A network handler 106 occasionally polls the queue 102 to determine if new data is in the queue 102 for transmission through a network adapter 108. The handler 106 may be part of an operating system executing on a computer system including the network adapter 108. When the handler 106 determines new data is ready for transmission through the network adapter 108, the handler retrieves the data from the queue 102 and controls the network adapter 108 to transmit the data.
Polling the network queue 102 by the handler 106 delays processing of network requests until the next polling event. The average latency of processing a network request is generally half of the poll rate for polling the network queue. This delay adds a significant amount of latency that detracts from performance of the network interface.