Network latency is the amount of time a message takes to traverse a network system. In a computer network, it is an expression of how much time it takes for a packet of data to get from one designated point to another. It is sometimes measured as the time required for a packet to be returned to its sender. Latency can depend on the speed of the transmission medium (e.g., copper wire, optical fiber or radio waves) and the delays in the transmission by devices along the way (e.g., routers and modems). A low latency indicates a high network efficiency. Latency and throughput are the two most fundamental measures of network performance. They are closely related, but whereas latency measures the amount of time between the start of an action and its completion, throughput is the total number of such actions that occur in a given amount of time.
Sending data in large packets has a higher throughput than sending the same data in small packets both because of the smaller number of packet headers and because of reduced startup and queuing latency. If the data is streamed (i.e., sent in a continuous flow), propagation latency has little effect on throughput, but if the system waits for an acknowledgment after each packet before sending the next, the resulting high propagation latency will greatly reduce throughput.
Latency is an important consideration with regards to other aspects of computers, particularly where real time (i.e., nearly instantaneous) response is required. For example, in some internet games, a high latency (also called lag) can add to the difficulty of determining which player performed an action first (such as shooting an opponent or answering a question). In playing computer-based musical instruments, latencies greater than 100 milliseconds make it difficult for players to get the nearly instantaneous feedback that they require.
Network latency in a packet-switched network is measured either one-way (the time from the source sending a packet to the destination receiving it), or round-trip delay time (the one-way latency from source to destination plus the one-way latency from the destination back to the source). Round-trip latency is more often quoted, because it can be measured from a single point. Latency limits total throughput in reliable two-way communication systems as described by the bandwidth-delay product. The processing latency that may be reduced by the instant disclosure may be the connection latency.
When computers are communicating over a network, several actions for connectivity are performed for every network connection before the transmission of data. For example, tcp sync packet, tcp ack packet, and a TLS handshake with and without a session cache, may be performed to establish network connections for computer communication. As faster speed is almost always a desire in computer networking, like in communication systems for health care and other industries, there is clearly a desire to speed up and/or eliminate some of these actions performed for connectivity in network connection.
The instant disclosure of a system and method to reduce network latency with a pool of ready connections may be designed to solve at least some of the problems disclosed above.