Videoconferencing has a wide range of applications, such as desktop and room-based conferencing, video and audio over the Internet and over telephone lines, surveillance and monitoring, telemedicine (medical consultation and diagnosis at a distance), computer-based training and education, and the like. In each application the video information is transmitted over a variety of telecommunication links, such as IP networks, analog telephone lines, ISDN lines etc. Practical communication channels have a limit to the number of bits that they can transmit per second. Sometimes an endpoint is limited to a predetermined bandwidth which is less than the bandwidth capability of its channel. In many cases the bit rate is fixed, i.e., is a Constant Bit Rate (CBR), for example POTS, ISDN and the like. As a result different endpoints participating in a conference can accommodate different bandwidths (i.e., call rate), some endpoints may have a relatively high bandwidth, some have a lower bandwidth and others connect at an intermediate bandwidth.
The term “communication channel”, as used herein, refers to the physical media and devices which provide the means for transmitting information from one endpoint to other endpoints, an endpoint being the terminus of a communication path between a transmitter and a receiver. With a communication channel, the “transmitter” is the entity that writes to the endpoint, and the “receiver” is the entity that reads from the endpoint.
Today it is possible to carry out conferences between several endpoints at the same time using suitable conference means, such as those that comprise a Multipoint Conferencing Unit (MCU). An MCU is a multi-port device that allows intercommunication of three or more audiographic, audiovisual or multimedia terminals in a conference configuration, such as RADVISION viaIP. MCUs of RADVISION, Ltd. Typical MCUs include a Multipoint Controller and a Multipoint Processor; the Multipoint Controller is used for administering the conference (e.g., by making required decisions regarding the endpoint participating in the conference); the Multipoint Processor is used for processing the content of the conference (such as voice, video and other relevant data). Usually, the decisions made by the Multipoint Controller are executed by the Multipoint Processor.
The video codec used in such MCU is usually a variable bit-rate codec (e.g. H.263). While using such an MCU, participants connected via endpoints at different bandwidths may be allowed to join the same conference at different call rates.
In the prior art, several attempts were made in order to enable endpoints running video at different bit-rates to join one and the same conference and maintain their own speed and associated video quality. For example, bit-rate matching of different endpoints allows 112 or 128 Kbps systems to join 336 or 384 Kbps systems in a single conference. However, bit-rate matching in conferences between endpoints with different bit-rates results in all endpoints dropping down to the same rate as the endpoint running the lowest bit-rate, unless video processing resources (a resource capable among other functions of transcoding video from one higher rate to another lower rate) are available. However, when this type of resource is available, the bit rates allowed in the conference are, in the prior art, manually predetermined and not necessarily optimized for optimum quality. In current implementations the administrator/operator chooses a set of bandwidths to be used during the conference, and each endpoint joining is generally automatically assigned a bandwidth for the duration of the conference. For example, a call with three 384 Kbps-capable video systems and one 128 Kbps system would result, according to the prior art in all systems operating at 128 Kbps, when no video processing resource is available. If such resource is available and the operator determined the allowed rates to be 384 kbps and 128 kbps then each endpoint would be receiving optimum quality video. However, if the 128 kbps endpoint decided to join at 256 kbps without the knowledge of the conference administrator he would still get 128 kbps video which obviously isn't optimal.
Other bit-rate matching methods involve splitting the endpoints into predetermined groups according to specific bandwidth resources. However, the video bit-rate used by such predetermined group is fixed and, as a result, the overall quality experienced by the participants in the conference is not optimal.
It is an object of the invention to provide a method for administering output rates in a multipoint conference environment, which overcomes the aforementioned drawbacks of the prior art.
It is another object of the present invention to provide a method which will allow the selection of a set of output rates in such a manner that the resources utilized provide the best possible video quality to the whole of the conference participants at any given time.
Other objects and advantages of the invention will become apparent as the description proceeds.