I. Field of the Invention
The present invention relates to the Shannon bound on communications capacity and also relates to symbol modulation and demodulation for high-data-rate wired, wireless, and optical communications and includes the symbol modulations phase-shift-keying PSK, quadrature amplitude modulation QAM, bandwidth efficient modulation BEM, gaussian minimum shift keying GMSK, pulse position modulation PPM, and the plurality of current and future modulations for single links and multiple access links which include time division multiple access TDMA, orthogonal frequency division multiple access OFDMA, code division multiple access CDMA, spatial division multiple access SDMA, frequency hopping FH, optical wavelength division multiple access WDMA, orthogonal Wavelet division multiple access OWDMA, combinations thereof, and the plurality of radar, optical, laser, spatial, temporal, sound, imaging, and media filings. Communication filing examples include electrical and optical wired, mobile, point-to-point, point-to-multipoint, multipoint-to-multipoint, cellular, multiple-input multiple-output MIMO, and satellite communication networks.
II. Description of the Related Art
The Shannon bound is the Shannon capacity theorem for the maximum data rate C and equivalently can be restated as a bound on the corresponding number of modulation bits per symbol as well as a bound on the communications efficiency and is complemented by the Shannon coding theorem. From Shannon's paper “A Mathematical Theory of Communications” Bell System Technical Journal, 27:379-423, 623-656, October 1948 and B. Vucetic and J. Yuan's book “Turbo Codes”, Kluwer Academic Publishers 2000, the Shannon (Shannon-Hartley theorem) capacity theorem, the corresponding Shannon bound on the information bits b per symbol, the Shannon bound on the communications efficiency η, and the Shannon coding theorem can be written as equations (1).Shannon bounds and coding theorem  (1)                1 Shannon capacity theorem        
            C                      =                ⁢                  B          ⁢                                          ⁢                                    log              2                        ⁡                          (                              1                +                                  S                  /                  N                                            )                                                                                          =                ⁢                              Channel            ⁢                                                  ⁢            capacity            ⁢                                                  ⁢            in            ⁢                                                  ⁢            bits            ⁢                          /                        ⁢            seconds                    =          Bps                                                                            ⁢                  for          ⁢                                          ⁢          an          ⁢                                          ⁢          additive          ⁢                                          ⁢          white          ⁢                                          ⁢          Gaussian          ⁢                                          ⁢          noise          ⁢                                          ⁢          AWGN                                                                            ⁢                  channel          ⁢                                          ⁢          with          ⁢                                          ⁢          bandwidth          ⁢                                          ⁢          B          ⁢                                          ⁢          wherein          ⁢                                          ⁢                      “                          log              2                        ”                                                                                      ⁢                  is          ⁢                                          ⁢          the          ⁢                                          ⁢          logarithm          ⁢                                          ⁢          to          ⁢                                          ⁢          the          ⁢                                          ⁢          base          ⁢                                          ⁢          2                                                                    =                ⁢                  Maximum          ⁢                                          ⁢          rate          ⁢                                          ⁢          at          ⁢                                          ⁢          which          ⁢                                          ⁢          information          ⁢                                          ⁢          can          ⁢                                          ⁢          be                                                                            ⁢                  reliablity          ⁢                                          ⁢          transmitted          ⁢                                          ⁢          over          ⁢                                          ⁢          a          ⁢                                          ⁢          noisy          ⁢                                          ⁢          channel                                                                            ⁢                  where          ⁢                                          ⁢                      S            /            N                    ⁢                                          ⁢          is          ⁢                                          ⁢          the          ⁢                                          ⁢          signal          ⁢                      -                    ⁢          to          ⁢                      -                    ⁢          noise          ⁢                                          ⁢          ratio          ⁢                                          ⁢          in          ⁢                                          ⁢          B                                    2 Shannon bound on b, η, and Eb/No        
                              max          ⁢                      {            b            }                          =                  max          ⁢                      {                          C              /              B                        )                                                  =                              log            2                    ⁡                      (                          1              +                              S                /                N                                      )                                                  =                  max          ⁡                      (            η            )                              Eb/No=[2^max{b}−1]/max{b}                                    wherein                            b=C/B, Bps/Hz=Bits/symbol                η=b/TsB, Bps/Hz                Ts=symbol interval                                                3. Shannon coding theorem for the information bit rate RbFor Rb<C there exists codes which support reliable communicationsFor Rb>C there are no codes which support reliable communications        
Using the assumption that the symbol rate 1/Ts is maximized which means 1/Ts=(Nyquist rate)=bandwidth B and is equivalent to TsB=1, enables 1 in equations (1) defining C to be rewritten to calculate max{b} as a function of the signal-to-noise ratio S/N, and to calculate Eb/No which is the ratio of energy per information bit Eb to the noise power density No, as a function of the max{b} in 2 and wherein max{b} is the maximum value of the number of information bits per symbol b. Since the communications efficiency η=b/(TsB) in bits/sec/Hz it follows that maximum values of b and η are equal. The derivation of the equation for Eb/No uses the definition Eb/No=(S/N)/b in addition to 1 and 2. Reliable communications in the statement of the Shannon coding theorem 3 means an arbitrarily low bit error rate BER.