In a conventional OTDM transmitter, several optical signals modulated at bit rate B using the same carrier frequency are multiplexed optically to form a composite optical signal at a higher bit rate nB, where n is the number of multiplexed optical channels.
Specifically, multiplexing of these n constituent bit streams is achieved by launching them into an optical fiber with time delays. The bit stream in the j-th channel is delayed optically by an amount (j−1)/nB, where j=1, . . . . , n. The outputs of all channels are combined to form a composite signal as a return-to-zero (RZ) signal. The composite bit stream has a bit slot T=1/nB. Furthermore, in the composite bit stream, n consecutive bits in each interval of duration 1/B belong to n different channels, as required by the TDM scheme.
The optical delays above are typically implemented by using fiber segments of controlled lengths. As an example, a 1 mm fiber length introduces a delay of about 5 ps. Moreover, the relative delay in each channel must be precisely controlled to ensure the proper alignment of bits belonging to different channels. For a precision typically required for a 40 Gb/s OTDM signal, the delay length should be controlled to within 20 μm.
However, as link rate increases beyond 40 Gbs, conventional OTDM systems and methods begin to experience problems such as timing inaccuracy and smeared time differentials between any two bits of the output composite signal launched into the optical fiber.