Three-dimensional sensors are important for autonomous vehicles, drones, and other applications. They may be used, for example, for obstacle detection in an autonomous vehicle. A conventional three-dimensional imaging system may include two cameras separated by a base distance. An object's images captured by the two cameras will display at different coordinates due to the lateral disparity. The object distance from the three-dimensional imaging system can be calculated with the image difference. The conventional three-dimensional imaging system, however, may have a number of limitations. For example, if the object has no distinguishable features, identification of corresponding locations cannot be made. Also, if illumination is poor or is too strong such that it exceeds the dynamic range of the camera, the depth measurement can become unreliable. Furthermore, if the object is located at a relatively long distance from the three-dimensional imaging system compared to the baseline distance, the measurement accuracy may be critically dependent on the image quality and the stability of baseline distance, both of which may be difficult to control, especially in automotive environment. Therefore, improved three-dimensional imaging systems are desired.