The present invention relates to a method for transforming a 3D object point into a 2D image point using linear pushbroom sensors, more particularly to the transformation method for rectification of linear pushbroom images so that the images can be geometrically referenced and for generation of 3D geometric information from linear pushbroom images.
Herein, the terms xe2x80x982Dxe2x80x99 and xe2x80x983Dxe2x80x99 mean xe2x80x982-dimensionalxe2x80x99 and xe2x80x983-dimensionalxe2x80x99, respectively, unless otherwise specified
Linear pushbroom images are the images taken by sensors in motion during imaging and which have a focal point per line or per part according to sensors"" scanning mechanism. The sensors taking images in the manner are referred to as linear pushbroom sensors. Herein, the term xe2x80x98sensorxe2x80x99 means xe2x80x98linear pushbroom sensorxe2x80x99 unless otherwise specified. In comparison with linear pushbroom images, perspective images are the images that have a single focal point per image.
For perspective images, the problem of mapping 3D object point onto 2-dimentional image point is well developed and being widely used. For linear pushbroom images, however, a robust numerical solution for the problem has not been discovered.
A previously proposed solution for the problem is based on the Newton-Raphson method. However, the Newton-Raphson method works only within the region where the equation to be solved varies monotonically and hence it is very sensitive to the initial value. In some cases, this method ends up to diverse or create large errors. Although this method may work in others, it is not so easy to choose appropriate initial values. Therefore, the solution cannot be applied to rectification or generation of 3D information of linear pushbroom images.
In order to eliminate the difficulties of previous approaches, this invention proposes a new powerful and robust method to transform a 3D object point onto a 2D image point for linear pushbroom images.
In order to achieve the goal, this invention proposes a method for transforming an object point in a 3D coordinate system into an image point on a linear pushbroom image captured by a linear pushbroom sensor and represented in a 2D coordinate system. The method comprises steps for: setting collinearity equations in relation with the object point and the image point for the linear pushbroom image; assuming an initial coordinate on a first coordinate axis in the 2D coordinate system and calculating the attitude of the linear pushbroom sensor using the initial coordinate; obtaining a temporary coordinate by solving the collinearity equation, while assuming that the linear pushbroom sensor keeps a constant attitude as calculated; calculating a differential between the initial coordinate and the temporary coordinate; comparing the differential with a given threshold; repeating the attitude calculation step through the differential comparison step after considering the temporary coordinate as the initial coordinate if the differential exceeds the given threshold; deciding on the temporary coordinate to be a final coordinate on the first coordinate axis if the differential does not exceed the given threshold; and obtaining a coordinate on a second coordinate axis in the 2D coordinate system by solving the collinearity equations using the final coordinate on the first coordinate axis.
Preferably, the collinearity equations are set on, as follows;       0    =                  -        f            ⁢              xe2x80x83            ⁢                                                  r              11                        ⁢                          xe2x80x83                        ⁢                          (                              X                -                                  X                  S                                            )                                +                                    r              21                        ⁢                          xe2x80x83                        ⁢                          (                              Y                -                                  Y                  S                                            )                                +                                    r              31                        ⁢                          xe2x80x83                        ⁢                          (                              Z                -                                  Z                  S                                            )                                                                          r              13                        ⁢                          xe2x80x83                        ⁢                          (                              X                -                                  X                  S                                            )                                +                                    r              23                        ⁢                          xe2x80x83                        ⁢                          (                              Y                -                                  Y                  S                                            )                                +                                    r              33                        ⁢                          xe2x80x83                        ⁢                          (                              Z                -                                  Z                  S                                            )                                                y    =                  -        f            ⁢              xe2x80x83            ⁢                                                  r              12                        ⁢                          xe2x80x83                        ⁢                          (                              X                -                                  X                  S                                            )                                +                                    r              22                        ⁢                          xe2x80x83                        ⁢                          (                              Y                -                                  Y                  S                                            )                                +                                    r              32                        ⁢                          xe2x80x83                        ⁢                          (                              Z                -                                  Z                  S                                            )                                                                          r              13                        ⁢                          xe2x80x83                        ⁢                          (                              X                -                                  X                  S                                            )                                +                                    r              23                        ⁢                          xe2x80x83                        ⁢                          (                              Y                -                                  Y                  S                                            )                                +                                    r              33                        ⁢                          xe2x80x83                        ⁢                          (                              Z                -                                  Z                  S                                            )                                          
in which, X, Y and Z mean a coordinate of an object point, respectively; XS, YS and ZS mean a coordinate of the sensor upon capturing an image of the object, respectively; r11 through r33 mean an element of a rotation matrix R determined by the sensor attitude upon capturing an image of the object, respectively; ƒ means the focal length of the camera; and x and y mean a coordinate of an image point, respectively.
It is also preferred that the temporary coordinate obtainment step comprises sub-steps for: calculating elements of the rotation matrix R using the attitude of the linear pushbroom sensor; and solving the collinearity equation using the elements of the rotation matrix R.
According to the present invention, there is also provided a machine-readable medium on which a program is recorded, wherein the program conducts to transform an object point in a 3D coordinate system into an image point on a linear pushbroom image captured by a linear pushbroom sensor and represented in a 2D coordinate system by taking steps for: setting collinearity equations in relation with the object point and the image point for the linear pushbroom image; assuming an initial coordinate on a first coordinate axis in the 2D coordinate system and calculating the attitude of the linear pushbroom sensor using the initial coordinate; obtaining a temporary coordinate by solving the collinearity equation, while assuming that the linear pushbroom sensor keeps a constant attitude as calculated; calculating a differential between the initial coordinate and the temporary coordinate; comparing the differential with a given threshold; repeating the attitude calculation step through the differential comparison step after considering the temporary coordinate as the initial coordinate if the differential exceeds the given threshold; deciding on the temporary coordinate to be a final coordinate on the first coordinate axis if the differential does not exceed the given threshold; and obtaining a coordinate on a second coordinate axis in the 2D coordinate system by solving the collinearity equations using the final coordinate on the first coordinate axis.