It is often required to determine securely a difference measure between two signals. Conventional methods typically use cryptographic hash functions to determine whether two signals are different. If the hashes of signals x and y are equal, then the signal x equals to the signal y, assuming that hash collisions occur with a negligibly low probability. That comparison of cryptographic hashes is fundamental in most password and key management applications.
An essential property of conventional cryptographic hash functions is that the hashes do not preserve the underlying structure of the signals that are compared. Specifically, if one signal is a noisy version of another signal, the cryptographic hashes of the two signals are different, even if the noise is small. Therefore, a cryptographic hash cannot, by itself, be used for comparing the signals in noisy environments, e.g., storage devices and communication channels.
Determining the difference between signals in a secure manner is important in many applications. For example, private medical data are to be analyzed and classified by a third party, without revealing the medical data to the third party. In addition, the third party does not want to reveal the classification method, nor the database used for the classification.
This problem is often defined as a secure multiparty computation (SMC). Computationally secure methods, such as oblivious transfer (OT), secure inner product (SIP) can be used as primitives to perform more complicated operations, thereby enabling SMC, see for example U.S. patent application Ser. No. 11/005,293 describes such a method. That method performs object detection without revealing the image supplied by a user, or the classification method used by classifier. However, the method requires a large number of exchanges between the user and the classifier. The overhead, in terms of exchanges and key management, is very large.
The differences between signals are determined according to distance metrics. Examples of the distance metrics are squared distance, Hamming distance, and Manhattan distance. There are number of methods in the art for determining securely the squared distance and the Hamming distance between the signals. See, for example, U.S. patent application Ser. No. 12/495,721 filed by Rane et al. on Jun. 30, 2009, and incorporated herein by reference. However, there is no two-party method in the art for determining securely the Manhattan distance between two signals, while having low communication overhead. As defined herein, “securely” means that each party, e.g., a processor, keeps its signal secret from the other party throughout the computation.
The Manhattan distance between two points is the sum of the absolute differences of their coordinates. Manhattan distance is also referred to as the L1 distance, taxicab distance, city block distance and rectilinear distance.
In case of signals, the Manhattan distance between signals of dimension n, x=(x1, x2, . . . , xn) and y=(y1, y2, . . . , yn) is
                                      x          -          y                            1        =                  ∑                  i          =          1                n            ⁢                                            x            i                    -                      y            i                                        ,where x and y are signals and normal xi and yi are the individual components of signals x and y.
There are a number of methods in the art for approximation of the Manhattan distances in different metric spaces. However, all those methods are not secure by design, and require a significant communication overhead between parties. Hence, it is desired to determine the Manhattan distance between two signals securely.
Similarly, the squared distance between two points is the sum of the squared differences of their coordinates. Thus, the squared distance between signals of dimension n, x=(x1, x2, . . . , xn) and y=(y1, y2, . . . , yn) is
            (                                              x            -            y                                    2            )        2    =            ∑              i        =        1            n        ⁢                            (                                    x              i                        -                          y              i                                )                2            .      