There are circumstances when human senses are incapable of, or incompetent (e.g., impaired), obstructed, or vulnerable to interference (of various sorts) in relation to, acquiring information relevant to an environment or place at or near a person's location or, by way of another example, in relation to a tracked, projected or otherwise determined path along with the person is walking, driving, etc. Even with the aid of conventional listening devices (e.g., microphones, hearing aids) and viewing devices (e.g., cameras, corrective lenses), in some circumstances even persons without hearing or vision impairments are prevented from acquiring or at least timely acquiring such information, which in some instances may be of critical importance or interest to that person. For example, a person who is listening to music while driving, or walking or bicycling while wearing headphones, may not hear or see an emergency (or other motorized) vehicle approaching from around a sharp corner in the road in sufficient time or until it is too late to avoid a collision. Human senses and conventional listening and viewing devices alike, in some circumstances may not (or cannot) sense or detect objects, events or phenomena of actual or potential interest or concern to a person, or sounds or information associated with or related to such objects, events or phenomena, even if the person is attempting to hear or see (such objects, events or phenomena) utilizing all or various of such senses and conventional listening and viewing devices as may be available. In some environments and circumstances, one or more of the aforementioned human senses and conventional listening and viewing devices may not be suitable for sensing (or detecting) in relation to acquiring sounds or information associated with or related to such objects, events or phenomena.
It would be helpful to be able to provide a listening device, system or methodology that facilitates sensing (or detecting) of or in relation to such objects, events or phenomena, and obtaining or providing sounds or information associated with or related to such objects, events or phenomena.
It would be helpful to be able to provide a listening device or system that provides aural capability (e.g., short range aural capability, long range aural capability, or both) in a manner other than, and independent of whether the device or system utilizes, conventional acoustic or vibration sensing (or detecting) responsive to the movement of air or other molecules.
It would be helpful in some circumstances to be able to provide a listening device or system capable of operation in a vacuum or other environment entirely or substantially devoid of matter (or atmosphere).
It is contemplated and envisioned by the inventor(s) of the present invention(s) that it would be helpful in some circumstances to be able to provide a listening device or system (e.g., such as described herein) to a device or system that does not generate or otherwise provide or obtain a viewing output (e.g., captured images).
It is contemplated and envisioned by the inventor(s) of the present invention(s) that it would be helpful in some or other circumstances to be able to provide a listening device or system (e.g., such as described herein) to a device or system that is capable of or adaptable to be capable of generating or otherwise providing or obtaining a viewing output (e.g., captured images).
It is contemplated and envisioned by the inventor(s) of the present invention(s) that it would be helpful to be able to provide a listening device or system (e.g., such as described herein) to one or more of the following: a device or system for providing or obtaining viewing and/or listening outputs, signals, or content; a camera or other image capturing device or system; a device or system for providing, obtaining, utilizing, generating or reproducing sounds; hearing aids and hearing augmentation devices and systems; devices and systems facilitating or utilizing sensory enhancement and/or sensed feedback; headphones; wearable technology items or apparel; personal computing and/or communications devices or peripherals; warning or detection devices or systems; safety and other specialized devices or systems; detection or measurement (and analysis) devices and systems involving, for example, aerospace, diagnostic, medical, meteorological, military, security, or space applications.
In relation to the aerospace industry, for example, many interesting and potentially observable things can happen during an aerospace vehicle flight test. On board sensors provide a good view of the vehicle state assuming that radio telemetry links can be maintained or on board recorders can be recovered. Motion picture film cameras have been used onboard test and chase aircraft to provide visual records of flight test events. Recoverable film camera pods were sometimes carried on early space launch vehicles but use was limited by the technical complexities and expense of post flight film recovery. These camera pods observed rocket engine plume characteristics, staging, engine start and shutdown, propellant tank ullage, spacecraft deployment and other flight phenomena. Space launch vehicles now often carry non recoverable television cameras with real time video downlinks.
Additional flight test information can be remotely obtained using space, airborne, ship and ground based sensors. Space assets and aircraft to a lesser extent are difficult to access and very costly to use. In the past, ground based flight test sensors were primarily large tracking radars and long range optics. These systems are government owned with access being expensive and difficult or impossible to obtain for private use. Moreover, the high acquisition and operating costs and operational complexities of currently available long range telescopic camera systems frequently result in an undersampling of desired imagery and associated spatial, temporal and geometric data.
Cinema started in the late 1800's with the introduction of the moving picture. This was followed shortly thereafter with color movies. Next was the introduction of recorded sound with early experiments starting around 1900, major film studio interest by 1927 and becoming very popular by 1930. For a long time the sound recordings were monophonic until the production of the stereophonic Disney film “Fantasia”, in 1947. By 1953 stereophonic sound was becoming commonplace and next was multichannel audio recording. Essentially all Hollywood feature films produced since 1930 include a sound track and now virtually all feature films include multichannel audio. Of great importance typically in modern cinema is the technical production of high resolution color imagery of excellent quality in addition to a high quality synchronized multichannel sound track.
In relation to a camera or other image capturing device or system having aural as well as visual capabilities, it would be helpful to be able to provide one or more of: improved sensor performance and/or utilization, better low light (imaging) sensitivity, and simplified associated electronics.
Electro-optical remote sensing satellites are increasingly able to provide high-resolution video observations of terrestrial activity. On orbit videos show the dynamic spatial and temporal characteristics of both cultural and natural activity. It would be interesting and/or useful if such satellites could also remotely sense (or provide or in some manner facilitate remote sensing of) associated terrestrial acoustic activity. The fusion of visual and aural observations would improve an overall understanding of a scene and provide a more cinema-like viewing experience. Remote Acoustic Sensing (RAS) technology potentially provides a pathway for obtaining/sensing acoustic observations of planetary surface activity from a satellite, spacecraft, or other space based vehicle or device.
As is well known, sound—the brain's interpretation of a modulated frequency sensed by the human ear with air particles being the carrier—does not propagate through a vacuum. In a vacuum, air is absent but photons can carry corresponding or equivalent information content. The ear cannot sense photons, but optical signals and their information content can be converted into an acoustic signal, (referred to herein as an “acousto-optic signature”) that provides (e.g., facilitates obtaining or reproduction, or is otherwise representative, of) information content corresponding or equivalent to that of the optical signals.
Remote Acoustic Sensing (RAS) (e.g., applied to RASSat as described herein—see below) is a form of a distributed microphone system that separates an acoustic sensor into two (or more) spatially separated elements utilizes/involves, for example, a naturally formed in situ terrestrial acousto-optic modulator (AOM) and a remote readout sensor. In the case of a space-based remote readout sensor, an attempt to obtain acoustic observations (e.g., attempting to hear terrestrial sounds from an orbiting spacecraft) utilizing RAS would not require acoustic wave propagation in space but instead passively couples the AOM to the readout sensor over orbital distances using ambient light fields. Acoustic wave propagation is only required between the sound source and the in situ AOM and not to the readout sensor.
In the case of a space-based remote readout sensor, utilization of conventional RAS (as presently known) to obtain acoustic observations is problematic in respect to, among other things, the logistics, cost and intrusiveness of adding/integrating required/additional sensors and wiring to a spacecraft, for example, particularly on multiple spacecraft sides and locations. (See e.g., FIGS. 11 and 12 which show a six-sided full sphere video camera and a CubeSat with solar arrays on all six sides, respectively.)
It would be helpful to be able to provide devices and methods for obtaining or providing listening or detection of audio frequency modulated optical energy in relation to and/or facilitating one or more of methodologies/technologies described herein.