With the advent of low cost microprocessors and the explosive growth of the PC industry, electronic gaming has proliferated at a blinding rate. Games and simulations are now executed, in homes and arcades, on a vast array of available hardware platforms, where each hardware platform yields its own unique combination of complexity, and fidelity, and cost. Depending on a given hardware implementation, game players may have many different types of control input devices at their disposal that are used to interact with a game or simulation. For example, driving games and simulations may use any combination of control input devices such as steering wheels, gear shifters, and gas/brake/clutch pedal units. Flight games and simulations may use any combination of control input devices such as throttles, weapons controllers, joysticks, rudder pedals, and flight yolks. First person perspective action games may use any combination of joystick, mouse, or 3D control. In most cases, a person playing a game or simulation is sitting in a seat of some kind while interacting with the hardware control input devices.
In order for tactile sensation to be effectively implemented by a modern electronic gaming system, where there are so many inconsistencies between various hardware systems and their software applications, an entirely new approach is necessary. In order to be most effective, a gaming or simulation system configured to provide the illusion that all of the available disparate control input devices that control a given simulation are each part of a unified whole, and that they are not independent, physically disconnected devices. This is necessary to suspend the disbelief of the person who is interacting with the game or simulation system.
Due to the current and future countless implementations of computer based video game and simulation systems, and due to the continually expanding library of game and simulation applications that can be executed on such systems, a need exists in the art for a truly universal tactile feedback system that can function without regard to the specific apparatus, implementation, or application of any given system. Furthermore, a need exists in the art for a universal system that can accommodate currently existing and future control input devices, via simple and inexpensive tactile feedback actuators, that can be readily connected to or embedded within said devices, such that these disparate devices become part of a unified whole.
Additionally, a need exists in the art for a universal system that will function both with and without support by the host gaming apparatus, achieving said functionality by implementing both a reprogrammable audio analysis function, and/or a direct digital control function. Moreover, a need exists in the art for a tactile feedback seating unit, that is not based on a low frequency speaker system, such that vehicle based games and simulations can be more realistically rendered, both with and without support by the host gaming apparatus. Furthermore, a need exists in the art for a vest-based tactile sensation generator, such that both open-body games and vehicle based games can be more realistically rendered, both with and without support by the host gaming apparatus. Finally, a need exists in the art for a universal tactile feedback system, such that the complete system is versatile, inexpensive, reliable, lightweight, quiet, reconfigurable, reprogrammable, and expandable.
Accordingly, it is one of many objectives of some example embodiments to introduce a tactile feedback seating unit that can produce tactile feedback within a seat, that is not based upon a low frequency speaker system, that can function via host-independent digital audio analysis and/or host-dependent direct digital control, the digital signal not necessarily being specific to the actuators in the seat, but rather a general control signal for a distributed system, in order to represent tactile sensations occurring in real time within a computer generated game or simulation, such that the person sitting in the seat feels this representation, and the tactile feedback provided by such a system further enhances the believability of the simulation. It is an additional objective of some example embodiments to implement the tactile feedback seating unit as a self contained unit, where a plurality of tactile feedback actuators are embedded inside a semi-rigid sealed foam cushion, such that the unit is portable, lightweight and quiet, and can fit in almost any chair and function with almost any application.
The implementation of tactile feedback in computer gaming, simulation, and training systems, where “tactile feedback” refers to physical sensations that one feels while interacting with said systems, has heretofore been plagued by the fundamental limitation of poor developer support. Typically, software developers are pressed for both time and money, and they are under constant pressure to release their software into the marketplace as soon as is practicable. In this competitive field, it is difficult for any given developer to spend the time necessary to create thoughtful and artistic tactile feedback. With regard to tactile feedback, proper implementation is more of an art than a science. No right or wrong answer necessarily exists for how a particular tactile event should feel, such as driving a car on gravel or firing a fictional laser cannon in a fictional spacecraft. Therefore, much time can be spent (and from the developer's point of view, even wasted) in tweaking the subjective feel of the desired effects, ad infinitum. In the end, there is a vast difference between the quality of tactile feedback that was merely implemented into a given software application, and tactile feedback that was thoughtfully and artistically crafted to render effective results (given of course the limitations of the particular tactile feedback hardware in use). In cases where the code that renders tactile feedback is incorporated by any given developer into that developer's wares, it is difficult for that code to be upgraded at a later time once a given product is released, because developers are usually fighting the time constraints of the post-release cycle where updates to some given software are periodically released to fix bugs or add features that were left out of the original release (or are functioning with substantial deficits) due to time constraints in effect during the prerelease period. Ultimately, servicing the tactile feedback support is less than a priority to the developer, and the effectiveness of said tactile feedback suffers as a result. Furthermore, given the subjective nature of tactile feedback, its effectiveness and quality varies greatly in each title, application, and/or implementation.
With regard to the patent applications and patents for which this application is a continuation, much of the difficulty presented by lack of developer support has been overcome by AudioSense®, which is a sound analysis technology that generates tactile feedback in real time without developer support of any kind. However, AudioSense® has inherent limitations that are overcome and/or minimized by intelliVIBE®, which is the type of tactile feedback that has typically required developer support—and therefore, intelliVIBE® suffers from the very limitations described previously herein. U.S. patent application Ser. No. 08/409,327 (filed Mar. 23, 1995, now U.S. Pat. No. 5,669,818) and Ser. No. 08/309,763 (filed Sep. 21, 1994, now U.S. Pat. No. 5,684,722) are incorporated herein in their entirety by reference.
Until the innovation that is the subject matter of this patent application, software developers were relied upon to decide what events in their software required what sensations, and what the triggering mechanisms were for initiating those sensations. For example, Immersion Corporation of 801 Fox Lane, San Jose, Calif. 95131 USA (NASDAQ: IMMR) markets and licenses technologies collectively falling under the trademark name TouchSense™. In using TouchSense™, developers can utilize a toolkit known as “Immersion Studio” (as of this writing, currently in version 4.1.0). When using Immersion Studio, developers can more easily create tactile effects, but are relied on to determine the subjective artistic tactile feel of those effects, and when such effects are to be rendered. This is precisely the burden that this patent application is intended to relieve.
Therefore, a need exists in the art for a relatively quick and simple method by which any given software developer is required only to activate a simple shared data structure, which will provided real-time telemetry to an external executable, application, function, or code segment (known as an “intelliVIBE® module”), which in turn will itself generate the necessary tactile feedback control signals via general purpose algorithms that are shaped by said telemetry, thereby relieving the developer of the most substantial burdens of supporting an interface such as intelliVIBE®, and/or Immersion Corporation's TouchSense™, and/or any other proprietary methodology for generating tactile feedback. For reference, throughout this application, the terms “tactile feedback” should be understood to include and be synonymous with the terms “force feedback”. In much the same way that AudioSense® audio analysis technology allows tactile feedback hardware to function with no support whatsoever from any given developer of any given software title, application, or hardware system, this telemetry based method allows intelliVIBE® (and other similar interfaces) to self-generate effective and thoughtful tactile feedback with minimal, simple, and one-time-only support by any given developer. Furthermore, due to the telemetry basis for the tactile feedback, the teachings herein can be applied to other areas, such as remotely piloted vehicles, remotely controlled vehicles, UA Vs (Unmated Aerial Vehicles), spacecraft, and other types of tele-operated or tele-presense vehicles, where telemetry is available or can be provided, thereby producing effective tactile feedback for the operators or spectators of such vehicles.