Paper served as the basic medium for handling information for the health care industry for many years. From the scholarly work reporting on medical studies, to health records and prescriptions, nearly every aspect of health care related information and processes was formerly managed using paper. However, as the amazing potential of network computing has been applied to the health care industry, many of the old ways are rapidly changing.
By implementing computers and network computing into the health care industry, information management and the actual provision of medical care has rapidly improved. As evidenced by the development of electronic medical records, the ability to handle prescriptions and order entries electronically, and the development of numerous software and hardware electronic devices that improve capabilities in all areas of providing medical services, the health care industry is in a period of rapid change.
As indicated above, part of the rapid change being experienced in the health care industry is occurring in relation to the development of new software for improving various aspects of information management by developing new services and applications associated with network computing environments. The services and applications that are developed, and continue to be developed are typically supported by a combination of hardware platforms and corresponding software. For example, a new software release may be planned for integration on a computer system that interacts with other existing and perhaps external systems. Accordingly, it is important that, when developing a new software release, compatibility and interoperability be accounted for with respect to the external systems. However, new software is often produced with some level of urgency to get to market and significant risk of damage to marketability if potential problems are not identified prior to release. Thus, it may be important that software testing be enabled to start as early as possible from a business perspective, in order to ensure that overall time in getting a quality product to market can be minimized.
To accomplish the task of enabling testing for interoperability, developers typically need to “fake” the corresponding external systems. Currently, there are static, preconfigured simulators that are typically employed to conduct testing with third party live test networks. The simulators are typically required to be configured to give a certain kind of response before activity is started. Then, orders may be processed to generate the configured response. Accordingly, for different responses, activity must be stopped in order to reconfigure the simulator to provide the corresponding different response.
For interaction with a relatively simple external system, the above described mechanism for testing may be adequate, although a bit in flexible. However, with some new and complex systems that are currently being developed, complicated and asynchronous responses may be provided by the external networks. This makes simulators like those described above fall significantly short because, for example, the number of different kinds of messages and values of parameters in the messages can be so large and various that conventional simulators may not be able to effectively handle simulating such new and complex systems.
Accordingly, it may be desirable to provide a network simulator that may address some of the issues mentioned above.