Enterprise Resource Planning (ERP) packages are business management systems that integrate all facets of a business, including planning, manufacturing, sales, and marketing. One of the major shortcomings of traditional ERP packages is that they are very inflexible and unable to handle very complex business processes. Such business applications are initially created using specific business model templates. These templates are customized by manually changing settings and parameters in the software package. Changes in the business processes are always constrained by such templates in these packages.
These shortcomings are addressed by a so-called “framework-based” business application technology. The underlying principle of the framework is to create a meta-model of the business processes and then use this model to directly generate the application. The meta-model is created using very simple elements. Subsequent changes of any kind can then be made without any constraints imposed by any template because the new application is not just a variation of the old application with changed settings and parameters but it is generated directly from the updated meta-model and actually replaces the old application. A framework-based implementation allows easy configuration of application components to create dynamic applications. This allows dramatic reduction of the overall time for the creation of a business application from a business model and reduction of the effort for reconfiguration of the application every time the business model changes.
The available commercial technologies, based on the “framework-based” technique, use a combination of object-oriented and software agent technology. The various business documents and the operations that can be performed over these documents are modeled as objects. The meta-model is prepared by using these documents and operations objects and linking them together using a visual interface. The business application is generated by the creation of software agents, which execute the operations over the documents. The connections between different agents are created based on the processes defined in the meta-model. This agent framework is essentially the backend of the application.
Discrete event simulation concerns the modeling of an operational system as it evolves over time by a computer representation in which state variables change instantaneously at discrete points in time at which events occur. An event is defined as an instantaneous occurrence that may change the state of the system. In this way an artificial history of the system can be generated by operation of a computer model of the system. This kind of rapid experience acquisition is normally infeasible with the real system because experimentation with real system is often disruptive, seldom cost-effective and sometimes impossible. The capability of discrete event simulation to portray random effects and the dynamic behavior of an operational system makes it a very powerful technology for operational performance analysis.
The High Level Architecture (HLA) is a general-purpose architecture for simulation reuse and interoperability. The HLA was developed under the leadership of the Defense Modeling and Simulation Office (DMSO) to support reuse and interoperability across the large numbers of different types of simulations developed and maintained by the Department of Defense. The Object Management Group (OMG) adopted the HLA as the Facility for Distributed Simulation Systems 1.0 in November 1998. The HLA was approved as an open standard through the Institute of Electrical and Electronic Engineers (IEEE)—IEEE Standard 1516—in September 2000. The HLA MOA was signed and approved in November 2000. The HLA-compliant simulation, termed as a federation, is a collection of simulation models, termed as federates.
In today's business environment “change” has emerged as the only constant. The speed, cost and efficiency at which these changes are executed has become an important competitive advantage for a corporation. However, improvements in operations and business are often discarded at the design table today as their implementation involves too much work and the advantage is lost by the time the changes are made. Further, the effects of business changes on operations and vice-versa are investigated by either employing external consultants or by assigning internal resources but seldom involve validation of the findings through techniques like simulation.
Framework-based business application software systems have reduced the implementation time and the effort that is needed to reconfigure the enterprise application to incorporate alternative policies and changes in the highly complex business processes of organizations.
Within prior art framework-based implementations of business applications, simulations have been used to increase the efficiency of the application and test for system integrity as illustrated in FIG. 1 (Prior Art). The business processes themselves are deterministic in nature.
Programs that emulate the real situation with some probability distribution (i.e. simulation models) and data files having scenario-specific data to emulate the databases introduce randomness in a scenario. In the example of FIG. 1, the programs are Java programs which emulate the externals (e.g. a customer coming onto a web site and placing an order). Also in the example of FIG. 1, the data files are “XML scenario files”. They represent a particular state of the corporation and can be used to define scenarios. They emulate the databases of the corporation. The scenarios are a set of chosen policies with a particular configuration of the system. They define a situation corresponding to a real life situation that could occur. The scenarios include a sequence of events following a certain configuration and initialization of the system. As an example of a particular scenario, one could choose to have all the machines in an operations model super loaded with orders and then add more orders. First there is a warm-up in which orders are fed into the system to fill the queues.
Statistics collected are for an individual agent and for each scenario. Statistics collected over multiple replications are used for analysis. Here, running a simulation of the same scenario a number of times is called replicating the simulation. The statistics include the various different values associated with the particular scenario study. For example, the statistics can describe the number of transactions between the database and the system, the load (transactions) per agent in the system, etc. These values, or statistics, are indicative of the performance of the system in relation to various scenarios. The agent here is represented as a piece of software with an objective, process for achieving the objective and set of resources to use to reach the objective. In the present case, the agents run the system and execute the various business processes. However, this kind of simulation—which is not discrete event simulation (DES)—is not sufficiently able to provide direct and obvious information on the impact of changes in the business model upon the operational and business performance.
On the other hand, with conventional (discrete event) simulation approaches it is possible to analyze different business scenarios and to optimize the underlying models based on the results. But implementation/realization of these changes in the corresponding business applications remains cumbersome and in some cases impossible.
Also, today's business models are driven by customer demand (push), whereas conventional simulation models are input-release driven and represent only the operational execution. Input release, however, is the result of a complex translation from customer demand into material quantities to be released into and moved within the system at pre-specified times. Consequently, in a demand-driven business world, simulation models can be made much more realistic if the process of translating demand into input release (i.e. planning and order management) and the complex interdependencies between the business processes and the operational execution are incorporated in it.
In summary, some of the shortcomings when applying framework-based business applications or discrete event simulation systems individually are:
1) Even with advanced framework-based applications the performance of business operations can only be addressed based on the real history of the system. Here the real history of the system is when a real event has occurred and been reported (e.g. the production profile of a month once the month is over).
2) Operational models that are optimized using conventional discrete event simulation technology are difficult to implement.
3) The scope of traditional simulation models does not sufficiently take into account the fact that the underlying business is typically customer-demand driven rather than input-release driven.