Over the past few years, Electronic Commerce has become an increasingly central part of the economy. An internet presence is considered an essential part of doing business, rather than an exotic add-on to a company. More and more transactions, both from business to consumer and between businesses, are taking place online. Simple fixed cost business transactions are often automated at one or both ends, and auctions are overwhelmingly conducted by automated auctioneer software. Agent technology has been proposed as a means of automating some of the more sophisticated negotiations which businesses are involved in. Background prior art documents are listed at the end of this description and hereafter are referenced by author and date of publication.
Research into automated negotiation has long been an important part of distributed AI and multi-agent systems. Initially it focused primarily on negotiation in collaborative problem solving, as a means towards improving co-ordination of multiple agents working together on a common task. As electronic commerce became increasingly important, the work expanded to encompass situations with agents representing individuals or  businesses with potentially conflicting interests. The Contract Net (Smith, 1980) provides an early architecture for the distribution of contracts and subcontracts to suppliers. It uses a form of distributed request-for-proposals. However, it does not discuss algorithms for determining what price to ask in a proposal. Jennings et al. (1996) use a more sophisticated negotiation protocol to allow the subcontracting of aspects of a business process to third parties. This is primarily treated as a one-to-one negotiation problem, and various heuristic algorithms for negotiation in this context are discussed in (Faratin et al, 1998). Vulkan and Jennings (1998) recast the problem as a one-to-many negotiation, and provide an appropriate negotiation protocol to handle this.
Other relevant work in one-to-one negotiation includes the game-theoretic approach of Rosenschein and Zlotkin (1994) and the logic-based argumentation approach of Parsons et al. (1998). As much electronic commerce involves one-to-many or many-to-many negotiation, the work in the agent in community has broadened to explore these cases too. The Kasbah system (Chavez et al., 1997) featured agents involved in many-to-many negotiations to make purchases on behalf of their users. However, the algorithm used by the agents (a simple version of those in (Faratin et al., 1998) was more appropriate in one-to-one negotiation, and so gave rise to some counter-intuitive behaviours by the agents. Preist and van Tol (1998) and Cliff and Bruten (1998) present adaptive agents above to effectively bid in many-to-many marketplaces, and are the first examples of work which borrow techniques from experimental economics to analyze the dynamics of agent-based systems. Preist (1999) demonstrates how these can be used to produce a market mechanism with desirable properties. Park et al. (1999) present a stochastic-based  algorithm for use in the University of Michigan Digital Library, another many-to-many market.
Gjerstad and Dickhaut (1998) use a belief-based modelling approach to generating appropriate bids in a double auction whose approach is close in spirit to that of the present invention in that it combines belief-based learning of individual agents bidding strategies with utility analysis. However, it is applied to a single double auction marketplace, and does not allow agents to bid in a variety of auctions. Vulkan and Preist (1999) use a more sophisticated learning mechanism that combines belief-based learning with reinforcement learning. Again, the context for this is a single double auction marketplace. Unlike Gjerstad's approach, this focuses on learning the distribution of the equilibrium price. The work of Garcia et al. (1998) is clearly relevant. They consider the development of bidding strategies in the context of the Spanish fish market tournament. Agents compete in a sequence of Dutch auctions, and use a combination of utility modelling and fuzzy heuristics to generate their bidding strategy. Their work focuses on Dutch rather than English auctions, and on a sequence of auctions run by a single auction house rather than parallel auctions run by multiple auction houses.
In Preist et al., 2001a, there are presented algorithms which allow agents to participate simultaneously in multiple auctions for the purchase of a number of similar goods. In Preist et al, 2001b, there is shown how agents using these algorithms in multiple auctions can create a more efficient and stable market. It is interesting to contrast this analysis with that of Greenwald and Kephart (1999). They demonstrate that the use of dynamic  price-setting agents by sellers, to adjust their price in response to other sellers, can lead to an unstable market with cyclical price was occurring. Preist et al, 2001b, however, show that (in a very different context) the use of agents improves the dynamics and stability of the market. From this, it can concluded that agent technology is not a-priori “good” or “bad” for market dynamics, but that each potential role must be studied to determine its appropriateness.
Sandholm (2000) proposes a sophisticated marketplace able to handle combinatorial bidding, and able to provide guidance to buyers and sellers as to which market mechanism to adopt for a particular negotiation. In the long term, as the different auction houses merge or fold and only a few remain, this approach will be ideal. In the short term, the applicant expects improved market dynamics will occur through autonomous agents in multiple auctions.
A specific class of business process that will become increasingly important in the virtual economy, namely service composition. This application will now consider the different technical issues that the present applicant has identified must be addressed if service composition is to be automated, and describe, specifically, methods for the purchase of composite services from a group of auctions.
Over the last decade, companies have been encouraged by business consultants such as Tom Peters to focus on their core competencies. By trying to do everything—welding, graphic design, supply chain management, customer care, keeping the photocopiers  running, producing good food in the office canteen—companies run the risk of being “jack of all trades, but master of none”. As a result of this there is a danger that other smaller companies focused on the same core business will outperform them. To avoid that risk, and become more competitive, large companies are going through a process of “disaggregation”. In some cases, this can mean splitting a large company into several parts, each of which can focus on one core business (such as the recent move by Hewlett Packard to separate its test and measurement business from its computing business, creating a new company, Agilent, from the former). In other cases, it can mean outsourcing more and more of a company's activities to other companies, maintaining only those activities that it truly excels in.
This trend is beginning to have an impact on many E-businesses, as well as traditional bricks-and-mortar companies. Companies would like to be able to outsource some of their activities over the internet. Initially, this has focused on semi-permanent arrangements, with the web acting as an intermediary. (For example, career guidance information is provided to HP employees via a web-based third party). However, as this trend is becoming increasingly important, much research and development effort has been focussing on a new vision for the internet e-services. E-services are virtual entities that provide a service over the network through an open standard interface. The service may be information, such as the latest stock prices, or may be a virtual representation of some physical good or activity, such as a contract to transport a crate from one location to another. Because the service is offered through an open standard interface any client familiar with this standard can use it. Furthermore, the output from one service can be  fed directly into another service. This makes creation of composite services and complex business processes which cross-organisational boundaries possible. Potentially, this can be done automatically and dynamically, and agent technology will play a key role in this.
This leads to the emergence of an important role in the virtual economy—the service composer. As companies focus on their core competencies, other companies can focus on creating composite packages. This is not new—travel agents, among others, have done exactly that for years—but what is new is that it will be able to take place dynamically, automatically, over the internet.
One specific aspect of the service composition problem is that of negotiating to purchase composable services. An agent's task may be to buy a set of service which can be composed to sell on as a bespoke composite service, possible to a specific customer with special requirements. There may be several ways of creating this composite service including individual services for sale in the auction. The agents task is to purchase one such set of services which can be composed, while reducing the risk of accidentally purchasing additional, unnecessary services. The present invention is directed to a method and data structure useful for this task.