Interoperability is one of the biggest problems facing businesses and agencies that operate in data rich environments. One common problem is that many new systems that come on line can only communicate with other systems if the new system develops custom interfaces to the other systems. Many businesses and government organizations are continually working to get data, such as intelligence data, from the collection source into the hands of users, such as analysts, in an efficient manner. For example, the Intelligence Community is continually searching to find a way to make their programs interoperable. Data is being collected, yet is not finding its way to the appropriate analysts. Frequently this is due to incompatibility between the data source and the analysis application. Other times the problem is a matter of geography. For example, an analyst is at one site and the archive is at another with no means to query between the two.
Although efforts are being expended to resolve geographical issues, issues relating to incompatibility are rarely addressed adequately. Further, because of tight budgets at many government agencies, features such as interoperability are sometimes viewed as an unnecessary luxury.
Another reason that past efforts have not always succeeded is that there often is little to no benefit to the developers of the data archive systems to make their data more accessible. Rather, limiting accessibility to data can provide a strategic or business advantage to the holder of a data archive system. For example, by maintaining a firm hold on the data from collection through analysis, these developers can exercise monopolistic control over who and what can see their data.
Past efforts to resolve some of these problems have focused on using so-called data brokers, object brokers, network control programs, and other forms of middleware having varying levels of complexity. Middleware can, for example, include: software that sits between two or more types of software and translates information between them; software that provides a common application programming interface (API); and/or software development tools that enable user to create programs by selecting services and linking the services with a scripting language. While solutions such as middleware sometimes work in the short term, effort has to go into maintaining those tools and updating them when new data sources come online. Also, many middleware data brokers and complexity levels are developed and/or centered on a specific Community of Interest (COI), and, unfortunately, may not be as useful to some other communities, such as the Intelligence community.
In addition, some middleware is information centric, attaching meaning to the data that it is linking or translating. This so-called “heavyweight” middleware tends to use significant amounts of memory and processor resources, which may increase the complexity, cost, and size of the middleware.
While solutions such as those described above may be adequate in the short term, they are very costly to maintain and place a large amount of overhead on the data query and retrieval process.