As users come to expect faster performance from their computing devices, evolutionary advances of hardware are not always sufficient to meet the expectations of many users. Therefore efficiencies in software also receive a great amount of attention to improve the performance of computing devices, but some computing tasks are computationally intensive and will always take longer than average to complete. Rather than force a user to accept this reality, it may be possible to precompute the results of certain tasks. Caching of various data represents an example of precomputing or pre-performing certain tasks. Caching of data, in the context of this invention, means the saving of some identifier of the data and the results of all or part of the computed task. This saving can be in RAM, on disk, or in any other storage medium.
In some instances database queries are one such task that can be computationally intensive and result in long response times. As such, some have attempted to cache database queries. The typical caching of database search queries generally depends on user input to determine which queries (and their results) to cache. For example, typical caching may require user input or analysis of previous search queries to calculate which inputted search queries are the most popular. Alternatively all prior user queries may be cached and when the allocated cache space fills up those least often repeated may be removed. Queries (and their results) can be cached to provide better performance the next time the query is performed. In fact, in a typical relational database, this type of user action dependent caching is the only practical way to cache queries. This is because there is no practical way to determine if a query result is to be cached before a user has inputted that query. It would be an advantage to choose to cache only those queries that take the longest to compute. However this requires a practical way of estimating the response times to queries (without performing all of them, which is impractical) in order to then choose to cache those responses which take the longest time—the long query responses.