It is not uncommon for large corporations or other large entities to serve hundreds of thousands of customers, and even customers in excess of 1,000,000. Given the large scale, it is increasingly common, and in most cases essential, for such entities to use robust and complex enterprise systems or databases to manage customer or member data. In addition to simply storing data, entities also need the ability to manipulate the customer data, authenticate the customer data, add to the customer data, or delete the customer data from a system or database holding the customer data.
While companies may have specialized database systems for storing and querying on specific types of data, such as customer data, there is often a need to add large quantities of data to the existing databases. For example, it is common for entities to merge, or for one entity to acquire another entity. In such a case, customer data from one of the entities may need to be combined with existing customer data from an acquiring entity.
Typically, to add quantities of data to a database, it is necessary to invoke processes on current database servers to add the data. Such server processes can be slow and can require extensive runtime processing by the server, and as a result can have a negative impact on the server's ability to handle normal database services such as database queries or other updates. In addition, merging data from disparate database systems often requires that data be transformed to a format consistent with a target database, requiring additional server runtime. Thus, there is a need for an efficient method of processing and loading large quantities of customer data to an existing database, wherein the method does not interfere with the normal operations of a database. In addition, there is a need for an efficient method of replicating such data on back-up or replicated data stores.