Marketers are continually seeking better ways to create, execute, and automate campaigns with the goal of growing revenue and strengthening customer loyalty. A workflow engine can be used to generate output materials (e.g., email messages) used in the prosecution of marketing campaigns. A marketer can use the workflow engine to configure a series of connected workflow execution components that make up a marketing campaign. Some engines allow the marketer to visually design, manage, and automate a multi-stage lifecycle marketing program through a drag-and-drop user interface and a library of pre-built program templates.
Certain aspects of such marketing programs and constituent marketing campaigns emerge in the course of generating personalized messages (e.g., email messages, web page content, advertisement messages, text messages, various forms of mobile messaging, etc.). In a conventional process for generating personalized messages, the marketer designs a template during a design phase for a personalized message that includes stock message sections and personalization sections. The stock message sections contain standardized messages that are the same for all recipients of the personalized message. The personalization sections include fields based on commands that invoke queries to the marketer's relational database management system (RDBMS) or any other database system. During processing, a loop iterates through customer records in the database system to form one or more queries to retrieve data, which data is used to populate personalization sections of a personalized message before sending the personalized message to the customer.
When customer records in a marketer's database system number in the hundreds or a few thousand, the method described above may be executed in a reasonable amount of time. However, the method described above does not scale very well such as when the number of customer records increases to tens of thousands, millions, or even billions. In such cases, where a high throughput on the order of hundreds, thousands, tens of thousands, or more messages per second is desirable, the method described above exhibits too much latency to sustain the desired throughput. Moreover, the latency per message generated is highly variable when processing random reads of data associated with database look-ups.
The observed latency can be smoothed out by introducing a cache facility; however, if the data is constantly changing on the back-end database, the cache would often be “dirty” or out-of-sync with the back-end database.