1. The Field of the Invention.
The present invention relates to computer systems that utilize date information, especially such systems that include a host computer supplying date information remotely to a local workstation requiring date information, and especially to methods for modifying existing systems and implementing the same.
2. The State of the Art.
Various applications (software) programs utilize date functions in a number of different ways. Date information is typically included as a data field in a database for record entry, such as tracking telephone calls, insurance claims, stock trading and/or settlement dates, order entry, and the like. Date information is also utilized for automated (macro) processing, such as paying bills from an accounting or personal finance application, as triggers for data backups and other transmissions or actions that are performed on a regular or periodic basis. Software applications can be programmed to treat date data as text (i.e., alphabetical, non-numeric) or as a numerical or formula-based value (i.e., numeric), and sometimes as both.
In the programming arts, date data is usually handled in a particular format predetermined by the programmer. Very often, this format is the typical month-day-year American format (and sometime optionally the European format of day-month-year). The date data is typically represented by numbers, and sometimes a three letter abbreviation for the month (e.g., "jun" for June and "jul" for July), with a unitary or set of defined data delimiters, such as hyphens and/or slashes, for separating the three data units defining typical date data (e.g., Jan. 1, 1990 or 1-jan-90 for Jan. 1st, 1990). Internally, software applications typically convert the input date data into a more usable, numerical form. Some date data also requires the use of the fractional date parts hours and minutes.
Because it would be an unusual calculation requiring a data entry, an output, or a calculation based on a date on the order of a century prior to the present, applications (or rather, their designers and programmers) often assume that the date entered or output refers to a date in the present "century," i.e., between 1900 and 1999. In either or both cases, an arbitrary date, such as Jan. 1st, 1900, is denoted as day number one (a base date) and all other dates are calculated as the number of days from the base date (analogous to a Julian calendar starting on a different arbitrary date); sometimes this sequential reference date is called a serial date or (incorrectly) a Julian date.
With the advent of cheaper, faster, and more reliable communications devices and media, various sizes of computers requiring date data can be connected to each other for controlled communications and interfacing. In a network system, in which one or more communications backbones allows for various computers to interface with each other, it may be desirable to have certain computers operate software applications locally and to have others operate software applications that, when operating, are shared among users. In both cases, passing date data among the various computers is often frustrated by the different predefined limitations on the form and range of the date data which is acceptable as input and/or output for each of the various applications. For example, distributed applications may be running on different platforms, or a shared application may be required to output data to workstations running on different platforms, exacerbating the communications and interface problem.
Still another problem, which has received some attention but little solution, is the often-inherent limitation in various software applications that a given date will range between 1900 and 1999 when, in commerce today, timetables and scheduling for events in the third millennium CE (2000 and beyond) are already required. Assumptions about the date range required for application operation and data was originally made to save memory allocation space, because a year assumed to be between 1900 and 1999 can be defined by a smaller value requiring less storage area (e.g., "85" for "1985"). Because of the inherent assumption in such applications as to the form and range of the date data, errors occur in processing. The solutions to such a problem include as least some rewriting of the software code, and especially modifying the stored data so that it accurately reflects the century. However, it has been estimated that the world-wide cost for upgrading all of the data and applications reliant thereon to accurately reflect changes in the numerical representation of the century will cost in the hundreds of billions of dollars. Various "solutions" to the year 2000 problem (often termed "Y2K problem") are described on internet world wide web sites (e.g., Bellcore, IEEE, Information Technology Assn. of America) and in various printed publications (e.g., "Year 2000 or Bust," Chem. Eng., July 1997). These publications do not, however, provide a solution but rather a methodology towards a solution; namely, purchasing date-compliant software, and reprogramming and adequate testing the system to assure that it is date-compliant.