The execution steps in most research, development, and engineering experiments generally involve manual operations carried out on unconnected technology platforms. The scientist or engineer works in what are essentially isolated technology islands with manual operations providing the only bridges. To illustrate, when there is a Standard Operating Practice (SOP) Guide for the experimental work, it is often an electronic document, for example in Microsoft Word. The experimental plan (Step 1) within the SOP Guide has to be transferred to the target device (instrument, instrument platform, or component module for execution (Step 2) by manually re-keying the experiment into the device's instrument control program (ICP)—the device's controlling application software. In a few cases the statistical analysis of results (Step 3a) can be done within the ICP, but it is most often done within a separate statistical analysis software package or spreadsheet program such as Microsoft Excel. This also requires manually transferring the results data from the ICP to the analysis software package. Reporting of results (Step 3b) is usually carried out in Microsoft Word, and therefore requires the manual transfer of all results tables and graphs from the separate statistical analysis software package. The manual operations within the general execution sequence steps are presented below. The isolated technology islands are illustrated in FIGS. 1 and 2.
FIG. 1 illustrates the manual tools and operations involved in carrying out a research and development experiment. In this work a statistical experiment design protocol is first generated, via step 12. This protocol is developed manually and off-line using nonvalidated tools such as Microsoft Word. The protocol then must be approved, once again manually and off-line, via step 14. Next, sample amounts are calculated using non-validated tools such as Microsoft Excel, via step 16. Thereafter the samples are prepared, via step 18 and the experiment is run on a target device, via step 20, for example, a high-performance liquid chromatograph (HPLC). Running the experiment requires manually re-constructing the statistical design within the target device's ICP. When this software does not exist, or does not allow for full instrument control, the experiment must be carried out in a fully manual mode by manually adjusting instrument settings between experiment runs.
FIG. 2 illustrates the manual tools and operations involved in analyzing the data and reporting the results of the research and development experiment, via step 22. The analysis and reporting of data is accomplished by first statistically analyzing and interpreting the data, off-line, using non-validated tools such as Microsoft Excel. Next, it is determined whether or not there is a need for more experiments, possibly using off-line generic design of experiments (DOE) software, via step 24. Then, data are entered and a report is written, via step 26. Finally, the report is archived, via step 28. As is seen from the above, the research, development, and engineering experimentation process involves a series of activities that are currently conducted in separate “technology islands” that require manual data exchanges among the tools that are used for each activity. However, until now, no overarching automation technology exists that brings together all the individual activities under a single integrated-technology platform that is adapted to multiple devices and data systems.
Method validation activities encompass the planning and experimental work involved in verifying the fitness of an analytical method for its intended use. These activities are often captured in company Standard Operating Procedure (SOP) documents that usually incorporate Food and Drug Administration (FDA) and International Conference on Harmonization (ICH) requirements and guidances. Method validation SOP documents include a description of all aspects of the method validation work for each experiment type (e.g. accuracy, linearity) within a framework of three general execution sequence steps: (1) experimental plan, (2) instrumental procedures, and (3) analysis and reporting of results. The individual elements within these three general steps are presented below.
Step 1: Generate Experimental Plan
Select experiment type
Select target instrument
Define study variables:                analyte concentrations        instrument parameters        environmental parameters        
Specify number of levels per variable
Specify number of preparation replicates per sample
Specify number of injections per preparation replicate
Integrate standards
Include system suitability injections
Define Acceptance Criteria
Step 2: Construct Instrumental Procedures
Define required transformations of the experiment plan into the native file or data formats of the instrument's controlling ICP software (construction of Sample Sets and Method Sets or Sequence and Methods files).
Specify number of injections (rows)
Specify type of each injection (e.g., sample, standard)
Define required modifications to the analytical method (robustness)
Step 3: Analyze Data and Report Results
Specify analysis calculations and report content and format
Carry out numerical analyses
Compare analysis results to acceptance criteria (FDA & ICH requirements)
Specify graphs and plots that should accompany the analysis
Construct graphs and plots
Compile final report
The execution steps in analytical method validation generally involve manual operations carried out on unconnected technology platforms. To illustrate, an SOP Guide for the validation of an HPLC analytical method is often an electronic document in Microsoft Word. The experimental plan (Step 1) within the SOP Guide has to be transferred to the HPLC instrument for execution (Step 2) by manually re-keying the experiment into the instrument platform's ICP—in the case of an HPLC this is typically referred to as a chromatography data system (CDS). In a few cases the statistical analysis of results (Step 3) can be performed within the CDS, but it is most often carried out within a separate statistical analysis software package or spreadsheet program such as Microsoft Excel. This also requires manually transferring the results data from the CDS to the analysis software package. Reporting of results (Step 3) is usually carried out in Microsoft Word, and therefore requires the manual transfer of all results tables and graphs from the separate statistical analysis software package. The manual operations within the three general execution sequence steps are presented below.
Step 1—Experimental Plan
Validation plan developed in Microsoft Word.
Experimental design protocol developed in off-line DOE software.
Step 2—Instrumental Procedures
Manually build the Sequences or Sample Sets in the CDS.
Raw peak (x, y) data reduction calculations performed by the CDS (e.g. peak area, concentration).
Step 3a—Statistical Analysis
Calculated results manually transferred from the CDS to Microsoft Excel.
Statistical analysis usually carried out manually in Microsoft Excel.
Some graphs generated manually in Microsoft Excel, some obtained from the CDS.
Step 3b—Reporting of Results
Reports manually constructed from template documents in Microsoft Word.
Graphs and plots manually integrated into report document.
Prior art systems are known in this area, but they do not address the overarching problem of removing the manually intensive steps required to bridge the separate technology islands. Relevant prior art that has been discovered by applicants is described herein below.