Performance testing is a process used to ascertain if the software application can meet performance criteria namely, work-load, reliability, resource-usage, scalability and other requirements in a dynamic or real time use environment. Performance testing is a very important phase in software application development lifecycle, it can determine whether the software application qualifies the performance criteria and it can also determine the better application amongst two or more software applications available for a given purpose on the basis of the above mentioned performance criteria.
For many years, the conventional performance testing tools or system are comprised of load injectors, a system resource utilization measurement system and application monitoring tools or utilities.
Over the years, with the progress in technology, more advanced ways of performance testing techniques are developed that involve the distribution of user load across multiple systems, use of network bandwidth simulators supported by the load injection tool and does statistical profiling of application tier wherein, the resultant performance test result of such methods primarily consists of a function of: (1) Workload—{Arrival, Activity and Departure Rates}; (2) Application State—{Data volumes, Impact of one component on another}; and (3) Resource Utilization—{Capacity of the underlying hardware}.
Such typical or advanced performance testing methodology is at best a bottleneck detection exercise, and a due diligence activity to anticipate which tiers needed more attention when launched into dynamic production environment.
It is further observed that in a performance test environment most of the performing test exercises are statically represented (Fixed workload, one application state and one hardware environment), whereas in real-life (after launch of the software and into Production) all of the above functions are in constant state of flux.
In Production stage, it is observed that due to factors like shared hosting, unanticipated application access patterns, growth in data volumes and integration with other applications have impact on performance.
Hence there is an urgent need for a solution that overcomes some of the existing lacunae in performance testing domain that are accounting, distinguishing and addressing the differences in static test environment and dynamic production environment; ensuring productivity while working from different locations; increased size of database in production environment as compared to test environment; scalability testing challenges with increased hardware and software costs; on time correct documentation of test results and predict the accurate results over a constantly changing application; integrating the performance testing system with the software development life cycle of legacy systems to leverage previous investments in terms of software and knowledge assets; performing extrapolation and validating results and the distribution of think times of load testing or enterprise performance testing challenges; and executing the right scripts with pre-determined think times, scientifically computing run duration and validating the data provided by the load generators.
Some of the inventions which deal with providing a solution to performance testing are listed below. None of them address the lacunae stated above.
U.S. Pat. No. 7,577,875 discloses a system and method that generates a list of parts of a tested software program that are responsible for significant performance regressions in the tested software that may be ranked by significance. Further the system analyzes sampled profile data, which provides information about processor activity during a test. An amount of processor resources used to execute the various functions, modules, and processes of tested software can be determined.
U.S. Pat. No. 7,493,272 discloses a method of performance testing for software applications, wherein the electronic document is a web-page presented as part of a web-based shopping system.
U.S. Pat. No. 7,457,723 discloses a method for providing performance testing, the method comprising: performing N number of iterations of a test; determining that results of the test are valid if the test reached a defined test criteria after performing the N number of iterations; deeming the test as passing if the test results compare favorably to a goal value; and deeming the test as failing if the test results compare unfavorably to the goal value.
U.S. Pat. No. 7,415,635 discloses a method of testing a software application comprising: executing a test script that identifies the software application to be tested and a plurality of measurements.
U.S. Pat. No. 6,799,147 discloses a method for testing and monitoring applications, by providing at least one integrated interface capable of controlling at least two monitoring programs which each send functional test signals to respective applications and receive results functionally responsive to the test signals; wherein at least one of the monitoring programs sends test signals using HTTP-compliant communications; and wherein a second or the monitoring programs sends test signals using TCP/IP-compliant communications.
U.S. Pat. No. 6,735,719 discloses a method for performing a load test on a application software in which dynamic data are required to be correlated, said method comprising: recording a test script; identifying a location at which dynamic data are first generated within said recorded test script from a recording log; inserting proper data correlation statements into said recorded test script; substituting inserted data parameters throughout said recorded test scripts; verifying that all of said dynamic data have been captured and replaced; and performing a load test on a software application utilizing said recorded test script.
In order to address the need of such a solution, the present invention provides a resource efficient system having a scalable and pluggable framework and method for automated, cost effective performance testing in order to meet the performance and scalability requirements, bridge gap between dynamic production environment and static test environment, maximize coverage of performance testing engagement, provide insight on not just current but also latent performance defects which manifests as the system evolves and provides end-to-end visibility of performance metrics to help improve productivity by compressing test-tune iterations.