Definition - What does Regression Testing mean?
Regression testing is a type of software testing used to determine whether new problems are the result of software changes.
Before applying a change, a program is tested. After a change is applied, the program is retested in selected areas to detect whether the change created new bugs or issues, or if the actual change achieved its intended purpose.
Techopedia explains Regression Testing
Regression testing is essential for large software applications, as it is often difficult to know whether changing part of an issue has created a new issue for a different part of the application. For example, a change to a bank application loan module may result in the failure of a monthly transaction report. In most cases, issues may appear to be unrelated, but may actually be the root of frustration among application developers.
Other situations requiring regression testing include detecting whether certain changes accomplish an intended goal or testing for new dangers associated with issues that reemerge after a trouble-free period.
Modern regression testing is primarily handled via specialized commercial testing tools that take existing software snapshots that are then compared after applying a specific change. It is almost impossible for human testers to perform the same tasks as efficiently as automated software testers. This is especially true with large and complex software applications within vast computing environments such as banks, hospitals, manufacturing enterprises and large retailers.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: