An experimental comparison of the effectiveness of branch testing and data flow testing
- 1 August 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Software Engineering
- Vol. 19 (8), 774-787
- https://doi.org/10.1109/32.238581
Abstract
An experiment comparing the effectiveness of the all-uses and all-edges test data adequacy criteria is discussed. The experiment was designed to overcome some of the deficiencies of previous software testing experiments. A large number of test sets was randomly generated for each of nine subject programs with subtle errors. For each test set, the percentages of executable edges and definition-use associations covered were measured, and it was determined whether the test set exposed an error. Hypothesis testing was used to investigate whether all-uses adequate test sets are more likely to expose errors than are all-edges adequate test sets. Logistic regression analysis was used to investigate whether the probability that a test set exposes an error increases as the percentage of definition-use associations or edges covered by it increases. Error exposing ability was shown to be strongly positively correlated to percentage of covered definition-use associations in only four of the nine subjects. Error exposing ability was also shown to be positively correlated to the percentage of covered edges in four different subjects, but the relationship was weaker.<>Keywords
This publication has 26 references indexed in Scilit:
- Methods of comparing test data adequacy criteriaPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Investigations of the software testing coupling effectACM Transactions on Software Engineering and Methodology, 1992
- Comparison of program testing strategiesPublished by Association for Computing Machinery (ACM) ,1991
- Analyzing partition testing strategiesIEEE Transactions on Software Engineering, 1991
- Experimental comparison of three system test strategies preliminary reportPublished by Association for Computing Machinery (ACM) ,1989
- An applicable family of data flow testing criteriaIEEE Transactions on Software Engineering, 1988
- Selecting Software Test Data Using Data Flow InformationIEEE Transactions on Software Engineering, 1985
- On Required Element TestingIEEE Transactions on Software Engineering, 1984
- An Evaluation of Random TestingIEEE Transactions on Software Engineering, 1984
- Algorithm 408: a sparse matrix package (part I) [F4]Communications of the ACM, 1971