Search Based Software Testing (SBST) is a popular automated testing technique which uses a feedback mechanism to search for faults in software. Despite its popularity, it has fundamental challenges related to the design, construction and interpretation of the feedback. Neural Networks (NN) have been hugely popular in recent years for a wide range of tasks. We believe that they can address many of the issues inherent to common SBST approaches. Unfortunately, NNs require large and representative training datasets. In this work we present an SBST framework based on a deconvolutional generative neural network. Not only does it retain the beneficial qualities that make NNs appropriate for SBST tasks, it also produces its own training data which circumvents the problem of acquiring a training dataset that limits the use of NNs. We demonstrate through a series of experiments that this architecture is possible and practical. It generates diverse, sensible program inputs, while exploring the space of program behaviours. It also creates a meaningful ordering over program behaviours and is able to find crashing executions. This is all done without any prior knowledge of the program. We believe this proof of concept opens new directions for future work at the intersection of SBST and neural networks.
View on arXiv