Explaining the Neuroevolution of Fighting Creatures Through Virtual fMRI

Abstract
While interest in artificial neural networks (ANNs) has been renewed by the ubiquitous use of deep learning to solve high-dimensional problems, we are still far from general artificial intelligence. In this article, we address the problem of emergent cognitive capabilities and, more crucially, of their detection, by relying on co-evolving creatures with mutable morphology and neural structure. The former is implemented via both static and mobile structures whose shapes are controlled by cubic splines. The latter uses ESHyperNEAT to discover not only appropriate combinations of connections and weights but also to extrapolate hidden neuron distribution. The creatures integrate low-level perceptions (touch/pain proprioceptors, retina-based vision, frequency-based hearing) to inform their actions. By discovering a functional mapping between individual neurons and specific stimuli, we extract a high-level module-based abstraction of a creature’s brain. This drastically simplifies the discovery of relationships between naturally occurring events and their neural implementation. Applying this methodology to creatures resulting from solitary and tag-team co-evolution showed remarkable dynamics such as range-finding and structured communication. Such discovery was made possible by the abstraction provided by the modular ANN which allowed groups of neurons to be viewed as functionally enclosed entities.