A hybrid versatile method for state estimation and feature extraction from the trajectory of animal behavior

Abstract
Animal behavior is the final and integrated output of the brain activity. Thus, recording and analyzing behavior is critical to understand the underlying brain function. While recording animal behavior has become easier than ever with the development of compact and inexpensive devices, detailed behavioral data analysis requires sufficient previous knowledge and/or high content data such as video images of animal postures, which makes it difficult for most of the animal behavioral data to be efficiently analyzed to understand brain function. Here, we report a versatile method using a hybrid supervised/unsupervised machine learning approach to efficiently estimate behavioral states and to extract important behavioral features only from low-content animal trajectory data. As proof of principle experiments, we analyzed trajectory data of worms, fruit flies, rats, and bats in the laboratories, and penguins and flying seabirds in the wild, which were recorded with various methods and span a wide range of spatiotemporal scales—from mm to 1000 km in space and from sub-seconds to days in time. We estimated several states during behavior and comprehensively extracted characteristic features from a behavioral state and/or a specific experimental condition. Physiological and genetic experiments in worms revealed that the extracted behavioral features reflected specific neural or gene activities. Thus, our method provides a versatile and unbiased way to extract behavioral features from simple trajectory data to understand brain function.