Context-dependent representations of objects and space in the primate hippocampus during virtual navigation

Abstract
The hippocampus is implicated in associative memory and spatial navigation. To investigate how these functions are mixed in the hippocampus, we recorded from single hippocampal neurons in macaque monkeys navigating a virtual maze during a foraging task and a context-object associative memory task. During both tasks, single neurons encoded information about spatial position; a linear classifier also decoded position. However, the population code for space did not generalize across tasks, particularly where stimuli relevant to the associative memory task appeared. Single-neuron and population-level analyses revealed that cross-task changes were due to selectivity for nonspatial features of the associative memory task when they were visually available (perceptual coding) and following their disappearance (mnemonic coding). Our results show that neurons in the primate hippocampus nonlinearly mix information about space and nonspatial elements of the environment in a task-dependent manner; this efficient code flexibly represents unique perceptual experiences and correspondent memories. Gulli et al. record neurons in the monkey hippocampus during multiple tasks in a virtual reality environment and find that spatial coding is task-dependent. Their analyses reveal rich nonspatial sensory and mnemonic coding of task-related features.
Funding Information
  • Gouvernement du Canada | Instituts de Recherche en Santé du Canada | CIHR Skin Research Training Centre
  • Healthy Brains for Healthy Lives
  • NeuroNex
  • Gouvernement du Canada | Natural Sciences and Engineering Research Council of Canada