Volitional activation of remote place representations with a hippocampal brain‐machine interface
Authors/Creators
- 1. Howard Hughes Medical Institute
- 2. Beth Israel Deaconess Medical Center
Description
Overview
This repository is associated with the following paper: Lai C, Tanaka S, Harris TD, Lee AK. Volitional activation of remote place representations with a hippocampal brain‐machine interface. Science, 2023 (in press).
This dataset demonstrates the ability of animals to activate remote place representations within the hippocampus when they aren't physically present at those locations. Such remote activations serve as a fundamental capability underpinning memory recall, mental simulation/planning, imagination, and reasoning. By employing a hippocampal map-based brain-machine interface (BMI), we designed two specific tasks to test whether animals can intentionally control their hippocampal activity in a flexible, goal-directed, and model-based manner. Our results show that animals can perform both tasks in real-time and in single trials. This dataset provides the neural and behavior data of these two tasks. The details of the tasks and results are described in the paper.
Dataset, pre-trained model and code access:
-
Unzip the
data.7zto get adatafolder. Thedatafolder contains three subfolders:- 1. Running: This folder has two subfolders:
- run_before_jumper: Contains data files for the Running task performed before the Jumper task.
- run_before_jedi: Contains data files for the Running task performed before the Jedi task.
- 2. Jumper: Contains data files for the Jumper task.
- 2. Jedi: Contains data files for the Jedi task.
- 1. Running: This folder has two subfolders:
-
Unzip the
model.7zto get apretrained_modelfolder, which contains all 6 pretrained models (pthfiles) trained using the data from theRunningtasks, 3 used inJumpertasks and 3 used in theJeditasks. -
Unzip the
code.7z