User Story: TVB

Version 19.1 by michaels on 2020/06/02 08:59

From MRI to personalized brain simulation

Here we explain step by step how to use The Virtual Brain (TVB) tools for end-to-end personalized brain simulation. We start by finding shared MRI data using KnowledgeGraph, create a brain model from extracted connectomes using TVB pipeline, and simulate neural activity using TVB brain network model simulators. We will use Jupyter notebooks on EBRAINS Collab platforms for frontend operations and supercomputers in the backend for intensive number crunching.

export_overview_new2.png

TVB pipeline: Extract connectomes

As a first step we browse through KnowledgeGraph in order to find a suitable dataset to construct a brain model. The dataset must contain diffusion-weighted MRI data, in order to extract a structural connectome, which will form the basis of a brain network model. Structural connectivity extracted from diffusion MRI is used to quantify how strongly brain regions interact in the brain model. Next, the data set must contain functional MRI (fMRI) data, because a common approach is to tune the parameters of the brain model such that the simulated fMRI functional connectivity fits with the empirical fMRI data. For fitting, we usually compute functional connectivity matrices from simulated and empirical data. Finally, we need anatomical T1-weighted MRI to extract cortical surfaces and to perform a parcellation of the brain into different regions.

img1.png

  • Download the imaging data. The full dataset would be quite large: 54.06 GB. It contains several subjects, modalities, tasks and runs, most of which we don’t need to demo the workflow. We will therefore only download the minimal set of files that we need to form a valid BIDS data set and to perform the following steps. Using the Dataset File Tree on the right, download the files indicated in the following folder tree. The interface unfortunately only allows to download individual files, so you have to click each one of them and also you have to create the necessary folder structure (incl. the folders sub-01, anat, dwi, func) yourself. Note that the full data set contains multiple sessions identified by the keyword “ses-XX”, where “XX” indicates the session number. Here we use only data from ses-00 and therefore omit the folder and instead directly copy the folders “dwi”, “func”, and “anat” one level beneath “sub-01”. When you are done, your folder tree should look like this:

tree.png

  • We now have an MRI dataset in BIDS format. The next step is to compress the folder (e.g. as a .zip or .tar.gz file) so that we can upload it as a single file to the EBAINS Collaboratory and later to the supercomputer. In the next steps, we are going to use diffusion MRI tractography to reconstruct white matter fiber pathways and to estimate coupling weights between brain regions.
  • Open the TVB Pipeline EBRAINS Collab: https://wiki.ebrains.eu/bin/view/Collabs/tvb-pipeline/
  • The pipeline is implemented in the form of a Jupyter notebook that shows how to upload data from local filesystem to EBRAINS drive; how to copy the data to the supercomputer; how to run the three docker containers that perform the processing; how to download results to local filesystem.
  • In order to use EBRAINS Collab software, it is necessary to download the notebook, create a new Collab, and upload the notebook there. The process is described on the main Collab page and in the notebook.
  • For brain simulation the most important result of tractography is the structural connectome (SC), which consists of the coupling strengths matrix and the distances matrix. The former quantifies the strength of interaction between each pair of brain regions and the latter contains the average length of the respective fiber bundle. The exported SC can be directly imported to TVB: as one of the last steps of the pipeline, the SC was stored along with other data that can be read by TVB in the file "TVB_output.zip". Within that ZIP archive is the file “sub-<participant_label>_Connectome.zip”, which can be used to set up a brain network model in the other TVB workflows.

sc.png

The Virtual Brain: Simulate brain activity

The Virtual Brain is the main TVB software package. It is a neuroinformatics platform that provides an ecosystem of tools for simulating and analysing large-scale brain network dynamics based on biologically realistic connectivity. TVB can be operated via GUI and programmatic Python interface. On the HBP Collaboratory Platform TVB Simulator usage is introduced through IPython Notebooks. Additionally, the TVB GUI can be directly accessed as a Web App (https://thevirtualbrain.apps.hbp.eu/user/profile). Via the Web App users can configure simulations that are – depending on their complexity – either simulated directly on the web server or on a supercomputer, thereby making resource-consuming TVB functionality accessible to researchers that do not have access to supercomputers. Compiled standalone versions of the main software package can be downloaded from thevirtualbrain.org. In the following we take you through the main steps of brain network model simulation.

Congratulations, you performed your first brain simulation. You may now want to play with parameters and look how it affects the simulated FC – a goal may be to maximize the fit between simulated and empirical FC. Often a good first step is to vary the global coupling scaling factor: start at a low value (little exchange of synaptic currents between brain regions) and then increase until fMRI time series of the different brain regions become increasingly correlated.

TVB+NEST: Multiscale simulation

In the previous step we simulated the brain at a coarse spatial resolution: the macroscopic scale of brain regions (e.g. “M1”, “V1”, etc.) and long-range white matter fiber bundles. However, interesting computations often happen on smaller scales, like the mesoscopic scale of small neural populations or the microscopic scale of individual neurons and neural networks. TVB+NEST is a Python toolbox that makes it easier to simulate multi-scale networks, i.e., networks, where one part simulates activity on a coarse scale and another part simulates activity on a finer scale. Essentially, TVB+NEST is a Python wrapper for The Virtual Brain neuroinformatics platform and the NEST spiking network simulator. TVB+NEST exists as a web app and a download version. The web app runs on HBP computers, while the download version is implemented as standalone Docker container that can be downloaded. To run TVB+NEST follow the instructions here:

https://collab.humanbrainproject.eu/#/collab/19/nav/2108?state=software,TVB%20and%20NEST%202

or directly open the App at

https://tvb-nest.apps.hbp.eu/.

Alternatively, download the standalone Docker container thevirtualbrain/tvb-nest from Dockerhub. In the previous sections you may have simulated a large-scale brain model, but are now interested how large-scale activity affects finer-scale activity in a specific region. To familiarize yourself with TVB+NEST, you may read through the following tutorials.

Fast_TVB: Fast and parallel simulation

Fast_TVB is thousands of times faster than Python TVB as it uses several optimization techniques and is implemented in the hardware-near language C. In addition, it is able to simulate in parallel, i.e., users can specify a number of threads that will simultaneously perform the processing and occupy multiple processors, as often done on supercomputers.

To perform a dense parameter space exploration or to simulate high-dimensional models (e.g. with the number of network nodes N > 103) high performance simulation codes are necessary. For the ReducedWongWang model a high-performance C-version of the TVB simulator core is implemented as a Docker container:

https://hub.docker.com/r/thevirtualbrain/fast_tvb

The EBRAINS Collaboratory “TVB C -- High-speed parallel brain network models” explains how to use the container on supercomputer backends with a Jupyter notebook as frontend:

https://wiki.ebrains.eu/bin/view/Collabs/tvb-c-high-speed-parallel-brain-network-

  •  Open the “TVB C -- High-speed parallel brain network models” Collab: https://wiki.ebrains.eu/bin/view/Collabs/tvb-c-high-speed-parallel-brain-network
  • Follow the instructions in the Collab notebook or at https://hub.docker.com/r/thevirtualbrain/fast_tvb to set up a brain model, simulate it and collect the results.
  • Simulations are more efficient when only a single thread is created, but faster for multiple threads. Play around with the num_threads parameter and compare the execution speeds for different settings. If execution speed is the primary goal a higher number of threads is advised, if efficiency during parameter space exploration is the goal, then it is advised to use multiple single-threaded instances of the program.

TVB-HPC: High-performance computing

In this project a toolbox has been created that supports the efficiently port TVB neural mass models between different computing architectures. This addresses the need that most models simulated in TVB are written in Python, and most of them have not yet been optimized for parallel execution or deployment on high-performance computing architectures. At the heart of this project is the development of a domain-specific language (DSL) that lets us define TVB models in a structured language that allows automatic code generation. Based on the model description computing code for different environments or hardware is automatically generated.

In order to implement your own model in TVB-HPC DSL you may first familiarize yourself with the DSL and how to define models in it: https://collab.humanbrainproject.eu/#/collab/80668/nav/546280. As a next step, you may go on to learn how to generate CUDA code from your models: https://collab.humanbrainproject.eu/#/collab/80715/nav/546614.

Disease models

An advanced application of TVB is the specification of disease models like it was done in this publication by Stefanovski et al.: https://www.frontiersin.org/articles/10.3389/fncom.2019.00054/full.

https://nbviewer.jupyter.org/github/BrainModes/TVB_EducaseAD_molecular_pathways_TVB/blob/master/Educase_AD_study-LS-Surrogate.ipynb

In the INCF training space an extensive tutorial is provided that walks the user through the approach of the paper:

https://training.incf.org/lesson/linking-molecular-pathways-and-large-scale-computational-modeling-assess-candidate-disease

Tumor brain models

Planning for tumour surgery involves delineating eloquent tissue to spare. To this end doctors analyze noninvasive neuroimages like fMRI and dwMRI. Instead of analyzing these modalities independently, brain models provide a novel way to combine the information in these different modalities, which may reveal information that is not apparent from individual analysis. Aerts et al. generated brain models of tumour patients before (https://doi.org/10.1523%2FENEURO.0083-18.2018)  and after surgery (https://doi.org/10.1101%2F752931) and now published their core brain model data set:

https://kg.ebrains.eu/search/?facet_type[0]=Dataset&q=marinazzo#Dataset/a696ccc7-e742-4301-8b43-d6814f3e5a44

The dataset contains BOLD time series averaged over 68 regions of interest according to the Desikan-Kilianny atlas, and a structural connectivity matrix displaying the fibers connecting each pair of these regions of interest, derived from the DWI data. The locations of the areas, the centers, the fiber lengths and densities are also included. The computational models are implemented at each of these regions of interested, connected according to the white matter fibers. The empirical functional connectivity matrix (the Pearson correlation among pairs of BOLD time series from each ROI) is used to fit the model.

Having learned how to create and simulate first brain models in the initial chapters, how to optimally implement them in the middle chapters, and how to implement disease mechanisms in the last two chapters, researchers may now combine these workflows and extend them to study other healthy or pathological brain processes.  

INCF training space

TVB EduPack provides didactic use cases for The Virtual Brain. Typically a use case consists of a jupyter notebook and a didactic video. EduPack use cases help the user to reproduce TVB based publications or to get started quickly with TVB. EduCases demonstrate for example how to use TVB via the Collaboratory of the Human Brain Project, how to run multi-scale co-simulations with other simulators such as NEST, how to process imaging data to construct personalized virtual brains of healthy individuals and patients.

https://training.incf.org/studytrack/virtual-brain-simulation-platform