Changes for page Co-Simulation The Virtual Brain Multiscale
Last modified by ldomide on 2024/04/08 12:55
Summary
-
Page properties (2 modified, 0 added, 0 removed)
-
Objects (1 modified, 0 added, 0 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 - Co-SimulationThe Virtual Brain-multiscale1 +The Virtual Brain Multiscale - Content
-
... ... @@ -2,9 +2,13 @@ 2 2 ((( 3 3 (% class="container" %) 4 4 ((( 5 -= My Collab'sExtendedTitle =5 += (% style="color:inherit" %)TVB Co-Simulation [[image:https://github.com/the-virtual-brain/tvb-multiscale/blob/master/docs/documented_example_notebook/ConceptGraph.png?raw=true||alt="ConceptGraph.png" height="197" width="255"]] (%%) = 6 6 7 -My collab's subtitle 7 +(% style="color:inherit" %)Multiscale: TVB - NEST 8 + 9 +(% style="color:inherit" %)Authors: (%%)Dionysios Perdikis, Lia Domide, Jochen Mersmann, 10 + 11 + Michael Schirner, Petra Ritter(% style="color:inherit" %) 8 8 ))) 9 9 ))) 10 10 ... ... @@ -12,15 +12,108 @@ 12 12 ((( 13 13 (% class="col-xs-12 col-sm-8" %) 14 14 ((( 15 - =WhatcanI findhere? =19 +Main TVB wiki: [[https:~~/~~/wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain/>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain/]] 16 16 17 -* Notice how the table of contents on the right 18 -* is automatically updated 19 -* to hold this page's headers 21 +=== Who has access? === 20 20 21 -= Who has access? = 23 +* TVB-multiscale co-simulator is open source, GPLv3 licensed: [[https:~~/~~/github.com/the-virtual-brain/tvb-multiscale>>url:https://github.com/the-virtual-brain/tvb-multiscale]] so you can use it locally freely 24 +* Within HBP infrastructure you will only need an HBP account to access TVB 22 22 23 -Describe the audience of this collab. 26 +== Use our Jupyter Hub setup online == 27 + 28 +We have setup a Jupyter Hub service with tvb-nest as backed already prepared. You will only need an HBP account for accessing this: [[https:~~/~~/tvb-nest.apps.hbp.eu/>>url:https://tvb-nest.apps.hbp.eu/]] 29 + 30 +This JupyterHub installation works smoothly with HBP Collab user credentials (login only once at HBP and get access here too). We use a custom Docker Hub tvb-nest image as a backend, and thus a ready to use environment is available immediately, without the need of any local installation or download. This should be the ideal env for demos, presentations or even workshops with tvb-nest. 31 + 32 +**[[image:https://lh6.googleusercontent.com/ytx9eYpMcL3cCScX2_Sxm4CeBW0xbKW3xKsfO2zSId10bW0gw1kiN2_SkexyYBCsF-sKsu0MaJC4cZvGVfQPjMoPBLiePbkvXOZd8BgY3Q0kFzSkRCqQ183lgDQv_6PYoqS3s7uJ||height="149" width="614"]]** 33 + 34 +Currently, the users can access 2 folders: //TVB-NEST-Examples// and //Contributed-Notebooks//. 35 + 36 +The notebooks under **TVB-NEST-Examples** are public, shared by everyone accessing the instance. Periodically, we will clean all changes under TVB-NEST-Examples folder (by redeploying the pod image), and show the original example notebooks submitted on our Github repo. If users intend to contribute here, they are encouraged to submit changes through Pull Requests ([[https:~~/~~/github.com/the-virtual-brain/tvb-multiscale>>url:https://github.com/the-virtual-brain/tvb-multiscale]]) 37 + 38 +**[[image:https://lh6.googleusercontent.com/nnsM0mhXQinmQsJwZwwwe5Sx7f-tZc8t4ELnCh9DwksyVEPUE-jixJTkhoP4l25VKwlDGoXACWtnuxQM9NMOCYbQOzDesgMDlT3sntow___vsEqRVd4OwqMY4BPyBiLJ32BnUbmM||height="267" width="614"]]** 39 + 40 +Folder **Contributed-Notebooks** is not shared. Here, users can experiment with their own private examples. This folder is persisted on restarts in the user HBP Collab personal space. Thus, users will be able to access their work even after a redeploy. (e.g. during a workshop every participant could have in here his own exercise solution). 41 + 42 +== Running TVB-NEST locally == 43 + 44 +See more on Github [[https:~~/~~/github.com/the-virtual-brain/tvb-multiscale>>url:https://github.com/the-virtual-brain/tvb-multiscale]] and check this notebook example: [[https:~~/~~/drive.ebrains.eu/f/b3ea5740fcc34f12af7a/?dl=1>>url:https://drive.ebrains.eu/f/b3ea5740fcc34f12af7a/?dl=1]] 45 + 46 +This notebook will be ok to download and try yourself locally, after you have also prepared and launched locally a Docker env: [[https:~~/~~/hub.docker.com/r/thevirtualbrain/tvb-nest>>url:https://hub.docker.com/r/thevirtualbrain/tvb-nest]] 47 + 48 +This is the path recommended for people working closely with tvb-nest. They are able to download it in their local work env and code freely and fast with it. 49 + 50 +== Running TVB-NEST jobs on CSCS infrastructure from HBP collab == 51 + 52 +The CSCS and HBP Collab deployment of tvb-nest is a good example to show how tvb-nest can run with an HPC backend. This will be efficient when the simulation jobs are very large. From our experience, with small jobs, the stage-in/out time is considerable, and then the user might be better with just a local run. Also, this deployment requires that **the user have an active CSCS personal account**. More details on how to use this deployment can be found in this movie: [[https:~~/~~/drive.google.com/open?id=1osF263FK_NjhZcBJfpSy-F7qkbYs3Q-E>>url:https://drive.google.com/open?id=1osF263FK_NjhZcBJfpSy-F7qkbYs3Q-E]] 53 + 54 +* Create a collab space of your own 55 +* Clone and run in your HBP Collab Hub ([[https:~~/~~/lab.ebrains.eu/>>url:https://lab.ebrains.eu/]]) the notebooks from here: [[https:~~/~~/drive.ebrains.eu/d/245e6c13082f45bcacfa/>>url:https://drive.ebrains.eu/d/245e6c13082f45bcacfa/]] 56 +** test_tvb-nest_installation.ipynb Run the cosimulate_tvb_nest.sh script on the CSCS Daint supercomputer. In this example, basically we are running the //installation_test.py// file which is in the docker folder. 57 +** run_custom_cosimulation.ipynb For this example we are using the //cosimulate_with_staging.sh// script in order to pull the tvb-nest docker image and we are using a custom simulation script (from Github page) which will be uploaded in the staging in phase 58 +** run_custom_cosimulation_from_notebook.ipynb This example is running the same simulation as the example above but instead of using an external file with the simulation code we will build a simulation file from a few notebook cells and we will pass this file to the CSCS server. 59 + 60 +Few technical details about what we do in these notebooks: 61 + 62 +1. Prepare UNICORE client api. 63 + 64 +PYUNICORE client library is available on PYPI. In order to use it you have to install it using: 65 + 66 +>pip install pyunicore 67 + 68 +Next step is to configure client registry and what supercomputer to use 69 + 70 +>tr = unicore_client.Transport(oauth.get_token()) 71 +>r = unicore_client.Registry(tr, unicore_client._HBP_REGISTRY_URL) 72 +># use "DAINT-CSCS" change if another supercomputer is prepared for usage 73 +>client = r.site('DAINT-CSCS') 74 + 75 +1. Prepare job submission 76 + 77 +In this step we have to prepare a JSON object which will be used in the job submission process. 78 + 79 +># What job will execute (command/executable) 80 +>my_job['Executable'] = 'job.sh' 81 +> 82 +># To import files from remote sites to the job’s working directory 83 +>my_job['Imports'] = [{ 84 +> "From": "https:~/~/raw.githubusercontent.com/the-virtual-brain/tvb-multiscale/update-collab-examples/docker/cosimulate_tvb_nest.sh", 85 +> "To" : job.sh 86 +>}] 87 +> 88 +># Specify the resources to request on the remote system 89 +>my_job['Resources'] = { 90 +> "CPUs": "1"} 91 + 92 +1. Actual job submission 93 + 94 +In order to submit a job we have to use the JSON built in the previous step and also if we have some local files, we have to give their paths as a list of strings (inputs argument) so the UNICORE library will upload them in the job's working directory in the staging in phase, before launching the job. 95 + 96 +>job = site_client.new_job(job_description=my_job, inputs=['/path1', '/path2']) 97 +>job.properties 98 + 99 +1. Wait until job is completed and check the results 100 + 101 +Wait until the job is completed using the following command 102 + 103 +># TRUE or FALSE 104 +>job.is_running() 105 + 106 +Check job's working directory for the output files/directories using 107 + 108 +>wd = job.working_dir 109 +>wd.listdir() 110 + 111 +From working job you can preview files content and download files 112 + 113 +># Read 'stdout' file 114 +>out = wd.stat("stdout") 115 +>f = out.raw() 116 +>all_lines = f.read().splitlines() 117 +>all_lines[-20:] 118 +> 119 +># Download 'outputs/res/results.npy' file 120 +>wd.stat("outputs/res/results.npy").download("results.npy") 24 24 ))) 25 25 26 26
- Collaboratory.Apps.Collab.Code.CollabClass[0]
-
- Description
-
... ... @@ -1,1 +1,1 @@ 1 -This space contains tutorials for configuring and running in HBP Collab simulations with TVB simulator in combination with the NEST simulator, and do a co-simulation. Full Brain Modeling. Human. Animal. EEG, MEG, SEEG, iEEG, BOLD. Large Scale Connectivity. Surface Simulations. TVB-multiscale co-simulation 1 +Co-Simulation The Virtual Brain-multiscale. This space contains tutorials for configuring and running in HBP Collab simulations with TVB simulator in combination with the NEST simulator, and do a co-simulation. Full Brain Modeling. Human. Animal. EEG, MEG, SEEG, iEEG, BOLD. Large Scale Connectivity. Surface Simulations. TVB-multiscale co-simulation