Changes for page Co-Simulation The Virtual Brain Multiscale
Last modified by ldomide on 2024/04/08 12:55
Summary
-
Page properties (3 modified, 0 added, 0 removed)
-
Objects (1 modified, 2 added, 3 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 - Co-SimulationThe Virtual Brain Multiscale1 +The Virtual Brain Multiscale - Author
-
... ... @@ -1,1 +1,1 @@ 1 -XWiki.d ionperd1 +XWiki.ldomide - Content
-
... ... @@ -2,12 +2,9 @@ 2 2 ((( 3 3 (% class="container" %) 4 4 ((( 5 -= (% style="color:inherit"%)TVBCo-Simulation {{html}}<iframe width="302" height="170" src="https://www.youtube.com/embed/6hEuvxD7IDk?list=PLVtblERyzDeLcVv4BbW3BvmO8D-qVZxKf" frameborder="0" allow="accelerometer; autoplay; encrypted-media;gyroscope; picture-in-picture" allowfullscreen></iframe>{{/html}}(%%)=5 += My Collab's Extended Title = 6 6 7 - 8 -(% style="color:inherit" %)Multiscale: TVB, NEST, (%%)ANNarchy, NetPyNE , Elephant, PySpike 9 - 10 -(% style="color:inherit" %)Authors: (%%)D. Perdikis, A. Blickensdörfer, V. Bragin, L. Domide, J. Mersmann, M. Schirner, P. Ritter(% style="color:inherit" %) 7 +My collab's subtitle 11 11 ))) 12 12 ))) 13 13 ... ... @@ -15,111 +15,15 @@ 15 15 ((( 16 16 (% class="col-xs-12 col-sm-8" %) 17 17 ((( 18 - Formore detailsonTVBsee:15 += What can I find here? = 19 19 20 -* TVB Dedicated Wiki [[https:~~/~~/wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain/>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain/]] 21 -* TVB in HBP User Story [[https:~~/~~/wiki.ebrains.eu/bin/view/Collabs/user-story-tvb/>>url:https://wiki.ebrains.eu/bin/view/Collabs/user-story-tvb/]] 17 +* Notice how the table of contents on the right 18 +* is automatically updated 19 +* to hold this page's headers 22 22 23 -= ===21 += Who has access? = 24 24 25 -== Running TVB-MULTISCALE at EBRAINS JupyterLab == 26 - 27 -TVB-multiscale is made available at [[EBRAINS JupyterLab>>https://lab.ebrains.eu/]]. 28 - 29 -All the user has to do is log in with their EBRAINS credentials, and start a Python console or a Jupyter notebook using the kernel "EBRAINS-23.09" (or a more recent version), where TVB-multiscale can be imported (e.g., via "import tvb_multiscale"). All necessary TVB-multiscale dependencies (NEST, ANNarchy, NetPyNE (NEURON), Elephant, Pyspike) are also installed and available. 30 - 31 -This collab contains various examples of using TVB-Multiscale with all three supported spiking simulators. We suggest copying the contents of this collab to your Library or to any collab owned by you, and running them there (note that the user's drive offers persistent storage, i.e. users will find their files after logging out and in again), as follows: 32 - 33 -~1. Select `Drive` on the left of the current page (or use [[this link>>https://wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain-multiscale/Drive||rel="noopener noreferrer" target="_blank"]]). 34 - 35 -2. Check the `tvb-multiscale-collab` folder checkbox, and copy it to your `My Library` ("copy" icon will appear above the files/folders list). 36 - 37 -3. Select `Lab` (on the left), and navigate to the destination where you just copied the folder. 38 - 39 -4. Enter the `tvb-multiscale-collab` folder, and open either of example notebooks. Ensure you select the appropriate ipykernel (EBRAINS-23.09 or a more recent one) 40 - 41 - 42 -== Running TVB-MULTISCALE locally == 43 - 44 -See more on Github [[https:~~/~~/github.com/the-virtual-brain/tvb-multiscale>>url:https://github.com/the-virtual-brain/tvb-multiscale]] . 45 - 46 -Documented notebooks and other examples will be ok to download and try yourself locally, after you have also prepared and launched locally a Docker env: [[https:~~/~~/hub.docker.com/r/thevirtualbrain/tvb-multiscale>>https://hub.docker.com/r/thevirtualbrain/tvb-multiscale]] 47 - 48 -This is the path recommended for people working closely with tvb-multiscale. They are able to download it in their local work env and code freely and fast with it. 49 - 50 -== == 51 - 52 -== Running TVB-MULTISCALE jobs on CSCS infrastructure from HBP collab == 53 - 54 -The CSCS and HBP Collab deployment of tvb-multiscale is a good example to show how tvb-multiscale can run with an HPC backend. This will be efficient when the simulation jobs are very large. From our experience, with small jobs, the stage-in/out time is considerable, and then the user might be better with just a local run. Also, this deployment requires that **the user have an active CSCS personal account**. More details on how to use this deployment can be found in this movie: [[https:~~/~~/drive.google.com/open?id=1osF263FK_NjhZcBJfpSy-F7qkbYs3Q-E>>url:https://drive.google.com/open?id=1osF263FK_NjhZcBJfpSy-F7qkbYs3Q-E]] 55 - 56 -* Create a collab space of your own 57 -* Clone and run in your HBP Collab Hub ([[https:~~/~~/lab.ebrains.eu/>>url:https://lab.ebrains.eu/]]) the notebooks from here: [[https:~~/~~/drive.ebrains.eu/d/245e6c13082f45bcacfa/>>url:https://drive.ebrains.eu/d/245e6c13082f45bcacfa/]] 58 -** test_tvb-nest_installation.ipynb Run the cosimulate_tvb_nest.sh script on the CSCS Daint supercomputer. In this example, basically we are running the //installation_test.py// file which is in the docker folder. 59 -** run_custom_cosimulation.ipynb For this example we are using the //cosimulate_with_staging.sh// script in order to pull the tvb-multiscale docker image and we are using a custom simulation script (from Github page) which will be uploaded in the staging in phase 60 -** run_custom_cosimulation_from_notebook.ipynb This example is running the same simulation as the example above but instead of using an external file with the simulation code we will build a simulation file from a few notebook cells and we will pass this file to the CSCS server. 61 - 62 -Few technical details about what we do in these notebooks: 63 - 64 -1. Prepare UNICORE client api. 65 - 66 -PYUNICORE client library is available on PYPI. In order to use it you have to install it using: 67 - 68 ->pip install pyunicore 69 - 70 -Next step is to configure client registry and what supercomputer to use 71 - 72 ->tr = unicore_client.Transport(oauth.get_token()) 73 ->r = unicore_client.Registry(tr, unicore_client._HBP_REGISTRY_URL) 74 -># use "DAINT-CSCS" change if another supercomputer is prepared for usage 75 ->client = r.site('DAINT-CSCS') 76 - 77 -1. Prepare job submission 78 - 79 -In this step we have to prepare a JSON object which will be used in the job submission process. 80 - 81 -># What job will execute (command/executable) 82 ->my_job['Executable'] = 'job.sh' 83 -> 84 -># To import files from remote sites to the job’s working directory 85 ->my_job['Imports'] = [{ 86 -> "From": "https:~/~/raw.githubusercontent.com/the-virtual-brain/tvb-multiscale/update-collab-examples/docker/cosimulate_tvb_nest.sh", 87 -> "To" : job.sh 88 ->}] 89 -> 90 -># Specify the resources to request on the remote system 91 ->my_job['Resources'] = { 92 -> "CPUs": "1"} 93 - 94 -1. Actual job submission 95 - 96 -In order to submit a job we have to use the JSON built in the previous step and also if we have some local files, we have to give their paths as a list of strings (inputs argument) so the UNICORE library will upload them in the job's working directory in the staging in phase, before launching the job. 97 - 98 ->job = site_client.new_job(job_description=my_job, inputs=['/path1', '/path2']) 99 ->job.properties 100 - 101 -1. Wait until job is completed and check the results 102 - 103 -Wait until the job is completed using the following command 104 - 105 -># TRUE or FALSE 106 ->job.is_running() 107 - 108 -Check job's working directory for the output files/directories using 109 - 110 ->wd = job.working_dir 111 ->wd.listdir() 112 - 113 -From working job you can preview files content and download files 114 - 115 -># Read 'stdout' file 116 ->out = wd.stat("stdout") 117 ->f = out.raw() 118 ->all_lines = f.read().splitlines() 119 ->all_lines[-20:] 120 -> 121 -># Download 'outputs/res/results.npy' file 122 ->wd.stat("outputs/res/results.npy").download("results.npy") 23 +Describe the audience of this collab. 123 123 ))) 124 124 125 125
- Collaboratory.Apps.Collab.Code.CollabClass[0]
-
- Description
-
... ... @@ -1,2 +1,1 @@ 1 -Multiscale Co-Simulation with The Virtual Brain (TVB) and NEST. 2 -This space contains tutorials for configuring and running in HBP Collab simulations with TVB simulator in combination with the NEST simulator, and do co-simulations. 1 +Co-Simulation The Virtual Brain-multiscale. This space contains tutorials for configuring and running in HBP Collab simulations with TVB simulator in combination with the NEST simulator, and do a co-simulation. Full Brain Modeling. Human. Animal. EEG, MEG, SEEG, iEEG, BOLD. Large Scale Connectivity. Surface Simulations. TVB-multiscale co-simulation - owner
-
... ... @@ -1,1 +1,0 @@ 1 -ldomide
- XWiki.XWikiComments[0]
-
- Author
-
... ... @@ -1,1 +1,0 @@ 1 -XWiki.ldomide - Comment
-
... ... @@ -1,1 +1,0 @@ 1 -Check this movie on tvb-multiscale https://www.youtube.com/watch?v=6hEuvxD7IDk&feature=youtu.be - Date
-
... ... @@ -1,1 +1,0 @@ 1 -2020-07-27 21:56:05.0
- XWiki.XWikiRights[7]
-
- Allow/Deny
-
... ... @@ -1,1 +1,0 @@ 1 -Allow - Levels
-
... ... @@ -1,1 +1,0 @@ 1 -view - Users
-
... ... @@ -1,1 +1,0 @@ 1 -XWiki.XWikiGuest
- XWiki.XWikiRights[8]
-
- Allow/Deny
-
... ... @@ -1,1 +1,0 @@ 1 -Allow - Groups
-
... ... @@ -1,1 +1,0 @@ 1 -XWiki.XWikiAllGroup - Levels
-
... ... @@ -1,1 +1,0 @@ 1 -view
- XWiki.XWikiRights[3]
-
- Allow/Deny
-
... ... @@ -1,0 +1,1 @@ 1 +Allow - Levels
-
... ... @@ -1,0 +1,1 @@ 1 +view - Users
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.XWikiGuest
- XWiki.XWikiRights[4]
-
- Allow/Deny
-
... ... @@ -1,0 +1,1 @@ 1 +Allow - Groups
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.XWikiAllGroup - Levels
-
... ... @@ -1,0 +1,1 @@ 1 +view