Last modified by ldomide on 2024/04/08 12:55

From version 36.3
edited by vbragin
on 2024/03/02 12:30
Change comment: There is no comment for this version
To version 31.1
edited by dionperd
on 2023/09/26 18:42
Change comment: There is no comment for this version

Summary

Details

Page properties
Author
... ... @@ -1,1 +1,1 @@
1 -XWiki.vbragin
1 +XWiki.dionperd
Content
... ... @@ -20,25 +20,15 @@
20 20  * TVB Dedicated Wiki [[https:~~/~~/wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain/>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain/]]
21 21  * TVB in HBP User Story [[https:~~/~~/wiki.ebrains.eu/bin/view/Collabs/user-story-tvb/>>url:https://wiki.ebrains.eu/bin/view/Collabs/user-story-tvb/]]
22 22  
23 +(% class="wikigeneratedid" %)
23 23  == ==
24 24  
26 +(% class="wikigeneratedid" %)
25 25  == Running TVB-MULTISCALE at EBRAINS JupyterLab ==
26 26  
27 -TVB-multiscale is made available at [[EBRAINS JupyterLab>>https://lab.ebrains.eu/]].
29 +TVB-multiscale is made available at [[EBRAINS JupyterLab>>https://lab.ebrains.eu/]]. All the user has to do is login with the EBRAINS credentials, and start a Python console or a Jupyter notebook, TVB-multiscale being available for importing (e.g., via "import tvb_multiscale"). All necessary TVB-multiscale dependencies (NEST, ANNarchy, NetPyNE (NEURON), Elephant, Pyspike) are also installed and available. We suggest the users to upload [[documented notebooks>>https://github.com/the-virtual-brain/tvb-multiscale/tree/master/docs/notebooks]] and/or [[examples' scripts and notebooks >>https://github.com/the-virtual-brain/tvb-multiscale/tree/master/examples]]from TVB-multiscale Github repository and run them there.
28 28  
29 -All the user has to do is log in with their EBRAINS credentials, and start a Python console or a Jupyter notebook using the kernel "EBRAINS-23.09" (or a more recent version), where TVB-multiscale can be imported (e.g., via "import tvb_multiscale"). All necessary TVB-multiscale dependencies (NEST, ANNarchy, NetPyNE (NEURON), Elephant, Pyspike) are also installed and available.
30 30  
31 -This collab contains various examples of using TVB-Multiscale with all three supported spiking simulators. We suggest copying the contents of this collab to your Library or to any collab owned by you, and running them there (note that the user's drive offers persistent storage, i.e. users will find their files after logging out and in again), as follows:
32 -
33 -~1. Select `Drive` on the left of the current page (or use [[this link>>https://wiki.ebrains.eu/bin/view/Collabs/the-virtual-brain-multiscale/Drive]]).
34 -
35 -2. Check the `tvb-multiscale-collab` folder checkbox, and copy it to your `My Library` ("copy" icon will appear above the files/folders list).
36 -
37 -3. Select `Lab` (on the left), and navigate to the destination where you just copied the folder.
38 -
39 -4. Enter the `tvb-multiscale-collab` folder, and open either of example notebooks. Ensure you select the appropriate ipykernel (EBRAINS-23.09 or a more recent one)
40 -
41 -
42 42  == Use our Jupyter Hub setup online ((% style="color:#c0392b" %)DEPRECATED(%%)) ==
43 43  
44 44  (% style="color:#c0392b" %)**TVB-multiscale app is deprecated and will stop being available after the end of 2023!**
... ... @@ -66,6 +66,7 @@
66 66  
67 67  This is the path recommended for people working closely with tvb-multiscale. They are able to download it in their local work env and code freely and fast with it.
68 68  
59 +(% class="wikigeneratedid" %)
69 69  == ==
70 70  
71 71  == Running TVB-MULTISCALE jobs on CSCS infrastructure from HBP collab ==