Wiki source code of Widget 3D Head
Version 25.1 by reginafilange on 2025/05/30 13:47
Hide last authors
author | version | line-number | content |
---|---|---|---|
![]() |
15.1 | 1 | Source code: [[https:~~/~~/github.com/the-virtual-brain/tvb-widgets>>https://github.com/the-virtual-brain/tvb-widgets]] |
![]() |
3.1 | 2 | |
![]() |
22.1 | 3 | This is part of a Pypi release: [[https:~~/~~/pypi.org/project/tvb-widgets/>>https://pypi.org/project/tvb-widgets/]] |
![]() |
11.1 | 4 | |
![]() |
22.1 | 5 | //**tvb-widgets**// is also already installed in the official image released for EBRAINS lab, where you can test it directly. |
6 | |||
![]() |
3.1 | 7 | == Purpose == |
8 | |||
![]() |
12.1 | 9 | It is a Jupyter widget intended for visualization of the 3D Head data available for a patient: |
![]() |
3.1 | 10 | |
11 | * surfaces of different types (cortex, face, skull, etc) | ||
![]() |
25.1 | 12 | * connectivity region centers and edges |
![]() |
3.1 | 13 | * sensors locations (SEEG, MEG, EEG) |
14 | |||
![]() |
25.1 | 15 | On cortical surfaces, it can also display region parcellation. |
16 | |||
![]() |
3.1 | 17 | == Inputs == |
18 | |||
19 | It supports the above data in the form of their corresponding TVB datatypes: | ||
20 | |||
21 | * Surface (CorticalSurface, FaceSurface, etc) | ||
![]() |
25.1 | 22 | * Parcellation (RegionMapping) |
![]() |
3.1 | 23 | * Connectivity |
24 | * Sensors (SensorsInternal, SensorsMEG, SensorsEEG) | ||
25 | |||
26 | == Installation == | ||
27 | |||
28 | (% class="box" %) | ||
29 | ((( | ||
30 | pip install tvb-widgets | ||
31 | ))) | ||
32 | |||
33 | == API usage == | ||
34 | |||
35 | We need to first import the widget __API from tvbwidgets__// //package, together with the __TVB API __and the __display__ function: | ||
36 | |||
37 | (% class="box" %) | ||
38 | ((( | ||
39 | import tvbwidgets.api as api | ||
40 | |||
41 | from tvb.simulator.lab import * | ||
42 | |||
43 | from IPython.core.display_functions import display | ||
44 | ))) | ||
45 | |||
![]() |
18.1 | 46 | Then, there are 2 options to work with the widget: |
![]() |
3.1 | 47 | |
![]() |
18.1 | 48 | 1. Use a file browser to load the data and automatically display it |
49 | 1. Use directly the API to load the data and display it | ||
50 | |||
51 | For the first option, you have to run the following 2 lines of code in a notebook cell and then just use the UI controls: | ||
52 | |||
![]() |
3.1 | 53 | (% class="box" %) |
54 | ((( | ||
![]() |
18.1 | 55 | widget = api.HeadBrowser() |
56 | display(widget) | ||
57 | ))) | ||
58 | |||
![]() |
21.1 | 59 | {{html}} |
![]() |
24.1 | 60 | <iframe width="840" height="480" src="https://www.youtube.com/embed/BCCh-wdcnVo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> |
61 | {{/html}} | ||
![]() |
18.1 | 62 | |
![]() |
21.1 | 63 | |
![]() |
18.1 | 64 | For the second option, the API is described below: |
65 | |||
66 | In a new cell, we instantiate the **HeadWidget** and a **FaceSurface** datatype that we want to visualize. Using the //**add_datatype**// method we add the surface to our widget and __display__ the widget: | ||
67 | |||
68 | (% class="box" %) | ||
69 | ((( | ||
![]() |
12.1 | 70 | widget = api.HeadWidget() |
![]() |
3.1 | 71 | |
72 | face = surfaces.FaceSurface().from_file() | ||
73 | |||
74 | face.configure() | ||
75 | |||
![]() |
9.1 | 76 | widget.add_datatype(face) |
77 | display(widget) | ||
![]() |
3.1 | 78 | ))) |
79 | |||
80 | {{html}} | ||
![]() |
23.1 | 81 | <iframe width="840" height="480" src="https://www.youtube.com/embed/8bmjKp3BYFA" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> |
![]() |
3.1 | 82 | {{/html}} |
83 | |||
![]() |
25.1 | 84 | Next, we can continue adding other datatypes to this widget, by calling //**add_datatype**// multiple times. |
![]() |
3.1 | 85 | |
86 | In the code below, we add a **Connectivity** and SEEG **Sensors:** | ||
87 | |||
88 | (% class="box" %) | ||
89 | ((( | ||
90 | conn = connectivity.Connectivity().from_file() | ||
91 | |||
92 | conn.configure() | ||
93 | |||
![]() |
9.1 | 94 | widget.add_datatype(conn) |
![]() |
3.1 | 95 | |
96 | |||
97 | seeg = sensors.SensorsInternal().from_file() | ||
98 | |||
99 | seeg.configure() | ||
100 | |||
![]() |
25.1 | 101 | widget.add_datatype(seeg) |
![]() |
3.1 | 102 | ))) |
103 | |||
![]() |
6.1 | 104 | {{html}} |
![]() |
23.1 | 105 | <iframe width="840" height="480" src="https://www.youtube.com/embed/6UQhL9gd1HM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> |
![]() |
6.1 | 106 | {{/html}} |
107 | |||
![]() |
25.1 | 108 | We can also provide a **RegionMapping** to be used as parcellation for a surface: |
![]() |
3.1 | 109 | |
110 | (% class="box" %) | ||
111 | ((( | ||
112 | reg_map = region_mapping.RegionMapping.from_file() | ||
113 | |||
114 | |||
115 | cortex = surfaces.CorticalSurface().from_file() | ||
116 | |||
117 | cortex.configure() | ||
118 | |||
![]() |
16.2 | 119 | |
![]() |
25.1 | 120 | widget.add_datatype(cortex, reg_map) |
![]() |
16.2 | 121 | |
122 | display(widget) | ||
![]() |
8.1 | 123 | ))) |
![]() |
3.1 | 124 | |
125 | {{html}} | ||
![]() |
24.1 | 126 | <iframe width="840" height="480" src="https://www.youtube.com/embed/aDC2TJm2NxM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> |
![]() |
3.1 | 127 | {{/html}} |