Wiki source code of Widget 3D Head
Version 25.1 by reginafilange on 2025/05/30 13:47
Show last authors
author | version | line-number | content |
---|---|---|---|
1 | Source code: [[https:~~/~~/github.com/the-virtual-brain/tvb-widgets>>https://github.com/the-virtual-brain/tvb-widgets]] | ||
2 | |||
3 | This is part of a Pypi release: [[https:~~/~~/pypi.org/project/tvb-widgets/>>https://pypi.org/project/tvb-widgets/]] | ||
4 | |||
5 | //**tvb-widgets**// is also already installed in the official image released for EBRAINS lab, where you can test it directly. | ||
6 | |||
7 | == Purpose == | ||
8 | |||
9 | It is a Jupyter widget intended for visualization of the 3D Head data available for a patient: | ||
10 | |||
11 | * surfaces of different types (cortex, face, skull, etc) | ||
12 | * connectivity region centers and edges | ||
13 | * sensors locations (SEEG, MEG, EEG) | ||
14 | |||
15 | On cortical surfaces, it can also display region parcellation. | ||
16 | |||
17 | == Inputs == | ||
18 | |||
19 | It supports the above data in the form of their corresponding TVB datatypes: | ||
20 | |||
21 | * Surface (CorticalSurface, FaceSurface, etc) | ||
22 | * Parcellation (RegionMapping) | ||
23 | * Connectivity | ||
24 | * Sensors (SensorsInternal, SensorsMEG, SensorsEEG) | ||
25 | |||
26 | == Installation == | ||
27 | |||
28 | (% class="box" %) | ||
29 | ((( | ||
30 | pip install tvb-widgets | ||
31 | ))) | ||
32 | |||
33 | == API usage == | ||
34 | |||
35 | We need to first import the widget __API from tvbwidgets__// //package, together with the __TVB API __and the __display__ function: | ||
36 | |||
37 | (% class="box" %) | ||
38 | ((( | ||
39 | import tvbwidgets.api as api | ||
40 | |||
41 | from tvb.simulator.lab import * | ||
42 | |||
43 | from IPython.core.display_functions import display | ||
44 | ))) | ||
45 | |||
46 | Then, there are 2 options to work with the widget: | ||
47 | |||
48 | 1. Use a file browser to load the data and automatically display it | ||
49 | 1. Use directly the API to load the data and display it | ||
50 | |||
51 | For the first option, you have to run the following 2 lines of code in a notebook cell and then just use the UI controls: | ||
52 | |||
53 | (% class="box" %) | ||
54 | ((( | ||
55 | widget = api.HeadBrowser() | ||
56 | display(widget) | ||
57 | ))) | ||
58 | |||
59 | {{html}} | ||
60 | <iframe width="840" height="480" src="https://www.youtube.com/embed/BCCh-wdcnVo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> | ||
61 | {{/html}} | ||
62 | |||
63 | |||
64 | For the second option, the API is described below: | ||
65 | |||
66 | In a new cell, we instantiate the **HeadWidget** and a **FaceSurface** datatype that we want to visualize. Using the //**add_datatype**// method we add the surface to our widget and __display__ the widget: | ||
67 | |||
68 | (% class="box" %) | ||
69 | ((( | ||
70 | widget = api.HeadWidget() | ||
71 | |||
72 | face = surfaces.FaceSurface().from_file() | ||
73 | |||
74 | face.configure() | ||
75 | |||
76 | widget.add_datatype(face) | ||
77 | display(widget) | ||
78 | ))) | ||
79 | |||
80 | {{html}} | ||
81 | <iframe width="840" height="480" src="https://www.youtube.com/embed/8bmjKp3BYFA" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> | ||
82 | {{/html}} | ||
83 | |||
84 | Next, we can continue adding other datatypes to this widget, by calling //**add_datatype**// multiple times. | ||
85 | |||
86 | In the code below, we add a **Connectivity** and SEEG **Sensors:** | ||
87 | |||
88 | (% class="box" %) | ||
89 | ((( | ||
90 | conn = connectivity.Connectivity().from_file() | ||
91 | |||
92 | conn.configure() | ||
93 | |||
94 | widget.add_datatype(conn) | ||
95 | |||
96 | |||
97 | seeg = sensors.SensorsInternal().from_file() | ||
98 | |||
99 | seeg.configure() | ||
100 | |||
101 | widget.add_datatype(seeg) | ||
102 | ))) | ||
103 | |||
104 | {{html}} | ||
105 | <iframe width="840" height="480" src="https://www.youtube.com/embed/6UQhL9gd1HM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> | ||
106 | {{/html}} | ||
107 | |||
108 | We can also provide a **RegionMapping** to be used as parcellation for a surface: | ||
109 | |||
110 | (% class="box" %) | ||
111 | ((( | ||
112 | reg_map = region_mapping.RegionMapping.from_file() | ||
113 | |||
114 | |||
115 | cortex = surfaces.CorticalSurface().from_file() | ||
116 | |||
117 | cortex.configure() | ||
118 | |||
119 | |||
120 | widget.add_datatype(cortex, reg_map) | ||
121 | |||
122 | display(widget) | ||
123 | ))) | ||
124 | |||
125 | {{html}} | ||
126 | <iframe width="840" height="480" src="https://www.youtube.com/embed/aDC2TJm2NxM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> | ||
127 | {{/html}} |