Wiki source code of Widget 3D Head
Show last authors
| author | version | line-number | content |
|---|---|---|---|
| 1 | Source code: [[https:~~/~~/github.com/the-virtual-brain/tvb-widgets>>https://github.com/the-virtual-brain/tvb-widgets]] | ||
| 2 | |||
| 3 | This is part of a Pypi release: [[https:~~/~~/pypi.org/project/tvb-widgets/>>https://pypi.org/project/tvb-widgets/]] | ||
| 4 | |||
| 5 | //**tvb-widgets**// is also already installed in the official image released for EBRAINS lab, where you can test it directly. | ||
| 6 | |||
| 7 | == Purpose == | ||
| 8 | |||
| 9 | It is a Jupyter widget intended for visualization of the 3D Head data available for a patient: | ||
| 10 | |||
| 11 | * surfaces of different types (cortex, face, skull, etc) | ||
| 12 | * connectivity region centers | ||
| 13 | * sensors locations (SEEG, MEG, EEG) | ||
| 14 | |||
| 15 | == Inputs == | ||
| 16 | |||
| 17 | It supports the above data in the form of their corresponding TVB datatypes: | ||
| 18 | |||
| 19 | * Surface (CorticalSurface, FaceSurface, etc) | ||
| 20 | * Connectivity | ||
| 21 | * Sensors (SensorsInternal, SensorsMEG, SensorsEEG) | ||
| 22 | |||
| 23 | == Installation == | ||
| 24 | |||
| 25 | (% class="box" %) | ||
| 26 | ((( | ||
| 27 | pip install tvb-widgets | ||
| 28 | ))) | ||
| 29 | |||
| 30 | == API usage == | ||
| 31 | |||
| 32 | We need to first import the widget __API from tvbwidgets__// //package, together with the __TVB API __and the __display__ function: | ||
| 33 | |||
| 34 | (% class="box" %) | ||
| 35 | ((( | ||
| 36 | import tvbwidgets.api as api | ||
| 37 | |||
| 38 | from tvb.simulator.lab import * | ||
| 39 | |||
| 40 | from IPython.core.display_functions import display | ||
| 41 | ))) | ||
| 42 | |||
| 43 | Then, there are 2 options to work with the widget: | ||
| 44 | |||
| 45 | 1. Use a file browser to load the data and automatically display it | ||
| 46 | 1. Use directly the API to load the data and display it | ||
| 47 | |||
| 48 | For the first option, you have to run the following 2 lines of code in a notebook cell and then just use the UI controls: | ||
| 49 | |||
| 50 | (% class="box" %) | ||
| 51 | ((( | ||
| 52 | widget = api.HeadBrowser() | ||
| 53 | display(widget) | ||
| 54 | ))) | ||
| 55 | |||
| 56 | {{html}} | ||
| 57 | <iframe src="https://drive.google.com/file/d/1lY3X5eqJfOLmkmHuBa2iq_Aas8mDPa1e/preview" width="840" height="480" allow="autoplay"></iframe> | ||
| 58 | {{/html}} | ||
| 59 | |||
| 60 | |||
| 61 | For the second option, the API is described below: | ||
| 62 | |||
| 63 | In a new cell, we instantiate the **HeadWidget** and a **FaceSurface** datatype that we want to visualize. Using the //**add_datatype**// method we add the surface to our widget and __display__ the widget: | ||
| 64 | |||
| 65 | (% class="box" %) | ||
| 66 | ((( | ||
| 67 | widget = api.HeadWidget() | ||
| 68 | |||
| 69 | face = surfaces.FaceSurface().from_file() | ||
| 70 | |||
| 71 | face.configure() | ||
| 72 | |||
| 73 | widget.add_datatype(face) | ||
| 74 | display(widget) | ||
| 75 | ))) | ||
| 76 | |||
| 77 | {{html}} | ||
| 78 | <iframe src="https://drive.google.com/file/d/1Egp9Lk-HGMATc9em6Kw_jSHmybTD2vzM/preview" width="840" height="480" allow="autoplay"></iframe> | ||
| 79 | {{/html}} | ||
| 80 | |||
| 81 | Next, we can continue adding other datatypes to this widget, by calling //**add_datatype**// multiple times. A maximum of 10 datatypes are supported by this widget. | ||
| 82 | |||
| 83 | The **Config** object can be used to tweak the display options for each datatype. | ||
| 84 | |||
| 85 | In the code below, we add a **Connectivity** and SEEG **Sensors:** | ||
| 86 | |||
| 87 | (% class="box" %) | ||
| 88 | ((( | ||
| 89 | conn = connectivity.Connectivity().from_file() | ||
| 90 | |||
| 91 | conn.configure() | ||
| 92 | |||
| 93 | widget.add_datatype(conn) | ||
| 94 | |||
| 95 | |||
| 96 | seeg = sensors.SensorsInternal().from_file() | ||
| 97 | |||
| 98 | seeg.configure() | ||
| 99 | |||
| 100 | widget.add_datatype(seeg, api.HeadWidgetConfig(name='SEEG')) | ||
| 101 | ))) | ||
| 102 | |||
| 103 | {{html}} | ||
| 104 | <iframe src="https://drive.google.com/file/d/1RLwts75Hh31LoPdWLK7QOM61KIsOabF1/preview" width="840" height="480" allow="autoplay"></iframe> | ||
| 105 | {{/html}} | ||
| 106 | |||
| 107 | We can also provide a **RegionMapping** to be used as colormap for a surface: | ||
| 108 | |||
| 109 | (% class="box" %) | ||
| 110 | ((( | ||
| 111 | reg_map = region_mapping.RegionMapping.from_file() | ||
| 112 | |||
| 113 | config = api.HeadWidgetConfig(name='Cortex') | ||
| 114 | |||
| 115 | config.add_region_mapping_as_cmap(reg_map) | ||
| 116 | |||
| 117 | |||
| 118 | cortex = surfaces.CorticalSurface().from_file() | ||
| 119 | |||
| 120 | cortex.configure() | ||
| 121 | |||
| 122 | |||
| 123 | widget = api.HeadWidget() | ||
| 124 | |||
| 125 | widget.add_datatype(cortex, config) | ||
| 126 | |||
| 127 | display(widget) | ||
| 128 | ))) | ||
| 129 | |||
| 130 | {{html}} | ||
| 131 | <iframe src="https://drive.google.com/file/d/1zrbjdb8Y4V5rqg7Y7LDGlHHMS0RH8luz/preview" width="840" height="480" allow="autoplay"></iframe> | ||
| 132 | {{/html}} |