Wiki source code of Tools description
Version 2.3 by marissadiazpier on 2023/06/29 12:49
Show last authors
author | version | line-number | content |
---|---|---|---|
1 | EBRAINS offers an extensive range of data and also provides compute resources (High-performance computing and Neuromorphic hardware). SLU can guide your journey on EBRAINS, show you how you find tools/data/related research projects appropriate for your research, and help you translate your work from the structured formalization process of science to technical requirements on EBRAINS. | ||
2 | |||
3 | EBRAINS support teams will help you connect to the community and give you tips for creating your simulations and models. If you are a neuroscientist, professor, or student, or are you interested in brain-related research, simulation, AI, robotics, and brain-inspired hardware check out what EBRAINS has for you. | ||
4 | |||
5 | As a researcher, you might find yourself immersed in a continuous flow of information, data, environments, and platforms which offer different tools and aids to fulfill an investigation and publish your results. This new way to doing science helps us advance at a fast pace and reduces efforts on searching data and tests, also allows us to deploy state-of-the-art simulations that can improve the quality of our work, and increase the capacity of neuroscientists for multiscale neural activity modeling of the human brain network. | ||
6 | |||
7 | On this page you will find a quick overview of the different tools and services available in EBRAINS. It will address, in an interactive way, how to use EBRAINS for specific use cases from the participants and focus on exploring all the potential that EBRAINS, as a digital research infrastructure, provides to its users. Researchers will have the opportunity to get creative and combine the different EBRAINS components to respond to existing questions and formulate new avenues based on collaboration, sharing, co-design, and innovation. | ||
8 | |||
9 | {{html}} | ||
10 | <html> | ||
11 | |||
12 | <head> | ||
13 | <meta http-equiv=Content-Type content="text/html; charset=windows-1252"> | ||
14 | <meta name=Generator content="Microsoft Word 15 (filtered)"> | ||
15 | <style> | ||
16 | <!-- | ||
17 | /* Font Definitions */ | ||
18 | @font-face | ||
19 | {font-family:"Cambria Math"; | ||
20 | panose-1:2 4 5 3 5 4 6 3 2 4;} | ||
21 | @font-face | ||
22 | {font-family:"Calibri Light"; | ||
23 | panose-1:2 15 3 2 2 2 4 3 2 4;} | ||
24 | @font-face | ||
25 | {font-family:Calibri; | ||
26 | panose-1:2 15 5 2 2 2 4 3 2 4;} | ||
27 | /* Style Definitions */ | ||
28 | p.MsoNormal, li.MsoNormal, div.MsoNormal | ||
29 | {margin-top:0cm; | ||
30 | margin-right:0cm; | ||
31 | margin-bottom:10.0pt; | ||
32 | margin-left:0cm; | ||
33 | line-height:120%; | ||
34 | font-size:10.5pt; | ||
35 | font-family:"Calibri",sans-serif;} | ||
36 | h1 | ||
37 | {mso-style-link:"Heading 1 Char"; | ||
38 | margin-top:18.0pt; | ||
39 | margin-right:0cm; | ||
40 | margin-bottom:2.0pt; | ||
41 | margin-left:0cm; | ||
42 | page-break-after:avoid; | ||
43 | font-size:20.0pt; | ||
44 | font-family:"Calibri Light",sans-serif; | ||
45 | color:#538135; | ||
46 | font-weight:normal;} | ||
47 | h2 | ||
48 | {mso-style-link:"Heading 2 Char"; | ||
49 | margin-top:4.0pt; | ||
50 | margin-right:0cm; | ||
51 | margin-bottom:0cm; | ||
52 | margin-left:0cm; | ||
53 | margin-bottom:.0001pt; | ||
54 | page-break-after:avoid; | ||
55 | font-size:14.0pt; | ||
56 | font-family:"Calibri Light",sans-serif; | ||
57 | color:#538135; | ||
58 | font-weight:normal;} | ||
59 | h3 | ||
60 | {mso-style-link:"Heading 3 Char"; | ||
61 | margin-top:4.0pt; | ||
62 | margin-right:0cm; | ||
63 | margin-bottom:0cm; | ||
64 | margin-left:0cm; | ||
65 | margin-bottom:.0001pt; | ||
66 | page-break-after:avoid; | ||
67 | font-size:12.0pt; | ||
68 | font-family:"Calibri Light",sans-serif; | ||
69 | color:#538135; | ||
70 | font-weight:normal;} | ||
71 | h4 | ||
72 | {mso-style-link:"Heading 4 Char"; | ||
73 | margin-top:4.0pt; | ||
74 | margin-right:0cm; | ||
75 | margin-bottom:0cm; | ||
76 | margin-left:0cm; | ||
77 | margin-bottom:.0001pt; | ||
78 | line-height:120%; | ||
79 | page-break-after:avoid; | ||
80 | font-size:11.0pt; | ||
81 | font-family:"Calibri Light",sans-serif; | ||
82 | color:#70AD47; | ||
83 | font-weight:normal;} | ||
84 | h5 | ||
85 | {mso-style-link:"Heading 5 Char"; | ||
86 | margin-top:2.0pt; | ||
87 | margin-right:0cm; | ||
88 | margin-bottom:0cm; | ||
89 | margin-left:0cm; | ||
90 | margin-bottom:.0001pt; | ||
91 | line-height:120%; | ||
92 | page-break-after:avoid; | ||
93 | font-size:11.0pt; | ||
94 | font-family:"Calibri Light",sans-serif; | ||
95 | color:#70AD47; | ||
96 | font-weight:normal; | ||
97 | font-style:italic;} | ||
98 | h6 | ||
99 | {mso-style-link:"Heading 6 Char"; | ||
100 | margin-top:2.0pt; | ||
101 | margin-right:0cm; | ||
102 | margin-bottom:0cm; | ||
103 | margin-left:0cm; | ||
104 | margin-bottom:.0001pt; | ||
105 | line-height:120%; | ||
106 | page-break-after:avoid; | ||
107 | font-size:10.5pt; | ||
108 | font-family:"Calibri Light",sans-serif; | ||
109 | color:#70AD47; | ||
110 | font-weight:normal;} | ||
111 | p.MsoHeading7, li.MsoHeading7, div.MsoHeading7 | ||
112 | {mso-style-link:"Heading 7 Char"; | ||
113 | margin-top:2.0pt; | ||
114 | margin-right:0cm; | ||
115 | margin-bottom:0cm; | ||
116 | margin-left:0cm; | ||
117 | margin-bottom:.0001pt; | ||
118 | line-height:120%; | ||
119 | page-break-after:avoid; | ||
120 | font-size:10.5pt; | ||
121 | font-family:"Calibri Light",sans-serif; | ||
122 | color:#70AD47; | ||
123 | font-weight:bold;} | ||
124 | p.MsoHeading8, li.MsoHeading8, div.MsoHeading8 | ||
125 | {mso-style-link:"Heading 8 Char"; | ||
126 | margin-top:2.0pt; | ||
127 | margin-right:0cm; | ||
128 | margin-bottom:0cm; | ||
129 | margin-left:0cm; | ||
130 | margin-bottom:.0001pt; | ||
131 | line-height:120%; | ||
132 | page-break-after:avoid; | ||
133 | font-size:10.0pt; | ||
134 | font-family:"Calibri Light",sans-serif; | ||
135 | color:#70AD47; | ||
136 | font-weight:bold; | ||
137 | font-style:italic;} | ||
138 | p.MsoHeading9, li.MsoHeading9, div.MsoHeading9 | ||
139 | {mso-style-link:"Heading 9 Char"; | ||
140 | margin-top:2.0pt; | ||
141 | margin-right:0cm; | ||
142 | margin-bottom:0cm; | ||
143 | margin-left:0cm; | ||
144 | margin-bottom:.0001pt; | ||
145 | line-height:120%; | ||
146 | page-break-after:avoid; | ||
147 | font-size:10.0pt; | ||
148 | font-family:"Calibri Light",sans-serif; | ||
149 | color:#70AD47; | ||
150 | font-style:italic;} | ||
151 | p.MsoToc1, li.MsoToc1, div.MsoToc1 | ||
152 | {margin-top:0cm; | ||
153 | margin-right:0cm; | ||
154 | margin-bottom:5.0pt; | ||
155 | margin-left:0cm; | ||
156 | line-height:107%; | ||
157 | font-size:11.0pt; | ||
158 | font-family:"Calibri",sans-serif;} | ||
159 | p.MsoToc2, li.MsoToc2, div.MsoToc2 | ||
160 | {margin-top:0cm; | ||
161 | margin-right:0cm; | ||
162 | margin-bottom:5.0pt; | ||
163 | margin-left:10.5pt; | ||
164 | line-height:120%; | ||
165 | font-size:10.5pt; | ||
166 | font-family:"Calibri",sans-serif;} | ||
167 | p.MsoToc3, li.MsoToc3, div.MsoToc3 | ||
168 | {margin-top:0cm; | ||
169 | margin-right:0cm; | ||
170 | margin-bottom:5.0pt; | ||
171 | margin-left:22.0pt; | ||
172 | line-height:107%; | ||
173 | font-size:11.0pt; | ||
174 | font-family:"Calibri",sans-serif;} | ||
175 | p.MsoToc4, li.MsoToc4, div.MsoToc4 | ||
176 | {margin-top:0cm; | ||
177 | margin-right:0cm; | ||
178 | margin-bottom:5.0pt; | ||
179 | margin-left:33.0pt; | ||
180 | line-height:107%; | ||
181 | font-size:11.0pt; | ||
182 | font-family:"Calibri",sans-serif;} | ||
183 | p.MsoToc5, li.MsoToc5, div.MsoToc5 | ||
184 | {margin-top:0cm; | ||
185 | margin-right:0cm; | ||
186 | margin-bottom:5.0pt; | ||
187 | margin-left:44.0pt; | ||
188 | line-height:107%; | ||
189 | font-size:11.0pt; | ||
190 | font-family:"Calibri",sans-serif;} | ||
191 | p.MsoToc6, li.MsoToc6, div.MsoToc6 | ||
192 | {margin-top:0cm; | ||
193 | margin-right:0cm; | ||
194 | margin-bottom:5.0pt; | ||
195 | margin-left:55.0pt; | ||
196 | line-height:107%; | ||
197 | font-size:11.0pt; | ||
198 | font-family:"Calibri",sans-serif;} | ||
199 | p.MsoToc7, li.MsoToc7, div.MsoToc7 | ||
200 | {margin-top:0cm; | ||
201 | margin-right:0cm; | ||
202 | margin-bottom:5.0pt; | ||
203 | margin-left:66.0pt; | ||
204 | line-height:107%; | ||
205 | font-size:11.0pt; | ||
206 | font-family:"Calibri",sans-serif;} | ||
207 | p.MsoToc8, li.MsoToc8, div.MsoToc8 | ||
208 | {margin-top:0cm; | ||
209 | margin-right:0cm; | ||
210 | margin-bottom:5.0pt; | ||
211 | margin-left:77.0pt; | ||
212 | line-height:107%; | ||
213 | font-size:11.0pt; | ||
214 | font-family:"Calibri",sans-serif;} | ||
215 | p.MsoToc9, li.MsoToc9, div.MsoToc9 | ||
216 | {margin-top:0cm; | ||
217 | margin-right:0cm; | ||
218 | margin-bottom:5.0pt; | ||
219 | margin-left:88.0pt; | ||
220 | line-height:107%; | ||
221 | font-size:11.0pt; | ||
222 | font-family:"Calibri",sans-serif;} | ||
223 | p.MsoCaption, li.MsoCaption, div.MsoCaption | ||
224 | {margin-top:0cm; | ||
225 | margin-right:0cm; | ||
226 | margin-bottom:10.0pt; | ||
227 | margin-left:0cm; | ||
228 | font-size:10.5pt; | ||
229 | font-family:"Calibri",sans-serif; | ||
230 | font-variant:small-caps; | ||
231 | color:#595959; | ||
232 | font-weight:bold;} | ||
233 | p.MsoTitle, li.MsoTitle, div.MsoTitle | ||
234 | {mso-style-link:"Title Char"; | ||
235 | margin:0cm; | ||
236 | margin-bottom:.0001pt; | ||
237 | font-size:48.0pt; | ||
238 | font-family:"Calibri Light",sans-serif; | ||
239 | color:#262626; | ||
240 | letter-spacing:-.75pt;} | ||
241 | p.MsoTitleCxSpFirst, li.MsoTitleCxSpFirst, div.MsoTitleCxSpFirst | ||
242 | {mso-style-link:"Title Char"; | ||
243 | margin:0cm; | ||
244 | margin-bottom:.0001pt; | ||
245 | font-size:48.0pt; | ||
246 | font-family:"Calibri Light",sans-serif; | ||
247 | color:#262626; | ||
248 | letter-spacing:-.75pt;} | ||
249 | p.MsoTitleCxSpMiddle, li.MsoTitleCxSpMiddle, div.MsoTitleCxSpMiddle | ||
250 | {mso-style-link:"Title Char"; | ||
251 | margin:0cm; | ||
252 | margin-bottom:.0001pt; | ||
253 | font-size:48.0pt; | ||
254 | font-family:"Calibri Light",sans-serif; | ||
255 | color:#262626; | ||
256 | letter-spacing:-.75pt;} | ||
257 | p.MsoTitleCxSpLast, li.MsoTitleCxSpLast, div.MsoTitleCxSpLast | ||
258 | {mso-style-link:"Title Char"; | ||
259 | margin:0cm; | ||
260 | margin-bottom:.0001pt; | ||
261 | font-size:48.0pt; | ||
262 | font-family:"Calibri Light",sans-serif; | ||
263 | color:#262626; | ||
264 | letter-spacing:-.75pt;} | ||
265 | p.MsoSubtitle, li.MsoSubtitle, div.MsoSubtitle | ||
266 | {mso-style-link:"Subtitle Char"; | ||
267 | margin-top:0cm; | ||
268 | margin-right:0cm; | ||
269 | margin-bottom:10.0pt; | ||
270 | margin-left:0cm; | ||
271 | font-size:15.0pt; | ||
272 | font-family:"Calibri Light",sans-serif;} | ||
273 | a:link, span.MsoHyperlink | ||
274 | {color:#0563C1; | ||
275 | text-decoration:underline;} | ||
276 | a:visited, span.MsoHyperlinkFollowed | ||
277 | {color:#954F72; | ||
278 | text-decoration:underline;} | ||
279 | em | ||
280 | {color:#70AD47;} | ||
281 | p.MsoNoSpacing, li.MsoNoSpacing, div.MsoNoSpacing | ||
282 | {margin:0cm; | ||
283 | margin-bottom:.0001pt; | ||
284 | font-size:10.5pt; | ||
285 | font-family:"Calibri",sans-serif;} | ||
286 | p.MsoQuote, li.MsoQuote, div.MsoQuote | ||
287 | {mso-style-link:"Quote Char"; | ||
288 | margin-top:8.0pt; | ||
289 | margin-right:36.0pt; | ||
290 | margin-bottom:10.0pt; | ||
291 | margin-left:36.0pt; | ||
292 | text-align:center; | ||
293 | line-height:120%; | ||
294 | font-size:10.5pt; | ||
295 | font-family:"Calibri",sans-serif; | ||
296 | color:#262626; | ||
297 | font-style:italic;} | ||
298 | p.MsoIntenseQuote, li.MsoIntenseQuote, div.MsoIntenseQuote | ||
299 | {mso-style-link:"Intense Quote Char"; | ||
300 | margin-top:8.0pt; | ||
301 | margin-right:36.0pt; | ||
302 | margin-bottom:8.0pt; | ||
303 | margin-left:36.0pt; | ||
304 | text-align:center; | ||
305 | line-height:110%; | ||
306 | font-size:16.0pt; | ||
307 | font-family:"Calibri Light",sans-serif; | ||
308 | color:#70AD47; | ||
309 | font-style:italic;} | ||
310 | span.MsoSubtleEmphasis | ||
311 | {font-style:italic;} | ||
312 | span.MsoIntenseEmphasis | ||
313 | {font-weight:bold; | ||
314 | font-style:italic;} | ||
315 | span.MsoSubtleReference | ||
316 | {font-variant:small-caps; | ||
317 | color:#595959;} | ||
318 | span.MsoIntenseReference | ||
319 | {font-variant:small-caps; | ||
320 | color:#70AD47; | ||
321 | font-weight:bold;} | ||
322 | span.MsoBookTitle | ||
323 | {font-variant:small-caps; | ||
324 | text-transform:none; | ||
325 | letter-spacing:.35pt; | ||
326 | font-weight:bold;} | ||
327 | p.MsoTocHeading, li.MsoTocHeading, div.MsoTocHeading | ||
328 | {margin-top:18.0pt; | ||
329 | margin-right:0cm; | ||
330 | margin-bottom:2.0pt; | ||
331 | margin-left:0cm; | ||
332 | page-break-after:avoid; | ||
333 | font-size:20.0pt; | ||
334 | font-family:"Calibri Light",sans-serif; | ||
335 | color:#538135;} | ||
336 | span.Heading1Char | ||
337 | {mso-style-name:"Heading 1 Char"; | ||
338 | mso-style-link:"Heading 1"; | ||
339 | font-family:"Calibri Light",sans-serif; | ||
340 | color:#538135;} | ||
341 | span.Heading2Char | ||
342 | {mso-style-name:"Heading 2 Char"; | ||
343 | mso-style-link:"Heading 2"; | ||
344 | font-family:"Calibri Light",sans-serif; | ||
345 | color:#538135;} | ||
346 | span.Heading3Char | ||
347 | {mso-style-name:"Heading 3 Char"; | ||
348 | mso-style-link:"Heading 3"; | ||
349 | font-family:"Calibri Light",sans-serif; | ||
350 | color:#538135;} | ||
351 | span.Heading4Char | ||
352 | {mso-style-name:"Heading 4 Char"; | ||
353 | mso-style-link:"Heading 4"; | ||
354 | font-family:"Calibri Light",sans-serif; | ||
355 | color:#70AD47;} | ||
356 | span.Heading5Char | ||
357 | {mso-style-name:"Heading 5 Char"; | ||
358 | mso-style-link:"Heading 5"; | ||
359 | font-family:"Calibri Light",sans-serif; | ||
360 | color:#70AD47; | ||
361 | font-style:italic;} | ||
362 | span.Heading6Char | ||
363 | {mso-style-name:"Heading 6 Char"; | ||
364 | mso-style-link:"Heading 6"; | ||
365 | font-family:"Calibri Light",sans-serif; | ||
366 | color:#70AD47;} | ||
367 | span.Heading7Char | ||
368 | {mso-style-name:"Heading 7 Char"; | ||
369 | mso-style-link:"Heading 7"; | ||
370 | font-family:"Calibri Light",sans-serif; | ||
371 | color:#70AD47; | ||
372 | font-weight:bold;} | ||
373 | span.Heading8Char | ||
374 | {mso-style-name:"Heading 8 Char"; | ||
375 | mso-style-link:"Heading 8"; | ||
376 | font-family:"Calibri Light",sans-serif; | ||
377 | color:#70AD47; | ||
378 | font-weight:bold; | ||
379 | font-style:italic;} | ||
380 | span.Heading9Char | ||
381 | {mso-style-name:"Heading 9 Char"; | ||
382 | mso-style-link:"Heading 9"; | ||
383 | font-family:"Calibri Light",sans-serif; | ||
384 | color:#70AD47; | ||
385 | font-style:italic;} | ||
386 | span.TitleChar | ||
387 | {mso-style-name:"Title Char"; | ||
388 | mso-style-link:Title; | ||
389 | font-family:"Calibri Light",sans-serif; | ||
390 | color:#262626; | ||
391 | letter-spacing:-.75pt;} | ||
392 | span.SubtitleChar | ||
393 | {mso-style-name:"Subtitle Char"; | ||
394 | mso-style-link:Subtitle; | ||
395 | font-family:"Calibri Light",sans-serif;} | ||
396 | span.QuoteChar | ||
397 | {mso-style-name:"Quote Char"; | ||
398 | mso-style-link:Quote; | ||
399 | color:#262626; | ||
400 | font-style:italic;} | ||
401 | span.IntenseQuoteChar | ||
402 | {mso-style-name:"Intense Quote Char"; | ||
403 | mso-style-link:"Intense Quote"; | ||
404 | font-family:"Calibri Light",sans-serif; | ||
405 | color:#70AD47; | ||
406 | font-style:italic;} | ||
407 | .MsoChpDefault | ||
408 | {font-size:10.5pt; | ||
409 | font-family:"Calibri",sans-serif;} | ||
410 | .MsoPapDefault | ||
411 | {margin-bottom:10.0pt; | ||
412 | line-height:120%;} | ||
413 | @page WordSection1 | ||
414 | {size:595.3pt 841.9pt; | ||
415 | margin:72.0pt 72.0pt 72.0pt 72.0pt;} | ||
416 | div.WordSection1 | ||
417 | {page:WordSection1;} | ||
418 | --> | ||
419 | </style> | ||
420 | |||
421 | </head> | ||
422 | |||
423 | <body lang=EN-US link="#0563C1" vlink="#954F72"> | ||
424 | |||
425 | <div class=WordSection1> | ||
426 | |||
427 | <p class=MsoTocHeading>HBP Tools list</p> | ||
428 | |||
429 | <p class=MsoToc2><span lang=en-DE><span class=MsoHyperlink><a | ||
430 | href="#_Toc138932248">AngoraPy<span style='color:windowtext;display:none; | ||
431 | text-decoration:none'>. </span><span | ||
432 | style='color:windowtext;display:none;text-decoration:none'>5</span></a></span></span></p> | ||
433 | |||
434 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
435 | href="#_Toc138932249">AnonyMI<span style='color:windowtext;display:none; | ||
436 | text-decoration:none'> </span><span | ||
437 | style='color:windowtext;display:none;text-decoration:none'>5</span></a></span></span></p> | ||
438 | |||
439 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
440 | href="#_Toc138932250">Arbor<span style='color:windowtext;display:none; | ||
441 | text-decoration:none'> </span><span | ||
442 | style='color:windowtext;display:none;text-decoration:none'>6</span></a></span></span></p> | ||
443 | |||
444 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
445 | href="#_Toc138932251">Arbor GUI<span style='color:windowtext;display:none; | ||
446 | text-decoration:none'> </span><span | ||
447 | style='color:windowtext;display:none;text-decoration:none'>6</span></a></span></span></p> | ||
448 | |||
449 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
450 | href="#_Toc138932252">Bayesian Virtual Epileptic Patient (BVEP)<span | ||
451 | style='color:windowtext;display:none;text-decoration:none'> </span><span | ||
452 | style='color:windowtext;display:none;text-decoration:none'>6</span></a></span></span></p> | ||
453 | |||
454 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
455 | href="#_Toc138932253">BIDS Extension Proposal Computational Model | ||
456 | Specifications<span style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
457 | style='color:windowtext;display:none;text-decoration:none'>6</span></a></span></span></p> | ||
458 | |||
459 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
460 | href="#_Toc138932254">BioBB<span style='color:windowtext;display:none; | ||
461 | text-decoration:none'>. </span><span | ||
462 | style='color:windowtext;display:none;text-decoration:none'>6</span></a></span></span></p> | ||
463 | |||
464 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
465 | href="#_Toc138932255">BioExcel-CV19<span style='color:windowtext;display:none; | ||
466 | text-decoration:none'>. </span><span | ||
467 | style='color:windowtext;display:none;text-decoration:none'>7</span></a></span></span></p> | ||
468 | |||
469 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
470 | href="#_Toc138932256">BioNAR<span style='color:windowtext;display:none; | ||
471 | text-decoration:none'>. </span><span | ||
472 | style='color:windowtext;display:none;text-decoration:none'>7</span></a></span></span></p> | ||
473 | |||
474 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
475 | href="#_Toc138932257">BlueNaaS-single cell<span style='color:windowtext; | ||
476 | display:none;text-decoration:none'> </span><span | ||
477 | style='color:windowtext;display:none;text-decoration:none'>7</span></a></span></span></p> | ||
478 | |||
479 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
480 | href="#_Toc138932258">BlueNaaS-subcellular<span style='color:windowtext; | ||
481 | display:none;text-decoration:none'> </span><span | ||
482 | style='color:windowtext;display:none;text-decoration:none'>7</span></a></span></span></p> | ||
483 | |||
484 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
485 | href="#_Toc138932259">BluePyEfe<span style='color:windowtext;display:none; | ||
486 | text-decoration:none'>. </span><span | ||
487 | style='color:windowtext;display:none;text-decoration:none'>7</span></a></span></span></p> | ||
488 | |||
489 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
490 | href="#_Toc138932260">BluePyMM<span style='color:windowtext;display:none; | ||
491 | text-decoration:none'>... </span><span | ||
492 | style='color:windowtext;display:none;text-decoration:none'>8</span></a></span></span></p> | ||
493 | |||
494 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
495 | href="#_Toc138932261">BluePyOpt<span style='color:windowtext;display:none; | ||
496 | text-decoration:none'> </span><span | ||
497 | style='color:windowtext;display:none;text-decoration:none'>8</span></a></span></span></p> | ||
498 | |||
499 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
500 | href="#_Toc138932262">Brain Cockpit<span style='color:windowtext;display:none; | ||
501 | text-decoration:none'> </span><span | ||
502 | style='color:windowtext;display:none;text-decoration:none'>8</span></a></span></span></p> | ||
503 | |||
504 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
505 | href="#_Toc138932263">BrainScaleS<span style='color:windowtext;display:none; | ||
506 | text-decoration:none'>. </span><span | ||
507 | style='color:windowtext;display:none;text-decoration:none'>8</span></a></span></span></p> | ||
508 | |||
509 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
510 | href="#_Toc138932264">Brayns<span style='color:windowtext;display:none; | ||
511 | text-decoration:none'>. </span><span | ||
512 | style='color:windowtext;display:none;text-decoration:none'>8</span></a></span></span></p> | ||
513 | |||
514 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
515 | href="#_Toc138932265">Brion<span style='color:windowtext;display:none; | ||
516 | text-decoration:none'>. </span><span | ||
517 | style='color:windowtext;display:none;text-decoration:none'>9</span></a></span></span></p> | ||
518 | |||
519 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
520 | href="#_Toc138932266">BSB<span style='color:windowtext;display:none;text-decoration: | ||
521 | none'>. </span><span | ||
522 | style='color:windowtext;display:none;text-decoration:none'>9</span></a></span></span></p> | ||
523 | |||
524 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
525 | href="#_Toc138932267">BSP Service Account<span style='color:windowtext; | ||
526 | display:none;text-decoration:none'> </span><span | ||
527 | style='color:windowtext;display:none;text-decoration:none'>9</span></a></span></span></p> | ||
528 | |||
529 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
530 | href="#_Toc138932268">bsp-usecase-wizard<span style='color:windowtext; | ||
531 | display:none;text-decoration:none'>. </span><span | ||
532 | style='color:windowtext;display:none;text-decoration:none'>9</span></a></span></span></p> | ||
533 | |||
534 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
535 | href="#_Toc138932269">CGMD Platform<span style='color:windowtext;display:none; | ||
536 | text-decoration:none'>.. </span><span | ||
537 | style='color:windowtext;display:none;text-decoration:none'>9</span></a></span></span></p> | ||
538 | |||
539 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
540 | href="#_Toc138932270">CNS-ligands<span style='color:windowtext;display:none; | ||
541 | text-decoration:none'>. </span><span | ||
542 | style='color:windowtext;display:none;text-decoration:none'>9</span></a></span></span></p> | ||
543 | |||
544 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
545 | href="#_Toc138932271">Cobrawap<span style='color:windowtext;display:none; | ||
546 | text-decoration:none'>. </span><span | ||
547 | style='color:windowtext;display:none;text-decoration:none'>10</span></a></span></span></p> | ||
548 | |||
549 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
550 | href="#_Toc138932272">Collaboratory Bucket service<span style='color:windowtext; | ||
551 | display:none;text-decoration:none'>. </span><span | ||
552 | style='color:windowtext;display:none;text-decoration:none'>10</span></a></span></span></p> | ||
553 | |||
554 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
555 | href="#_Toc138932273">Collaboratory Drive<span style='color:windowtext; | ||
556 | display:none;text-decoration:none'>. </span><span | ||
557 | style='color:windowtext;display:none;text-decoration:none'>10</span></a></span></span></p> | ||
558 | |||
559 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
560 | href="#_Toc138932274">Collaboratory IAM<span style='color:windowtext; | ||
561 | display:none;text-decoration:none'>... </span><span | ||
562 | style='color:windowtext;display:none;text-decoration:none'>10</span></a></span></span></p> | ||
563 | |||
564 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
565 | href="#_Toc138932275">Collaboratory Lab<span style='color:windowtext; | ||
566 | display:none;text-decoration:none'>. </span><span | ||
567 | style='color:windowtext;display:none;text-decoration:none'>11</span></a></span></span></p> | ||
568 | |||
569 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
570 | href="#_Toc138932276">Collaboratory Office<span style='color:windowtext; | ||
571 | display:none;text-decoration:none'>. </span><span | ||
572 | style='color:windowtext;display:none;text-decoration:none'>11</span></a></span></span></p> | ||
573 | |||
574 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
575 | href="#_Toc138932277">Collaboratory Wiki<span style='color:windowtext; | ||
576 | display:none;text-decoration:none'> </span><span | ||
577 | style='color:windowtext;display:none;text-decoration:none'>11</span></a></span></span></p> | ||
578 | |||
579 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
580 | href="#_Toc138932278">CoreNEURON<span style='color:windowtext;display:none; | ||
581 | text-decoration:none'>.. </span><span | ||
582 | style='color:windowtext;display:none;text-decoration:none'>11</span></a></span></span></p> | ||
583 | |||
584 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
585 | href="#_Toc138932279">CxSystem2<span style='color:windowtext;display:none; | ||
586 | text-decoration:none'>. </span><span | ||
587 | style='color:windowtext;display:none;text-decoration:none'>11</span></a></span></span></p> | ||
588 | |||
589 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
590 | href="#_Toc138932280">DeepSlice<span style='color:windowtext;display:none; | ||
591 | text-decoration:none'>. </span><span | ||
592 | style='color:windowtext;display:none;text-decoration:none'>11</span></a></span></span></p> | ||
593 | |||
594 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
595 | href="#_Toc138932281">EBRAINS Ethics & Society Toolkit<span | ||
596 | style='color:windowtext;display:none;text-decoration:none'> </span><span | ||
597 | style='color:windowtext;display:none;text-decoration:none'>12</span></a></span></span></p> | ||
598 | |||
599 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
600 | href="#_Toc138932282">EBRAINS Image Service<span style='color:windowtext; | ||
601 | display:none;text-decoration:none'>. </span><span | ||
602 | style='color:windowtext;display:none;text-decoration:none'>12</span></a></span></span></p> | ||
603 | |||
604 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
605 | href="#_Toc138932283">EBRAINS Knowledge Graph<span style='color:windowtext; | ||
606 | display:none;text-decoration:none'>. </span><span | ||
607 | style='color:windowtext;display:none;text-decoration:none'>12</span></a></span></span></p> | ||
608 | |||
609 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
610 | href="#_Toc138932284">EDI Toolkit<span style='color:windowtext;display:none; | ||
611 | text-decoration:none'> </span><span | ||
612 | style='color:windowtext;display:none;text-decoration:none'>12</span></a></span></span></p> | ||
613 | |||
614 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
615 | href="#_Toc138932285">eFEL<span style='color:windowtext;display:none; | ||
616 | text-decoration:none'>. </span><span | ||
617 | style='color:windowtext;display:none;text-decoration:none'>12</span></a></span></span></p> | ||
618 | |||
619 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
620 | href="#_Toc138932286">Electrophysiology Analysis Toolkit<span style='color: | ||
621 | windowtext;display:none;text-decoration:none'> </span><span | ||
622 | style='color:windowtext;display:none;text-decoration:none'>13</span></a></span></span></p> | ||
623 | |||
624 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
625 | href="#_Toc138932287">FAConstructor<span style='color:windowtext;display:none; | ||
626 | text-decoration:none'> </span><span | ||
627 | style='color:windowtext;display:none;text-decoration:none'>13</span></a></span></span></p> | ||
628 | |||
629 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
630 | href="#_Toc138932288">fairgraph<span style='color:windowtext;display:none; | ||
631 | text-decoration:none'>. </span><span | ||
632 | style='color:windowtext;display:none;text-decoration:none'>13</span></a></span></span></p> | ||
633 | |||
634 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
635 | href="#_Toc138932289">Fast sampling with neuromorphic hardware<span | ||
636 | style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
637 | style='color:windowtext;display:none;text-decoration:none'>13</span></a></span></span></p> | ||
638 | |||
639 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
640 | href="#_Toc138932290">fastPLI<span style='color:windowtext;display:none; | ||
641 | text-decoration:none'> </span><span | ||
642 | style='color:windowtext;display:none;text-decoration:none'>13</span></a></span></span></p> | ||
643 | |||
644 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
645 | href="#_Toc138932291">Feed-forward LFP-MEG estimator from mean-field models<span | ||
646 | style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
647 | style='color:windowtext;display:none;text-decoration:none'>13</span></a></span></span></p> | ||
648 | |||
649 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
650 | href="#_Toc138932292">FIL<span style='color:windowtext;display:none;text-decoration: | ||
651 | none'>. </span><span | ||
652 | style='color:windowtext;display:none;text-decoration:none'>14</span></a></span></span></p> | ||
653 | |||
654 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
655 | href="#_Toc138932293">FMRALIGN<span style='color:windowtext;display:none; | ||
656 | text-decoration:none'>.. </span><span | ||
657 | style='color:windowtext;display:none;text-decoration:none'>14</span></a></span></span></p> | ||
658 | |||
659 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
660 | href="#_Toc138932294">Foa3D<span style='color:windowtext;display:none; | ||
661 | text-decoration:none'>.. </span><span | ||
662 | style='color:windowtext;display:none;text-decoration:none'>14</span></a></span></span></p> | ||
663 | |||
664 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
665 | href="#_Toc138932295">Frites<span style='color:windowtext;display:none; | ||
666 | text-decoration:none'>. </span><span | ||
667 | style='color:windowtext;display:none;text-decoration:none'>14</span></a></span></span></p> | ||
668 | |||
669 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
670 | href="#_Toc138932296">gridspeccer<span style='color:windowtext;display:none; | ||
671 | text-decoration:none'> </span><span | ||
672 | style='color:windowtext;display:none;text-decoration:none'>14</span></a></span></span></p> | ||
673 | |||
674 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
675 | href="#_Toc138932297">Hal-Cgp<span style='color:windowtext;display:none; | ||
676 | text-decoration:none'>. </span><span | ||
677 | style='color:windowtext;display:none;text-decoration:none'>14</span></a></span></span></p> | ||
678 | |||
679 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
680 | href="#_Toc138932298">Health Data Cloud<span style='color:windowtext; | ||
681 | display:none;text-decoration:none'>. </span><span | ||
682 | style='color:windowtext;display:none;text-decoration:none'>15</span></a></span></span></p> | ||
683 | |||
684 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
685 | href="#_Toc138932299">Hodgkin-Huxley Neuron Builder<span style='color:windowtext; | ||
686 | display:none;text-decoration:none'> </span><span | ||
687 | style='color:windowtext;display:none;text-decoration:none'>15</span></a></span></span></p> | ||
688 | |||
689 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
690 | href="#_Toc138932300">HPC Job Proxy<span style='color:windowtext;display:none; | ||
691 | text-decoration:none'>. </span><span | ||
692 | style='color:windowtext;display:none;text-decoration:none'>15</span></a></span></span></p> | ||
693 | |||
694 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
695 | href="#_Toc138932301">HPC Status Monitor<span style='color:windowtext; | ||
696 | display:none;text-decoration:none'> </span><span | ||
697 | style='color:windowtext;display:none;text-decoration:none'>15</span></a></span></span></p> | ||
698 | |||
699 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
700 | href="#_Toc138932302">Human Intracerebral EEG Platform<span style='color:windowtext; | ||
701 | display:none;text-decoration:none'>.. </span><span | ||
702 | style='color:windowtext;display:none;text-decoration:none'>15</span></a></span></span></p> | ||
703 | |||
704 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
705 | href="#_Toc138932303">Hybrid MM/CG Webserver<span style='color:windowtext; | ||
706 | display:none;text-decoration:none'> </span><span | ||
707 | style='color:windowtext;display:none;text-decoration:none'>16</span></a></span></span></p> | ||
708 | |||
709 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
710 | href="#_Toc138932304">Insite<span style='color:windowtext;display:none; | ||
711 | text-decoration:none'>. </span><span | ||
712 | style='color:windowtext;display:none;text-decoration:none'>16</span></a></span></span></p> | ||
713 | |||
714 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
715 | href="#_Toc138932305">Interactive Brain Atlas Viewer<span style='color:windowtext; | ||
716 | display:none;text-decoration:none'> </span><span | ||
717 | style='color:windowtext;display:none;text-decoration:none'>16</span></a></span></span></p> | ||
718 | |||
719 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
720 | href="#_Toc138932306">JuGEx<span style='color:windowtext;display:none; | ||
721 | text-decoration:none'>. </span><span | ||
722 | style='color:windowtext;display:none;text-decoration:none'>16</span></a></span></span></p> | ||
723 | |||
724 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
725 | href="#_Toc138932307">KnowledgeSpace<span style='color:windowtext;display:none; | ||
726 | text-decoration:none'>. </span><span | ||
727 | style='color:windowtext;display:none;text-decoration:none'>16</span></a></span></span></p> | ||
728 | |||
729 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
730 | href="#_Toc138932308">L2L<span style='color:windowtext;display:none;text-decoration: | ||
731 | none'>. </span><span | ||
732 | style='color:windowtext;display:none;text-decoration:none'>17</span></a></span></span></p> | ||
733 | |||
734 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
735 | href="#_Toc138932309">Leveltlab/SpectralSegmentation<span style='color:windowtext; | ||
736 | display:none;text-decoration:none'>. </span><span | ||
737 | style='color:windowtext;display:none;text-decoration:none'>17</span></a></span></span></p> | ||
738 | |||
739 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
740 | href="#_Toc138932310">LFPy<span style='color:windowtext;display:none; | ||
741 | text-decoration:none'>. </span><span | ||
742 | style='color:windowtext;display:none;text-decoration:none'>17</span></a></span></span></p> | ||
743 | |||
744 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
745 | href="#_Toc138932311">libsonata<span style='color:windowtext;display:none; | ||
746 | text-decoration:none'>. </span><span | ||
747 | style='color:windowtext;display:none;text-decoration:none'>17</span></a></span></span></p> | ||
748 | |||
749 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
750 | href="#_Toc138932312">Live Papers<span style='color:windowtext;display:none; | ||
751 | text-decoration:none'>. </span><span | ||
752 | style='color:windowtext;display:none;text-decoration:none'>17</span></a></span></span></p> | ||
753 | |||
754 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
755 | href="#_Toc138932313">Livre<span style='color:windowtext;display:none; | ||
756 | text-decoration:none'>. </span><span | ||
757 | style='color:windowtext;display:none;text-decoration:none'>18</span></a></span></span></p> | ||
758 | |||
759 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
760 | href="#_Toc138932314">LocaliZoom<span style='color:windowtext;display:none; | ||
761 | text-decoration:none'>.. </span><span | ||
762 | style='color:windowtext;display:none;text-decoration:none'>18</span></a></span></span></p> | ||
763 | |||
764 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
765 | href="#_Toc138932315">MD-IFP<span style='color:windowtext;display:none; | ||
766 | text-decoration:none'>. </span><span | ||
767 | style='color:windowtext;display:none;text-decoration:none'>18</span></a></span></span></p> | ||
768 | |||
769 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
770 | href="#_Toc138932316">MEDUSA<span style='color:windowtext;display:none; | ||
771 | text-decoration:none'>. </span><span | ||
772 | style='color:windowtext;display:none;text-decoration:none'>18</span></a></span></span></p> | ||
773 | |||
774 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
775 | href="#_Toc138932317">MeshView<span style='color:windowtext;display:none; | ||
776 | text-decoration:none'>.. </span><span | ||
777 | style='color:windowtext;display:none;text-decoration:none'>18</span></a></span></span></p> | ||
778 | |||
779 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
780 | href="#_Toc138932318">MIP<span style='color:windowtext;display:none;text-decoration: | ||
781 | none'>. </span><span | ||
782 | style='color:windowtext;display:none;text-decoration:none'>19</span></a></span></span></p> | ||
783 | |||
784 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
785 | href="#_Toc138932319">Model Validation Service<span style='color:windowtext; | ||
786 | display:none;text-decoration:none'>. </span><span | ||
787 | style='color:windowtext;display:none;text-decoration:none'>19</span></a></span></span></p> | ||
788 | |||
789 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
790 | href="#_Toc138932320">Model Validation Test Suites<span style='color:windowtext; | ||
791 | display:none;text-decoration:none'>. </span><span | ||
792 | style='color:windowtext;display:none;text-decoration:none'>19</span></a></span></span></p> | ||
793 | |||
794 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
795 | href="#_Toc138932321">MoDEL-CNS<span style='color:windowtext;display:none; | ||
796 | text-decoration:none'>. </span><span | ||
797 | style='color:windowtext;display:none;text-decoration:none'>19</span></a></span></span></p> | ||
798 | |||
799 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
800 | href="#_Toc138932322">Modular Science<span style='color:windowtext;display: | ||
801 | none;text-decoration:none'>. </span><span | ||
802 | style='color:windowtext;display:none;text-decoration:none'>19</span></a></span></span></p> | ||
803 | |||
804 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
805 | href="#_Toc138932323">Monsteer<span style='color:windowtext;display:none; | ||
806 | text-decoration:none'> </span><span | ||
807 | style='color:windowtext;display:none;text-decoration:none'>20</span></a></span></span></p> | ||
808 | |||
809 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
810 | href="#_Toc138932324">MorphIO<span style='color:windowtext;display:none; | ||
811 | text-decoration:none'>.. </span><span | ||
812 | style='color:windowtext;display:none;text-decoration:none'>20</span></a></span></span></p> | ||
813 | |||
814 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
815 | href="#_Toc138932325">Morphology alignment tool<span style='color:windowtext; | ||
816 | display:none;text-decoration:none'> </span><span | ||
817 | style='color:windowtext;display:none;text-decoration:none'>20</span></a></span></span></p> | ||
818 | |||
819 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
820 | href="#_Toc138932326">MorphTool<span style='color:windowtext;display:none; | ||
821 | text-decoration:none'> </span><span | ||
822 | style='color:windowtext;display:none;text-decoration:none'>20</span></a></span></span></p> | ||
823 | |||
824 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
825 | href="#_Toc138932327">Multi-Brain<span style='color:windowtext;display:none; | ||
826 | text-decoration:none'>. </span><span | ||
827 | style='color:windowtext;display:none;text-decoration:none'>20</span></a></span></span></p> | ||
828 | |||
829 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
830 | href="#_Toc138932328">Multi-Image-OSD<span style='color:windowtext;display: | ||
831 | none;text-decoration:none'>.. </span><span | ||
832 | style='color:windowtext;display:none;text-decoration:none'>21</span></a></span></span></p> | ||
833 | |||
834 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
835 | href="#_Toc138932329">MUSIC<span style='color:windowtext;display:none; | ||
836 | text-decoration:none'>. </span><span | ||
837 | style='color:windowtext;display:none;text-decoration:none'>21</span></a></span></span></p> | ||
838 | |||
839 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
840 | href="#_Toc138932330">NEAT<span style='color:windowtext;display:none; | ||
841 | text-decoration:none'>. </span><span | ||
842 | style='color:windowtext;display:none;text-decoration:none'>21</span></a></span></span></p> | ||
843 | |||
844 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
845 | href="#_Toc138932331">Neo<span style='color:windowtext;display:none;text-decoration: | ||
846 | none'>. </span><span | ||
847 | style='color:windowtext;display:none;text-decoration:none'>21</span></a></span></span></p> | ||
848 | |||
849 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
850 | href="#_Toc138932332">Neo Viewer<span style='color:windowtext;display:none; | ||
851 | text-decoration:none'> </span><span | ||
852 | style='color:windowtext;display:none;text-decoration:none'>21</span></a></span></span></p> | ||
853 | |||
854 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
855 | href="#_Toc138932333">NEST Desktop<span style='color:windowtext;display:none; | ||
856 | text-decoration:none'>. </span><span | ||
857 | style='color:windowtext;display:none;text-decoration:none'>22</span></a></span></span></p> | ||
858 | |||
859 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
860 | href="#_Toc138932334">NEST Simulator<span style='color:windowtext;display:none; | ||
861 | text-decoration:none'> </span><span | ||
862 | style='color:windowtext;display:none;text-decoration:none'>22</span></a></span></span></p> | ||
863 | |||
864 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
865 | href="#_Toc138932335">NESTML<span style='color:windowtext;display:none; | ||
866 | text-decoration:none'>. </span><span | ||
867 | style='color:windowtext;display:none;text-decoration:none'>22</span></a></span></span></p> | ||
868 | |||
869 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
870 | href="#_Toc138932336">NetPyNE<span style='color:windowtext;display:none; | ||
871 | text-decoration:none'>. </span><span | ||
872 | style='color:windowtext;display:none;text-decoration:none'>22</span></a></span></span></p> | ||
873 | |||
874 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
875 | href="#_Toc138932337">NEURO-CONNECT<span style='color:windowtext;display:none; | ||
876 | text-decoration:none'>. </span><span | ||
877 | style='color:windowtext;display:none;text-decoration:none'>22</span></a></span></span></p> | ||
878 | |||
879 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
880 | href="#_Toc138932338">NeuroFeatureExtract<span style='color:windowtext; | ||
881 | display:none;text-decoration:none'> </span><span | ||
882 | style='color:windowtext;display:none;text-decoration:none'>23</span></a></span></span></p> | ||
883 | |||
884 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
885 | href="#_Toc138932339">NeurogenPy<span style='color:windowtext;display:none; | ||
886 | text-decoration:none'>. </span><span | ||
887 | style='color:windowtext;display:none;text-decoration:none'>23</span></a></span></span></p> | ||
888 | |||
889 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
890 | href="#_Toc138932340">NeuroM<span style='color:windowtext;display:none; | ||
891 | text-decoration:none'>... </span><span | ||
892 | style='color:windowtext;display:none;text-decoration:none'>23</span></a></span></span></p> | ||
893 | |||
894 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
895 | href="#_Toc138932341">Neuromorphic Computing Job Queue<span style='color:windowtext; | ||
896 | display:none;text-decoration:none'>. </span><span | ||
897 | style='color:windowtext;display:none;text-decoration:none'>23</span></a></span></span></p> | ||
898 | |||
899 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
900 | href="#_Toc138932342">Neuronize v2<span style='color:windowtext;display:none; | ||
901 | text-decoration:none'>. </span><span | ||
902 | style='color:windowtext;display:none;text-decoration:none'>23</span></a></span></span></p> | ||
903 | |||
904 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
905 | href="#_Toc138932343">NeuroR<span style='color:windowtext;display:none; | ||
906 | text-decoration:none'>. </span><span | ||
907 | style='color:windowtext;display:none;text-decoration:none'>24</span></a></span></span></p> | ||
908 | |||
909 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
910 | href="#_Toc138932344">Neurorobotics Platform<span style='color:windowtext; | ||
911 | display:none;text-decoration:none'>.. </span><span | ||
912 | style='color:windowtext;display:none;text-decoration:none'>24</span></a></span></span></p> | ||
913 | |||
914 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
915 | href="#_Toc138932345">Neurorobotics Platform Robot Designer<span | ||
916 | style='color:windowtext;display:none;text-decoration:none'> </span><span | ||
917 | style='color:windowtext;display:none;text-decoration:none'>24</span></a></span></span></p> | ||
918 | |||
919 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
920 | href="#_Toc138932346">NeuroScheme<span style='color:windowtext;display:none; | ||
921 | text-decoration:none'>. </span><span | ||
922 | style='color:windowtext;display:none;text-decoration:none'>24</span></a></span></span></p> | ||
923 | |||
924 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
925 | href="#_Toc138932347">NeuroSuites<span style='color:windowtext;display:none; | ||
926 | text-decoration:none'>. </span><span | ||
927 | style='color:windowtext;display:none;text-decoration:none'>24</span></a></span></span></p> | ||
928 | |||
929 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
930 | href="#_Toc138932348">NeuroTessMesh<span style='color:windowtext;display:none; | ||
931 | text-decoration:none'>. </span><span | ||
932 | style='color:windowtext;display:none;text-decoration:none'>25</span></a></span></span></p> | ||
933 | |||
934 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
935 | href="#_Toc138932349">NMODL Framework<span style='color:windowtext;display: | ||
936 | none;text-decoration:none'>. </span><span | ||
937 | style='color:windowtext;display:none;text-decoration:none'>25</span></a></span></span></p> | ||
938 | |||
939 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
940 | href="#_Toc138932350">NSuite<span style='color:windowtext;display:none; | ||
941 | text-decoration:none'>. </span><span | ||
942 | style='color:windowtext;display:none;text-decoration:none'>25</span></a></span></span></p> | ||
943 | |||
944 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
945 | href="#_Toc138932351">ODE-toolbox<span style='color:windowtext;display:none; | ||
946 | text-decoration:none'>. </span><span | ||
947 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
948 | |||
949 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
950 | href="#_Toc138932352">openMINDS<span style='color:windowtext;display:none; | ||
951 | text-decoration:none'>. </span><span | ||
952 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
953 | |||
954 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
955 | href="#_Toc138932353">openMINDS metadata for TVB-ready data<span | ||
956 | style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
957 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
958 | |||
959 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
960 | href="#_Toc138932354">PCI<span style='color:windowtext;display:none;text-decoration: | ||
961 | none'> </span><span | ||
962 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
963 | |||
964 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
965 | href="#_Toc138932355">PIPSA<span style='color:windowtext;display:none; | ||
966 | text-decoration:none'>. </span><span | ||
967 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
968 | |||
969 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
970 | href="#_Toc138932356">PoSCE<span style='color:windowtext;display:none; | ||
971 | text-decoration:none'>. </span><span | ||
972 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
973 | |||
974 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
975 | href="#_Toc138932357">Provenance API<span style='color:windowtext;display:none; | ||
976 | text-decoration:none'> </span><span | ||
977 | style='color:windowtext;display:none;text-decoration:none'>26</span></a></span></span></p> | ||
978 | |||
979 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
980 | href="#_Toc138932358">PyNN<span style='color:windowtext;display:none; | ||
981 | text-decoration:none'>.. </span><span | ||
982 | style='color:windowtext;display:none;text-decoration:none'>27</span></a></span></span></p> | ||
983 | |||
984 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
985 | href="#_Toc138932359">Pyramidal Explorer<span style='color:windowtext; | ||
986 | display:none;text-decoration:none'> </span><span | ||
987 | style='color:windowtext;display:none;text-decoration:none'>27</span></a></span></span></p> | ||
988 | |||
989 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
990 | href="#_Toc138932360">QCAlign software<span style='color:windowtext;display: | ||
991 | none;text-decoration:none'>. </span><span | ||
992 | style='color:windowtext;display:none;text-decoration:none'>27</span></a></span></span></p> | ||
993 | |||
994 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
995 | href="#_Toc138932361">QuickNII<span style='color:windowtext;display:none; | ||
996 | text-decoration:none'> </span><span | ||
997 | style='color:windowtext;display:none;text-decoration:none'>27</span></a></span></span></p> | ||
998 | |||
999 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1000 | href="#_Toc138932362">Quota Manager<span style='color:windowtext;display:none; | ||
1001 | text-decoration:none'> </span><span | ||
1002 | style='color:windowtext;display:none;text-decoration:none'>27</span></a></span></span></p> | ||
1003 | |||
1004 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1005 | href="#_Toc138932363">RateML<span style='color:windowtext;display:none; | ||
1006 | text-decoration:none'>. </span><span | ||
1007 | style='color:windowtext;display:none;text-decoration:none'>28</span></a></span></span></p> | ||
1008 | |||
1009 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1010 | href="#_Toc138932364">Region-wise CBPP using the Julich BrainÊCytoarchitectonic | ||
1011 | Atlas<span style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
1012 | style='color:windowtext;display:none;text-decoration:none'>28</span></a></span></span></p> | ||
1013 | |||
1014 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1015 | href="#_Toc138932365">RRI Capacity Development Resources<span style='color: | ||
1016 | windowtext;display:none;text-decoration:none'>. </span><span | ||
1017 | style='color:windowtext;display:none;text-decoration:none'>28</span></a></span></span></p> | ||
1018 | |||
1019 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1020 | href="#_Toc138932366">rsHRF<span style='color:windowtext;display:none; | ||
1021 | text-decoration:none'>. </span><span | ||
1022 | style='color:windowtext;display:none;text-decoration:none'>28</span></a></span></span></p> | ||
1023 | |||
1024 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1025 | href="#_Toc138932367">RTNeuron<span style='color:windowtext;display:none; | ||
1026 | text-decoration:none'>. </span><span | ||
1027 | style='color:windowtext;display:none;text-decoration:none'>29</span></a></span></span></p> | ||
1028 | |||
1029 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1030 | href="#_Toc138932368">sbs: Spike-based Sampling<span style='color:windowtext; | ||
1031 | display:none;text-decoration:none'>. </span><span | ||
1032 | style='color:windowtext;display:none;text-decoration:none'>29</span></a></span></span></p> | ||
1033 | |||
1034 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1035 | href="#_Toc138932369">SDA 7<span style='color:windowtext;display:none; | ||
1036 | text-decoration:none'>. </span><span | ||
1037 | style='color:windowtext;display:none;text-decoration:none'>29</span></a></span></span></p> | ||
1038 | |||
1039 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1040 | href="#_Toc138932370">Shape & Appearance Modelling<span style='color:windowtext; | ||
1041 | display:none;text-decoration:none'>. </span><span | ||
1042 | style='color:windowtext;display:none;text-decoration:none'>29</span></a></span></span></p> | ||
1043 | |||
1044 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1045 | href="#_Toc138932371">siibra-api<span style='color:windowtext;display:none; | ||
1046 | text-decoration:none'> </span><span | ||
1047 | style='color:windowtext;display:none;text-decoration:none'>29</span></a></span></span></p> | ||
1048 | |||
1049 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1050 | href="#_Toc138932372">siibra-explorer<span style='color:windowtext;display: | ||
1051 | none;text-decoration:none'> </span><span | ||
1052 | style='color:windowtext;display:none;text-decoration:none'>29</span></a></span></span></p> | ||
1053 | |||
1054 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1055 | href="#_Toc138932373">siibra-python<span style='color:windowtext;display:none; | ||
1056 | text-decoration:none'>. </span><span | ||
1057 | style='color:windowtext;display:none;text-decoration:none'>30</span></a></span></span></p> | ||
1058 | |||
1059 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1060 | href="#_Toc138932374">Single Cell Model (Re)builder Notebook<span | ||
1061 | style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
1062 | style='color:windowtext;display:none;text-decoration:none'>30</span></a></span></span></p> | ||
1063 | |||
1064 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1065 | href="#_Toc138932375">Slurm Plugin for Co-allocation of Compute and Data | ||
1066 | Resources<span style='color:windowtext;display:none;text-decoration:none'>. </span><span | ||
1067 | style='color:windowtext;display:none;text-decoration:none'>30</span></a></span></span></p> | ||
1068 | |||
1069 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1070 | href="#_Toc138932376">Snudda<span style='color:windowtext;display:none; | ||
1071 | text-decoration:none'>. </span><span | ||
1072 | style='color:windowtext;display:none;text-decoration:none'>30</span></a></span></span></p> | ||
1073 | |||
1074 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1075 | href="#_Toc138932377">SomaSegmenter<span style='color:windowtext;display:none; | ||
1076 | text-decoration:none'> </span><span | ||
1077 | style='color:windowtext;display:none;text-decoration:none'>30</span></a></span></span></p> | ||
1078 | |||
1079 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1080 | href="#_Toc138932378">SpiNNaker<span style='color:windowtext;display:none; | ||
1081 | text-decoration:none'> </span><span | ||
1082 | style='color:windowtext;display:none;text-decoration:none'>31</span></a></span></span></p> | ||
1083 | |||
1084 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1085 | href="#_Toc138932379">SSB toolkit<span style='color:windowtext;display:none; | ||
1086 | text-decoration:none'> </span><span | ||
1087 | style='color:windowtext;display:none;text-decoration:none'>31</span></a></span></span></p> | ||
1088 | |||
1089 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1090 | href="#_Toc138932380">Subcellular model building and calibration tool set<span | ||
1091 | style='color:windowtext;display:none;text-decoration:none'> </span><span | ||
1092 | style='color:windowtext;display:none;text-decoration:none'>31</span></a></span></span></p> | ||
1093 | |||
1094 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1095 | href="#_Toc138932381">Synaptic Events Fitting<span style='color:windowtext; | ||
1096 | display:none;text-decoration:none'>. </span><span | ||
1097 | style='color:windowtext;display:none;text-decoration:none'>31</span></a></span></span></p> | ||
1098 | |||
1099 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1100 | href="#_Toc138932382">Synaptic Plasticity Explorer<span style='color:windowtext; | ||
1101 | display:none;text-decoration:none'> </span><span | ||
1102 | style='color:windowtext;display:none;text-decoration:none'>32</span></a></span></span></p> | ||
1103 | |||
1104 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1105 | href="#_Toc138932383">Synaptic proteome database (SQLite)<span | ||
1106 | style='color:windowtext;display:none;text-decoration:none'> </span><span | ||
1107 | style='color:windowtext;display:none;text-decoration:none'>32</span></a></span></span></p> | ||
1108 | |||
1109 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1110 | href="#_Toc138932384">Synaptome.db<span style='color:windowtext;display:none; | ||
1111 | text-decoration:none'>. </span><span | ||
1112 | style='color:windowtext;display:none;text-decoration:none'>32</span></a></span></span></p> | ||
1113 | |||
1114 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1115 | href="#_Toc138932385">Tide<span style='color:windowtext;display:none; | ||
1116 | text-decoration:none'>. </span><span | ||
1117 | style='color:windowtext;display:none;text-decoration:none'>32</span></a></span></span></p> | ||
1118 | |||
1119 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1120 | href="#_Toc138932386">TVB EBRAINS<span style='color:windowtext;display:none; | ||
1121 | text-decoration:none'>. </span><span | ||
1122 | style='color:windowtext;display:none;text-decoration:none'>32</span></a></span></span></p> | ||
1123 | |||
1124 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1125 | href="#_Toc138932387">TVB Image Processing Pipeline<span style='color:windowtext; | ||
1126 | display:none;text-decoration:none'>. </span><span | ||
1127 | style='color:windowtext;display:none;text-decoration:none'>33</span></a></span></span></p> | ||
1128 | |||
1129 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1130 | href="#_Toc138932388">TVB Inversion<span style='color:windowtext;display:none; | ||
1131 | text-decoration:none'>. </span><span | ||
1132 | style='color:windowtext;display:none;text-decoration:none'>33</span></a></span></span></p> | ||
1133 | |||
1134 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1135 | href="#_Toc138932389">TVB Web App<span style='color:windowtext;display:none; | ||
1136 | text-decoration:none'>. </span><span | ||
1137 | style='color:windowtext;display:none;text-decoration:none'>33</span></a></span></span></p> | ||
1138 | |||
1139 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1140 | href="#_Toc138932390">TVB Widgets<span style='color:windowtext;display:none; | ||
1141 | text-decoration:none'>. </span><span | ||
1142 | style='color:windowtext;display:none;text-decoration:none'>33</span></a></span></span></p> | ||
1143 | |||
1144 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1145 | href="#_Toc138932391">TVB-Multiscale<span style='color:windowtext;display:none; | ||
1146 | text-decoration:none'>. </span><span | ||
1147 | style='color:windowtext;display:none;text-decoration:none'>33</span></a></span></span></p> | ||
1148 | |||
1149 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1150 | href="#_Toc138932392">VIOLA<span style='color:windowtext;display:none; | ||
1151 | text-decoration:none'>. </span><span | ||
1152 | style='color:windowtext;display:none;text-decoration:none'>34</span></a></span></span></p> | ||
1153 | |||
1154 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1155 | href="#_Toc138932393">Vishnu 1.0<span style='color:windowtext;display:none; | ||
1156 | text-decoration:none'>. </span><span | ||
1157 | style='color:windowtext;display:none;text-decoration:none'>34</span></a></span></span></p> | ||
1158 | |||
1159 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1160 | href="#_Toc138932394">ViSimpl<span style='color:windowtext;display:none; | ||
1161 | text-decoration:none'> </span><span | ||
1162 | style='color:windowtext;display:none;text-decoration:none'>34</span></a></span></span></p> | ||
1163 | |||
1164 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1165 | href="#_Toc138932395">VisuAlign<span style='color:windowtext;display:none; | ||
1166 | text-decoration:none'>. </span><span | ||
1167 | style='color:windowtext;display:none;text-decoration:none'>34</span></a></span></span></p> | ||
1168 | |||
1169 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1170 | href="#_Toc138932396">VMetaFlow<span style='color:windowtext;display:none; | ||
1171 | text-decoration:none'>.. </span><span | ||
1172 | style='color:windowtext;display:none;text-decoration:none'>34</span></a></span></span></p> | ||
1173 | |||
1174 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1175 | href="#_Toc138932397">Voluba<span style='color:windowtext;display:none; | ||
1176 | text-decoration:none'>. </span><span | ||
1177 | style='color:windowtext;display:none;text-decoration:none'>35</span></a></span></span></p> | ||
1178 | |||
1179 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1180 | href="#_Toc138932398">WebAlign<span style='color:windowtext;display:none; | ||
1181 | text-decoration:none'>. </span><span | ||
1182 | style='color:windowtext;display:none;text-decoration:none'>35</span></a></span></span></p> | ||
1183 | |||
1184 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1185 | href="#_Toc138932399">Webilastik<span style='color:windowtext;display:none; | ||
1186 | text-decoration:none'>. </span><span | ||
1187 | style='color:windowtext;display:none;text-decoration:none'>35</span></a></span></span></p> | ||
1188 | |||
1189 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1190 | href="#_Toc138932400">WebWarp<span style='color:windowtext;display:none; | ||
1191 | text-decoration:none'>. </span><span | ||
1192 | style='color:windowtext;display:none;text-decoration:none'>35</span></a></span></span></p> | ||
1193 | |||
1194 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1195 | href="#_Toc138932401">ZetaStitcher<span style='color:windowtext;display:none; | ||
1196 | text-decoration:none'> </span><span | ||
1197 | style='color:windowtext;display:none;text-decoration:none'>35</span></a></span></span></p> | ||
1198 | |||
1199 | <p class=MsoToc2><span class=MsoHyperlink><span lang=en-DE><a | ||
1200 | href="#_Toc138932402">TauRAMD<span style='color:windowtext;display:none; | ||
1201 | text-decoration:none'>.. </span><span | ||
1202 | style='color:windowtext;display:none;text-decoration:none'>36</span></a></span></span></p> | ||
1203 | |||
1204 | <p class=MsoNormal><span lang=en-DE> </span></p> | ||
1205 | |||
1206 | <h2><a name="_Toc138932248"><span lang=en-DE>AngoraPy</span></a></h2> | ||
1207 | |||
1208 | <p class=MsoNormal><span lang=en-DE>AngoraPy is an open-source Python library | ||
1209 | that helps neuroscientists build and train goal-driven models of the sensorimotor | ||
1210 | system. The toolbox comprises state-of-the-art machine learning techniques | ||
1211 | under the hood of an easy-to-use API. With the help of deep reinforcement | ||
1212 | learning, the connectivity required for solving complex, ecologically valid | ||
1213 | tasks, can be learned autonomously, obviating the need for hand-engineered or | ||
1214 | hypothesis-driven connectivity patterns. With AngoraPy, neuroscientists can | ||
1215 | train custom deep neural networks on custom sensorimotor tasks.</span></p> | ||
1216 | |||
1217 | <h2></h2> | ||
1218 | |||
1219 | <h2><a name="_Toc138932249"><span lang=en-DE>AnonyMI</span></a></h2> | ||
1220 | |||
1221 | <p class=MsoNormal><span lang=en-DE>AnonyMI is an MRI de-identification tool | ||
1222 | that uses 3D surface modelling in order to de-identify MRIs while retaining as | ||
1223 | much geometrical information as possible. It can be run automatically or | ||
1224 | manually, which allows precise tailoring for specific needs. AnonyMI is | ||
1225 | distributed as a plug-in of 3D Slicer, a widely used, open-source, stable and | ||
1226 | reliable image-processing software. It leverages the power of this platform for | ||
1227 | reading and saving images, which makes it applicable on almost any MRI file | ||
1228 | type, including all the most commonly used formats (e.g., DICOM, Nifti, Analyze | ||
1229 | etc.).</span></p> | ||
1230 | |||
1231 | <h2></h2> | ||
1232 | |||
1233 | <h2><a name="_Toc138932250"><span lang=en-DE>Arbor</span></a></h2> | ||
1234 | |||
1235 | <p class=MsoNormal><span lang=en-DE>Arbor is a simulation software library for | ||
1236 | neuron models with complex morphologies Ñ from single cells to large | ||
1237 | distributed networks. Developed entirely inside HBP, it enables running | ||
1238 | large-scale simulations on any HPC, including those available through EBRAINS. | ||
1239 | Arbor provides performance portability for native execution on all HPC | ||
1240 | architectures. Optimized vectorized code is generated for Intel, AMD and ARM | ||
1241 | CPUs, NVIDIA and AMD GPUs, and support will be added for new architectures as | ||
1242 | they become available. Model portability is easier due to an interface for | ||
1243 | model description independent of how Arbor represents models internally. | ||
1244 | Interoperability with other simulation engines is enabled via API for spike | ||
1245 | exchange and the output of voltages, currents and model state.</span></p> | ||
1246 | |||
1247 | <h2></h2> | ||
1248 | |||
1249 | <h2><a name="_Toc138932251"><span lang=en-DE>Arbor GUI</span></a></h2> | ||
1250 | |||
1251 | <p class=MsoNormal><span lang=en-DE>Arbor GUI strives to be self-contained, | ||
1252 | fast and easy to use. Design morphologically detailed cells for simulation in | ||
1253 | Arbor. Load morphologies from SWC .swc, NeuroML .nml, NeuroLucida .asc. Define | ||
1254 | and highlight Arbor regions and locsets. Paint ion dynamics and bio-physical | ||
1255 | properties onto morphologies. Place spike detectors and probes. Export cable | ||
1256 | cells to Arbor's internal format (ACC) for direct simulation. Import cable | ||
1257 | cells in ACC format. This project is under active development and welcomes | ||
1258 | early feedback.</span></p> | ||
1259 | |||
1260 | <h2></h2> | ||
1261 | |||
1262 | <h2><a name="_Toc138932252"><span lang=en-DE>Bayesian Virtual Epileptic Patient | ||
1263 | (BVEP)</span></a></h2> | ||
1264 | |||
1265 | <p class=MsoNormal><span lang=en-DE>BVEP relies on the fusion of structural | ||
1266 | data of individuals, a generative model of epileptiform discharges, and the | ||
1267 | state-of-the-art probabilistic machine learning algorithms. It uses self-tuning | ||
1268 | Monte Carlo sampling algorithm, and the deep neural density estimators for | ||
1269 | reliable and efficient model-based inference at source and sensor levels data. | ||
1270 | The Bayesian framework provides an appropriate patient-specific strategy for | ||
1271 | estimating the extent of epileptogenic and propagation zones of the brain | ||
1272 | regions to improve outcome after epilepsy surgery.</span></p> | ||
1273 | |||
1274 | <h2></h2> | ||
1275 | |||
1276 | <h2><a name="_Toc138932253"><span lang=en-DE>BIDS Extension Proposal | ||
1277 | Computational Model Specifications</span></a></h2> | ||
1278 | |||
1279 | <p class=MsoNormal><span lang=en-DE>A data structure schema for neural network | ||
1280 | computational models that aims to be generically applicable to all kinds of | ||
1281 | neural network simulation software, mathematical models, computational models, | ||
1282 | and data models, but with a focus on dynamic circuit models of brain activity. </span></p> | ||
1283 | |||
1284 | <h2></h2> | ||
1285 | |||
1286 | <h2><a name="_Toc138932254"><span lang=en-DE>BioBB</span></a></h2> | ||
1287 | |||
1288 | <p class=MsoNormal><span lang=en-DE>The BioExcel Building Blocks (BioBB) | ||
1289 | software library is a collection of Python wrappers on top of popular | ||
1290 | biomolecular simulation tools. The library offers a layer of interoperability | ||
1291 | between the wrapped tools, which make them compatible and prepared to be | ||
1292 | directly interconnected to build complex biomolecular workflows. Building and | ||
1293 | sharing complex biomolecular simulation workflows just requires joining and | ||
1294 | connecting BioExcelBuilding Blocks together. Biomolecular simulation workflows | ||
1295 | built using the BioBB library are integrated in the Collaboratory Jupyter lab | ||
1296 | infrastructure, allowing the exploration of dynamics and flexibility of | ||
1297 | proteins related to the Central Nervous Systems.</span></p> | ||
1298 | |||
1299 | <h2></h2> | ||
1300 | |||
1301 | <h2><a name="_Toc138932255"><span lang=en-DE>BioExcel-CV19</span></a></h2> | ||
1302 | |||
1303 | <p class=MsoNormal><span lang=en-DE>BioExcel-CV19 is a platform designed to | ||
1304 | provide web access to atomistic-MD trajectories for macromolecules involved in | ||
1305 | the COVID-19 disease. The project is part of the open access initiatives | ||
1306 | promoted by the world-wide scientific community to share information about | ||
1307 | COVID-19 research. BioExcel-CV19 web server interface presents the resulting | ||
1308 | trajectories, with a set of quality control analyses and system information. | ||
1309 | All data produced by the project is available to download from an associated | ||
1310 | programmatic access API.</span></p> | ||
1311 | |||
1312 | <h2></h2> | ||
1313 | |||
1314 | <h2><a name="_Toc138932256"><span lang=en-DE>BioNAR</span></a></h2> | ||
1315 | |||
1316 | <p class=MsoNormal><span lang=en-DE>BioNAR combines a selection of | ||
1317 | existing R protocols for network analysis with newly designed original | ||
1318 | methodological features to support step-by-step analysis of | ||
1319 | biological/biomedical networks. BioNAR supports a pipeline approach where many | ||
1320 | networks and iterative analyses can be performed. BioNAR helps to achieve a | ||
1321 | number of network analysis goals that are difficult to achieve anywhere else, | ||
1322 | e.g., choose the optimal clustering algorithm from a range of options based on | ||
1323 | independent annotation enrichment</span><span lang=en-DE style='font-family: | ||
1324 | "Times New Roman",serif'> </span><span lang=en-DE>predict a proteins influence | ||
1325 | within and across multiple sub-complexes in the network and estimate the | ||
1326 | co-occurrence or linkage between meta-data at the network level.</span></p> | ||
1327 | |||
1328 | <h2></h2> | ||
1329 | |||
1330 | <h2><a name="_Toc138932257"><span lang=en-DE>BlueNaaS-single cell</span></a></h2> | ||
1331 | |||
1332 | <p class=MsoNormal><span lang=en-DE>BlueNaaS-SingleCell is an open-source web | ||
1333 | application. It enables users to quickly visualize single cell model | ||
1334 | morphologies in 3D or as a dendrogram. Using a simple web user interface, single | ||
1335 | cell simulations can be easily configured and launched, producing voltage | ||
1336 | traces from selected compartments.</span></p> | ||
1337 | |||
1338 | <h2></h2> | ||
1339 | |||
1340 | <h2><a name="_Toc138932258"><span lang=en-DE>BlueNaaS-subcellular</span></a></h2> | ||
1341 | |||
1342 | <p class=MsoNormal><span lang=en-DE>BlueNaaS-Subcellular is a web-based | ||
1343 | environment for creation and simulation of reaction-diffusion models. It allows | ||
1344 | the user to import, combine and simulate existing models derived from other | ||
1345 | parts of the pipeline. It is integrated with a number of solvers for | ||
1346 | reaction-diffusion systems of equations and can represent rule-based systems | ||
1347 | using BioNetGen. Additionally, it supports simulation of spatially distributed | ||
1348 | systems using STEPS (stochastic engine for pathway simulation), providing | ||
1349 | spatial stochastic and deterministic solvers for simulation of reactions and | ||
1350 | diffusion on tetrahedral meshes. It includes some visualisation tools such as a | ||
1351 | geometry viewer, a contact map and a reactivity network graph.</span></p> | ||
1352 | |||
1353 | <h2></h2> | ||
1354 | |||
1355 | <h2><a name="_Toc138932259"><span lang=en-DE>BluePyEfe</span></a></h2> | ||
1356 | |||
1357 | <p class=MsoNormal><span lang=en-DE>BluePyEfe eases the process of reading | ||
1358 | experimental recordings and extracting batches of electrical features from | ||
1359 | these recordings. It combines trace reading functions and features extraction | ||
1360 | functions from the eFel library. BluePyEfe outputs protocols and features files | ||
1361 | in the format used by BluePyOpt for neuron electrical model building.</span></p> | ||
1362 | |||
1363 | <h2></h2> | ||
1364 | |||
1365 | <h2><a name="_Toc138932260"><span lang=en-DE>BluePyMM</span></a></h2> | ||
1366 | |||
1367 | <p class=MsoNormal><span lang=en-DE>When building a network simulation, | ||
1368 | biophysically detailed electrical models (e-models) need to be tested for every | ||
1369 | morphology that is possibly used in the circuit. With current resources, | ||
1370 | e-models are not re-optimised for every morphology in the network. In a process | ||
1371 | called Cell Model Management (MM), we test if an existing e-model matches a | ||
1372 | particular morphology 'well enough'. It takes as input a morphology release, a | ||
1373 | circuit recipe and a set of e-models, then finds all possible (morphology, | ||
1374 | e-model)-combinations (me-combos) based on e-type, m-type, and layer as | ||
1375 | described by the circuit recipe, then calculates the scores for every | ||
1376 | combination. Finally, it writes out the resulting accepted me-combos to a | ||
1377 | database and produces a report with information on the number of matches.</span></p> | ||
1378 | |||
1379 | <h2></h2> | ||
1380 | |||
1381 | <h2><a name="_Toc138932261"><span lang=en-DE>BluePyOpt</span></a></h2> | ||
1382 | |||
1383 | <p class=MsoNormal><span lang=en-DE>BluePyOpt simplifies the task of creating | ||
1384 | and sharing these optimisations, and the associated techniques and knowledge. | ||
1385 | This is achieved by abstracting the optimisation and evaluation tasks into | ||
1386 | various reusable and flexible discrete elements according to established best | ||
1387 | practices. Further, BluePyOpt provides methods for setting up both small- and | ||
1388 | large-scale optimisations on a variety of platforms, ranging from laptops to | ||
1389 | Linux clusters and cloud-based computer infrastructures.</span></p> | ||
1390 | |||
1391 | <h2></h2> | ||
1392 | |||
1393 | <h2><a name="_Toc138932262"><span lang=en-DE>Brain Cockpit</span></a></h2> | ||
1394 | |||
1395 | <p class=MsoNormal><span lang=en-DE>Brain Cockpit is a web app comprising a | ||
1396 | Typescript front-end and a Python back-end. It is meant to help explore large | ||
1397 | surface fMRI datasets projected on surface meshes and alignments computed | ||
1398 | between brains, such as those computed with Fused Unbalanced Gromov-Wasserstein | ||
1399 | (fugw) for Python.</span></p> | ||
1400 | |||
1401 | <h2></h2> | ||
1402 | |||
1403 | <h2><a name="_Toc138932263"><span lang=en-DE>BrainScaleS</span></a></h2> | ||
1404 | |||
1405 | <p class=MsoNormal><span lang=en-DE>Emulate spiking neural networks in | ||
1406 | continuous time on the BrainScaleS analog neuromorphic computing system. Models | ||
1407 | and experiments can be described in Python using the PyNN modelling language, | ||
1408 | or in hxtorch, a PyTorch-based machine-learning-friendly API. The platform can | ||
1409 | be used interactively via the EBRAINS JupyterLab service or EBRAINS HPC</span><span | ||
1410 | lang=en-DE style='font-family:"Times New Roman",serif'> </span><span | ||
1411 | lang=en-DE>in addition, the NMPI web service provides batch-style access. The | ||
1412 | modelling APIs employ common data formats for input and output data, e.g., | ||
1413 | neo.</span></p> | ||
1414 | |||
1415 | <h2></h2> | ||
1416 | |||
1417 | <h2><a name="_Toc138932264"><span lang=en-DE>Brayns</span></a></h2> | ||
1418 | |||
1419 | <p class=MsoNormal><span lang=en-DE>Brayns is a large-scale scientific | ||
1420 | visualization platform based on Intel OSPRAY to perform CPU Ray-tracing and | ||
1421 | uses an extension-plugin architecture. The core provides basic functionalities | ||
1422 | that can be reused and/or extended on plugins, which are independent and can be | ||
1423 | loaded or disabled at start-up. This simplifies the process of adding support | ||
1424 | for new scientific visualization use cases, without compromising the | ||
1425 | reliability of the rest of the software. Brayns counts with braynsService, a | ||
1426 | rendering backend which can be accessed over the internet and streams images to | ||
1427 | connected clients. Already-made plugins include CircuitExplorer, DTI, | ||
1428 | AtlasExplorer, CylindricCamera and MoleculeExplorer.</span></p> | ||
1429 | |||
1430 | <p class=MsoNormal></p> | ||
1431 | |||
1432 | <h2><a name="_Toc138932265"><span lang=en-DE>Brion</span></a></h2> | ||
1433 | |||
1434 | <p class=MsoNormal><span lang=en-DE>Brion is a C++ project for read and write | ||
1435 | access to Blue Brain data structures, including BlueConfig/CircuitConfig, | ||
1436 | Circuit, CompartmentReport, Mesh, Morphology, Synapse and Target files. It also | ||
1437 | offers an interface in Python.</span></p> | ||
1438 | |||
1439 | <h2></h2> | ||
1440 | |||
1441 | <h2><a name="_Toc138932266"><span lang=en-DE>BSB</span></a></h2> | ||
1442 | |||
1443 | <p class=MsoNormal><span lang=en-DE>The BSB reconstructs realistic neural | ||
1444 | circuits by placing and connecting fibres and neurons with detailed | ||
1445 | morphologies or only simplified geometrical features. Configure your model the | ||
1446 | way you need. Interfaces with several simulators (CoreNEURON, Arbor, NEST) | ||
1447 | allow simulation of the reconstructed network and investigation of the | ||
1448 | structure-function-dynamics relationships at different levels of resolution. | ||
1449 | The 'scaffold' design allows an easy model reconfiguration reflecting variants | ||
1450 | across brain regions, animal species and physio-pathological conditions without | ||
1451 | dismounting the basic network structure. The BSB provides effortless parallel | ||
1452 | computing both for the reconstruction and simulation phase.</span></p> | ||
1453 | |||
1454 | <h2></h2> | ||
1455 | |||
1456 | <h2><a name="_Toc138932267"><span lang=en-DE>BSP Service Account</span></a></h2> | ||
1457 | |||
1458 | <p class=MsoNormal><span lang=en-DE>The BSP Service Account is a rest API | ||
1459 | service that allows developers to submit user's jobs on HPC systems and | ||
1460 | retrieve results using the EBRAINS authentication, even if users don't have a | ||
1461 | personal account on the available HPC facilities.</span></p> | ||
1462 | |||
1463 | <h2></h2> | ||
1464 | |||
1465 | <h2><a name="_Toc138932268"><span lang=en-DE>bsp-usecase-wizard</span></a></h2> | ||
1466 | |||
1467 | <p class=MsoNormal><span lang=en-DE>The CLS interactive workflows and use cases | ||
1468 | application guides the users through the resolution of realistic scientific | ||
1469 | problems. They are implemented as either front-end or full stack web | ||
1470 | applications or Python-based Jupyter Notebooks that allow the user to | ||
1471 | interactively build, reconstruct or simulate data-driven brain models and | ||
1472 | perform data analysis visualisation. Web applications are freely accessible and | ||
1473 | only require authentication to EBRAINS when specific actions are required | ||
1474 | (e.g., submitting a simulation job to an HBP HPC system). Jupyter Notebooks are | ||
1475 | cloned to the lab.ebrains.eu platform and require authentication via an EBRAINS | ||
1476 | account.</span></p> | ||
1477 | |||
1478 | <h2></h2> | ||
1479 | |||
1480 | <h2><a name="_Toc138932269"><span lang=en-DE>CGMD Platform</span></a></h2> | ||
1481 | |||
1482 | <p class=MsoNormal><span lang=en-DE>Recent advances in CGMD simulations have | ||
1483 | allowed longer and larger molecular dynamics simulations of biological | ||
1484 | macromolecules and their interactions. The CGMD platform is dedicated to the | ||
1485 | preparation, running, and analysis of CGMD simulations, and built on a | ||
1486 | completely revisited version of the Martini coarsE gRained MembrAne proteIn | ||
1487 | Dynamics (MERMAID) web server. In its current version, the platform expands the | ||
1488 | existing implementation of the Martini force field for membrane proteins to | ||
1489 | also allow the simulation of soluble proteins using the Martini and SIRAH force | ||
1490 | fields. Moreover, it offers an automated protocol for carrying out the | ||
1491 | backmapping of the coarse-grained description of the system into an atomistic | ||
1492 | one.</span></p> | ||
1493 | |||
1494 | <h2></h2> | ||
1495 | |||
1496 | <h2><a name="_Toc138932270"><span lang=en-DE>CNS-ligands</span></a></h2> | ||
1497 | |||
1498 | <p class=MsoNormal><span lang=en-DE>The project is part of the Parameter | ||
1499 | generation and mechanistic studies of neuronal cascades using multi-scale | ||
1500 | molecular simulations of the HBP. CNS conformers are generated using a powerful | ||
1501 | multilevel strategy that combines a low-level (LL) method for sampling the | ||
1502 | conformational minima and high-level (HL) ab initio calculations for estimating | ||
1503 | their relative stability. CNS database presents the results in a graphical user | ||
1504 | interface, displaying small molecule properties, analyses and generated 3D | ||
1505 | conformers. All data produced by the project is available to download.</span></p> | ||
1506 | |||
1507 | <h2></h2> | ||
1508 | |||
1509 | <h2><a name="_Toc138932271"><span lang=en-DE>Cobrawap</span></a></h2> | ||
1510 | |||
1511 | <p class=MsoNormal><span lang=en-DE>Cobrawap is an adaptable and reusable | ||
1512 | software tool to study wave-like activity propagation in the cortex. It allows for | ||
1513 | the integration of heterogeneous data from different measurement techniques and | ||
1514 | simulations through alignment to common wave descriptions. Cobrawap provides an | ||
1515 | extendable collection of processing and analysis methods that can be combined | ||
1516 | and adapted to specific input data and research applications. It enables broad | ||
1517 | and rigorous comparisons of wave characteristics across multiple datasets, | ||
1518 | model calibration and validation applications, and its modular building blocks | ||
1519 | may serve to construct related analysis pipelines.</span></p> | ||
1520 | |||
1521 | <h2></h2> | ||
1522 | |||
1523 | <h2><a name="_Toc138932272"><span lang=en-DE>Collaboratory Bucket service</span></a></h2> | ||
1524 | |||
1525 | <p class=MsoNormal><span lang=en-DE>The Bucket service provides object storage | ||
1526 | to EBRAINS users without them having to request an account on Fenix (the | ||
1527 | EBRAINS infrastructure provider) and storage resources there. This is the | ||
1528 | recommended storage for datasets that are shared by data providers, on the | ||
1529 | condition that these do not contain sensitive personal data. For sharing | ||
1530 | datasets with personal data, users should refer to the Health Data Cloud. The | ||
1531 | Bucket service is better suited for larger files that are usually not edited, | ||
1532 | such as datasets and videos. For Docker images, users should refer to the | ||
1533 | EBRAINS Docker registry. For smaller files and files which are more likely to | ||
1534 | be edited, users should consider the Collaboratory Drive service.</span></p> | ||
1535 | |||
1536 | <h2></h2> | ||
1537 | |||
1538 | <h2><a name="_Toc138932273"><span lang=en-DE>Collaboratory Drive</span></a></h2> | ||
1539 | |||
1540 | <p class=MsoNormal><span lang=en-DE>The Drive service offers users cloud | ||
1541 | storage space for their files in each collab (workspace). The Drive storage is | ||
1542 | mounted in the Collaboratory Lab to provide persistent storage (as opposed to | ||
1543 | the Lab containers which are deleted after a few hours of inactivity). All | ||
1544 | files are under version control. The Drive is intended for smaller files | ||
1545 | (currently limited to 1 GB) that change more often. Users must not save files | ||
1546 | containing personal information in the Drive (i.e. data of living human subjects). | ||
1547 | The Drive is also integrated with the Collaboratory Office service to offer | ||
1548 | easy collaborative editing of Office files online.</span></p> | ||
1549 | |||
1550 | <h2></h2> | ||
1551 | |||
1552 | <h2><a name="_Toc138932274"><span lang=en-DE>Collaboratory IAM</span></a></h2> | ||
1553 | |||
1554 | <p class=MsoNormal><span lang=en-DE>The EBRAINS Collaboratory IAM allows the | ||
1555 | developers of different EBRAINS services to benefit from a single sign-on | ||
1556 | solution. End users will benefit from a seamless experience, whereby they can | ||
1557 | access a specific service and have direct access from it to resources in other | ||
1558 | EBRAINS services without re-authentication. For the developer, it is a good way | ||
1559 | for separating concerns and offloading much of the identification and | ||
1560 | authentication to a central service. The EBRAINS IAM is recognised as an | ||
1561 | identity provider at Fenix supercomputing sites. The IAM service also provides | ||
1562 | three ways of managing groups of users: Units, Groups and Teams.</span></p> | ||
1563 | |||
1564 | <h2></h2> | ||
1565 | |||
1566 | <h2><a name="_Toc138932275"><span lang=en-DE>Collaboratory Lab</span></a></h2> | ||
1567 | |||
1568 | <p class=MsoNormal><span lang=en-DE>The Collaboratory Lab provides EBRAINS | ||
1569 | users with a user-friendly programming environment for reproducible science. | ||
1570 | EBRAINS tools are pre-installed for the user. The latest release is selected by | ||
1571 | default, but users can choose to run an older release to reuse an older | ||
1572 | notebook, or try out the very latest features in the weekly experimental | ||
1573 | deployment. Official releases are produced by EBRAINS every few months. End | ||
1574 | users do not need to build and install the tools, and, more importantly, they | ||
1575 | do not need to resolve dependency conflicts among tools as this has been | ||
1576 | handled for them.</span></p> | ||
1577 | |||
1578 | <h2></h2> | ||
1579 | |||
1580 | <h2><a name="_Toc138932276"><span lang=en-DE>Collaboratory Office</span></a></h2> | ||
1581 | |||
1582 | <p class=MsoNormal><span lang=en-DE>With the Office service, EBRAINS users can | ||
1583 | collaboratively edit Office documents (Word, PowerPoint or Excel) with most of | ||
1584 | the key features of the MS Office tools. It uses the open standard formats | ||
1585 | .docx, .pptx and .xlsx so that files can alternately be edited in the | ||
1586 | Collaboratory Office service and in other compatible tools including the MS | ||
1587 | Office suite.</span></p> | ||
1588 | |||
1589 | <h2></h2> | ||
1590 | |||
1591 | <h2><a name="_Toc138932277"><span lang=en-DE>Collaboratory Wiki</span></a></h2> | ||
1592 | |||
1593 | <p class=MsoNormal><span lang=en-DE>The Wiki service offers the user-friendly | ||
1594 | wiki functionality for publishing web content. It acts as central user | ||
1595 | interface and API to access the other Collaboratory services. EBRAINS | ||
1596 | developers can integrate their services as app which can be instantiated by | ||
1597 | users in their collabs. The Wiki is a good place to create tutorials and | ||
1598 | documentation and it is also the place to publish your work on the internet if | ||
1599 | you choose to do so.</span></p> | ||
1600 | |||
1601 | <h2></h2> | ||
1602 | |||
1603 | <h2><a name="_Toc138932278"><span lang=en-DE>CoreNEURON</span></a></h2> | ||
1604 | |||
1605 | <p class=MsoNormal><span lang=en-DE>In order to adapt NEURON to evolving | ||
1606 | computer architectures, the compute engine of the NEURON simulator was | ||
1607 | extracted and optimised as a library called CoreNEURON. CoreNEURON is a compute | ||
1608 | engine library for the NEURON simulator optimised for both memory usage and | ||
1609 | computational speed on modern CPU/GPU architectures. Some of its key goals are | ||
1610 | to: 1) Efficiently simulate large network models, 2) Support execution on | ||
1611 | accelerators such as GPU, 3) Support optimisations such as vectorisation and | ||
1612 | cache-efficient memory layout.</span></p> | ||
1613 | |||
1614 | <h2></h2> | ||
1615 | |||
1616 | <h2><a name="_Toc138932279"><span lang=en-DE>CxSystem2</span></a></h2> | ||
1617 | |||
1618 | <p class=MsoNormal><span lang=en-DE>CxSystem is a cerebral cortex simulation | ||
1619 | framework, which operates on personal computers. The CxSystem enables easy | ||
1620 | testing and build-up of diverse models at single-cell resolution and it is | ||
1621 | implemented on the top of the Python-based Brain2 simulator. The CxSystem | ||
1622 | interface comprises two csv files - one for anatomy and technical details, the | ||
1623 | other for physiological parameters.</span></p> | ||
1624 | |||
1625 | <h2></h2> | ||
1626 | |||
1627 | <h2><a name="_Toc138932280"><span lang=en-DE>DeepSlice</span></a></h2> | ||
1628 | |||
1629 | <p class=MsoNormal><span lang=en-DE>DeepSlice is a deep neural network that | ||
1630 | aligns histological sections of mouse brain to the Allen Mouse Brain Common | ||
1631 | Coordinate Framework, adjusting for anterior-posterior position, angle, | ||
1632 | rotation and scale. At present, DeepSlice only works with tissue cut in the | ||
1633 | coronal plane, although future versions will be compatible with sagittal and | ||
1634 | horizontal sections.</span></p> | ||
1635 | |||
1636 | <h2></h2> | ||
1637 | |||
1638 | <h2><a name="_Toc138932281"><span lang=en-DE>EBRAINS Ethics & Society | ||
1639 | Toolkit</span></a></h2> | ||
1640 | |||
1641 | <p class=MsoNormal><span lang=en-DE>The aim of the toolkit is to offer | ||
1642 | researchers who carry out cross-disciplinary brain research a possibility to | ||
1643 | engage with ethical and societal issues within brain health and brain disease. | ||
1644 | The user is presented with short introductory texts, scenario-based dilemmas, | ||
1645 | animations and quizzes, all tailored to specific areas of ethics and society in | ||
1646 | a setting of brain research. All exercises are reflection-oriented, with an | ||
1647 | interactive approach to inspire users to incorporate these reflections into | ||
1648 | their own research practices. Moreover, it is possible to gain further | ||
1649 | knowledge by utilising the links for relevant publications, teaching modules | ||
1650 | and the EBRAINS Community Space.</span></p> | ||
1651 | |||
1652 | <h2></h2> | ||
1653 | |||
1654 | <h2><a name="_Toc138932282"><span lang=en-DE>EBRAINS Image Service</span></a></h2> | ||
1655 | |||
1656 | <p class=MsoNormal><span lang=en-DE>The Image Service takes large 2D (and 3D) | ||
1657 | images and preprocesses them to generate small 2D tiles (or 3D chunks). | ||
1658 | Applications consuming image data (viewers or other) can then access regions of | ||
1659 | interest by downloading a few tiles rather than the entire large image. Tiles | ||
1660 | are also generated at coarser resolutions to support zooming out of large | ||
1661 | images. The service supports multiple input image formats. The serving of tiles | ||
1662 | to apps is provided by the Collaboratory Bucket (based on OpenStack Swift | ||
1663 | object storage), which provides significantly higher network bandwidth than | ||
1664 | could be provided by any VM.</span></p> | ||
1665 | |||
1666 | <h2></h2> | ||
1667 | |||
1668 | <h2><a name="_Toc138932283"><span lang=en-DE>EBRAINS Knowledge Graph</span></a></h2> | ||
1669 | |||
1670 | <p class=MsoNormal><span lang=en-DE>The EBRAINS Knowledge Graph (KG) is the | ||
1671 | metadata management system of the EBRAINS Data and Knowledge services. It | ||
1672 | provides fundamental services and tools to make neuroscientific data, models | ||
1673 | and related software FAIR. The KG Editor and API (incl. Python SDKs) allow to | ||
1674 | annotate scientific resources in a semantically correct way. The KG Search | ||
1675 | exposes the research information via an intuitive user interface and makes the | ||
1676 | information publicly available to any user. For advanced users, the KG Query | ||
1677 | Builder and KG Core API provide the necessary means to execute detailed queries | ||
1678 | on the graph database whilst enforcing fine-grained permission control.</span></p> | ||
1679 | |||
1680 | <h2></h2> | ||
1681 | |||
1682 | <h2><a name="_Toc138932284"><span lang=en-DE>EDI Toolkit</span></a></h2> | ||
1683 | |||
1684 | <p class=MsoNormal><span lang=en-DE>The EDI Toolkit supports projects in | ||
1685 | integrating EDI in their research content and as guiding principles for team | ||
1686 | collaboration. It is designed for everyday usage by offering: Basic information | ||
1687 | Guiding questions, templates and tools to design responsible research Quick | ||
1688 | checklists, guidance for suitable structures and standard procedures Measures | ||
1689 | to support EDI-based leadership, fair teams and events</span></p> | ||
1690 | |||
1691 | <h2></h2> | ||
1692 | |||
1693 | <h2><a name="_Toc138932285"><span lang=en-DE>eFEL</span></a></h2> | ||
1694 | |||
1695 | <p class=MsoNormal><span lang=en-DE>eFEL allows neuroscientists to | ||
1696 | automatically extract features from time series data recorded from neurons | ||
1697 | (both in vitro and in silico). Examples include action potential width and | ||
1698 | amplitude in voltage traces recorded during whole-cell patch clamp experiments. | ||
1699 | Users can provide a set of traces and select which features to calculate. The | ||
1700 | library will then extract the requested features and return the values.</span></p> | ||
1701 | |||
1702 | <h2></h2> | ||
1703 | |||
1704 | <h2><a name="_Toc138932286"><span lang=en-DE>Electrophysiology Analysis Toolkit</span></a></h2> | ||
1705 | |||
1706 | <p class=MsoNormal><span lang=en-DE>The Electrophysiology Analysis Toolkit | ||
1707 | (Elephant) is a Python library that provides a modular framework for the | ||
1708 | analysis of experimental and simulated neuronal activity data, such as spike | ||
1709 | trains, local field potentials, and intracellular data. Elephant builds on the | ||
1710 | Neo data model to facilitate usability, enable interoperability, and support | ||
1711 | data from dozens of file formats and network simulation tools. Its analysis | ||
1712 | functions are continuously validated against reference implementations and | ||
1713 | reports in the literature. Visualisations of analysis results are made | ||
1714 | available via the Viziphant companion library. Elephant aims to act as a | ||
1715 | platform for sharing analysis methods across the field.</span></p> | ||
1716 | |||
1717 | <h2></h2> | ||
1718 | |||
1719 | <h2><a name="_Toc138932287"><span lang=en-DE>FAConstructor</span></a></h2> | ||
1720 | |||
1721 | <p class=MsoNormal><span lang=en-DE>FAConstructor allows a simple and effective | ||
1722 | creation of fibre models based on mathematical functions or the manual input of | ||
1723 | data points. Models are visualised during creation and can be interacted with | ||
1724 | by translating them in 3D space.</span></p> | ||
1725 | |||
1726 | <h2></h2> | ||
1727 | |||
1728 | <h2><a name="_Toc138932288"><span lang=en-DE>fairgraph</span></a></h2> | ||
1729 | |||
1730 | <p class=MsoNormal><span lang=en-DE>fairgraph is a Python library for working | ||
1731 | with metadata in the EBRAINS Knowledge Graph (KG), with a particular focus on | ||
1732 | data reuse, although it is also useful in registering and curating metadata. | ||
1733 | The library represents metadata nodes (also known as openMINDS instances) from | ||
1734 | the KG as Python objects. fairgraph supports querying the KG, following links | ||
1735 | in the graph, downloading data and metadata, and creating new nodes in the KG. | ||
1736 | It builds on openMINDS and on the KG Core Python library.</span></p> | ||
1737 | |||
1738 | <h2></h2> | ||
1739 | |||
1740 | <h2><a name="_Toc138932289"><span lang=en-DE>Fast sampling with neuromorphic | ||
1741 | hardware</span></a></h2> | ||
1742 | |||
1743 | <p class=MsoNormal><span lang=en-DE>Compared to conventional neural networks, | ||
1744 | physical model devices offer a fast, efficient, and inherently parallel | ||
1745 | substrate capable of related forms of Markov chain Monte Carlo sampling. This | ||
1746 | software suite enables the use of a neuromorphic chip to replicate the | ||
1747 | properties of quantum systems through spike-based sampling.</span></p> | ||
1748 | |||
1749 | <h2></h2> | ||
1750 | |||
1751 | <h2><a name="_Toc138932290"><span lang=en-DE>fastPLI</span></a></h2> | ||
1752 | |||
1753 | <p class=MsoNormal><span lang=en-DE>fastPLI is an open-source toolbox based on | ||
1754 | Python and C++ for modelling myelinated axons, i.e., nerve fibres, and | ||
1755 | simulating the results of measurement of fibre orientations with a polarisation | ||
1756 | microscope using 3D-PLI. The fastPLI package includes the following modules: | ||
1757 | nerve fibre modelling, simulation, and analysis. All computationally intensive | ||
1758 | calculations are optimised either with Numba on the Python side or with | ||
1759 | multithreading C++ algorithms, which can be accessed via pybind11 inside the | ||
1760 | Python package. Additionally, the simulation module supports the Message | ||
1761 | Passing Interface (MPI) to facilitate the simulation of very large volumes on | ||
1762 | multiple computer nodes.</span></p> | ||
1763 | |||
1764 | <h2></h2> | ||
1765 | |||
1766 | <h2><a name="_Toc138932291"><span lang=en-DE>Feed-forward LFP-MEG estimator | ||
1767 | from mean-field models</span></a></h2> | ||
1768 | |||
1769 | <p class=MsoNormal><span lang=en-DE>This tool was developed to calculate the | ||
1770 | local field potentials (LFP) and magnetoencephalogram (MEG) signals generated | ||
1771 | by a population of neurons described by a mean-field model. The calculation of | ||
1772 | LFP is done via a kernel method based on unitary LFP's (the LFP generated by a | ||
1773 | single axon) which was recently introduced for spiking-networks simulations and | ||
1774 | that we adapt here for mean-field models. The calculation of the magnetic field | ||
1775 | is based on current-dipole and volume-conductor models, where the secondary | ||
1776 | currents (due to the conducting extracellular medium) are estimated using the | ||
1777 | LFP calculated via the kernel method and where the effects of | ||
1778 | medium-inhomogeneities are incorporated.</span></p> | ||
1779 | |||
1780 | <h2></h2> | ||
1781 | |||
1782 | <h2><a name="_Toc138932292"><span lang=en-DE>FIL</span></a></h2> | ||
1783 | |||
1784 | <p class=MsoNormal><span lang=en-DE>This is a scheme for training and applying | ||
1785 | the FIL framework. Some functionality from SPM12 is required for handling | ||
1786 | images. After training, labelling a new image is relatively fast because | ||
1787 | optimising the latent variables can be formulated within a scheme similar to a recurrent | ||
1788 | Residual Network (ResNet).</span></p> | ||
1789 | |||
1790 | <h2></h2> | ||
1791 | |||
1792 | <h2><a name="_Toc138932293"><span lang=en-DE>FMRALIGN</span></a></h2> | ||
1793 | |||
1794 | <p class=MsoNormal><span lang=en-DE>This library is meant to be a light-weight | ||
1795 | Python library that handles functional alignment tasks (also known as | ||
1796 | hyperalignment). It is compatible with and inspired by Nilearn. Alternative | ||
1797 | implementations of these ideas can be found in the pymvpa or brainiak packages.</span></p> | ||
1798 | |||
1799 | <h2></h2> | ||
1800 | |||
1801 | <h2><a name="_Toc138932294"><span lang=en-DE>Foa3D</span></a></h2> | ||
1802 | |||
1803 | <p class=MsoNormal><span lang=en-DE>Foa3D is a tool for multiscale nerve fibre | ||
1804 | enhancement and orientation analysis in high-resolution volume images acquired | ||
1805 | by two-photon scanning or light-sheet fluorescence microscopy, exploiting the | ||
1806 | brain tissue autofluorescence or exogenous myelin stains. Its image processing | ||
1807 | pipeline is built around a 3D Frangi filter that enables the enhancement of | ||
1808 | fibre structures of varying diameters, and the generation of accurate 3D | ||
1809 | orientation maps in both grey and white matter. Foa3D features the computation | ||
1810 | of multiscale orientation distribution functions that facilitate the comparison | ||
1811 | with orientations assessed via 3D-PLI or 3D PS-OCT, and the validation of | ||
1812 | mesoscale dMRI-based connectivity information.</span></p> | ||
1813 | |||
1814 | <h2></h2> | ||
1815 | |||
1816 | <h2><a name="_Toc138932295"><span lang=en-DE>Frites</span></a></h2> | ||
1817 | |||
1818 | <p class=MsoNormal><span lang=en-DE>Frites allows the characterisation of | ||
1819 | task-related cognitive brain networks. Neural correlates of cognitive functions | ||
1820 | can be extracted both at the single brain area (or channel) and network level. | ||
1821 | The toolbox includes time-resolved directed (e.g., Granger causality) and | ||
1822 | undirected (e.g., Mutual Information) functional connectivity metrics. In | ||
1823 | addition, it includes cluster-based and permutation-based statistical methods | ||
1824 | for single-subject and group-level inference.</span></p> | ||
1825 | |||
1826 | <h2></h2> | ||
1827 | |||
1828 | <h2><a name="_Toc138932296"><span lang=en-DE>gridspeccer</span></a></h2> | ||
1829 | |||
1830 | <p class=MsoNormal><span lang=en-DE>Plotting tool to make plotting with many | ||
1831 | subfigures easier, especially for publications. After installation, gridspeccer | ||
1832 | can be used from the command line to create plots.</span></p> | ||
1833 | |||
1834 | <h2></h2> | ||
1835 | |||
1836 | <h2><a name="_Toc138932297"><span lang=en-DE>Hal-Cgp</span></a></h2> | ||
1837 | |||
1838 | <p class=MsoNormal><span lang=en-DE>Hal-Cgp is an extensible pure Python | ||
1839 | library implementing Cgp to represent, mutate and evaluate populations of | ||
1840 | individuals encoding symbolic expressions targeting applications with | ||
1841 | computationally expensive fitness evaluations. It supports the translation from | ||
1842 | a CGP genotype, a two-dimensional Cartesian graph, into the corresponding | ||
1843 | phenotype, a computational graph implementing a particular mathematical expression. | ||
1844 | These computational graphs can be exported as pure Python functions, in a | ||
1845 | NumPy-compatible format, SymPy expressions or PyTorch modules. The library | ||
1846 | implements a mu + lambda evolution strategy to evolve a population of | ||
1847 | individuals to optimise an objective function.</span></p> | ||
1848 | |||
1849 | <h2></h2> | ||
1850 | |||
1851 | <h2><a name="_Toc138932298"><span lang=en-DE>Health Data Cloud</span></a></h2> | ||
1852 | |||
1853 | <p class=MsoNormal><span lang=en-DE>The Health Data Cloud (HDC) provides | ||
1854 | EBRAINS services for sensitive data as a federated research data ecosystem that | ||
1855 | enables scientists across Europe and beyond to collect, process and share | ||
1856 | sensitive data in compliance with EU General Data Protection Regulations | ||
1857 | (GDPR). The HDC is a federation of interoperable nodes. Nodes share a common | ||
1858 | system architecture based on CharitŽ Virtual Research Environment (VRE), | ||
1859 | enabling research consortia to manage and process data, and making data | ||
1860 | discoverable and sharable via the EBRAINS Knowledge Graph.</span></p> | ||
1861 | |||
1862 | <p class=MsoNormal></p> | ||
1863 | |||
1864 | <p class=MsoNormal><a name="_Toc138932299"><span class=Heading2Char><span | ||
1865 | lang=en-DE style='font-size:14.0pt;line-height:120%'>Hodgkin-Huxley Neuron | ||
1866 | Builder</span></span></a></p> | ||
1867 | |||
1868 | <p class=MsoNormal><span lang=en-DE>The Hodgkin-Huxley Neuron Builder is a | ||
1869 | web-application that allows users to interactively go through an entire NEURON | ||
1870 | model building pipeline of individual biophysically detailed cells. 2. Model | ||
1871 | parameter optimisation via HPC systems. 3. In silico experiments using the | ||
1872 | optimised model cell. </span></p> | ||
1873 | |||
1874 | <h2></h2> | ||
1875 | |||
1876 | <h2><a name="_Toc138932300"><span lang=en-DE>HPC Job Proxy</span></a></h2> | ||
1877 | |||
1878 | <p class=MsoNormal><span lang=en-DE>The HPC Job Proxy provides a simplified way | ||
1879 | for EBRAINS service providers to launch jobs on Fenix supercomputers on behalf | ||
1880 | of EBRAINS end users. The proxy offers a wrapper over the Unicore service which | ||
1881 | adds logging, access to stdout/stderr/status, verification of user quota, and | ||
1882 | updating of user quota at the end of the job.</span></p> | ||
1883 | |||
1884 | <h2></h2> | ||
1885 | |||
1886 | <h2><a name="_Toc138932301"><span lang=en-DE>HPC Status Monitor</span></a></h2> | ||
1887 | |||
1888 | <p class=MsoNormal><span lang=en-DE>The HPC Status Monitor allows a real-time | ||
1889 | check of the availability status of the HPC Systems accessible from HBP tools | ||
1890 | and services and provides an instant snapshot of the resource quotas available | ||
1891 | to individual users on each system.</span></p> | ||
1892 | |||
1893 | <h2></h2> | ||
1894 | |||
1895 | <h2><a name="_Toc138932302"><span lang=en-DE>Human Intracerebral EEG Platform</span></a></h2> | ||
1896 | |||
1897 | <p class=MsoNormal><span lang=en-DE>The HIP is an open-source platform designed | ||
1898 | for collecting, managing, analysing and sharing multi-scale iEEG data at an | ||
1899 | international level. Its mission is to assist clinicians and researchers in | ||
1900 | improving research capabilities by simplifying iEEG data analysis and | ||
1901 | interpretation. The HIP integrates different software, modules and services | ||
1902 | necessary for investigating spatio-temporal dynamics of neural processes in a | ||
1903 | secure and optimised fashion. The interface is browser-based and allows | ||
1904 | selecting sets of tools according to specific research needs.</span></p> | ||
1905 | |||
1906 | <h2></h2> | ||
1907 | |||
1908 | <h2><a name="_Toc138932303"><span lang=en-DE>Hybrid MM/CG Webserver</span></a></h2> | ||
1909 | |||
1910 | <p class=MsoNormal><span lang=en-DE>MM/CG simulations help predict ligand poses | ||
1911 | in hGPCRs for pharmacological applications. This approach allows for the | ||
1912 | description of the ligand, the binding cavity and the surrounding water | ||
1913 | molecules at atomistic resolution, while coarse-graining the rest of the | ||
1914 | receptor. The webserver automatises and speeds up the simulation set-up of | ||
1915 | hGPCR/ligand complexes. It also allows for equilibration of the systems, either | ||
1916 | fully automatically or interactively. The results are visualised online, | ||
1917 | helping the user identify possible issues and modify the set-up parameters. | ||
1918 | This framework allows for the automatic preparation and running of hybrid | ||
1919 | molecular dynamics simulations of molecules and their cognate receptors.</span></p> | ||
1920 | |||
1921 | <h2></h2> | ||
1922 | |||
1923 | <h2><a name="_Toc138932304"><span lang=en-DE>Insite</span></a></h2> | ||
1924 | |||
1925 | <p class=MsoNormal><span lang=en-DE>Insite enables users to access data via the | ||
1926 | in transit paradigm for NEST, TVB and Arbor simulations. Compared to the | ||
1927 | traditional approach of offline processing, in transit paradigms allow | ||
1928 | accessing of data while the simulation runs. This is especially useful for | ||
1929 | simulations that produce large amounts of data and are running for a long time. | ||
1930 | In transit allows the user to access only parts of the data and prevents the | ||
1931 | need for storing all data. It also allows the user early insights into the data | ||
1932 | even before the simulation finishes. Insite provides an easy-to-use and | ||
1933 | easy-to-integrate architecture to enable in transit features in other tools.</span></p> | ||
1934 | |||
1935 | <h2></h2> | ||
1936 | |||
1937 | <h2><a name="_Toc138932305"><span lang=en-DE>Interactive Brain Atlas Viewer</span></a></h2> | ||
1938 | |||
1939 | <p class=MsoNormal><span lang=en-DE>The Interactive Brain Atlas Viewer provides | ||
1940 | various kinds of interactive visualisations for multi-modal brain and head | ||
1941 | image data: different parcellations, degrees of transparency and overlays. The | ||
1942 | Viewer provides the following functions and supports data from the following | ||
1943 | sources: EEG, white matter tracts, MRI and PET 3D volumes, 2D slices, | ||
1944 | intracranial electrodes, brain activity, multiscale brain network models, | ||
1945 | supplementary information for brain regions and functional brain networks in | ||
1946 | multiple languages. It comes as a web app, mobile app and desktop app.</span></p> | ||
1947 | |||
1948 | <h2></h2> | ||
1949 | |||
1950 | <h2><a name="_Toc138932306"><span lang=en-DE>JuGEx</span></a></h2> | ||
1951 | |||
1952 | <p class=MsoNormal><span lang=en-DE>Decoding the chain from genes to cognition | ||
1953 | requires detailed insights into how areas with specific gene activities and | ||
1954 | microanatomical architectures contribute to brain function and dysfunction. The | ||
1955 | Allen Human Brain Atlas contains regional gene expression data, while the | ||
1956 | Julich Brain Atlas, which can be accessed via siibra, offers 3D | ||
1957 | cytoarchitectonic maps reflecting the interindividual variability. JuGEx offers | ||
1958 | an integrated framework that combines the analytical benefits of both | ||
1959 | repositories towards a multilevel brain atlas of adult humans. JuGEx is a new | ||
1960 | method for integrating tissue transcriptome and cytoarchitectonic segregation.</span></p> | ||
1961 | |||
1962 | <h2></h2> | ||
1963 | |||
1964 | <h2><a name="_Toc138932307"><span lang=en-DE>KnowledgeSpace</span></a></h2> | ||
1965 | |||
1966 | <p class=MsoNormal><span lang=en-DE>KnowledgeSpace (KS) is a globally-used, | ||
1967 | data-driven encyclopaedia and search engine for the neuroscience community. As | ||
1968 | an encyclopaedia, KS provides curated definitions of brain research concepts | ||
1969 | found in different neuroscience community ontologies, Wikipedia and | ||
1970 | dictionaries. The dataset discovery in KS makes research datasets across many | ||
1971 | large-scale brain initiatives universally accessible and useful. It also | ||
1972 | promotes FAIR data principles that will help data publishers to follow best | ||
1973 | practices for data storage and publication. As more and more data publishers | ||
1974 | follow data standards like OpenMINDS or DATS, the quality of data discovery | ||
1975 | through KS will improve. The related publications are also curated from PubMed | ||
1976 | and linked to the concepts in KS to provide an improved search capability.</span></p> | ||
1977 | |||
1978 | <h2></h2> | ||
1979 | |||
1980 | <h2><a name="_Toc138932308"><span lang=en-DE>L2L</span></a></h2> | ||
1981 | |||
1982 | <p class=MsoNormal><span lang=en-DE>L2L is an easy-to-use and flexible | ||
1983 | framework to perform parameter and hyper-parameter space exploration of | ||
1984 | mathematical models on HPC infrastructure. L2L is an implementation of the | ||
1985 | learning-to-learn concept written in Python. This open-source software allows | ||
1986 | several instances of an optimisation target to be executed with different | ||
1987 | parameters in an massively parallel fashion on HPC. L2L provides a set of | ||
1988 | built-in optimiser algorithms, which make adaptive and efficient exploration of | ||
1989 | parameter spaces possible. Different from other optimisation toolboxes, L2L | ||
1990 | provides maximum flexibility for the way the optimisation target can be | ||
1991 | executed.</span></p> | ||
1992 | |||
1993 | <h2></h2> | ||
1994 | |||
1995 | <h2><a name="_Toc138932309"><span lang=en-DE>Leveltlab/SpectralSegmentation</span></a></h2> | ||
1996 | |||
1997 | <p class=MsoNormal><span lang=en-DE>SpecSeg is a toolbox that segments neurons | ||
1998 | and neurites in chronic calcium imaging datasets based on low-frequency | ||
1999 | cross-spectral power. The pipeline includes a graphical user interface to edit | ||
2000 | the automatically extracted ROIs, to add new ones or delete ROIs by further | ||
2001 | constraining their properties.</span></p> | ||
2002 | |||
2003 | <h2></h2> | ||
2004 | |||
2005 | <h2><a name="_Toc138932310"><span lang=en-DE>LFPy</span></a></h2> | ||
2006 | |||
2007 | <p class=MsoNormal><span lang=en-DE>LFPy is an open-source Python module linking | ||
2008 | simulated neural activity with measurable brain signals. This is done by | ||
2009 | enabling calculation of brain signals from neural activity simulated with | ||
2010 | multi-compartment neuron models (single cells or networks). LFPy can be used to | ||
2011 | simulate brain signals like extracellular action potentials, local field | ||
2012 | potentials (LFP), and in vitro MEA recordings, as well as ECoG, EEG, and MEG | ||
2013 | signals. LFPy is well-integrated with the NEURON simulator and can, through | ||
2014 | LFPykit, also be used with other simulators like Arbor. Through the recently | ||
2015 | developed extensions hybridLFPy and LFPykernels, LFPy can also be used to | ||
2016 | calculate brain signals directly from point-neuron network models or | ||
2017 | population-based models.</span></p> | ||
2018 | |||
2019 | <h2></h2> | ||
2020 | |||
2021 | <h2><a name="_Toc138932311"><span lang=en-DE>libsonata</span></a></h2> | ||
2022 | |||
2023 | <p class=MsoNormal><span lang=en-DE>libsonata allows circuit and simulation | ||
2024 | config loading, node set materialisation, and access to node and edge | ||
2025 | populations in an efficient manner. It is generally a read-only library, but | ||
2026 | support for writing edge indices has been added.</span></p> | ||
2027 | |||
2028 | <h2></h2> | ||
2029 | |||
2030 | <h2><a name="_Toc138932312"><span lang=en-DE>Live Papers</span></a></h2> | ||
2031 | |||
2032 | <p class=MsoNormal><span lang=en-DE>EBRAINS Live Papers are structured and | ||
2033 | interactive documents that complement published scientific articles. Live | ||
2034 | Papers feature integrated tools and services that allow users to download, | ||
2035 | visualise or simulate data, models and results presented in the corresponding | ||
2036 | publications: Build interactive documents to showcase your data and the | ||
2037 | simulation or data analysis code used in your research. Easily link to | ||
2038 | resources in community databases such as EBRAINS, NeuroMorpho.org, ModelDB, and | ||
2039 | Allen Brain Atlas. Embedded, interactive visualisation of electrophysiology | ||
2040 | data and neuronal reconstructions. Launch EBRAINS simulation tools to explore | ||
2041 | single neuron models in your browser. Share live papers pre-publication with | ||
2042 | anonymous reviewers during peer review of your manuscript. Explore already | ||
2043 | published live papers, or develop your own live paper with our authoring tool.</span></p> | ||
2044 | |||
2045 | <h2></h2> | ||
2046 | |||
2047 | <h2><a name="_Toc138932313"><span lang=en-DE>Livre</span></a></h2> | ||
2048 | |||
2049 | <p class=MsoNormal><span lang=en-DE>Livre is an out-of-core, multi-node, | ||
2050 | multi-GPU, OpenGL volume rendering engine to visualise large volumetric | ||
2051 | datasets. It provides the following major features to facilitate rendering of | ||
2052 | large volumetric datasets: Visualisation of pre-processed UVF format volume | ||
2053 | datasets. Real-time voxelisation of different data sources (surface meshes, BBP | ||
2054 | morphologies, local field potentials, etc.) through the use of plugins. | ||
2055 | Multi-node, multi-GPU rendering (only sort-first rendering).</span></p> | ||
2056 | |||
2057 | <h2></h2> | ||
2058 | |||
2059 | <h2><a name="_Toc138932314"><span lang=en-DE>LocaliZoom</span></a></h2> | ||
2060 | |||
2061 | <p class=MsoNormal><span lang=en-DE>Pan-and-zoom type viewer displaying image | ||
2062 | series with overlaid atlas delineations. LocaliZoom is a pan-and-zoom type | ||
2063 | viewer displaying high-resolution image series coupled with overlaid atlas | ||
2064 | delineations. It has three operating modes: Display series with atlas overlay. | ||
2065 | Both linear and nonlinear alignments are supported (created with QuickNII or | ||
2066 | VisuAlign). Create or edit nonlinear alignments. Create markup which can be | ||
2067 | exported as MeshView point clouds or to Excel for further numerical analysis.</span></p> | ||
2068 | |||
2069 | <h2></h2> | ||
2070 | |||
2071 | <h2><a name="_Toc138932315"><span lang=en-DE>MD-IFP</span></a></h2> | ||
2072 | |||
2073 | <p class=MsoNormal><span lang=en-DE>MD-IFP is a python workflow for the | ||
2074 | generation and analysis of protein-ligand interaction fingerprints from | ||
2075 | molecular dynamics trajectories. If used for the analysis of Random | ||
2076 | Acceleration Molecular Dynamics (RAMD) trajectories, it can help to investigate | ||
2077 | dissociation mechanisms by characterising transition states as well as the | ||
2078 | determinants and hot-spots for dissociation. As such, the combined use of | ||
2079 | RAMD and MD-IFP may assist the early stages of drug discovery campaigns for the | ||
2080 | design of new molecules or ligand optimisation.</span></p> | ||
2081 | |||
2082 | <h2></h2> | ||
2083 | |||
2084 | <h2><a name="_Toc138932316"><span lang=en-DE>MEDUSA</span></a></h2> | ||
2085 | |||
2086 | <p class=MsoNormal><span lang=en-DE>Using a spherical meshing technique that | ||
2087 | decomposes each microstructural item into a set of overlapping spheres, the | ||
2088 | phantom construction is made very fast while reliably avoiding the collisions | ||
2089 | between items in the scene. This novel method is applied to the construction of | ||
2090 | human brain white matter microstructural components, namely axonal fibers, | ||
2091 | oligodendrocytes and astrocytes. The algorithm reaches high values of packing | ||
2092 | density and angular dispersion for the axonal fibers, even in the case of | ||
2093 | multiple white matter fiber populations and enables the construction of complex | ||
2094 | biomimicking geometries including myelinated axons, beaded axons and glial | ||
2095 | cells.</span></p> | ||
2096 | |||
2097 | <h2></h2> | ||
2098 | |||
2099 | <h2><a name="_Toc138932317"><span lang=en-DE>MeshView</span></a></h2> | ||
2100 | |||
2101 | <p class=MsoNormal><span lang=en-DE>MeshView is a web application for real-time | ||
2102 | 3D display of surface mesh data representing structural parcellations from | ||
2103 | volumetric atlases, such as the Waxholm Space atlas of the Sprague Dawley rat | ||
2104 | brain. Key features: orbiting view with toggleable opaque/transparent/hidden | ||
2105 | parcellation meshes, rendering user-defined cut surface as if meshes were solid | ||
2106 | objects, rendering point-clouds (simple type-in, or loaded from JSON). The | ||
2107 | coordinate system is compatible with QuickNII.</span></p> | ||
2108 | |||
2109 | <h2></h2> | ||
2110 | |||
2111 | <h2><a name="_Toc138932318"><span lang=en-DE>MIP</span></a></h2> | ||
2112 | |||
2113 | <p class=MsoNormal><span lang=en-DE>MIP is an open-source platform enabling | ||
2114 | federated data analysis in a secure environment for centres involved in | ||
2115 | collaborative initiatives. It allows users to initiate or join disease-oriented | ||
2116 | federations with the aim of analysing large-scale distributed clinical | ||
2117 | datasets. For each federation, users can create specific data models based on | ||
2118 | well-accepted common data elements, approved by all participating centres. MIP | ||
2119 | experts assist in creating the data models and facilitate coordination and | ||
2120 | communication among centres. They provide advice and support for data curation, | ||
2121 | harmonisation, and anonymisation, as well as data governance, especially with | ||
2122 | regards to Data Sharing Agreements and general ethical considerations.</span></p> | ||
2123 | |||
2124 | <h2></h2> | ||
2125 | |||
2126 | <h2><a name="_Toc138932319"><span lang=en-DE>Model Validation Service</span></a></h2> | ||
2127 | |||
2128 | <p class=MsoNormal><span lang=en-DE>The HBP/EBRAINS Model Validation Service is | ||
2129 | a set of tools for performing and tracking validation of models with respect to | ||
2130 | experimental data. It consists of a web API, a GUI client (the Model Catalog | ||
2131 | app) and a Python client. The service enables users to store, query, view and | ||
2132 | download: (i) model descriptions/scripts, (ii) validation test definitions and | ||
2133 | (iii) validation results. In a typical workflow, users will find models and | ||
2134 | validation tests by searching the Model Catalog (or upload their own), run the | ||
2135 | tests using the Python client in a Jupyter notebook, with simulations running | ||
2136 | locally or on HPC, and then upload the results.</span></p> | ||
2137 | |||
2138 | <h2></h2> | ||
2139 | |||
2140 | <h2><a name="_Toc138932320"><span lang=en-DE>Model Validation Test Suites</span></a></h2> | ||
2141 | |||
2142 | <p class=MsoNormal><span lang=en-DE>As part of the HBP/EBRAINS model validation | ||
2143 | framework, we provide a Python Software Development Kit (SDK) for model | ||
2144 | validation, which provides: (i) validation test definitions and (ii) interface | ||
2145 | definitions intended to decouple model validation from the details of model | ||
2146 | implementation. This more formal approach to model validation aims to make it | ||
2147 | quicker and easier to compare models, to provide validation test suites for | ||
2148 | models and to develop new validations of existing models. The SDK consists of a | ||
2149 | collection of Python packages all using the sciunit framework: HippoUnit, | ||
2150 | MorphoUnit, NetworkUnit, BasalUnit, CerebUnit, eFELUnit, HippoNetworkUnit.</span></p> | ||
2151 | |||
2152 | <h2></h2> | ||
2153 | |||
2154 | <h2><a name="_Toc138932321"><span lang=en-DE>MoDEL-CNS</span></a></h2> | ||
2155 | |||
2156 | <p class=MsoNormal><span lang=en-DE>MoDEL-CNS is a database and server platform | ||
2157 | designed to provide web access to atomistic MD trajectories for relevant signal | ||
2158 | transduction proteins. The project is part of the service for providing | ||
2159 | molecular simulation-based predictions for systems neurobiology of the HBP. | ||
2160 | MoDEL-CNS expands the MD Extended Library database of atomistic MD trajectories | ||
2161 | with proteins involved in CNS processes, including membrane proteins. MoDEL-CNS | ||
2162 | web server interface presents the resulting trajectories, analyses and protein | ||
2163 | properties. All data produced are available to download.</span></p> | ||
2164 | |||
2165 | <h2></h2> | ||
2166 | |||
2167 | <h2><a name="_Toc138932322"><span lang=en-DE>Modular Science</span></a></h2> | ||
2168 | |||
2169 | <p class=MsoNormal><span lang=en-DE>Modular Science is a middleware that | ||
2170 | provides robust deployment of complex multi-application workflows. It contains | ||
2171 | protocols and interfaces for multi-scale co-simulation workloads on | ||
2172 | high-performance computers and local hardware. It allows for synchronisation | ||
2173 | and coordination of individual components and contains dedicated and | ||
2174 | parallelised modules for data transformations between scales. Modular Science | ||
2175 | offers insight into both the system level and the individual subsystems to | ||
2176 | steer the execution, to monitor resource usage, and system health & status | ||
2177 | with small overheads on performance. Modular Science comes with a number of | ||
2178 | neuroscience co-simulation use cases including NEST-TVB, NEST-Arbor, LFPy and neurorobotics.</span></p> | ||
2179 | |||
2180 | <h2></h2> | ||
2181 | |||
2182 | <h2><a name="_Toc138932323"><span lang=en-DE>Monsteer</span></a></h2> | ||
2183 | |||
2184 | <p class=MsoNormal><span lang=en-DE>Monsteer is a library for interactive | ||
2185 | supercomputing in the neuroscience domain. It facilitates the coupling of | ||
2186 | running simulations (currently NEST) with interactive visualization and | ||
2187 | analysis applications. Monsteer supports streaming of simulation data to | ||
2188 | clients (currently limited to spikes) as well as control of the simulator from | ||
2189 | the clients (also known as computational steering). Monsteer's main components | ||
2190 | are a C++ library, a MUSIC-based application and Python helpers.</span></p> | ||
2191 | |||
2192 | <h2></h2> | ||
2193 | |||
2194 | <h2><a name="_Toc138932324"><span lang=en-DE>MorphIO</span></a></h2> | ||
2195 | |||
2196 | <p class=MsoNormal><span lang=en-DE>MorphIO is a library for reading and | ||
2197 | writing neuron morphology files. It supports the following formats: SWC, ASC | ||
2198 | (also known as neurolucida), H5. There are two APIs: mutable, for creating or | ||
2199 | editing morphologies, and immutable, for read-only operations. Both are | ||
2200 | represented in C++ and Python. Extended formats include glia, mitochondria and | ||
2201 | endoplasmic reticulum.</span></p> | ||
2202 | |||
2203 | <h2></h2> | ||
2204 | |||
2205 | <h2><a name="_Toc138932325"><span lang=en-DE>Morphology alignment tool</span></a></h2> | ||
2206 | |||
2207 | <p class=MsoNormal><span lang=en-DE>Starting with serial sections of a brain in | ||
2208 | which a complete single morphology has been labelled, the pieces of neurite | ||
2209 | (axons/dendrites) in each section are traced with Neurolucida or similar | ||
2210 | microscope-attached software. The slices are then aligned, first using an | ||
2211 | automated algorithm that tries to find matching pieces in adjacent sections | ||
2212 | (Python script), and second using a GUI-driven tool (web-based, JavaScript). | ||
2213 | Finally, the pieces are stitched into a complete neuron (Python script). The | ||
2214 | neuron and tissue volume are then registered to one of the EBRAINS-supported | ||
2215 | reference templates (Python script). The web-based tool can also be used to align | ||
2216 | slices without a neuron being present.</span></p> | ||
2217 | |||
2218 | <h2></h2> | ||
2219 | |||
2220 | <h2><a name="_Toc138932326"><span lang=en-DE>MorphTool</span></a></h2> | ||
2221 | |||
2222 | <p class=MsoNormal><span lang=en-DE>MorphTool is a python toolkit designed for | ||
2223 | editing morphological skeletons of cell reconstructions. It has been developed | ||
2224 | to provide helper programmes that perform simple tasks such as morphology | ||
2225 | diffing, file conversion, soma area calculation, skeleton simplification, | ||
2226 | process resampling, morphology repair and spatial transformations. It allows | ||
2227 | neuroscientists to curate and manipulate morphological reconstruction and | ||
2228 | correct morphological artifacts due to the manual reconstruction process.</span></p> | ||
2229 | |||
2230 | <h2></h2> | ||
2231 | |||
2232 | <h2><a name="_Toc138932327"><span lang=en-DE>Multi-Brain</span></a></h2> | ||
2233 | |||
2234 | <p class=MsoNormal><span lang=en-DE>The Multi-Brain (MB) model has the | ||
2235 | general aim of integrating a number of disparate image analysis components | ||
2236 | within a single unified generative modelling framework. Its objective is to | ||
2237 | achieve diffeomorphic alignment of a wide variety of medical image modalities | ||
2238 | into a common anatomical space. This involves the ability to construct a | ||
2239 | "tissue probability template" from a population of scans | ||
2240 | through group-wise alignment. The MB model has been shown to provide accurate | ||
2241 | modelling of the intensity distributions of different imaging modalities.</span></p> | ||
2242 | |||
2243 | <h2></h2> | ||
2244 | |||
2245 | <h2><a name="_Toc138932328"><span lang=en-DE>Multi-Image-OSD</span></a></h2> | ||
2246 | |||
2247 | <p class=MsoNormal><span lang=en-DE>It has browser-based classic pan and zoom | ||
2248 | capabilities. A collection of images can be displayed as a filmstrip (Filmstrip | ||
2249 | Mode) or as a table (Collection Mode) with adjustable number of rows and | ||
2250 | columns. The tool supports keyboard or/and mouse navigation options, as well as | ||
2251 | touch devices. Utilising the open standard Deep Zoom Image (DZI) format, it is | ||
2252 | able to efficiently visualise very large brain images in the gigapixel range, | ||
2253 | allowing to zoom from common, display-sized overview resolutions down to the | ||
2254 | microscopic resolution without downloading the underlying, very large image | ||
2255 | dataset.</span></p> | ||
2256 | |||
2257 | <h2></h2> | ||
2258 | |||
2259 | <h2><a name="_Toc138932329"><span lang=en-DE>MUSIC</span></a></h2> | ||
2260 | |||
2261 | <p class=MsoNormal><span lang=en-DE>MUSIC is a communication framework in the | ||
2262 | domain of computational neuroscience and neuromorphic computing which enables | ||
2263 | co-simulations, where components of a model are simulated by different | ||
2264 | simulators or hardware. It consists of an API and C++ library which can be | ||
2265 | linked into existing software with minor modifications. MUSIC enables the | ||
2266 | communication of neuronal spike events, continuous values and text messages | ||
2267 | while hiding the complexity of data distribution over ranks, as well as | ||
2268 | scheduling of communication in the face of loops. MUSIC is light-weight with a | ||
2269 | simple API.</span></p> | ||
2270 | |||
2271 | <h2></h2> | ||
2272 | |||
2273 | <h2><a name="_Toc138932330"><span lang=en-DE>NEAT</span></a></h2> | ||
2274 | |||
2275 | <p class=MsoNormal><span lang=en-DE>NEAT allows for the convenient definition | ||
2276 | of morphological neuron models. These models can be simulated through an | ||
2277 | interface with the NEURON simulator or analysed with two classical methods: (i) | ||
2278 | the separation-of-variables method to obtain impedance kernels as a | ||
2279 | superposition of exponentials and (ii) Koch's method to compute impedances with | ||
2280 | linearised ion channels analytically in the frequency domain. NEAT also | ||
2281 | implements the neural evaluation tree framework and an associated C++ simulator | ||
2282 | to analyse sub-unit independence. Finally, NEAT implements a new method to | ||
2283 | simplify morphological neuron models into models with few compartments, which | ||
2284 | can also be simulated with NEURON.</span></p> | ||
2285 | |||
2286 | <h2></h2> | ||
2287 | |||
2288 | <h2><a name="_Toc138932331"><span lang=en-DE>Neo</span></a></h2> | ||
2289 | |||
2290 | <p class=MsoNormal><span lang=en-DE>Neo implements a hierarchical data model | ||
2291 | well adapted to intracellular and extracellular electrophysiology and EEG data. | ||
2292 | It improves interoperability between Python tools for analysing, visualising | ||
2293 | and generating electrophysiology data by providing a common, shared object | ||
2294 | model. It reads a wide range of neurophysiology file formats, including Spike2, | ||
2295 | NeuroExplorer, AlphaOmega, Axon, Blackrock, Plexon, Tdt and Igor Pro and writes | ||
2296 | to open formats such as NWB and NIX. Neo objects behave just like normal NumPy | ||
2297 | arrays, but with additional metadata, checks for dimensional consistency and | ||
2298 | automatic unit conversion. Neo has been endorsed as a community standard by the | ||
2299 | International Neuroinformatics Coordinating Facility (INCF).</span></p> | ||
2300 | |||
2301 | <h2></h2> | ||
2302 | |||
2303 | <h2><a name="_Toc138932332"><span lang=en-DE>Neo Viewer</span></a></h2> | ||
2304 | |||
2305 | <p class=MsoNormal><span lang=en-DE>Neo Viewer consists of a REST-API and a | ||
2306 | Javascript component that can be embedded in any web page. Electrophysiology | ||
2307 | traces can be zoomed, scrolled and saved as images. Individual points can be | ||
2308 | measured off the graphs. Neo Viewer can visualise data from most of the | ||
2309 | widely-used file formats in neurophysiology, including community standards such | ||
2310 | as NWB.</span></p> | ||
2311 | |||
2312 | <h2></h2> | ||
2313 | |||
2314 | <h2><a name="_Toc138932333"><span lang=en-DE>NEST Desktop</span></a></h2> | ||
2315 | |||
2316 | <p class=MsoNormal><span lang=en-DE>NEST Desktop comprises of GUI components | ||
2317 | for creating and configuring network models, running simulations, and | ||
2318 | visualising and analysing simulation results. NEST Desktop allows students to | ||
2319 | explore important concepts in computational neuroscience without the need to | ||
2320 | first learn a simulator control language. This is done by offering a | ||
2321 | server-side NEST simulator, which can also be installed as a package together | ||
2322 | with a web server providing NEST Desktop as visual front-end. Besides local | ||
2323 | installations, distributed setups can be installed, and direct use through | ||
2324 | EBRAINS is possible. NEST Desktop has also been used as a modelling front-end | ||
2325 | of the Neurorobotics Platform.</span></p> | ||
2326 | |||
2327 | <h2></h2> | ||
2328 | |||
2329 | <h2><a name="_Toc138932334"><span lang=en-DE>NEST Simulator</span></a></h2> | ||
2330 | |||
2331 | <p class=MsoNormal><span lang=en-DE>NEST is used in computational neuroscience | ||
2332 | to model and study behaviour of large networks of neurons. The models describe | ||
2333 | single neuron and synapse behaviour and their connections. Different mechanisms | ||
2334 | of plasticity can be used to investigate artificial learning and help to shed | ||
2335 | light on the fundamental principles of how the brain works. NEST offers | ||
2336 | convenient and efficient commands to define and connect large networks, ranging | ||
2337 | from algorithmically determined connections to data-driven connectivity. Create | ||
2338 | connections between neurons using numerous synapse models from STDP to gap | ||
2339 | junctions.</span></p> | ||
2340 | |||
2341 | <h2></h2> | ||
2342 | |||
2343 | <h2><a name="_Toc138932335"><span lang=en-DE>NESTML</span></a></h2> | ||
2344 | |||
2345 | <p class=MsoNormal><span lang=en-DE>NESTML is a domain-specific language for | ||
2346 | neuron and synapse models. These dynamical models can be used in simulations of | ||
2347 | brain activity on several platforms, in particular NEST Simulator. NESTML | ||
2348 | combines an easy to understand, yet powerful syntax with good simulation | ||
2349 | performance by means of code generation (C++ for NEST Simulator), but flexibly | ||
2350 | supports other simulation engines including neuromorphic hardware.</span></p> | ||
2351 | |||
2352 | <h2></h2> | ||
2353 | |||
2354 | <h2><a name="_Toc138932336"><span lang=en-DE>NetPyNE</span></a></h2> | ||
2355 | |||
2356 | <p class=MsoNormal><span lang=en-DE>NetPyNE provides programmatic and graphical | ||
2357 | interfaces to develop data-driven multiscale brain neural circuit models using | ||
2358 | Python and NEURON. Users can define models using a standardised | ||
2359 | JSON-compatible, rule-based, declarative format. Based on these specifications, | ||
2360 | NetPyNE will generate the network in CoreNEURON, enabling users to run | ||
2361 | parallel simulations, optimise and explore network parameters through automated | ||
2362 | batch runs, and use built-in functions for visualisation and analysis (e.g., | ||
2363 | generate connectivity matrices, voltage traces, spike raster plots, local field | ||
2364 | potentials and information theoretic measures). NetPyNE also facilitates model | ||
2365 | sharing by exporting and importing standardised formats: NeuroML and SONATA.</span></p> | ||
2366 | |||
2367 | <h2></h2> | ||
2368 | |||
2369 | <h2><a name="_Toc138932337"><span lang=en-DE>NEURO-CONNECT</span></a></h2> | ||
2370 | |||
2371 | <p class=MsoNormal><span lang=en-DE>The NEURO-CONNECT platform provides | ||
2372 | functions to integrate multimodal brain imaging information in a unifying | ||
2373 | feature space. Thus, Surface Based Morphometry (SBM), Functional Magnetic | ||
2374 | Resonance Imaging (fMRI) and Diffusion Tensor Imaging (DTI) can be combined and | ||
2375 | visualised at the whole-brain scale. Moreover, multiple brain atlases are | ||
2376 | aligned to match research outcomes to neuroanatomical entities. The datasets | ||
2377 | are appended with openMINDS metadata and thus enable integrative data analysis | ||
2378 | and machine learning.</span></p> | ||
2379 | |||
2380 | <h2></h2> | ||
2381 | |||
2382 | <h2><a name="_Toc138932338"><span lang=en-DE>NeuroFeatureExtract</span></a></h2> | ||
2383 | |||
2384 | <p class=MsoNormal><span lang=en-DE>The NeuroFeatureExtract is a web | ||
2385 | application that allows the users to extract an ensemble of | ||
2386 | electrophysiological properties from voltage traces recorded upon electrical | ||
2387 | stimulation of neuronal cells. The main outcome of the application is the | ||
2388 | generation of two files Ð features.json and protocol.json Ð that can be used | ||
2389 | for later analysis and model parameter optimisations via the Hodgkin-Huxley | ||
2390 | Neuron Builder application.</span></p> | ||
2391 | |||
2392 | <h2></h2> | ||
2393 | |||
2394 | <h2><a name="_Toc138932339"><span lang=en-DE>NeurogenPy</span></a></h2> | ||
2395 | |||
2396 | <p class=MsoNormal><span lang=en-DE>NeurogenPy is a Python package for working | ||
2397 | with Bayesian networks. It is focused on the analysis of gene expression data | ||
2398 | and learning of gene regulatory networks, modelled as Bayesian networks. For | ||
2399 | that reason, at the moment, only the Gaussian and fully discrete cases are | ||
2400 | supported. The package provides different structure learning algorithms, | ||
2401 | parameters estimation and input/output formats. For some of them, already | ||
2402 | existing implementations have been used, with bnlearn, pgmpy, networkx and | ||
2403 | igraph being the most relevant used packages. This project has been conceived | ||
2404 | to be included as a plugin in the EBRAINS interactive atlas viewer, but it may | ||
2405 | be used for other purposes.</span></p> | ||
2406 | |||
2407 | <h2></h2> | ||
2408 | |||
2409 | <h2><a name="_Toc138932340"><span lang=en-DE>NeuroM</span></a></h2> | ||
2410 | |||
2411 | <p class=MsoNormal><span lang=en-DE>NeuroM is a Python toolkit for the analysis | ||
2412 | and processing of neuron morphologies. It allows the extraction of various | ||
2413 | information about morphologies, e.g., the segment lengths of a morphology via | ||
2414 | the segment_lengths feature. More than 50 features that can be extracted.</span></p> | ||
2415 | |||
2416 | <h2></h2> | ||
2417 | |||
2418 | <h2><a name="_Toc138932341"><span lang=en-DE>Neuromorphic Computing Job Queue</span></a></h2> | ||
2419 | |||
2420 | <p class=MsoNormal><span lang=en-DE>The Neuromorphic Computing Job Queue allows | ||
2421 | users to run simulations/emulations on the SpiNNaker and BrainScaleS systems by | ||
2422 | submitting a PyNN script and associated job configuration information to a | ||
2423 | central queue. The system consists of a web API, a GUI client (the Job Manager | ||
2424 | app) and a Python client. Users can submit scripts stored locally on their own | ||
2425 | machine, in a Git repository, in the KG, or in EBRAINS Collaboratory storage | ||
2426 | (Drive/Bucket). Users can track the progress of their job, and view and/or | ||
2427 | download the results, log files, and provenance information.</span></p> | ||
2428 | |||
2429 | <h2></h2> | ||
2430 | |||
2431 | <h2><a name="_Toc138932342"><span lang=en-DE>Neuronize v2</span></a></h2> | ||
2432 | |||
2433 | <p class=MsoNormal><span lang=en-DE>Neuronize v2 has been developed to generate | ||
2434 | a connected neural 3D mesh. If the input is a neuron tracing, it generates a 3D | ||
2435 | mesh from it, including the shape of the soma. If the input is data extracted | ||
2436 | with Imaris Filament Tracer (a set of unconnected meshes of a neuron), | ||
2437 | Neuronize v2 generates a single connected 3D mesh of the whole neuron (also | ||
2438 | generating the soma) and provides its neural tracing, which can then be | ||
2439 | imported into tools such as Neurolucida, facilitating the interoperability of | ||
2440 | two of the most widely used proprietary tools.</span></p> | ||
2441 | |||
2442 | <h2></h2> | ||
2443 | |||
2444 | <h2><a name="_Toc138932343"><span lang=en-DE>NeuroR</span></a></h2> | ||
2445 | |||
2446 | <p class=MsoNormal><span lang=en-DE>NeuroR is a collection of tools to repair | ||
2447 | morphologies. This includes cut plane detection, sanitisation (removing | ||
2448 | unifurcations, invalid soma counts, short segments) and 'unravelling': the | ||
2449 | action of 'stretching' the cell that has been shrunk due to the dehydratation | ||
2450 | caused by the slicing.</span></p> | ||
2451 | |||
2452 | <h2></h2> | ||
2453 | |||
2454 | <h2><a name="_Toc138932344"><span lang=en-DE>Neurorobotics Platform</span></a></h2> | ||
2455 | |||
2456 | <p class=MsoNormal><span lang=en-DE>The Neurorobotics Platform (NRP) is an | ||
2457 | integrative simulation framework that enables in silico experimentation and | ||
2458 | embodiment of brain models inside virtual agents interacting with realistic | ||
2459 | simulated environments. Entirely Open Source, it offers a browser-based | ||
2460 | graphical user interface for online access. It can be installed locally (Docker | ||
2461 | or source install). It can be interfaced with multiple spike-based neuromorphic | ||
2462 | chips (SpiNNaker, Intel Loihi). You can download and install the NRP locally | ||
2463 | for maximum experimental convenience or access it online in order to leverage | ||
2464 | the HBP High Performance Computing infrastructure for large-scale experiments.</span></p> | ||
2465 | |||
2466 | <h2></h2> | ||
2467 | |||
2468 | <h2><a name="_Toc138932345"><span lang=en-DE>Neurorobotics Platform Robot | ||
2469 | Designer</span></a></h2> | ||
2470 | |||
2471 | <p class=MsoNormal><span lang=en-DE>The Robot Designer is a plugin for the 3D | ||
2472 | modeling suite Blender that enables researchers to design morphologies for | ||
2473 | simulation experiments in, particularly but not restricted to, the | ||
2474 | Neurorobotics Platform. This plugin helps researchers design and parameterize | ||
2475 | models with a Graphical User Interface, simplifying and speeding up the design | ||
2476 | process.cess. It includes design capabilities for musculoskeletal bodies as | ||
2477 | well as robotic systems, fostering not only the understanding of biological | ||
2478 | motions and enabling better robot designs, but also enabling true Neurorobotic | ||
2479 | experiments that consist of biomimetic models such as tendon-driven robots or a | ||
2480 | transition between biology and technology.</span></p> | ||
2481 | |||
2482 | <h2></h2> | ||
2483 | |||
2484 | <h2><a name="_Toc138932346"><span lang=en-DE>NeuroScheme</span></a></h2> | ||
2485 | |||
2486 | <p class=MsoNormal><span lang=en-DE>NeuroScheme uses schematic | ||
2487 | representations, such as icons and glyphs, to encode attributes of neural | ||
2488 | structures (neurons, columns, layers, populations, etc.), alleviating problems | ||
2489 | with displaying, navigating and analysing large datasets. It manages | ||
2490 | hierarchically organised neural structures</span><span lang=en-DE | ||
2491 | style='font-family:"Times New Roman",serif'> </span><span lang=en-DE>users can | ||
2492 | navigate through the levels of the hierarchy and hone in on and explore the | ||
2493 | data at their desired level of detail. NeuroScheme has currently two built-in | ||
2494 | "domains", which specify entities, attributes and | ||
2495 | relationships used for specific use cases: the 'cortex' domain, designed for | ||
2496 | navigating and analysing cerebral cortex structures</span><span lang=en-DE | ||
2497 | style='font-family:"Times New Roman",serif'> </span><span lang=en-DE>and the | ||
2498 | 'congen' domain, used to define the properties of cells and connections, create | ||
2499 | circuits of neurons and build populations.</span></p> | ||
2500 | |||
2501 | <h2></h2> | ||
2502 | |||
2503 | <h2><a name="_Toc138932347"><span lang=en-DE>NeuroSuites</span></a></h2> | ||
2504 | |||
2505 | <p class=MsoNormal><span lang=en-DE>NeuroSuites is a web-based platform | ||
2506 | designed to handle large-scale, high-dimensional data in the field of | ||
2507 | neuroscience. It offers neuroscience-oriented applications and tools for data | ||
2508 | analysis, machine learning and visualisation, while also providing | ||
2509 | general-purpose tools for data scientists in other research fields. NeuroSuites | ||
2510 | requires no software installation and runs on the backend of a server, making | ||
2511 | it accessible from various devices. The platform's main strengths include its | ||
2512 | defined architecture, ability to handle complex neuroscience data and the | ||
2513 | variety of available tools.</span></p> | ||
2514 | |||
2515 | <h2></h2> | ||
2516 | |||
2517 | <h2><a name="_Toc138932348"><span lang=en-DE>NeuroTessMesh</span></a></h2> | ||
2518 | |||
2519 | <p class=MsoNormal><span lang=en-DE>NeuroTessMesh takes morphological tracings | ||
2520 | of cells acquired by neuroscientists and generates 3D models that approximate | ||
2521 | the neuronal membrane. The resolution of the models can be adapted at the time | ||
2522 | of visualisation. You can colour-code different parts of a morphology, | ||
2523 | differentiating relevant morphological variables or even neuronal activity. | ||
2524 | NeuroTessMesh copes with many of the problems associated with the visualisation | ||
2525 | of neural circuits consisting of large numbers of cells. It facilitates the | ||
2526 | recovery and visualisation of the 3D geometry of cells included in databases, | ||
2527 | such as NeuroMorpho, and allows to approximate missing information such as the | ||
2528 | soma's morphology.</span></p> | ||
2529 | |||
2530 | <h2></h2> | ||
2531 | |||
2532 | <h2><a name="_Toc138932349"><span lang=en-DE>NMODL Framework</span></a></h2> | ||
2533 | |||
2534 | <p class=MsoNormal><span lang=en-DE>NMODL Framework is designed with | ||
2535 | modern compiler and code generation techniques. It provides modular tools for | ||
2536 | parsing, analysing and transforming NMODL it provides an easy to use, high | ||
2537 | level Python API</span><span lang=en-DE style='font-family:"Times New Roman",serif'> | ||
2538 | </span><span lang=en-DE> it generates optimised code for modern compute architectures | ||
2539 | including CPUs and GPUs</span><span lang=en-DE style='font-family:"Times New Roman",serif'> | ||
2540 | </span><span lang=en-DE> it provides flexibility to implement new simulator | ||
2541 | backends and it supports full NMODL specification.</span></p> | ||
2542 | |||
2543 | <h2></h2> | ||
2544 | |||
2545 | <h2><a name="_Toc138932350"><span lang=en-DE>NSuite</span></a></h2> | ||
2546 | |||
2547 | <p class=MsoNormal><span lang=en-DE>NSuite is a framework for maintaining and | ||
2548 | running benchmarks and validation tests for multi-compartment neural network | ||
2549 | simulations on HPC systems. NSuite automates the process of building simulation | ||
2550 | engines, and running benchmarks and validation tests. NSuite is specifically | ||
2551 | designed to allow easy deployment on HPC systems in testing workflows, such as | ||
2552 | benchmark-driven development or continuous integration. The development of | ||
2553 | NSuite has been driven by the need (1) for a definitive resource for comparing | ||
2554 | performance and correctness of simulation engines on HPC systems, (2) to verify | ||
2555 | the performance and correctness of individual simulation engines as they change | ||
2556 | over time and (3) to test that changes to an HPC system do not cause | ||
2557 | performance or correctness regressions in simulation engines. The framework | ||
2558 | currently supports the simulation engines Arbor, NEURON, and CoreNeuron, while | ||
2559 | allowing other simulation engines to be added.</span></p> | ||
2560 | |||
2561 | <p class=MsoNormal></p> | ||
2562 | |||
2563 | <p class=MsoNormal><span lang=en-DE>Nutil</span></p> | ||
2564 | |||
2565 | <p class=MsoNormal><span lang=en-DE>Nutil is a pre- and post-processing toolbox | ||
2566 | that enables analysis of large collections of histological images of rodent | ||
2567 | brain sections. The software is open source and has both a graphical user | ||
2568 | interface for specifying the input and output parameters and a command-line | ||
2569 | execution option for batch processing. Nutil includes a transformation tool for | ||
2570 | automated scaling, rotation, mirroring and renaming of image files, a file | ||
2571 | format converter, a simple resize tool and a post-processing method for | ||
2572 | quantifying and localising labelled features based on a reference atlas of the | ||
2573 | brain (mouse or rat). The quantification method requires input from customised | ||
2574 | brain atlas maps generated with the QuickNII software, and segmentations | ||
2575 | generated with ilastik or another image analysis tool. The output from Nutil | ||
2576 | include csv reports, 3D point cloud coordinate files and atlas map images | ||
2577 | superimposed with colour-coded objects.</span></p> | ||
2578 | |||
2579 | <h2></h2> | ||
2580 | |||
2581 | <h2><a name="_Toc138932351"><span lang=en-DE>ODE-toolbox</span></a></h2> | ||
2582 | |||
2583 | <p class=MsoNormal><span lang=en-DE>ODE-toolbox is a Python package that | ||
2584 | assists in solver benchmarking, and recommends solvers on the basis of a set of | ||
2585 | user-configurable heuristics. For all dynamical equations that admit an | ||
2586 | analytic solution, ODE-toolbox generates propagator matrices that allow the | ||
2587 | solution to be calculated at machine precision. For all others, first-order | ||
2588 | update expressions are returned based on the Jacobian matrix. In addition to | ||
2589 | continuous dynamics, discrete events can be used to model instantaneous changes | ||
2590 | in system state, such as a neuronal action potential. These can be generated by | ||
2591 | the system under test as well as applied as external stimuli, making | ||
2592 | ODE-toolbox particularly well-suited for applications in computational | ||
2593 | neuroscience.</span></p> | ||
2594 | |||
2595 | <h2></h2> | ||
2596 | |||
2597 | <h2><a name="_Toc138932352"><span lang=en-DE>openMINDS</span></a></h2> | ||
2598 | |||
2599 | <p class=MsoNormal><span lang=en-DE>openMINDS is composed of: (i) integrated | ||
2600 | metadata models adoptable by any graph database system (GDBS), (ii) a set of | ||
2601 | libraries of serviceable metadata instances with external resource references | ||
2602 | for local and global knowledge integration, and (iii) supportive tooling for | ||
2603 | handling the metadata models and instances. Moreover, the framework provides | ||
2604 | machine-readable mappings to other standardisation efforts (e.g., schema.org). | ||
2605 | With this, openMINDS is a unique and powerful metadata framework for flexible | ||
2606 | knowledge integration within and beyond any GDBS.</span></p> | ||
2607 | |||
2608 | <h2></h2> | ||
2609 | |||
2610 | <h2><a name="_Toc138932353"><span lang=en-DE>openMINDS metadata for TVB-ready | ||
2611 | data</span></a></h2> | ||
2612 | |||
2613 | <p class=MsoNormal><span lang=en-DE>Jupyter Python notebook with code and | ||
2614 | commentaries for creating openMINDS metadata version 1.0 in JSON-LD format for | ||
2615 | ingestion of TVB-ready data in EBRAINS Knowledge Graph.</span></p> | ||
2616 | |||
2617 | <h2></h2> | ||
2618 | |||
2619 | <h2><a name="_Toc138932354"><span lang=en-DE>PCI</span></a></h2> | ||
2620 | |||
2621 | <p class=MsoNormal><span lang=en-DE>The notebook allows the computation of the | ||
2622 | PCI Lempel-Ziv and PCI state transitions. In order to run the examples, a wake | ||
2623 | and sleep data set needs to be provided in the Python-MNE format.</span></p> | ||
2624 | |||
2625 | <h2></h2> | ||
2626 | |||
2627 | <h2><a name="_Toc138932355"><span lang=en-DE>PIPSA</span></a></h2> | ||
2628 | |||
2629 | <p class=MsoNormal><span lang=en-DE>PIPSA enables the comparison of the | ||
2630 | electrostatic interaction properties of proteins. It permits the classification | ||
2631 | of proteins according to their interaction properties. PIPSA may assist in | ||
2632 | function assignment, the estimation of binding properties and enzyme kinetic | ||
2633 | parameters.</span></p> | ||
2634 | |||
2635 | <h2></h2> | ||
2636 | |||
2637 | <h2><a name="_Toc138932356"><span lang=en-DE>PoSCE</span></a></h2> | ||
2638 | |||
2639 | <p class=MsoNormal><span lang=en-DE>PoSCE is a functional connectivity | ||
2640 | estimator of fMRI time-series. It relies on the Riemannian geometry of | ||
2641 | covariances and integrates prior knowledge of covariance distribution over a | ||
2642 | population.</span></p> | ||
2643 | |||
2644 | <h2></h2> | ||
2645 | |||
2646 | <h2><a name="_Toc138932357"><span lang=en-DE>Provenance API</span></a></h2> | ||
2647 | |||
2648 | <p class=MsoNormal><span lang=en-DE>The EBRAINS Provenance API is a web service | ||
2649 | to facilitate working with computational provenance metadata. Metadata are | ||
2650 | stored in the EBRAINS Knowledge Graph (KG) using openMINDS schemas. The | ||
2651 | Provenance API provides a somewhat simplified interface compared to accessing | ||
2652 | the KG directly and performs checks of metadata consistency. The service covers | ||
2653 | workflows involving simulation, data analysis, visualisation, optimisation, | ||
2654 | data movement and model validation.</span></p> | ||
2655 | |||
2656 | <h2></h2> | ||
2657 | |||
2658 | <h2><a name="_Toc138932358"><span lang=en-DE>PyNN</span></a></h2> | ||
2659 | |||
2660 | <p class=MsoNormal><span lang=en-DE>A model description written with the PyNN | ||
2661 | API and the Python programming language runs on any simulator that PyNN | ||
2662 | supports (currently NEURON, NEST and Brian 2) as well as on the BrainScaleS | ||
2663 | and SpiNNaker neuromorphic hardware systems. PyNN provides a library of | ||
2664 | standard neuron, synapse and synaptic plasticity models, verified to work the | ||
2665 | same on different simulators. PyNN also provides commonly used connectivity | ||
2666 | algorithms (e.g. all-to-all, random, distance-dependent, small-world) but makes | ||
2667 | it easy to provide your own connectivity in a simulator-independent way. PyNN | ||
2668 | transparently supports distributed simulations using MPI.</span></p> | ||
2669 | |||
2670 | <h2></h2> | ||
2671 | |||
2672 | <h2><a name="_Toc138932359"><span lang=en-DE>Pyramidal Explorer</span></a></h2> | ||
2673 | |||
2674 | <p class=MsoNormal><span lang=en-DE>PyramidalExplorer is a tool to | ||
2675 | interactively explore and reveal the detailed organisation of the microanatomy | ||
2676 | of pyramidal neurons with functionally related models. Possible regional | ||
2677 | differences in the pyramidal cell architecture can be interactively discovered | ||
2678 | by combining quantitative morphological information about the structure of the | ||
2679 | cell with implemented functional models. The key contribution of this tool is the | ||
2680 | morpho-functional oriented design, allowing the user to navigate within the 3D | ||
2681 | dataset, filter and perform content-based retrieval operations to find the | ||
2682 | spines that are alike and dissimilar within the neuron, according to particular | ||
2683 | morphological or functional variables.</span></p> | ||
2684 | |||
2685 | <h2></h2> | ||
2686 | |||
2687 | <h2><a name="_Toc138932360"><span lang=en-DE>QCAlign software</span></a></h2> | ||
2688 | |||
2689 | <p class=MsoNormal><span lang=en-DE>The QUINT workflow enables spatial analysis | ||
2690 | of labelling in series of brain sections from mouse and rat based on | ||
2691 | registration to a reference brain atlas. The QCAlign software supports the use | ||
2692 | of QUINT for high-throughput studies by providing information about: 1. The | ||
2693 | quality of the section images used as input to the QUINT workflow. 2. The | ||
2694 | quality of the atlas registration performed in the QUINT workflow. 3. QCAlign | ||
2695 | also makes it easier for the user to explore the atlas hierarchy and decide on | ||
2696 | a customised hierarchy level to use for the investigation</span></p> | ||
2697 | |||
2698 | <h2></h2> | ||
2699 | |||
2700 | <h2><a name="_Toc138932361"><span lang=en-DE>QuickNII</span></a></h2> | ||
2701 | |||
2702 | <p class=MsoNormal><span lang=en-DE>QuickNII is a tool for user-guided affine | ||
2703 | registration (anchoring) of 2D experimental image data, typically high | ||
2704 | resolution microscopic images, to 3D atlas reference space, facilitating data | ||
2705 | integration through standardised coordinate systems. Key features: Generate | ||
2706 | user-defined cut planes through the atlas templates, matching the orientation | ||
2707 | of the cut plane of the 2D experimental image data, as a first step towards | ||
2708 | anchoring of images to the relevant atlas template. Propagate spatial | ||
2709 | transformations across series of sections following anchoring of selected | ||
2710 | images.</span></p> | ||
2711 | |||
2712 | <h2></h2> | ||
2713 | |||
2714 | <h2><a name="_Toc138932362"><span lang=en-DE>Quota Manager</span></a></h2> | ||
2715 | |||
2716 | <p class=MsoNormal><span lang=en-DE>The Quota Manager enables each EBRAINS | ||
2717 | service to manage user quotas for resources EBRAINS users consume in their | ||
2718 | respective services. The goal is to encourage the responsible use of resources. | ||
2719 | It is recommended that all users (except possibly guest accounts) are provided | ||
2720 | with a default quota, and that specific users have the option of receiving | ||
2721 | larger quotas based on their affiliation, role or motivated requests.</span></p> | ||
2722 | |||
2723 | <h2></h2> | ||
2724 | |||
2725 | <h2><a name="_Toc138932363"><span lang=en-DE>RateML</span></a></h2> | ||
2726 | |||
2727 | <p class=MsoNormal><span lang=en-DE>RateML enables users to generate | ||
2728 | whole-brain network models from a succinct declarative description, in which | ||
2729 | the mathematics of the model are described without specifying how their | ||
2730 | simulation should be implemented. RateML builds on NeuroML's Low Entropy Model | ||
2731 | Specification (LEMS), an XML-based language for specifying models of dynamical systems, | ||
2732 | allowing descriptions of neural mass and discretized neural field models, as | ||
2733 | implemented by the TVB simulator. The end user describes their model's | ||
2734 | mathematics once and generates and runs code for different languages, targeting | ||
2735 | both CPUs for fast single simulations and GPUs for parallel ensemble | ||
2736 | simulations.</span></p> | ||
2737 | |||
2738 | <h2></h2> | ||
2739 | |||
2740 | <h2><a name="_Toc138932364"><span lang=en-DE>Region-wise CBPP using the Julich | ||
2741 | BrainÊCytoarchitectonic Atlas</span></a></h2> | ||
2742 | |||
2743 | <p class=MsoNormal><span lang=en-DE>Many studies have been investigating the | ||
2744 | relationships between interindividual variability in brain regions' | ||
2745 | connectivity and behavioural phenotypes, by utilising connectivity-based | ||
2746 | prediction models. Recently, we demonstrated that an approach based on the | ||
2747 | combination of whole-brain and region-wise CBPP can provide important insight | ||
2748 | into the predictive model, and hence in brain-behaviour relationships, by | ||
2749 | offering interpretable patterns. Here, we applied this approach using the | ||
2750 | Julich Brain Cytoarchitectonic Atlas with the resting-state functional | ||
2751 | connectivity and psychometric variables from the Human Connectome Project | ||
2752 | dataset, illustrating each brain region's predictive power for a range of | ||
2753 | psychometric variables. As a result, a psychometric prediction profile was | ||
2754 | established for each brain region, which can be validated against brain mapping | ||
2755 | literature.</span></p> | ||
2756 | |||
2757 | <h2></h2> | ||
2758 | |||
2759 | <h2><a name="_Toc138932365"><span lang=en-DE>RRI Capacity Development Resources</span></a></h2> | ||
2760 | |||
2761 | <p class=MsoNormal><span lang=en-DE>A series of training resources developed to | ||
2762 | enable anticipation, critical reflection and public engagement/deliberation of | ||
2763 | societal consequences of brain research and innovation activities. These | ||
2764 | resources were designed primarily for HBP researchers and EBRAINS leadership | ||
2765 | and management, involving EBRAINS data and infrastructure providers. However, | ||
2766 | they are also useful for engaging the wider public with RRI. The resources are | ||
2767 | based on the legacy of over 10 years of research and activities of the ethics | ||
2768 | and society-team in the HBP. They cover important RRI-related topics on | ||
2769 | neuroethics, data governance, dual-use, public engagement and foresight, | ||
2770 | diversity, search integrity etc.</span></p> | ||
2771 | |||
2772 | <h2></h2> | ||
2773 | |||
2774 | <h2><a name="_Toc138932366"><span lang=en-DE>rsHRF</span></a></h2> | ||
2775 | |||
2776 | <p class=MsoNormal><span lang=en-DE>This toolbox is aimed to retrieve the | ||
2777 | onsets of pseudo-events triggering an hemodynamic response from resting state | ||
2778 | fMRI BOLD signals. It is based on point process theory and fits a model to | ||
2779 | retrieve the optimal lag between the events and the HRF onset, as well as the | ||
2780 | HRF shape, using different shape parameters or combinations of basis functions. | ||
2781 | Once the HRF has been retrieved for each voxel/vertex, it can be deconvolved | ||
2782 | from the time series (for example, to improve lag-based connectivity | ||
2783 | estimates), or one can map the shape parameters everywhere in the brain | ||
2784 | (including white matter) and use it as a pathophysiological indicator.</span></p> | ||
2785 | |||
2786 | <h2></h2> | ||
2787 | |||
2788 | <h2><a name="_Toc138932367"><span lang=en-DE>RTNeuron</span></a></h2> | ||
2789 | |||
2790 | <p class=MsoNormal><span lang=en-DE>The main utility of RTNeuron is twofold: | ||
2791 | (i) the interactive visual inspection of structural and functional features of | ||
2792 | the cortical column model and (ii) the generation of high-quality movies and | ||
2793 | images for presentations and publications.RTNeuron provides a C++ library with | ||
2794 | an OpenGL-based rendering backend, a Python wrapping and a Python application | ||
2795 | called rtneuron. RTNeuron is only supported in GNU/Linux systems. However, it | ||
2796 | should also be possible to build it on Windows systems. For OS/X it may be | ||
2797 | quite challenging and require changes in OpenGL-related code to get it working.</span></p> | ||
2798 | |||
2799 | <h2></h2> | ||
2800 | |||
2801 | <h2><a name="_Toc138932368"><span lang=en-DE>sbs: Spike-based Sampling</span></a></h2> | ||
2802 | |||
2803 | <p class=MsoNormal><span lang=en-DE>Spike-based sampling, sbs, is a software | ||
2804 | suite that takes care of calibrating spiking neurons for given target | ||
2805 | distributions and allows the evaluation of these distributions as they are | ||
2806 | produced by stochastic spiking networks.</span></p> | ||
2807 | |||
2808 | <h2></h2> | ||
2809 | |||
2810 | <h2><a name="_Toc138932369"><span lang=en-DE>SDA 7</span></a></h2> | ||
2811 | |||
2812 | <p class=MsoNormal><span lang=en-DE>SDA 7 can be used to carry out Brownian | ||
2813 | dynamics simulations of the diffusional association in a continuum aqueous | ||
2814 | solvent of two solute molecules, e.g., proteins, or of a solute molecule to an | ||
2815 | inorganic surface. SDA 7 can also be used to simulate the diffusion of multiple | ||
2816 | proteins, in dilute or concentrated solutions, e.g., to study the effects of | ||
2817 | macromolecular crowding.</span></p> | ||
2818 | |||
2819 | <h2></h2> | ||
2820 | |||
2821 | <h2><a name="_Toc138932370"><span lang=en-DE>Shape & Appearance Modelling</span></a></h2> | ||
2822 | |||
2823 | <p class=MsoNormal><span lang=en-DE>A framework for automatically learning | ||
2824 | shape and appearance models for medical (and certain other) images. The | ||
2825 | algorithm was developed with the aim of eventually enabling distributed | ||
2826 | privacy-preserving analysis of brain image data, such that shared information | ||
2827 | (shape and appearance basis functions) may be passed across sites, whereas | ||
2828 | latent variables that encode individual images remain secure within each site. | ||
2829 | These latent variables are proposed as features for privacy-preserving data | ||
2830 | mining applications.</span></p> | ||
2831 | |||
2832 | <h2></h2> | ||
2833 | |||
2834 | <h2><a name="_Toc138932371"><span lang=en-DE>siibra-api</span></a></h2> | ||
2835 | |||
2836 | <p class=MsoNormal><span lang=en-DE>siibra-api provides an HTTP wrapper around | ||
2837 | siibra-python, allowing developers to access atlas (meta)data over HTTP | ||
2838 | protocol. Deployed on the EBRAINS infrastructure, developers can access the | ||
2839 | centralised (meta)data on atlases, as provided by siibra-python, regardless of | ||
2840 | the programming language.</span></p> | ||
2841 | |||
2842 | <h2></h2> | ||
2843 | |||
2844 | <h2><a name="_Toc138932372"><span lang=en-DE>siibra-explorer</span></a></h2> | ||
2845 | |||
2846 | <p class=MsoNormal><span lang=en-DE>The interactive atlas viewer | ||
2847 | siibra-explorer allows exploring the different EBRAINS atlases for the human, | ||
2848 | monkey and rodent brains together with a comprehensive set of linked multimodal | ||
2849 | data features. It provides a 3-planar view of a parcellated reference volume | ||
2850 | combined with a rotatable overview of the 3D surface. Several templates can be | ||
2851 | selected to navigate through the brain from MRI-scale to microscopic | ||
2852 | resolution, allowing inspection of terabyte-size image data. Anatomically | ||
2853 | anchored datasets reflecting aspects of cellular and molecular organisation, | ||
2854 | fibres, function and connectivity can be discovered by selecting brain regions | ||
2855 | from parcellations, or zooming and panning the reference brain. siibra-explorer | ||
2856 | also allows annotation of brain locations as points and polygons and is | ||
2857 | extensible via interactive plugins.</span></p> | ||
2858 | |||
2859 | <h2></h2> | ||
2860 | |||
2861 | <h2><a name="_Toc138932373"><span lang=en-DE>siibra-python</span></a></h2> | ||
2862 | |||
2863 | <p class=MsoNormal><span lang=en-DE>siibra-python is a Python client to a brain | ||
2864 | atlas framework that integrates brain parcellations and reference spaces at | ||
2865 | different spatial scales and connects them with a broad range of multimodal | ||
2866 | regional data features. It aims to facilitate programmatic and reproducible | ||
2867 | incorporation of brain parcellations and brain region features from different | ||
2868 | sources into neuroscience workflows. Also, siibra-python provides an easy | ||
2869 | access to data features on the EBRAINS Knowledge Graph in a well-structured | ||
2870 | manner. Users can preconfigure their own data to use within siibra-python.</span></p> | ||
2871 | |||
2872 | <h2></h2> | ||
2873 | |||
2874 | <h2><a name="_Toc138932374"><span lang=en-DE>Single Cell Model (Re)builder | ||
2875 | Notebook</span></a></h2> | ||
2876 | |||
2877 | <p class=MsoNormal><span lang=en-DE>The Single Cell Model (Re)builder Notebook | ||
2878 | is a web application, implemented via a Jupyter Notebook on EBRAINS, which | ||
2879 | allows users to configure the BluePyOpt to re-run an optimisation with their | ||
2880 | own choices for the parameters range. The optimisation jobs are submitted | ||
2881 | through Neuroscience Gateway.</span></p> | ||
2882 | |||
2883 | <h2></h2> | ||
2884 | |||
2885 | <h2><a name="_Toc138932375"><span lang=en-DE>Slurm Plugin for Co-allocation of | ||
2886 | Compute and Data Resources</span></a></h2> | ||
2887 | |||
2888 | <p class=MsoNormal><span lang=en-DE>This Simple linux utility for resource | ||
2889 | management (Slurm) plugin enables the co-allocation of compute and data resources | ||
2890 | on a shared multi-tiered storage cluster by estimating waiting times when the | ||
2891 | high-performance storage (burst buffers) will become available to submitted | ||
2892 | jobs. Based on the current job queue and the estimated waiting time, the plugin | ||
2893 | decides whether scheduling the high-performance or lower-performance storage | ||
2894 | system (parallel file system) benefits the job's turnaround time. The | ||
2895 | estimation depends on additional information the user provides at submission | ||
2896 | time.</span></p> | ||
2897 | |||
2898 | <h2></h2> | ||
2899 | |||
2900 | <h2><a name="_Toc138932376"><span lang=en-DE>Snudda</span></a></h2> | ||
2901 | |||
2902 | <p class=MsoNormal><span lang=en-DE>Snudda ('touch' in Swedish) allows the user | ||
2903 | to set up and generate microcircuits where the connectivity between neurons is | ||
2904 | based on reconstructed neuron morphologies. The touch detection algorithm looks | ||
2905 | for overlaps of axons and dendrites, and places putative synapses where they | ||
2906 | touch. The putative synapses are pruned, removing a fraction to match | ||
2907 | statistics from pairwise connectivity experiments. If needed, Snudda can also | ||
2908 | use probability functions to create realistic microcircuits. The Snudda | ||
2909 | software is written in Python and includes support for supercomputers. It uses | ||
2910 | ipyparallel to parallelise network creation, and NEURON as the backend for | ||
2911 | simulations. Install using pip or by directly downloading.</span></p> | ||
2912 | |||
2913 | <h2></h2> | ||
2914 | |||
2915 | <h2><a name="_Toc138932377"><span lang=en-DE>SomaSegmenter</span></a></h2> | ||
2916 | |||
2917 | <p class=MsoNormal><span lang=en-DE>SomaSegmenter allows neuronal soma | ||
2918 | segmentation in fluorescence microscopy imaging datasets with the use of a | ||
2919 | parametrised version of the U-Net segmentation model, including additional | ||
2920 | features such as residual links and tile-based frame reconstruction.</span></p> | ||
2921 | |||
2922 | <h2></h2> | ||
2923 | |||
2924 | <h2><a name="_Toc138932378"><span lang=en-DE>SpiNNaker</span></a></h2> | ||
2925 | |||
2926 | <p class=MsoNormal><span lang=en-DE>SpiNNaker is a neuromorphic computer with | ||
2927 | over a million low power, small memory ARM cores arranged in chips, connected | ||
2928 | together with a unique brain-like mesh network, and designed to simulate | ||
2929 | networks of spiking point neurons. Software is provided to compile networks | ||
2930 | described with PyNN into running simulations, and to extract and convert | ||
2931 | results into the neo data format, as well as providing support for live | ||
2932 | interaction with running simulations. This allows integration with external | ||
2933 | devices such as real or virtual robotics as well as live simulation | ||
2934 | visualisation. Scripts can be written and executed using Jupyter for | ||
2935 | interactive access.</span></p> | ||
2936 | |||
2937 | <h2></h2> | ||
2938 | |||
2939 | <h2><a name="_Toc138932379"><span lang=en-DE>SSB toolkit</span></a></h2> | ||
2940 | |||
2941 | <p class=MsoNormal><span lang=en-DE>The SSB toolkit is an open-source Python | ||
2942 | library to simulate mathematical models of the signal transduction pathways of | ||
2943 | G-protein coupled receptors (GPCRs). By merging structural macromolecular data | ||
2944 | with systems biology simulations, the framework allows simulation of the signal | ||
2945 | transduction kinetics induced by ligand-GPCR interactions, as well as the consequent | ||
2946 | change of concentration of signalling molecular species, as a function of time | ||
2947 | and ligand concentration. Therefore, this tool allows the possibility to | ||
2948 | investigate the subcellular effects of ligand binding upon receptor activation, | ||
2949 | deepening the understanding of the relationship between the molecular level of | ||
2950 | ligand-target interactions and higher-level cellular and physiological or | ||
2951 | pathological response mechanisms.</span></p> | ||
2952 | |||
2953 | <h2></h2> | ||
2954 | |||
2955 | <h2><a name="_Toc138932380"><span lang=en-DE>Subcellular model building and | ||
2956 | calibration tool set</span></a></h2> | ||
2957 | |||
2958 | <p class=MsoNormal><span lang=en-DE>The toolset includes interoperable modules | ||
2959 | for: model building, calibration (parameter estimation) and model analysis. All | ||
2960 | information needed to perform these tasks (models, experimental calibration | ||
2961 | data and prior assumptions on parameter distributions) are stored in a | ||
2962 | structured, human- and machine-readable file format based on SBtab. The toolset | ||
2963 | enables simulations of the same model in simulators with different | ||
2964 | characteristics, e.g., STEPS, NEURON, MATLAB's Simbiology and R via automatic | ||
2965 | code generation. The parameter estimation can include uncertainty | ||
2966 | quantification and is done by optimisation or Bayesian approaches. Model | ||
2967 | analysis includes global sensitivity analysis and functionality for analysing | ||
2968 | thermodynamic constraints and conserved moieties.</span></p> | ||
2969 | |||
2970 | <h2></h2> | ||
2971 | |||
2972 | <h2><a name="_Toc138932381"><span lang=en-DE>Synaptic Events Fitting</span></a></h2> | ||
2973 | |||
2974 | <p class=MsoNormal><span lang=en-DE>The Synaptic Events Fitting is a web | ||
2975 | application, implemented in a Jupyter Notebook on EBRAINS that allows users to | ||
2976 | fit synaptic events using data and models from the EBRAINS Knowledge Graph | ||
2977 | (KG). Select, download and visualise experimental data from the KG and then choose | ||
2978 | the data to be fitted. A mod file is then selected (local or default) together | ||
2979 | with the corresponding configuration file (including protocol and the name of | ||
2980 | the parameters to be fitted, their initial values and allowed variation range, | ||
2981 | exclusion rules and an optional set of dependencies). The fitting procedure can | ||
2982 | run on Neuroscience Gateway. Fetch the fitting results from the storage of the | ||
2983 | HPC system to the storage of the Collab or to analyse the optimised parameters.</span></p> | ||
2984 | |||
2985 | <h2></h2> | ||
2986 | |||
2987 | <h2><a name="_Toc138932382"><span lang=en-DE>Synaptic Plasticity Explorer</span></a></h2> | ||
2988 | |||
2989 | <p class=MsoNormal><span lang=en-DE>The Synaptic Plasticity Explorer is a web | ||
2990 | application, implemented via a Jupyter Notebook on EBRAINS, which allows to | ||
2991 | configure and test, through an intuitive GUI, different synaptic plasticity | ||
2992 | models and protocols on single cell optimised models, available in the EBRAINS | ||
2993 | Model Catalog. It consists of two tabs: 'Config', where the user can specify | ||
2994 | the plasticity model to use and the synaptic parameters, and 'Sim', where the | ||
2995 | recording location, weight's evolution and number of simulations to run are | ||
2996 | defined. The results are plotted at the end of the simulation and the traces | ||
2997 | are available for download.</span></p> | ||
2998 | |||
2999 | <h2></h2> | ||
3000 | |||
3001 | <h2><a name="_Toc138932383"><span lang=en-DE>Synaptic proteome database | ||
3002 | (SQLite)</span></a></h2> | ||
3003 | |||
3004 | <p class=MsoNormal><span lang=en-DE>Integration of 57 published synaptic | ||
3005 | proteomic datasets reveals a stunningly complex picture involving over 7000 | ||
3006 | proteins. Molecular complexes were reconstructed using evidence-based | ||
3007 | protein-protein interaction data available from public databases. The | ||
3008 | constructed molecular interaction network model is embedded into an SQLite | ||
3009 | implementation, allowing queries that generate custom network models based on | ||
3010 | meta-data including species, synaptic compartment, brain region, and method of | ||
3011 | extraction.</span></p> | ||
3012 | |||
3013 | <h2></h2> | ||
3014 | |||
3015 | <h2><a name="_Toc138932384"><span lang=en-DE>Synaptome.db</span></a></h2> | ||
3016 | |||
3017 | <p class=MsoNormal><span lang=en-DE>The Synaptome.db bioconductor package | ||
3018 | contains a local copy of the Synaptic proteome database. On top of this it | ||
3019 | provides a set of utility R functions to query and analyse its content. It | ||
3020 | allows for extraction of information for specific genes and building the | ||
3021 | protein-protein interaction graph for gene sets, synaptic compartments and | ||
3022 | brain regions.</span></p> | ||
3023 | |||
3024 | <h2></h2> | ||
3025 | |||
3026 | <h2><a name="_Toc138932385"><span lang=en-DE>Tide</span></a></h2> | ||
3027 | |||
3028 | <p class=MsoNormal><span lang=en-DE>BlueBrain's Tide provides multi-window, | ||
3029 | multi-user touch interaction on large surfaces Ð think of a giant collaborative | ||
3030 | wall-mounted tablet. Tide is a distributed application that can run on multiple | ||
3031 | machines to power display walls or projection systems of any size. Its user interface | ||
3032 | is designed to offer an intuitive experience on touch walls. It works just as | ||
3033 | well on non-touch-capable installations by using its web interface from any web | ||
3034 | browser.</span></p> | ||
3035 | |||
3036 | <h2></h2> | ||
3037 | |||
3038 | <h2><a name="_Toc138932386"><span lang=en-DE>TVB EBRAINS</span></a></h2> | ||
3039 | |||
3040 | <p class=MsoNormal><span lang=en-DE>TVB EBRAINS is the principal full brain | ||
3041 | network simulation engine in EBRAINS and covers every aspect of realising | ||
3042 | personalised whole-brain simulations on the EBRAINS platform. It consists of | ||
3043 | the simulation tools and adaptors connecting the data, atlas and computing | ||
3044 | services to the rest of the TVB ecosystem and Cloud services available in | ||
3045 | EBRAINS. As such it allows the user to find and fetch relevant datasets through | ||
3046 | the EBRAINS Knowledge Graph and Atlas services, construct the personalised TVB | ||
3047 | models and use the HPC systems to perform parameter exploration, optimisation and | ||
3048 | inference studies. The user can orchestrate the workflow from the Jupyterlab | ||
3049 | interactive computing environment of the EBRAINS Collaboratory or use the | ||
3050 | dedicated web application of TVB.</span></p> | ||
3051 | |||
3052 | <h2></h2> | ||
3053 | |||
3054 | <h2><a name="_Toc138932387"><span lang=en-DE>TVB Image Processing Pipeline</span></a></h2> | ||
3055 | |||
3056 | <p class=MsoNormal><span lang=en-DE>TVB Image Processing Pipeline takes multimodal | ||
3057 | MRI data sets (anatomical, functional and diffusion-weighted MRI) as input and | ||
3058 | generates structural connectomes, region-average fMRI time series, functional | ||
3059 | connectomes, brain surfaces, electrode positions, lead field matrices and atlas | ||
3060 | parcellations as output. The pipeline performs preprocessing and | ||
3061 | distortion-correction on MRI data as well as white matter fibre bundle | ||
3062 | tractography on diffusion data. Outputs are formatted according to two data | ||
3063 | standards: a TVB-ready data set that can be directly used to simulate brain | ||
3064 | network models and the same output in BIDS format.</span></p> | ||
3065 | |||
3066 | <h2></h2> | ||
3067 | |||
3068 | <h2><a name="_Toc138932388"><span lang=en-DE>TVB Inversion</span></a></h2> | ||
3069 | |||
3070 | <p class=MsoNormal><span lang=en-DE>The TVB Inversion package implements the | ||
3071 | machinery required to perform parameter exploration and inference over | ||
3072 | parameters of The Virtual Brain simulator. It implements Simulation Based | ||
3073 | Inference (SBI) which is a Bayesian inference method for complex models, where | ||
3074 | calculation of the likelihood function is either analytically or | ||
3075 | computationally intractable. As such, it allows the user to formulate with | ||
3076 | great expressive power both the model and the inference scenario in terms of | ||
3077 | observed data features and model parameters. Part of the integration with TVB | ||
3078 | entails the option to perform numerous simulations in parallel, which can be | ||
3079 | used for parameter space exploration.</span></p> | ||
3080 | |||
3081 | <h2></h2> | ||
3082 | |||
3083 | <h2><a name="_Toc138932389"><span lang=en-DE>TVB Web App</span></a></h2> | ||
3084 | |||
3085 | <p class=MsoNormal><span lang=en-DE>TVB Web App provides The Virtual Brain | ||
3086 | Simulator as an EBRAINS Cloud Service with an HPC back-end. Scientists can run | ||
3087 | intense personalised brain simulations without having to deploy software. Users | ||
3088 | can access the service with their EBRAINS credentials (single sign on). TVB Web | ||
3089 | App uses private/public key cryptography, sandboxing, and access control to | ||
3090 | protect personalised health information contained in digital human brain twins | ||
3091 | while being processed on HPC. Users can upload their connectomes or use | ||
3092 | TVB-ready image-derived data discoverable via the EBRAINS Knowledge Graph. | ||
3093 | Users can also use containerised processing workflows available on EBRAINS to | ||
3094 | render image raw data into simulation-ready formats.</span></p> | ||
3095 | |||
3096 | <h2></h2> | ||
3097 | |||
3098 | <h2><a name="_Toc138932390"><span lang=en-DE>TVB Widgets</span></a></h2> | ||
3099 | |||
3100 | <p class=MsoNormal><span lang=en-DE>In order to support the usability of | ||
3101 | EBRAINS workflows, TVB-widgets has been developed as a set of modular graphic | ||
3102 | components and software solutions, easy to use in the Collaboratory within the | ||
3103 | JupyterLab. These GUI components are based on and under open source licence, | ||
3104 | supporting open neuroscience and support features like: Setup of models and | ||
3105 | region-specific or cohort simulations. Selection of Data sources and their | ||
3106 | links to models. Querying data from siibra and the EBRAINS Knowledge Graph. | ||
3107 | Deployment and monitoring jobs on HPC resources. Analysis and visualisation. | ||
3108 | Visual workflow builder for configuring and launching TVB simulations.</span></p> | ||
3109 | |||
3110 | <h2></h2> | ||
3111 | |||
3112 | <h2><a name="_Toc138932391"><span lang=en-DE>TVB-Multiscale</span></a></h2> | ||
3113 | |||
3114 | <p class=MsoNormal><span lang=en-DE>TVB-Multiscale is a Python toolbox aimed at | ||
3115 | facilitating the configuration of multiscale brain models and their | ||
3116 | co-simulation with TVB and spiking network simulators (currently NEST, | ||
3117 | NetPyNE (NEURON) and ANNarchy). A multiscale brain model consists of a full | ||
3118 | brain model formulated at the coarse scale of networks of tens up to thousands | ||
3119 | of brain regions, and an additional model of networks of spiking neurons | ||
3120 | describing selected brain regions at a finer scale. The toolbox has a | ||
3121 | user-friendly interface for configuring different kinds of models for | ||
3122 | transforming and exchanging data between the two scales during co-simulation.</span></p> | ||
3123 | |||
3124 | <h2></h2> | ||
3125 | |||
3126 | <h2><a name="_Toc138932392"><span lang=en-DE>VIOLA</span></a></h2> | ||
3127 | |||
3128 | <p class=MsoNormal><span lang=en-DE>VIOLA is an interactive, web-based tool to | ||
3129 | visualise activity data in multiple 2D layers such as the simulation output of | ||
3130 | neuronal networks with 2D geometry. As a reference implementation for a | ||
3131 | developed set of interactive visualisation concepts, the tool combines and | ||
3132 | adapts modern interactive visualisation paradigms, such as coordinated multiple | ||
3133 | views, for massively parallel neurophysiological data. The software allows for | ||
3134 | an explorative and qualitative assessment of the spatiotemporal features of | ||
3135 | neuronal activity, which can be performed prior to a detailed quantitative data | ||
3136 | analysis of specific aspects of the data.</span></p> | ||
3137 | |||
3138 | <h2></h2> | ||
3139 | |||
3140 | <h2><a name="_Toc138932393"><span lang=en-DE>Vishnu 1.0</span></a></h2> | ||
3141 | |||
3142 | <p class=MsoNormal><span lang=en-DE>DC Explorer, Pyramidal Explorer and Clint | ||
3143 | Explorer are the core of an application suite designed to help scientists to | ||
3144 | explore their data. Vishnu 1.0 is a communication framework that allows them to | ||
3145 | interchange information and cooperate in real time. It provides a unique access | ||
3146 | point to the three applications and manages a database with the users' | ||
3147 | datasets. Vishnu was originally designed to integrate data for | ||
3148 | Espina.Whole-brain-scale tools.</span></p> | ||
3149 | |||
3150 | <h2></h2> | ||
3151 | |||
3152 | <h2><a name="_Toc138932394"><span lang=en-DE>ViSimpl</span></a></h2> | ||
3153 | |||
3154 | <p class=MsoNormal><span lang=en-DE>ViSimpl integrates a set of visualisation | ||
3155 | and interaction components that provide a semantic view of brain data with the | ||
3156 | aim of improving its analysis procedures. ViSimpl provides 3D particle-based | ||
3157 | rendering that visualises simulation data with their associated spatial and | ||
3158 | temporal information, enhancing the knowledge extraction process. It also | ||
3159 | provides abstract representations of the time-varying magnitudes, supporting | ||
3160 | different data aggregation and disaggregation operations and giving focus and | ||
3161 | context clues. In addition, ViSimpl provides synchronised playback control of | ||
3162 | the simulation being analysed.</span></p> | ||
3163 | |||
3164 | <h2></h2> | ||
3165 | |||
3166 | <h2><a name="_Toc138932395"><span lang=en-DE>VisuAlign</span></a></h2> | ||
3167 | |||
3168 | <p class=MsoNormal><span lang=en-DE>VisuAlign is a tool for user-guided | ||
3169 | nonlinear registration after QuickNII of 2D experimental image data, typically | ||
3170 | high resolution microscopic images, to 3D atlas reference space, facilitating | ||
3171 | data integration through standardised coordinate systems. Key features: | ||
3172 | Generate user-defined cut planes through the atlas templates, matching the | ||
3173 | orientation of the cut plane of the 2D experimental image data, as a first step | ||
3174 | towards anchoring of images to the relevant atlas template. Propagate spatial | ||
3175 | transformations across series of sections following anchoring of selected | ||
3176 | images.</span></p> | ||
3177 | |||
3178 | <h2></h2> | ||
3179 | |||
3180 | <h2><a name="_Toc138932396"><span lang=en-DE>VMetaFlow</span></a></h2> | ||
3181 | |||
3182 | <p class=MsoNormal><span lang=en-DE>VMetaFlow is an abstraction layer placed | ||
3183 | over existing visual grammars and visualisation declarative languages, | ||
3184 | providing them with interoperability mechanisms. The main contribution of this | ||
3185 | research is to provide a user-friendly system to design visualisation and data | ||
3186 | processing operations that can be interconnected to form data analysis | ||
3187 | workflows. Visualisations and data processes can be saved as cards. Cards and | ||
3188 | workflows can be saved, distributed and reused between users.</span></p> | ||
3189 | |||
3190 | <h2></h2> | ||
3191 | |||
3192 | <h2><a name="_Toc138932397"><span lang=en-DE>Voluba</span></a></h2> | ||
3193 | |||
3194 | <p class=MsoNormal><span lang=en-DE>A common problem in high-resolution brain | ||
3195 | atlasing is spatial anchoring of volumes of interest from imaging experiments | ||
3196 | into the detailed anatomical context of an ultrahigh-resolution reference model | ||
3197 | like BigBrain. The interactive volumetric alignment tool voluba is implemented | ||
3198 | as a web service and allows anchoring of volumetric image data to reference | ||
3199 | volumes at microscopical spatial resolutions. It enables interactive | ||
3200 | manipulation of image position, scale, and orientation, flipping of coordinate | ||
3201 | axes, and entering of anatomical point landmarks in 3D. The resulting | ||
3202 | transformation parameters can, amongst others, be downloaded or used to view | ||
3203 | the anchored image volume in the interactive atlas viewer siibra-explorer.</span></p> | ||
3204 | |||
3205 | <h2></h2> | ||
3206 | |||
3207 | <h2><a name="_Toc138932398"><span lang=en-DE>WebAlign</span></a></h2> | ||
3208 | |||
3209 | <p class=MsoNormal><span lang=en-DE>WebAlign is the web version of QuickNII. | ||
3210 | Presently, it is available as a community app in the Collaboratory. Features | ||
3211 | include: Spatial registration of sectional image data. Generation of customised | ||
3212 | atlas maps for your sectional image data.</span></p> | ||
3213 | |||
3214 | <h2></h2> | ||
3215 | |||
3216 | <h2><a name="_Toc138932399"><span lang=en-DE>Webilastik</span></a></h2> | ||
3217 | |||
3218 | <p class=MsoNormal><span lang=en-DE>webilastik brings the popular machine | ||
3219 | learning-based image analysis tool ilastik from the desktop into the browser. | ||
3220 | Users can perform semantic segmentation tasks on their data in the cloud. | ||
3221 | webilastik runs computations on federated EBRAINS HPC resources and uses | ||
3222 | EBRAINS infrastructure for data access and storage. webilastik makes machine | ||
3223 | learning-based image analysis workflows accessible to users without deep | ||
3224 | knowledge of image analysis and machine learning. webilastik is part of the | ||
3225 | QUINT workflow for extraction, quantification and analysis of features from | ||
3226 | rodent histological images.</span></p> | ||
3227 | |||
3228 | <h2></h2> | ||
3229 | |||
3230 | <h2><a name="_Toc138932400"><span lang=en-DE>WebWarp</span></a></h2> | ||
3231 | |||
3232 | <p class=MsoNormal><span lang=en-DE>WebWarp is the web version of VisuAlign. | ||
3233 | Presently, it is available as a community app in the Collaboratory. Features | ||
3234 | include: Nonlinear refinements of atlas registration by WebAlign of sectional | ||
3235 | image data. Generation of customised atlas maps for your sectional image data.</span></p> | ||
3236 | |||
3237 | <h2></h2> | ||
3238 | |||
3239 | <h2><a name="_Toc138932401"><span lang=en-DE>ZetaStitcher</span></a></h2> | ||
3240 | |||
3241 | <p class=MsoNormal><span lang=en-DE>ZetaStitcher is a Python package designed | ||
3242 | to stitch large volumetric images, such as those produced by Light-Sheet | ||
3243 | Fluorescence Microscopes. It is able to quickly compute the optimal alignment | ||
3244 | of large mosaics of tiles thanks to its ability to perform a sampling along the | ||
3245 | tile depth, i.e., pairwise alignment is computed only at certain depths along | ||
3246 | the thickness of the tile. This greatly reduces the amount of data that needs | ||
3247 | to be read and transferred, thus, making the process much faster. ZetaStitcher | ||
3248 | comes with an API that can be used to programmatically access the aligned | ||
3249 | volume in a virtual fashion as if it were a big NumPy array, without having to | ||
3250 | produce the fused 3D image of the whole sample.Cellular- and subcellular-scale | ||
3251 | tools.</span></p> | ||
3252 | |||
3253 | <h2></h2> | ||
3254 | |||
3255 | <h2><a name="_Toc138932402"><span lang=en-DE>TauRAMD</span></a></h2> | ||
3256 | |||
3257 | <p class=MsoNormal><span lang=en-DE>The TauRAMD technique makes use of RAMD | ||
3258 | simulations to compute relative residence times (or dissociation rates) of | ||
3259 | protein-ligand complexes. In the RAMD method, the egress of a molecule from a | ||
3260 | target receptor is accelerated by the application of an adaptive randomly | ||
3261 | oriented force on the ligand. This enables ligand egress events to be observed | ||
3262 | in short, nanosecond timescale simulations without imposing any bias regarding | ||
3263 | the ligand egress route taken. If coupled to the MD-IFP tool, the TauRAMD | ||
3264 | method can be used to investigate dissociation mechanisms and characterize | ||
3265 | transition states.</span></p> | ||
3266 | |||
3267 | </div> | ||
3268 | |||
3269 | </body> | ||
3270 | |||
3271 | </html> | ||
3272 | |||
3273 | {{/html}} |