Changes for page data-curation-copy

Last modified by eapapp on 2023/07/04 16:46

From version 191.1
edited by eapapp
on 2023/06/08 21:33
Change comment: There is no comment for this version
To version 195.1
edited by adavison
on 2023/06/14 15:35
Change comment: There is no comment for this version

Summary

Details

Page properties
Author
... ... @@ -1,1 +1,1 @@
1 -XWiki.eapapp
1 +XWiki.adavison
Content
... ... @@ -211,11 +211,92 @@
211 211  === Step by Step - Models ===
212 212  
213 213  
214 -~1. Request curation using the [[Curation request form>>https://nettskjema.no/a/277393#/]]. You will be contacted by a curator with more information.
215 215  
216 216  
217 -//Additional information will be added soon.//
216 +==== 1. Start early ====
218 218  
218 +It is not necessary to wait until you are ready to publish to register your model with EBRAINS.
219 +
220 +By registering a model early in your project, you can take advantage of EBRAINS tools
221 +to keep track of simulations and to share them with your collaborators.
222 +
223 +==== 2. Create/choose a Collab workspace ====
224 +
225 +We use EBRAINS Collaboratory "collab" workspaces to help manage the model curation process.
226 +
227 +In particular, we use collab membership (the "Team") to control who can view or edit your model metadata prior to publication.
228 +
229 +It is up to you whether you create a new collab for each model, or reuse an existing collab
230 +(it is no problem to have multiple models associated with a single collab).
231 +
232 +Collabs are also useful for storing simulation results, adding documentation for your model,
233 +and/or providing tutorials in Jupyter notebooks.
234 +
235 +==== 3. Upload code ====
236 +
237 +We recommend storing model code and/or configuration files in an online Git repository, for example on GitHub.
238 +This repository should be public when you publish the model, but a private repository can be used for model development.
239 +
240 +Alternatively, you can upload code to the Collab Drive or Bucket storage.
241 +
242 +==== 4.Submit metadata ====
243 +
244 +We recommend submitting metadata using the Model Catalog app, installed in your collab.
245 +
246 +To install it:
247 +
248 +1. click the "+ Create" button
249 +1. in the "Create Page" form, add a title, such as "Model Catalog", and select "Community App", then click "Create"
250 +1. scroll down until you find the "Model Catalog" app, click "Select", then "Save & View"
251 +
252 +You will then see a table of all the models and validation tests associated with this collab.
253 +If this is your first time using the app, the table will probably be empty.
254 +To add your model, click "+", fill in the form, then click "Add model".
255 +
256 +As development of your model proceeds, you can easily register new versions of the code,
257 +and new parameterizations by clicking "Add new version".
258 +
259 +If you prefer not to use the app, you can instead fill in the [[Curation request form>>https://nettskjema.no/a/277393#/]].,
260 +and you will be contacted by e-mail with further instructions.
261 +
262 +====
263 +5. Provide a reference dataset ====
264 +
265 +Once you're ready to publish your model entry in the EBRAINS Knowledge Graph,
266 +we encourage you to provide a dataset containing the simulation results produced by your model,
267 +following the process under "Step by step - Data" above.
268 +
269 +These reference data will be linked to the model, and will be helpful to anyone trying to
270 +reuse your model.
271 +
272 +We will soon introduce a "Reproducible" badge for all models that include a reference dataset,
273 +and whose simulation results can be reproduced by an EBRAINS curator.
274 +
275 +==== 6. Request publication, preview and publish ====
276 +
277 +Until you request your model entry to be published in the EBRAINS Knowledge Graph,
278 +only members of the collab will be able to view the model entry, in the Model Catalog app
279 +or using the Model Validation Python client.
280 +
281 +After publication, the model will appear in the [[EBRAINS public search results>>https://search.kg.ebrains.eu/?category=Model||rel=" noopener noreferrer" target="_blank"]], and will receive a DOI.
282 +
283 +To request publication, [[contact EBRAINS support>>https://ebrains.eu/support||rel=" noopener noreferrer" target="_blank"]], providing the collab name and the model name or ID.
284 +
285 +
286 +Curators will then perform a number of checks:
287 +
288 +1. Does the model description provide sufficient context to understand the purpose and use of the model?
289 +1. Does the code repository contain a licence file, explaining the conditions for reusing the code?
290 +1. Does the model have a clearly defined version identifier (e.g. v1.0). For models in a Git repository, the version identifier should match the name of a tag or release.
291 +
292 +The curators will also take a snapshot of your model code.
293 +
294 +* For models in public Git repositories, we archive a copy of the repository in Software Heritage.
295 +* For models in a collab Bucket or Drive, we make a read-only copy of the code in a public container in the EBRAINS repository.
296 +
297 +Once this is done, you will be invited to review a preview of how the model entry will appear in the KG Search,
298 +and will have the opportunity to request modifications prior to approval and publication.
299 +
219 219  ----
220 220  
221 221  === Step by Step - Software ===
... ... @@ -244,6 +244,11 @@
244 244  
245 245  == **The openMINDS metadata framework** ==
246 246  
328 +(% class="box floatinginfobox" %)
329 +(((
330 +[[[[image:https://github.com/HumanBrainProject/openMINDS/raw/main/img/light_openMINDS-logo.png||alt="openMINDS logo" height="87" width="164"]]>>https://github.com/HumanBrainProject/openMINDS]]
331 +)))
332 +
247 247  openMINDS (open Metadata Initiative for Neuroscience Data Structures) is a community-driven, open-source metadata framework for graph database systems, such as the EBRAINS Knowledge Graph. It is composed of linked metadata models, libraries of serviceable metadata instances, and supportive tooling ([[openMINDS Python>>url:https://pypi.org/project/openMINDS/]], openMINDS Matlab). For exploring the openMINDS schemas, go to the [[HTML documentation>>url:https://humanbrainproject.github.io/openMINDS/]]. For a full overview of the framework, go to [[the openMINDS collab>>url:https://wiki.ebrains.eu/bin/view/Collabs/openminds/]] or the [[GitHub repository>>https://github.com/HumanBrainProject/openMINDS]].
248 248  
249 249  For feedback, requests, or contributions, please get in touch with the openMINDS development team via
... ... @@ -350,12 +350,16 @@
350 350  
351 351  Below is a list of additional services that data, models or software shared via EBRAINS can benefit from. EBRAINS is continuously looking to increase the number of interoperable services.
352 352  
353 -(% cellpadding="15" style="margin-right:auto" %)
354 -|(% colspan="2" %)**Viewer for 2D images**|(% colspan="2" rowspan="1" style="white-space:nowrap; width:265px" %)**Viewer for sequential atlas-registered 2D images with annotation options**
355 -|(% style="white-space:nowrap; width:320px" %)[[image:MIO_screenshot.PNG||alt="MIO viewer" height="202" style="float:left" width="250"]]|(% style="width:450px" %)Integrate image data with //the Mio viewer//: EBRAINS Multi-Image OpenSeadragon viewer provides an intuitive way of navigating high-resolution 2D image series. It has browser-based classic pan and zoom capabilities. A collection can be displayed as a filmstrip (Filmstrip Mode) or as a table (Collection Mode) with adjustable number of row and columns. See [[Mio viewer links available for this dataset>>https://search.kg.ebrains.eu/?category=Dataset&q=nr2f1#9677359c-73fa-4425-b8fa-3de794e9017a]] as an example. MioViewer user manual is found [[here>>https://multi-image-osd.readthedocs.io/en/latest/index.html]].|(% style="white-space:nowrap; width:283px" %)**[[image:LZ_screenshot.PNG||alt="LocaliZoom viewer" height="208" style="float:left" width="250"]]**|(% style="width:435px" %)Integrate atlas-registered 2D image data with //the LocaliZoom viewer//: The EBRAINS LocaliZoom serial section viewer displays series of registered 2D section images with atlas overlay, allowing the users to zoom into high-resolution images and have information about the brain regions. See the [[LocaliZoom links available for this dataset>>https://doi.org/10.25493/T686-7BX]] as an example. LocaliZoom user manual is found [[here>>https://localizoom.readthedocs.io/en/latest/index.html]].
356 -|(% colspan="2" rowspan="1" %)**Use your research product in an interactive publication**|(% colspan="2" rowspan="1" style="white-space:nowrap; width:265px" %)**Interactive 3D atlas viewer with options for data visualization**
357 -|[[image:LivePaper_screenshot.PNG||alt="LivePaper" height="284" style="float:left" width="250"]]|Add your data, models or software to a// Live paper. //Read more about [[Live papers on ebrains.eu>>https://www.ebrains.eu/data/live-papers/live-papers]].|(% style="width:283px" %)[[image:3Datlas_screenshot.PNG||alt="Siibra explorer" height="170" style="float:left" width="250"]]|(% style="width:435px" %)Upload your data to the //Siibra-explorer//: The siibra-explorer is used for visualizing volumetric brain data in all the brain atlases provided by EBRAINS (Human, Monkey, Rat and Mouse). The siibra-explorer viewer uses siibra-api to enable navigation of brain region hierarchies, maps in different coordinate spaces, and linked regional data features. Furthermore, it is connected with the siibra toolsuite providing several analytical workflows. To learn more about how to register your data to atlases, read about the [[Atlas services on ebrains.eu>>https://ebrains.eu/services/atlases#Integratedatatoanatlas]].
358 358  
440 +|(% colspan="2" %)**Viewer for 2D images**
441 +|[[image:MIO_screenshot.PNG]]|Integrate image data with //the Mio viewer//: EBRAINS Multi-Image OpenSeadragon viewer provides an intuitive way of navigating high-resolution 2D image series. It has browser-based classic pan and zoom capabilities. A collection can be displayed as a filmstrip (Filmstrip Mode) or as a table (Collection Mode) with adjustable number of row and columns. See [[Mio viewer links available for this dataset>>https://search.kg.ebrains.eu/?category=Dataset&q=nr2f1#9677359c-73fa-4425-b8fa-3de794e9017a]] as an example. MioViewer user manual is found [[here>>https://multi-image-osd.readthedocs.io/en/latest/index.html]].
442 +|(% colspan="2" %)**Viewer for sequential atlas-registered 2D images with annotation options**
443 +|[[image:LZ_screenshot.PNG]]|Integrate atlas-registered 2D image data with //the LocaliZoom viewer//: The EBRAINS LocaliZoom serial section viewer displays series of registered 2D section images with atlas overlay, allowing the users to zoom into high-resolution images and have information about the brain regions. See the [[LocaliZoom links available for this dataset>>https://doi.org/10.25493/T686-7BX]] as an example. LocaliZoom user manual is found [[here>>https://localizoom.readthedocs.io/en/latest/index.html]].
444 +|(% colspan="2" %)**Interactive 3D atlas viewer with options for data visualization**
445 +|[[image:3Datlas_screenshot.PNG]]|Upload your data to the //Siibra-explorer//: The siibra-explorer is used for visualizing volumetric brain data in all the brain atlases provided by EBRAINS (Human, Monkey, Rat and Mouse). The siibra-explorer viewer uses siibra-api to enable navigation of brain region hierarchies, maps in different coordinate spaces, and linked regional data features. Furthermore, it is connected with the siibra toolsuite providing several analytical workflows. To learn more about how to register your data to atlases, read about the [[Atlas services on ebrains.eu>>https://ebrains.eu/services/atlases#Integratedatatoanatlas]].
446 +|(% colspan="2" %)**Use your research product in an interactive publication**
447 +|[[image:LivePaper_screenshot.PNG]]|Add your data, models or software to a// Live paper. //Read more about [[Live papers on ebrains.eu>>https://www.ebrains.eu/data/live-papers/live-papers]].
448 +
359 359  ----
360 360  
361 361  ==== **Add a tutorial or learning resource ** ====