Last modified by robing on 2022/03/25 09:55

From version 67.1
edited by robing
on 2020/04/30 14:24
Change comment: There is no comment for this version
To version 64.1
edited by robing
on 2020/04/29 13:47
Change comment: There is no comment for this version

Summary

Details

Page properties
Content
... ... @@ -75,7 +75,12 @@
75 75  
76 76  * **Run the notebook**
77 77  In the jupyter hub, navigate to //drive/My Libraries/My Library/pipeline/showcase_notebooks/run_snakemake_in_collab.ipynb//, or where you copied the //pipeline// folder to.
78 -Follow the notebook to install the required packages into your Python kernel, set the output path, and execute the pipeline with snakemake
78 +Follow the notebook to install the required packages into your Python kernel, set the output path, and execute the pipeline with snakemake.
79 +
80 +* **Coming soon**
81 +** Use of KnowledgeGraph API
82 +** Provenance Tracking
83 +** HPC support
79 79  
80 80  === ii) Local execution ===
81 81  
... ... @@ -114,14 +114,6 @@
114 114  
115 115  All results are stored in the path specified in the //settings.py// file. The folder structure reflects the structuring of the pipeline into stages and blocks. All intermediate results are stored as //.nix// files using the [[Neo data format>>https://neo.readthedocs.io/en/stable/]] and can be loaded with ##neo.NixIO('/path/to/file.nix').read_block()##. Additionally, most blocks produce a figure, and each stage a report file, to give an overview of the execution log, parameters, intermediate results, and to help with debugging. The final stage (//stage05_wave_characterization//) stores the results as[[ //pandas.DataFrames//>>https://pandas.pydata.org/]] in //.csv// files, separately for each measure as well as in a combined dataframe for all measures.
116 116  
117 -== Outlook ==
118 -
119 -* Using the **KnowledgeGraph API **to insert data directly from the Knowledge Graph into the pipeline and also register and store the corresponding results as Analysis Objects. Such Analysis Objects are to incorporate **Provenance Tracking, **using [[fairgraph>>https://github.com/HumanBrainProject/fairgraph]],** **to record the details of the processing and analysis steps.
120 -* Adding support for the pipeline to make use of **HPC** resources when running on the collab.
121 -* Further extending the available **methods** to address a wider variety of analysis objectives and support the processing of other datatypes. Additional documentation and guides should also make it easier for non-developers to contribute new method blocks.
122 -* Extending the **application** of the pipeline to the analysis of other types of activity waves and oscillations.
123 -* Integrating and co-developing new features of the underlying **software tools **[[Elephant>>https://elephant.readthedocs.io/en/latest/]], [[Neo>>https://neo.readthedocs.io/en/stable/]], [[Nix>>https://github.com/G-Node/nix]], [[Snakemake>>https://snakemake.readthedocs.io/en/stable/]].
124 -
125 125  == References ==
126 126  
127 127  * [[Celotto, Marco, et al. "Analysis and Model of Cortical Slow Waves Acquired with Optical Techniques." //Methods and Protocols// 3.1 (2020): 14.>>https://doi.org/10.3390/mps3010014]]
... ... @@ -132,7 +132,7 @@
132 132  
133 133  == License (to discuss) ==
134 134  
135 -All text and example data in this collab is licensed under Creative Commons CC-BY 4.0 license. Software code is licensed under GNU General Public License v3.0.
132 +All text and example data in this collab is licensed under Creative Commons CC-BY 4.0 license. Software code is licensed under a modified BSD license.
136 136  
137 137  [[image:https://i.creativecommons.org/l/by/4.0/88x31.png||style="float:left"]]
138 138  
... ... @@ -147,7 +147,7 @@
147 147  )))
148 148  
149 149  
150 -== ==
147 +== Executing the pipeline ==
151 151  
152 152  (% class="col-xs-12 col-sm-4" %)
153 153  (((