Changes for page SGA2 SP3 UC002 KR3.2 - Slow Wave Analysis Pipeline
Last modified by robing on 2022/03/25 09:55
Summary
-
Page properties (1 modified, 0 added, 0 removed)
-
Objects (1 modified, 0 added, 2 removed)
Details
- Page properties
-
- Content
-
... ... @@ -73,13 +73,11 @@ 73 73 Each stage has config files (//pipeline/<stage_name>/configs/config_<profile>.yaml//) to specify which analysis/processing blocks to execute and which parameters to use. General and specific information about the blocks and parameters can be found in the README and config files of each stage. There are preset configuration profiles for the benchmark datasets IDIBAPS ([[ECoG, anesthetized mouse>>https://kg.ebrains.eu/search/?facet_type[0]=Dataset&q=sanchez-vives#Dataset/2ead029b-bba5-4611-b957-bb6feb631396]]) and LENS ([[Calcium Imaging, anesthetized mouse>>https://kg.ebrains.eu/search/instances/Dataset/71285966-8381-48f7-bd4d-f7a66afa9d79]]). 74 74 75 75 * **Run the notebook** 76 -In the jupyter hub, navigate to //drive/My Libraries/My Library/run_snakemake_in_collab.ipynb//, or where you copied the file to.77 -Follow the notebook to install the required packages into your Python kernel, set the output path, and execute the pipeline with snakemake .76 +In the jupyter hub, navigate to //drive/My Libraries/My Library/pipeline/showcase_notebooks/run_snakemake_in_collab.ipynb//, or where you copied the //pipeline// folder to. 77 +Follow the notebook to install the required packages into your Python kernel, set the output path, and execute the pipeline with snakemake 78 78 79 79 === ii) Local execution === 80 80 81 -//tested only with Mac OS and Linux!// 82 - 83 83 * **Get the code** 84 84 The source code of the pipeline is available via Github: [[INM-6/wavescalephant>>https://github.com/INM-6/wavescalephant]] and can be cloned to your machine ([[how to get started with Github>>https://guides.github.com/activities/hello-world/]]). 85 85 ... ... @@ -88,10 +88,6 @@ 88 88 In the wavescalephant git repository, there is an environment file ([[pipeline/environment.yaml>>https://drive.ebrains.eu/smart-link/1a0b15bb-be87-46ee-b838-4734bc320d20/]]) specifying the required packages and versions. To build the environment, we recommend using conda ([[how to get started with conda>>https://docs.conda.io/projects/conda/en/latest/user-guide/getting-started.html]]). 89 89 ##conda env create ~-~-file environment.yaml 90 90 conda activate wavescalephant_env## 91 - 92 -Make sure that neo and elephant were installed as their Github development version, and if necessary add them manually to the environment. 93 -##pip install git+https:~/~/github.com/NeuralEnsemble/elephant.git 94 -pip install git+https:~/~/github.com/NeuralEnsemble/python-neo.git## 95 95 96 96 ))) 97 97 * **Edit the settings** ... ... @@ -131,7 +131,7 @@ 131 131 * Stage 05 - [[planar velocities>>https://drive.ebrains.eu/smart-link/f4de8073-cb40-47a7-bc82-f97d36dbae25/]] 132 132 * Stage 05 - [[directionality>>https://drive.ebrains.eu/smart-link/5485032d-0121-4cde-9ea2-3e0af3f12178/]] 133 133 134 -== Outlook == 128 +=== Outlook === 135 135 136 136 * Using the **KnowledgeGraph API **to insert data directly from the Knowledge Graph into the pipeline and also register and store the corresponding results as Analysis Objects. Such Analysis Objects are to incorporate **Provenance Tracking, **using [[fairgraph>>https://github.com/HumanBrainProject/fairgraph]],** **to record the details of the processing and analysis steps. 137 137 * Adding support for the pipeline to make use of **HPC** resources when running on the collab.
- Collaboratory.Apps.Collab.Code.CollabClass[0]
-
- Public
-
... ... @@ -1,1 +1,1 @@ 1 - Yes1 +No
- XWiki.XWikiRights[3]
-
- Allow/Deny
-
... ... @@ -1,1 +1,0 @@ 1 -Allow - Levels
-
... ... @@ -1,1 +1,0 @@ 1 -view - Users
-
... ... @@ -1,1 +1,0 @@ 1 -XWiki.XWikiGuest
- XWiki.XWikiRights[4]
-
- Allow/Deny
-
... ... @@ -1,1 +1,0 @@ 1 -Allow - Groups
-
... ... @@ -1,1 +1,0 @@ 1 -XWiki.XWikiAllGroup - Levels
-
... ... @@ -1,1 +1,0 @@ 1 -view