Last modified by puchades on 2022/09/30 16:01

From version 45.2
edited by tomazvieira
on 2022/09/11 16:02
Change comment: There is no comment for this version
To version 45.1
edited by tomazvieira
on 2022/09/11 15:58
Change comment: Uploaded new attachment "image-20220911155827-1.png", version {1}

Summary

Details

Page properties
Content
... ... @@ -49,24 +49,20 @@
49 49  
50 50  === Opening a Dataset from the data-proxy ===
51 51  
52 -You can also load Neuroglancer Precomputed Chunks data from the data-proxy (e.g. the [[ana-workshop-event bucket>>https://wiki.ebrains.eu/bin/view/Collabs/ana-workshop-event/Bucket]]); The URLs for this kind of data follow the following scheme:
53 -\\##precomputed:~/~/https:~/~/data-proxy.ebrains.eu/api/v1/buckets/(% style="background-color:#3498db; color:#ffffff" %)my-bucket-name(% style="color: rgb(0, 0, 0); background-color: rgb(255, 255, 255)" %)/(% style="background-color:#9b59b6; color:#ffffff" %)path/inside/your/bucket(%%)##
52 +You can also load Neuroglancer Precomputed Chunks data from the data-proxy; The URLs for this kind of data follow the following scheme:
53 +\\##precomputed:~/~/https:~/~/data-proxy.ebrains.eu/api/buckets/(% style="background-color:#3498db; color:#ffffff" %)my-bucket-name(% style="background-color:#9b59b6; color:#ffffff" %)/path/inside/your/bucket(%%)##
54 54  
55 -where (% style="background-color:#9b59b6; color:#ffffff" %)path/inside/your/bucket(%%)  should be the path to the folder containing the dataset "info" file.
55 +So, for example, to load the sample data inside the (% style="background-color:#3498db; color:#ffffff" %)ana-workshop(%%) bucket, under the path (% style="background-color:#9b59b6; color:#ffffff" %)tg-ArcSwe_mice_precomputed/hbp-00138_122_381_423_s001.precomputed(% style="color:#000000" %) (%%) like in the example below:
56 56  
57 57  
58 -So, for example, to load the sample data inside the (% style="background-color:#3498db; color:#ffffff" %)ana-workshop-event(%%) bucket, under the path (% style="background-color:#9b59b6; color:#ffffff" %)tg-ArcSwe_mice_precomputed/hbp-00138_122_381_423_s001.precomputed(% style="color:#000000" %) (%%) like in the example below:
58 +[[image:image-20220128142757-1.png]]
59 59  
60 -(% style="display:none" %) (%%)
61 -
62 -[[image:webilastik_bucket_paths.png]]
63 -
64 64  === ===
65 65  
66 66  you would type a URL like this:
67 67  
68 68  
69 -{{{precomputed://https://data-proxy.ebrains.eu/api/v1/buckets/ana-workshop-event/tg-ArcSwe_mice_precomputed/hbp-00138_122_381_423_s001.precomputed}}}
65 +##precomputed:~/~/https:~/~/data-proxy.ebrains.eu/api/buckets/(% style="background-color:#3498db; color:#ffffff" %)ana-workshop(%%)/(% style="background-color:#9b59b6; color:#ffffff" %)tg-ArcSwe_mice_precomputed/hbp-00138_122_381_423_s001.precomputed(%%)##
70 70  
71 71  this scheme is the same whether you're loading data into the Neuroglancer viewer or specifying an input URL in the export applet.
72 72  
... ... @@ -100,17 +100,18 @@
100 100  
101 101  Normal ilastik operation can be computationally intensive, requiring dedicated compute resources to be allocated to every user working with it.
102 102  
99 +
103 103  The "Session Management" widget allows you to request a compute session where webilastik will run; Select a session duration and click 'Create' to create a new compute session. Eventually the compute session will be allocated, opening up the other workflow widgets.
104 104  
105 -Don't forget to close your compute session by clicking the "Close Session" button once you're done to prevent wasting your quota in the HPC. If you have a long running job, though, you can just leave the session and rejoin it later by pasting its session ID in the "Session Id" field of the "Session Management" widget and clicking "rejoin Session".
106 106  
103 +
107 107  == Training the Pixel Classifier ==
108 108  
109 109  === Selecting Image Features ===
110 110  
111 -Pixel Classification uses different characteristics ("features") of each pixel from your image to determine which class that pixel should belong to. These take into account, for example, color and texture of each pixel as well as that of the neighboring pixels. Each one of this characteristics requires some computational power, which is why you can select only the ones that are sensible for your particular dataset.
108 +Pixel Classification uses different characteristics ("features") of your image to determine which class each pixel should belong to. These take into account, for example, color and texture of each pixel as well as that of the neighboring pixels. Each one of this characteristics requires some computational power, which is why you can select only the ones that are sensible for your particular dataset.
112 112  
113 -Use the checkboxes in the applet "Select Image Features" applet to select some image features and their corresponding sigma. The higher the sigma, the bigger the vicinity considered when computing values for each pixel, and the bigger its influence over the final value of that feature. Higher sigmas also require more computations to be done and can increase the time required to do predictions.
110 +Use the checkboxes in the applet "Select Image Features" applet to select some image features and their corresponding sigma (which determines the radius around the pixel that will be considered when computing that feature).
114 114  
115 115  You can read more about image features in [[ilastik's documentation.>>https://www.ilastik.org/documentation/pixelclassification/pixelclassification\]]
116 116  
... ... @@ -120,15 +120,12 @@
120 120  
121 121  === Labeling the image ===
122 122  
123 -In order to classify the pixels of an image into different classes (e.g.: 'foreground' and 'background') ilastik needs you to provide it with examples of each class.
120 +In order to classify the pixels of an image into different classes (e.g.: 'foreground' and 'background') ilastik needs you to provide it with samples of each class.
124 124  
122 +To do so, first select a particular resolution of your dataset (your viewer might interpolate between multiple scales of the dataset, but ilastik operates on a single resolution):
125 125  
126 -==== Picking an Image Resolution (for multi-resolution images only) ====
124 +[[image:image-20220125165642-1.png]]
127 127  
128 -If your data has multiple resolutions (**not the case in any of the sample datasets**), you'll have to pick one of them in the "Training" widget. Neuroglancer interpolates between multiple scales of the dataset, but ilastik operates on a single resolution:
129 -
130 -[[image:image-20220911155827-1.png]]
131 -
132 132  Once you've selected a resolution to train on, you should see a new "training" tab at the top of the viewer:
133 133  
134 134  [[image:image-20220125165832-2.png]]
... ... @@ -137,12 +137,9 @@
137 137  
138 138  [[image:image-20220222151117-1.png]]
139 139  
140 -==== ====
141 141  
142 -==== Painting Labels ====
135 +The status display in this applet will show "training on [datasource url]" when you're in training mode.
143 143  
144 -The status display in the "Training" applet will show "training on [datasource url]" when it's ready to start painting.
145 -
146 146  Now you can start adding brush strokes. Select a color from the color picker, check the "Enable Brushing" checkbox to enable brushing (and disable navigation), and click and drag over the image to add brush strokes. Ilastik will map each used color to a "class", and will try to figure out a class for every pixel in the image based on the examples provided by the brush strokes. By painting, you provide ilastik with samples of what a pixel in that particular class should look like. The following image shows an example with 2 classes: teal, representing the "foreground" or the "cell class", and magenta, representing the "background" class.
147 147  
148 148  [[image:image-20220222153157-4.png]]