Changes for page 5. How to segment your objects with Webilastik
Last modified by puchades on 2022/09/30 16:01
From version 15.1
edited by tomazvieira
on 2022/01/25 17:22
on 2022/01/25 17:22
Change comment:
Uploaded new attachment "image-20220125172238-8.png", version {1}
To version 32.1
edited by tomazvieira
on 2022/02/22 15:47
on 2022/02/22 15:47
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
-
Attachments (0 modified, 9 added, 0 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 -4. How to use webilastik1 +4. How to use Webilastik - Content
-
... ... @@ -1,87 +1,120 @@ 1 -== What is webilastik? ==1 +== What is Webilastik? == 2 2 3 3 4 4 Classic [[ilastik>>https://www.ilastik.org/]] is a simple, user-friendly desktop tool for **interactive image classification, segmentation and analysis**. It is built as a modular software framework, which currently has workflows for automated (supervised) pixel- and object-level classification, automated and semi-automated object tracking, semi-automated segmentation and object counting without detection. Most analysis operations are performed **lazily**, which enables targeted interactive processing of data subvolumes, followed by complete volume analysis in offline batch mode. Using it requires no experience in image processing. 5 5 6 - 7 7 [[webilastik>>https://app.ilastik.org/]] is a web version of ilastik's Pixel Classification Workflow, integrated with the ebrains ecosystem. It can access the data-proxy buckets for reading and writing (though reading is still suffering from latency issues). It uses Neuroglancer as a 3D viewer as well as compute sessions allocated from the CSCS infrastructure. 8 8 9 -== How to use IlastikWeb==8 +== How to use Webilastik == 10 10 11 11 === Opening a sample Dataset === 12 12 13 -Go to [[app.ilastik.org/>>app.ilastik.org/]] and load a [[Neuroglancer Precomputed Chunks dataset>>https://github.com/google/neuroglancer/tree/master/src/neuroglancer/datasource/precomputed]]. You can e.g. use a sample data set that is already in the server by pasting the following URL into Neuroglancer's prompt: 12 +Go to [[https:~~/~~/app.ilastik.org/>>https://app.ilastik.org/]] and load a [[Neuroglancer Precomputed Chunks dataset>>https://github.com/google/neuroglancer/tree/master/src/neuroglancer/datasource/precomputed]]. You can e.g. use a sample data set that is already in the server by pasting the following URL into Neuroglancer's prompt: 14 14 15 - 16 16 precomputed:~/~/https:~/~/app.ilastik.org/public/images/c_cells_2.precomputed 17 17 18 - 19 19 [[image:image-20220125164204-2.png]] 20 20 21 21 22 - Ifyour dataset is 2D like in the example, you canclickthe "switch to xy layout"buttonattheop-rightcornerofthetop-left quadrantof the viewport touse asingle, 2D viewport:19 +=== Opening a Dataset from the data-proxy === 23 23 21 +You can also load Neuroglancer Precomputed Chunks data from the data-proxy; The URLs for this kind of data follow the following scheme: 22 +\\##precomputed:~/~/https:~/~/data-proxy.ebrains.eu/api/buckets/(% style="background-color:#3498db; color:#ffffff" %)my-bucket-name(% style="background-color:#9b59b6; color:#ffffff" %)/path/inside/your/bucket(%%)## 24 24 25 - [[image:image-20220125164416-3.png]]24 +So, for example, to load the sample data inside the (% style="background-color:#3498db; color:#ffffff" %)quint-demo(%%) bucket, under the path (% style="background-color:#9b59b6; color:#ffffff" %)tg-ArcSwe_mice_precomputed/hbp-00138_122_381_423_s001.precomputed(% style="color:#000000" %) (%%) like in the example below: 26 26 27 -== == 28 28 29 - which will changethe view to somethinglike this:27 +[[image:image-20220128142757-1.png]] 30 30 29 +=== === 31 31 32 - [[image:image-20220125164557-4.png]]31 +you would type a URL like this: 33 33 34 -(% class="wikigeneratedid" %) 35 -== == 36 36 37 - == TrainingthePixelClassifier==34 +##precomputed:~/~/https:~/~/data-proxy.ebrains.eu/api/buckets/(% style="background-color:#3498db; color:#ffffff" %)quint-demo(%%)/(% style="background-color:#9b59b6; color:#ffffff" %)tg-ArcSwe_mice_precomputed/hbp-00138_122_381_423_s001.precomputed(%%)## 38 38 36 +this scheme is the same whether you're loading data into the Neuroglancer viewer or specifying an input URL in the export applet. 39 39 38 +=== Viewing 2D Data === 39 + 40 +If your dataset is 2D like in the example, you can click the "switch to xy layout" button at the top-right corner of the top-left quadrant of the viewport to use asingle, 2D viewport: 41 + 42 +[[image:image-20220125164416-3.png]] 43 + 44 +which will change the view to something like this: 45 + 46 +[[image:image-20220125164557-4.png]] 47 + 48 +== Training the Pixel Classifier == 49 + 40 40 === Selecting Image Features === 41 41 42 42 Pixel Classification uses different characteristics ("features") of your image to determine which class each pixel should belong to. These take into account, for example, color and texture of each pixel as well as that of the neighboring pixels. Each one of this characteristics requires some computational power, which is why you can select only the ones that are sensible for your particular dataset. 43 43 54 +Use the checkboxes in the applet "Select Image Features" applet to select some image features and their corresponding sigma (which determines the radius around the pixel that will be considered when computing that feature). 44 44 56 +You can read more about image features in [[ilastik's documentation.>>https://www.ilastik.org/documentation/pixelclassification/pixelclassification\]] 57 + 58 +The following is an arbitrary selection of image features. Notice that the checkboxes marked in orange haven't been commited yet; Click Ok to send your feature selections (or deselections) to the server. 59 + 60 +[[image:image-20220125171850-7.png]] 61 + 45 45 === Labeling the image === 46 46 47 -In order to classify the pixels of an image into different classes (e.g.: "foreground and background") ilastik needs you to provide it with samples of each class.To do so, click the brush tool ([[image:https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-122322_28x27_scrot.png?width=28&height=27&rev=1.1||alt="2020-03-31-122322_28x27_scrot.png" height="27" width="28"]]) in the toolbox in Annotations panel, on the right-hand side of the screen. You can select any color for your brush strokes by clicking the color-picker tool, which defaults to a bright yellow([[image:https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-122531_50x28_scrot.png?width=50&height=28&rev=1.1||alt="2020-03-31-122531_50x28_scrot.png" height="28" width="50"]]).64 +In order to classify the pixels of an image into different classes (e.g.: 'foreground' and 'background') ilastik needs you to provide it with samples of each class. 48 48 49 - Withthebrush toolactivated and asuitablecolorselected,you canstartbelingthedataset.Fornow,thecontrols foraddingannotationstoadatasetarethesame asfor addinganyotherype ofannotationinvanilla Neuroglancer:66 +To do so, first select a particular resolution of your dataset (your viewer might interpolate between multiple scales of the dataset, but ilastik operates on a single resolution): 50 50 51 -1. Hold Ctrl and click to start a brush stroke; 52 -1. Move the mouse around to draw; 53 -1. While still holding Ctrl, click again to finish the stroke; 68 +[[image:image-20220125165642-1.png]] 54 54 55 - Here'sanexampleofa sample dataset withafewbrush strokesmarking thedarkerspotsinthedataset:70 +Once you've selected a resolution to train on, you should see a new "training" tab at the top of the viewer: 56 56 57 -[[image: https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-124011_1276x586_scrot.png?rev=1.2||alt="2020-03-31-124011_1276x586_scrot.png"]]72 +[[image:image-20220125165832-2.png]] 58 58 59 - Like in vanilla Neuroglancer, youcansnapthenavigationbacktoyournnotations by clickingontheminthelist ofannotations. Notealso that the text representing theannotationsis colored thesame asthe colorofthebrush stroke,and clickingany of theannotations willalso set thecolor pickerback totheannotation'scolor,sothatyou don't havetoremembertheirhex code. If you're not happywithhow oneannotationcame up, you can delete itbyselecting it fromthe list of annotations andthenclickingthetrashbin icon([[image:https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-124347_26x21_scrot.png?width=26&height=21&rev=1.1||alt="2020-03-31-124347_26x21_scrot.png" height="21" width="26"]]).74 +You must have the "training" tab as the frontmost visible tab in order to start adding brush strokes (in neuroglancer you can click the name of the raw data tab to hide it, for example): 60 60 61 - Youmust haveannotations of at least two different colors in order to havemeaningful predictions.76 +[[image:image-20220222151117-1.png]] 62 62 63 -=== Selecting Features === 64 64 65 - ilastik's pixelclassifier usestheannotationsfrom thelastsessionas markers ofthe relevant areasof your image, but it stillneedstoknowwhich characteristics of thoseareas areimportant for decidingwhich class (i.e. annotationcolor) a pixel shouldbelong to. Those characteristicsare thingslike the colorof the pixel, thecolor of theirneighbors, texturein theeighborhood,etc, andyoucan learn moreabout themin the [[relevantilastik documentation>>url:https://www.ilastik.org/documentation/pixelclassification/pixelclassification#selecting-good-features]].79 +The status display in this applet will show "training on [datasource url]" when you're in training mode. 66 66 67 - To selectwhichfeaturesyouwanttobe usedwhendoingpredictions, clickthe "Features"button([[image:https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-130413_69x30_scrot.png?width=69&height=30&rev=1.1||alt="2020-03-31-130413_69x30_scrot.png"height="30"width="69"]])in thetoolbox.Thiswilldisplayawindowwithintheannotationsside-pannel whereyoucanselectfeatures tobe calculated, aswellastheirscales, much likenative versionof ilastik:81 +Now you can start adding brush strokes. Select a color from the color picker, check the "Enable Brushing" checkbox to enable brushing (and disable navigation), and click and drag over the image to add brush strokes. Ilastik will map each used color to a "class", and will try to figure out a class for every pixel in the image based on the examples provided by the brush strokes. By painting, you provide ilastik with samples of what a pixel in that particular class should look like. The following image shows an example with 2 classes: teal, representing the "foreground" or the "cell class", and magenta, representing the "background" class. 68 68 69 -[[image: https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-131023_308x237_scrot.png?width=308&height=237&rev=1.1||alt="2020-03-31-131023_308x237_scrot.png" height="237" width="308"]]83 +[[image:image-20220222153157-4.png]] 70 70 71 - Select featuresappropriatetoyourimage andtheobjects you'veannotated andclick "Ok". Onceyouaveany annotationsandat least onefeatureselected,ilastik will automatically startcalculatingpredictionson your image. Keep in mind, though, thatthebiggerthe regionoftheimage beingvisualizedat anytime andthemorefeaturesyou've selected,the longerit will takefor thepredictions to be calculated.85 +Once you have some image features selected and some brush annotation of at least 2 colors, you can check "Live Update" and ilastik will automatically use your examples to predict what classes the rest of your dataset should be, displaying the results in a "predictions" tab. 72 72 73 - The nextimageshows a prediction on the sample data. Notice that the colors match those of your annotations, and are brighter the more confident ilastik is on the class to which they belong.87 +[[image:image-20220222153610-5.png]] 74 74 75 -[[image:https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-131806_1273x584_scrot.png?rev=1.1||alt="2020-03-31-131806_1273x584_scrot.png"]] 76 76 90 +You can keep adding or removing brush strokes to improve your predictions. 91 + 77 77 You can adjust the display settings of the overlay predictions layer as you would in vanilla neuroglancer: 78 78 79 -1. right-click the "ilastikpredictions"tab to reveal the "rendering" options80 -1. Adjust the layer opacity to be better view the predictions or underlying raw data;94 +1. right-click the predictions Neuroglancer tab to reveal the "rendering" options 95 +1. Adjust the layer opacity to better view the predictions or underlying raw data; 81 81 1. Advanced users: edit the shader to render the predictions in any arbitrary way; 82 82 83 -The image below shows the " ilastikpredictions" tab with an opacity set to 0.68 using the steps described above:98 +The image below shows the "predictions" tab with an opacity set to 0.68 using the steps described above: 84 84 85 -[[image: https://wiki.ebrains.eu/bin/download/Collabs/ilastik/ilastik%20online%20classifier%20training/WebHome/2020-03-31-132745_1274x583_scrot.png?rev=1.1||alt="2020-03-31-132745_1274x583_scrot.png"]]100 +[[image:image-20220125172238-8.png]] 86 86 87 87 You can keep adding or removing features to your model, as well as adding and removing annotations, which will automatically update the predictions tab. 103 + 104 +=== Exporting Results and Running Jobs === 105 + 106 +Once you trained your pixel classifier with the previous applets, you can apply it to other datasets or even the same dataset that was used to do the training on. 107 + 108 +To do so, select a data source by typing in the URL of the data source in the Data Source Url field and select a scale from the data source as they appear beneath the URL field. 109 + 110 +Then, configure a Data Sink, i.e., a destination that will receive the results of the pixel classification. For now, webilastik will only export to ebrains' data-proxy buckets; Fill in the name of the bucket and then the prefix (i.e.: path within the bucket) where the results in Neuroglancer's precomputed chunks format should be written to. 111 + 112 +[[image:image-20220125190311-2.png]] 113 + 114 +Finally, click export button and eventually a new job shall be created if all the parameters were filled in correctly. 115 + 116 +You'll be able to find your results in the data-proxy GUI, in a url that looks something like this: 117 + 118 +https:~/~/data-proxy.ebrains.eu/your-bucket-name?prefix=your/selected/prefix 119 + 120 +[[image:image-20220125191847-3.png]]
- image-20220125185730-1.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +43.7 KB - Content
- image-20220125190311-2.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +61.0 KB - Content
- image-20220125191847-3.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +74.6 KB - Content
- image-20220128142757-1.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +53.7 KB - Content
- image-20220222151117-1.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +225.3 KB - Content
- image-20220222151750-2.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +143.4 KB - Content
- image-20220222153044-3.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +135.2 KB - Content
- image-20220222153157-4.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +146.3 KB - Content
- image-20220222153610-5.png
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +XWiki.tomazvieira - Size
-
... ... @@ -1,0 +1,1 @@ 1 +171.1 KB - Content