Attention: The Collaboratory IAM will down for up to 1 hour on Monday, the 7th of July 2025 starting from 5pm CEST (my timezone) for up to 1 hour. Any and all services, which require a user login with an EBRAINS account, will be un-available during that time


Last modified by puchades on 2022/11/02 10:16

From version 55.1
edited by puchades
on 2020/09/24 11:43
Change comment: There is no comment for this version
To version 62.2
edited by annedevismes
on 2021/06/08 10:23
Change comment: There is no comment for this version

Summary

Details

Page properties
Author
... ... @@ -1,1 +1,1 @@
1 -XWiki.puchades
1 +XWiki.annedevismes
Content
... ... @@ -2,18 +2,18 @@
2 2  
3 3  == (% style="color:#c0392b" %)**Description**(%%) ==
4 4  
5 -**The QUINT workflow enables an atlas based analysis of extracted features from histological image sections from the rodent brain using 3D reference atlases. **
5 +**The QUINT workflow enables an atlas-based analysis of extracted features from histological image sections from the rodent brain by using 3D reference atlases. **
6 6  
7 -**Examples of use are: cell counting and spatial distributions; determination of projection areas in connectivity experiments; exploration of pathological hallmarks in brain disease models. Integration of various data to the same reference space enables new exploration strategies and re-use of experimental data.**
7 +**Examples of use are cell counting and spatial distributions, determination of projection areas in connectivity experiments, and exploration of pathological hallmarks in brain-disease models. Integration of various data to the same reference space enables new exploration strategies and reuse of experimental data.**
8 8  
9 -The workflow is built on the following open access software:
9 +The workflow is built on the following open-access software.
10 10  
11 -* [[(% style="color:#2980b9" %)//ilastik//>>doc:.3\. Image segmentation with ilastik.WebHome]](%%) allows the extraction of labelled features such as cells, using machine learning image segmentation.
11 +* [[(% style="color:#2980b9" %)//ilastik//>>doc:.3\. Image segmentation with ilastik.WebHome]](%%) allows the extraction of labelled features such as cells, by using machine-learning image segmentation.
12 12  * [[(% style="color:#2980b9" %)//QuickNII//>>doc:.Image registration to reference atlas using QuickNII.WebHome]](%%) generates custom-angle slices from volumetric brain atlases to match the proportions and cutting plane of histological sections.
13 -* //[[(% style="color:#3498db" %)VisuAlign>>doc:.Image registration to reference atlas using QuickNII.WebHome]]//(%%) is then used for nonlinear alignment of the reference atlas slice to the section image..
13 +* //[[(% style="color:#3498db" %)VisuAlign>>doc:.Image registration to reference atlas using QuickNII.WebHome]]//(%%) is then used for non-linear alignment of the reference-atlas slice to the section image.
14 14  * (% style="color:#2980b9" %)//Nutil//(%%) enables image [[transformations>>doc:.1\. Preparing the images.WebHome]], in addition to [[quantification and spatial analysis>>doc:.4\. Quantification and spatial analysis with Nutil.WebHome]] of features by drawing on the output of //ilastik// and //QuickNII//.
15 15  
16 -In combination, the tools facilitate semi-automated quantification, eliminating the need for more time consuming methods such as stereological analysis with manual delineation of brain regions.
16 +In combination, the tools facilitate semi-automated quantification, eliminating the need for more time-consuming methods such as stereological analysis with manual delineation of brain regions.
17 17  
18 18  [[[[image:Youtube_QUINT.PNG||height="282" style="float:left" width="500"]]>>https://www.youtube.com/watch?v=8oeg3qTzLnE]]
19 19  
... ... @@ -37,28 +37,23 @@
37 37  
38 38  (% class="box successmessage" %)
39 39  (((
40 -The semi-automated QUINT workflow uses open access software that can be operated without any scripting knowledge.
40 +The semi-automated QUINT workflow uses open-access software that can be operated without any scripting knowledge.
41 41  )))
42 42  
43 43  (((
44 44  (% class="box successmessage" %)
45 45  (((
46 -As the quantifications are performed in regions defined by a reference atlas, the region definitions are standardized, allowing comparisons of data from different laboratories.
46 +Because the quantifications are performed in regions defined by a reference atlas, the region definitions are standardised, allowing comparisons of data from different laboratories.
47 47  )))
48 48  
49 49  ==== (% style="color:#c0392b" %)**References**(%%) ====
50 50  
51 -* Yates SC et al. 2019. QUINT: Workflow for Quantification and Spatial Analysis of Features in Histological Images From Rodent Brain. Front. Neuroinform. 13:75. doi: [[10.3389/fninf.2019.00075>>https://www.frontiersin.org/articles/10.3389/fninf.2019.00075/full]]
51 +* Yates SC et al. (2019) QUINT: Workflow for Quantification and Spatial Analysis of Features in Histological Images From Rodent Brain. Front. Neuroinform. 13:75. doi: [[10.3389/fninf.2019.00075>>https://www.frontiersin.org/articles/10.3389/fninf.2019.00075/full]]
52 52  * Groeneboom NE, Yates SC, Puchades MA and Bjaalie JG (2020) Nutil: A Pre- and Post-processing Toolbox for Histological Rodent Brain Section Images. //Front. Neuroinform.// 14:37. doi: [[10.3389/fninf.2020.00037>>https://www.frontiersin.org/articles/10.3389/fninf.2020.00037/full]]
53 -* Berg S, Kutra D, Kroeger T, et al. & Kreshuk A (2019) ilastik: interactive machine learning for (bio)image analysis. Nat Methods. 16:1226-1232. doi: [[10.1038/s41592-019-0582-9>>attach:https://www.nature.com/articles/s41592-019-0582-9]]
53 +* Berg S, Kutra D, Kroeger T, et al. & Kreshuk A (2019) ilastik: interactive machine learning for (bio)image analysis. Nat Methods. 16:1226-1232. doi: [[10.1038/s41592-019-0582-9>>https://www.nature.com/articles/s41592-019-0582-9]]
54 54  * (((
55 55  Puchades MA et al. (2019) Spatial registration of serial microscopic brain images to three-dimensional reference atlases with the QuickNII tool. PlosOne. 14(5): e0216796. doi: [[10.1371/journal.pone.0216796>>https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0216796]]
56 56  )))
57 57  
58 -==== (% style="color:#c0392b" %)**User support**(%%) ====
59 -
60 -* [[Nutil>>https://github.com/Neural-Systems-at-UIO/nutil]]
61 -* [[QuickNII>>https://www.nitrc.org/projects/quicknii]]
62 -* [[VisuAlign>>https://www.nitrc.org/projects/visualign/]]
63 -* [[ilastik>>https://www.ilastik.org/]]
58 +==== ====
64 64  )))
XWiki.XWikiRights[5]
Allow/Deny
... ... @@ -1,1 +1,0 @@
1 -Allow
Levels
... ... @@ -1,1 +1,0 @@
1 -view
Users
... ... @@ -1,1 +1,0 @@
1 -XWiki.XWikiGuest
XWiki.XWikiRights[6]
Allow/Deny
... ... @@ -1,1 +1,0 @@
1 -Allow
Groups
... ... @@ -1,1 +1,0 @@
1 -XWiki.XWikiAllGroup
Levels
... ... @@ -1,1 +1,0 @@
1 -view
XWiki.XWikiRights[7]
Allow/Deny
... ... @@ -1,0 +1,1 @@
1 +Allow
Levels
... ... @@ -1,0 +1,1 @@
1 +view
Users
... ... @@ -1,0 +1,1 @@
1 +XWiki.XWikiGuest
XWiki.XWikiRights[8]
Allow/Deny
... ... @@ -1,0 +1,1 @@
1 +Allow
Groups
... ... @@ -1,0 +1,1 @@
1 +XWiki.XWikiAllGroup
Levels
... ... @@ -1,0 +1,1 @@
1 +view