Wiki source code of API Catalogue

Version 13.1 by adavison on 2021/12/17 13:40

Show last authors
1
2
3 This page catalogues the [[web API>>https://en.wikipedia.org/wiki/Web_API]]s available for users and developers of [[EBRAINS tools and services>>https://ebrains.eu]].
4
5 (% class="box warningmessage" %)
6 (((
7 Please note this is a work in progress.
8 )))
9
10 ----
11
12 {{toc/}}
13
14 == Core services / Collaboratory ==
15
16 === Authentication / Authorization (IAM) ===
17
18 [[[[image:documentation-icon.png]]>>https://www.keycloak.org/documentation.html||rel="noopener noreferrer" target="_blank"]]
19
20 IAM is the EBRAINS **I**dentity and **A**ccess **M**anagement service which is delivered by the Collaboratory and manages user identification and permission management for all EBRAINS users and services.
21
22 === Collaboratory wiki ===
23
24 [[[[image:documentation-icon.png]]>>https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide/Features/XWikiRESTfulAPI||rel="noopener noreferrer" target="_blank"]]
25
26 The [[Wiki service>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-collaboratory/Getting%20Started/#HWikipages]] of the Collaboratory hosts the main interface to access all the other Collaboratory services. As such, it embodies the full concept of collab workspaces and most users consider it to be the collab service. The Wiki in itself offers a handy way of documenting a user's work with a simple wiki user interface.
27
28 === Drive ===
29
30 [[[[image:documentation-icon.png]]>>https://download.seafile.com/published/web-api/home.md||rel="noopener noreferrer" target="_blank"]]
31
32 The Collaboratory offers two storage solutions for collabs: the Drive and the Bucket. The Drive is especially well suited for files and documents that need to be worked on in a more agile manner, possibly also collaboratively. For larger files, datasets, and videos the [[Bucket>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-collaboratory/Documentation%20Bucket/]] offers a better solution.
33
34 === Bucket (Data Proxy) ===
35
36 [[[[image:documentation-icon.png]]>>https://data-proxy-ppd.ebrains.eu/api/docs||rel="noopener noreferrer" target="_blank"]]
37
38 The Collaboratory offers two storage solutions for collabs: the Drive and the Bucket. The Drive is especially well suited for files and documents that need to be worked on in a more agile manner, possibly also collaboratively. For larger files, datasets, and videos the [[Bucket>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-collaboratory/Documentation%20Bucket/]] offers a better solution.
39
40 ----
41
42 == Find and share data ==
43
44 === Knowledge Graph core ===
45
46 [[[[image:documentation-icon.png]]>>https://core.kg.ebrains.eu/swagger-ui/index.html?configUrl=/v3/api-docs/swagger-config||rel="noopener noreferrer" target="_blank"]]
47
48 The EBRAINS Knowledge Graph is a multi-modal metadata store which brings together information from different fields on brain research. At the core of the EBRAINS Knowledge Graph, a graph database tracks the linkage between experimental data and neuroscientific data science supporting more extensive data reuse and complex computational research than would be possible otherwise.
49
50 === Knowledge Space ===
51
52 [[[[image:documentation-icon.png]]>>https://api.knowledge-space.org/docs||rel="noopener noreferrer" target="_blank"]]
53
54 KnowledgeSpace aims to be a globally-used, community-based, data-driven encyclopedia for neuroscience that links brain research concepts to data, models, and the literature that support them. Further it aims to serve as a framework where large-scale neuroscience projects can expose their data to the neuroscience community-at-large. KS is a framework that combines general descriptions of neuroscience concepts found in wikipedia with more detailed content from InterLex. It then integrates the content from those two sources with the latest neuroscience citations found in PubMed and data found in some of the world’s leading neuroscience repositories. KS is a joint development between the Human Brain Project (HBP), the International Neuroinformatics Coordinating Facility (INCF), and the Neuroscience Information Framework (NIF).
55
56 === Neural Activity Resource ===
57
58 [[[[image:documentation-icon.png]]>>https://neural-activity-resource.brainsimulation.eu/docs||rel="noopener noreferrer" target="_blank"]]
59
60 The Neural Activity Resource provides a simplified interface to data and metadata from the KG about neural activity recordings.
61
62 === Provenance API ===
63
64 [[[[image:documentation-icon.png]]>>https://prov.brainsimulation.eu/docs||rel="noopener noreferrer" target="_blank"]]
65
66 The Provenance API provides a simplified interface to computational provenance information in the KG.
67
68 ----
69
70 == Brain atlases ==
71
72 === Image service ===
73
74 [[[[image:documentation-icon.png]]>>https://img-svc.apps.hbp.eu/api-docs||rel="noopener noreferrer" target="_blank"]]
75
76 The Image Service was designed to allow users to process imaging data in order to be used with interactive HBP tools. These tools often need special formats and data descriptors to work, and the Image Service's goal is to provide streamlined workflows with which datasets can be made available to these tools with the simplest possible input, on-demand, using resources allocated in the FENIX Research Infrastructure.
77
78 === siibra ===
79
80 [[[[image:documentation-icon.png]]>>https://siibra-api.readthedocs.io||rel="noopener noreferrer" target="_blank"]]
81
82 The siibra toolsuite provides both interactive and programmatic user interfaces for working with “multilevel” brain atlases, that is, brain atlases composed of multiple reference spaces, parcellation maps, and data modalities. siibra-API is a RESTful API service exposing the core functionalities for integration with other applications.
83
84 ----
85
86 == Data analysis ==
87
88 === Neo-Viewer ===
89
90 [[[[image:documentation-icon.png]]>>https://neo-viewer.brainsimulation.eu/||rel="noopener noreferrer" target="_blank"]]
91
92 Neo Viewer provides web-based visualisation of electrophysiology data, with support for** **most of the widely-used file formats in neurophysiology, including community standards such as NIX and NWB. It consists of a REST API for transforming electrophysiology data files into JSON format and a Javascript component that can be embedded in any web page.
93
94 ----
95
96 == Simulation ==
97
98 === Model Validation service ===
99
100 [[[[image:documentation-icon.png]]>>https://validation-v2.brainsimulation.eu/docs||rel="noopener noreferrer" target="_blank"]]
101
102 The Model Validation Service provides web-based tools for working with computational models, and the validation of such models against experimental data.  It consists of a REST web service and two clients: the Model Catalog collaboratory app and a Python client. Underlying metadata are stored in the Knowledge Graph. The service allows users (i) to create, edit, search and view metadata about models and validation tests, (ii) to register, search, view and compare the results of validation tests.
103
104 === The Virtual Brain ===
105
106 [[[[image:documentation-icon.png]]>>https://thevirtualbrain-rest.apps.hbp.eu/doc||rel="noopener noreferrer" target="_blank"]]
107
108 REST API for TVB.
109
110 === MoDEL-CNS ===
111
112 [[[[image:documentation-icon.png]]>>https://mmb.irbbarcelona.org/MoDEL-CNS/api/rest/docs/||rel="noopener noreferrer" target="_blank"]]
113
114 Molecular Dynamics Extended Library: Central Nervous System (MoDEL-CNS) is a platform designed to provide web-access to **atomistic-MD trajectories** for relevant **signal transduction proteins**.
115
116 === Cellular Level Simulation ===
117
118 [[[[image:documentation-icon.png]]>>https://humanbrainproject.github.io/hbp-bsp-service-account/introduction/introduction.html||rel="noopener noreferrer" target="_blank"]]
119
120 Web applications that allow running simulations in different scales (single cell, multiple cells, regions).
121
122 ----
123
124 == Neurorobotics ==
125
126 === NRP Frontend-Backend comms ===
127
128 [[[[image:documentation-icon.png]]>>https://neurorobotics.net/Documentation/nrp/developer_manual/ExDbackend/index.html||rel="noopener noreferrer" target="_blank"]]
129
130 === NRP-core ===
131
132 [[[[image:documentation-icon.png]]>>https://hbpneurorobotics.bitbucket.io/index.html||rel="noopener noreferrer" target="_blank"]]
133
134 The neurorobotics platform core (referred throughout this document as NRP-core) is the mechanism through which NRP users can implement simulations whereby multiple pieces of simulation software can coexist, synchronize their execution and exchange data in tightly ordered fashion. In previous versions of the NRP, NRP-core was referred to as the "Closed Loop Engine" (CLE), the task of which was to orchestrate the dialogue between the Gazebo robotic simulator and brain models implemented in NEST, Nengo, etc. For those of our users familiar with the CLE, NRP-core is a generalization of the latter, with new generic mechanisms provided to users who want to integrate new simulation engines into their NRP simulations. NRP-core is still built on the so-called Transfer Function framework, although the latter was adapted and renamed [[Transceiver Functions>>url:https://hbpneurorobotics.bitbucket.io/transceiver_function.html]] framework. This renaming is not only cosmetic: users familiar with the NRP up to v3.2 should indeed remain aware of some limited but meaningful evolutions between these two frameworks.
135
136 ----
137
138 == Medical data analytics ==
139
140 No API documentation for this service category is available at the present time.
141
142 ----
143
144 == Neuromorphic computing ==
145
146 === Job Queue service ===
147
148 [[[[image:documentation-icon.png]]>>https://nmpi.hbpneuromorphic.eu/api/v2/||rel="noopener noreferrer" target="_blank"]]
149 REST API for submitting jobs to the BrainScaleS and SpiNNaker platforms, and for retrieving job results.
150
151 === Quotas service ===
152
153 [[[[image:documentation-icon.png]]>>]]
154 REST API for requesting and managing compute quotas for the BrainScaleS and SpiNNaker platforms.
155
156 ----
157
158 == High-performance computing ==
159
160 === UNICORE ===
161
162 [[[[image:documentation-icon.png]]>>https://sourceforge.net/p/unicore/wiki/REST_API/||rel="noopener noreferrer" target="_blank"]]
163
164 UNICORE (UNiform Interface to COmputing REsources) provides tools and services for making high-performance computing and data resources accessible in a seamless and secure way for a wide variety of applications.
165
166 UNICORE offers RESTful APIs for job submission, job management, data access, data movement and workflows, and is already integrated with the HBP AAI and the user/project management at the HPC sites. UNICORE can help solve many integration issues, provide the necessary building blocks and APIs for creating cross-site workflows and it provides added value such as site-to-site data movement (between POSIX filesystems), access to external data, or sharing of HPC data sets.
167
168 === Supercomputing Proxy ===
169
170 [[[[image:documentation-icon.png]]>>https://unicore-job-proxy.apps.hbp.eu/docs||rel="noopener noreferrer" target="_blank"]]
171
172 Supercomputing proxy provides services which run on supercomputers a means of running a job on behalf of an EBRAINS user without requesting that the user have an HPAC account. The job is launched using a service account in the name of the service provider. The service is limited to running jobs which use pre-determined executables. The end-user cannot use the supercomputing resources for purposes not intended by the service provider.
173
174 ----
175
176 == Cloud computing ==
177
178 === OpenStack ===
179
180 [[[[image:documentation-icon.png]]>>https://wiki.ebrains.eu/bin/view/Collabs/openstack/||rel="noopener noreferrer" target="_blank"]]
181
182 Red Hat OpenStack Platform provides the foundation to build a private or public Infrastructure-as-a-Service (IaaS) cloud on top of Red Hat Enterprise Linux. It offers a highly scalable, fault-tolerant platform for the development of cloud-enabled workloads.
183
184 === OpenShift ===
185
186 [[[[image:documentation-icon.png]]>>https://wiki.ebrains.eu/bin/view/Collabs/kubernetes/openshift/||rel="noopener noreferrer" target="_blank"]]
187
188 OpenShift is a family of containerization software products developed by Red Hat.
189
190 ----
191
192 == Developer tools ==
193
194 === Gitlab ===
195
196 [[[[image:documentation-icon.png]]>>https://docs.gitlab.com/ee/api/||rel="noopener noreferrer" target="_blank"]]
197
198 EBRAINS hosts a self-managed instance of GitLab for source code management and CI/CD.
199
200 === Docker registry (Harbor) ===
201
202 [[[[image:documentation-icon.png]]>>https://docker-registry.ebrains.eu/devcenter-api-2.0||rel="noopener noreferrer" target="_blank"]]
203
204 The EBRAINS Docker registry uses Harbor, an an open source trusted cloud native registry project that stores, signs, and scans content.
205
206 === Logging/monitoring APIs ===
207
208 [[[[image:documentation-icon.png]]>>]]
209
210 TO DO
211
212 ----
213
214 == Administration ==
215
216 === PLUS ===
217
218 [[[[image:documentation-icon.png]]>>https://lab.ch.ebrains.eu/user-redirect/lab/tree/drive/Shared%20with%20all/PLUS-LAB/PLUS%20KPI-PrI%20API.ipynb||rel="noopener noreferrer" target="_blank"]]
219
220 PLUS is a project management tool for the Human Brain Project.
221
222 ----
223
224 The book icon used on this page is from a [[set of icons by Boca Tutor>>https://www.iconfinder.com/iconsets/tutor-icon-set]] and is licensed under the [[Creative Commons>>https://en.wikipedia.org/wiki/en:Creative_Commons]] [[Attribution-Share Alike 3.0 Unported>>https://creativecommons.org/licenses/by-sa/3.0/deed.en]] license.
Public

API Catalogue