Wiki source code of API Catalogue

Version 16.1 by rowanthorpe on 2021/12/17 15:39

Show last authors
1
2
3 This page catalogues the [[web API>>https://en.wikipedia.org/wiki/Web_API]]s available for users and developers of [[EBRAINS tools and services>>https://ebrains.eu]].
4
5 (% class="box warningmessage" %)
6 (((
7 Please note this is a work in progress.
8 )))
9
10 ----
11
12 {{toc/}}
13
14 ----
15
16 == Core services / Collaboratory ==
17
18 === Authentication / Authorization (IAM) ===
19
20 [[[[image:documentation-icon.png]]>>https://www.keycloak.org/documentation.html||rel="noopener noreferrer" target="_blank"]]
21
22 IAM is the EBRAINS **I**dentity and **A**ccess **M**anagement service which is delivered by the Collaboratory and manages user identification and permission management for all EBRAINS users and services.
23
24 === Collaboratory wiki ===
25
26 [[[[image:documentation-icon.png]]>>https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide/Features/XWikiRESTfulAPI||rel="noopener noreferrer" target="_blank"]]
27
28 The [[Wiki service>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-collaboratory/Getting%20Started/#HWikipages]] of the Collaboratory hosts the main interface to access all the other Collaboratory services. As such, it embodies the full concept of collab workspaces and most users consider it to be the collab service. The Wiki in itself offers a handy way of documenting a user's work with a simple wiki user interface.
29
30 === Drive ===
31
32 [[[[image:documentation-icon.png]]>>https://download.seafile.com/published/web-api/home.md||rel="noopener noreferrer" target="_blank"]]
33
34 The Collaboratory offers two storage solutions for collabs: the Drive and the Bucket. The Drive is especially well suited for files and documents that need to be worked on in a more agile manner, possibly also collaboratively. For larger files, datasets, and videos the [[Bucket>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-collaboratory/Documentation%20Bucket/]] offers a better solution.
35
36 === Bucket (Data Proxy) ===
37
38 [[[[image:documentation-icon.png]]>>https://data-proxy-ppd.ebrains.eu/api/docs||rel="noopener noreferrer" target="_blank"]]
39
40 The Collaboratory offers two storage solutions for collabs: the Drive and the Bucket. The Drive is especially well suited for files and documents that need to be worked on in a more agile manner, possibly also collaboratively. For larger files, datasets, and videos the [[Bucket>>url:https://wiki.ebrains.eu/bin/view/Collabs/the-collaboratory/Documentation%20Bucket/]] offers a better solution.
41
42 ----
43
44 == Find and share data ==
45
46 === Knowledge Graph core ===
47
48 [[[[image:documentation-icon.png]]>>https://core.kg.ebrains.eu/swagger-ui/index.html?configUrl=/v3/api-docs/swagger-config||rel="noopener noreferrer" target="_blank"]]
49
50 The EBRAINS Knowledge Graph is a multi-modal metadata store which brings together information from different fields on brain research. At the core of the EBRAINS Knowledge Graph, a graph database tracks the linkage between experimental data and neuroscientific data science supporting more extensive data reuse and complex computational research than would be possible otherwise.
51
52 === Knowledge Space ===
53
54 [[[[image:documentation-icon.png]]>>https://api.knowledge-space.org/docs||rel="noopener noreferrer" target="_blank"]]
55
56 KnowledgeSpace aims to be a globally-used, community-based, data-driven encyclopedia for neuroscience that links brain research concepts to data, models, and the literature that support them. Further it aims to serve as a framework where large-scale neuroscience projects can expose their data to the neuroscience community-at-large. KS is a framework that combines general descriptions of neuroscience concepts found in wikipedia with more detailed content from InterLex. It then integrates the content from those two sources with the latest neuroscience citations found in PubMed and data found in some of the world’s leading neuroscience repositories. KS is a joint development between the Human Brain Project (HBP), the International Neuroinformatics Coordinating Facility (INCF), and the Neuroscience Information Framework (NIF).
57
58 === Neural Activity Resource ===
59
60 [[[[image:documentation-icon.png]]>>https://neural-activity-resource.brainsimulation.eu/docs||rel="noopener noreferrer" target="_blank"]]
61
62 The Neural Activity Resource provides a simplified interface to data and metadata from the KG about neural activity recordings.
63
64 === Provenance API ===
65
66 [[[[image:documentation-icon.png]]>>https://prov.brainsimulation.eu/docs||rel="noopener noreferrer" target="_blank"]]
67
68 The Provenance API provides a simplified interface to computational provenance information in the KG.
69
70 ----
71
72 == Brain atlases ==
73
74 === Image service ===
75
76 [[[[image:documentation-icon.png]]>>https://img-svc.apps.hbp.eu/api-docs||rel="noopener noreferrer" target="_blank"]]
77
78 The Image Service was designed to allow users to process imaging data in order to be used with interactive HBP tools. These tools often need special formats and data descriptors to work, and the Image Service's goal is to provide streamlined workflows with which datasets can be made available to these tools with the simplest possible input, on-demand, using resources allocated in the FENIX Research Infrastructure.
79
80 === siibra ===
81
82 [[[[image:documentation-icon.png]]>>https://siibra-api.readthedocs.io||rel="noopener noreferrer" target="_blank"]]
83
84 The siibra toolsuite provides both interactive and programmatic user interfaces for working with “multilevel” brain atlases, that is, brain atlases composed of multiple reference spaces, parcellation maps, and data modalities. siibra-API is a RESTful API service exposing the core functionalities for integration with other applications.
85
86 ----
87
88 == Data analysis ==
89
90 === Neo-Viewer ===
91
92 [[[[image:documentation-icon.png]]>>https://neo-viewer.brainsimulation.eu/||rel="noopener noreferrer" target="_blank"]]
93
94 Neo Viewer provides web-based visualisation of electrophysiology data, with support for** **most of the widely-used file formats in neurophysiology, including community standards such as NIX and NWB. It consists of a REST API for transforming electrophysiology data files into JSON format and a Javascript component that can be embedded in any web page.
95
96 ----
97
98 == Simulation ==
99
100 === Model Validation service ===
101
102 [[[[image:documentation-icon.png]]>>https://validation-v2.brainsimulation.eu/docs||rel="noopener noreferrer" target="_blank"]]
103
104 The Model Validation Service provides web-based tools for working with computational models, and the validation of such models against experimental data.  It consists of a REST web service and two clients: the Model Catalog collaboratory app and a Python client. Underlying metadata are stored in the Knowledge Graph. The service allows users (i) to create, edit, search and view metadata about models and validation tests, (ii) to register, search, view and compare the results of validation tests.
105
106 === The Virtual Brain ===
107
108 [[[[image:documentation-icon.png]]>>https://thevirtualbrain-rest.apps.hbp.eu/doc||rel="noopener noreferrer" target="_blank"]]
109
110 REST API for TVB.
111
112 === MoDEL-CNS ===
113
114 [[[[image:documentation-icon.png]]>>https://mmb.irbbarcelona.org/MoDEL-CNS/api/rest/docs/||rel="noopener noreferrer" target="_blank"]]
115
116 Molecular Dynamics Extended Library: Central Nervous System (MoDEL-CNS) is a platform designed to provide web-access to **atomistic-MD trajectories** for relevant **signal transduction proteins**.
117
118 === Cellular Level Simulation ===
119
120 [[[[image:documentation-icon.png]]>>https://humanbrainproject.github.io/hbp-bsp-service-account/introduction/introduction.html||rel="noopener noreferrer" target="_blank"]]
121
122 Web applications that allow running simulations in different scales (single cell, multiple cells, regions).
123
124 ----
125
126 == Neurorobotics ==
127
128 === NRP Frontend-Backend comms ===
129
130 [[[[image:documentation-icon.png]]>>https://neurorobotics.net/Documentation/nrp/developer_manual/ExDbackend/index.html||rel="noopener noreferrer" target="_blank"]]
131
132 === NRP-core ===
133
134 [[[[image:documentation-icon.png]]>>https://hbpneurorobotics.bitbucket.io/index.html||rel="noopener noreferrer" target="_blank"]]
135
136 The neurorobotics platform core (referred throughout this document as NRP-core) is the mechanism through which NRP users can implement simulations whereby multiple pieces of simulation software can coexist, synchronize their execution and exchange data in tightly ordered fashion. In previous versions of the NRP, NRP-core was referred to as the "Closed Loop Engine" (CLE), the task of which was to orchestrate the dialogue between the Gazebo robotic simulator and brain models implemented in NEST, Nengo, etc. For those of our users familiar with the CLE, NRP-core is a generalization of the latter, with new generic mechanisms provided to users who want to integrate new simulation engines into their NRP simulations. NRP-core is still built on the so-called Transfer Function framework, although the latter was adapted and renamed [[Transceiver Functions>>url:https://hbpneurorobotics.bitbucket.io/transceiver_function.html]] framework. This renaming is not only cosmetic: users familiar with the NRP up to v3.2 should indeed remain aware of some limited but meaningful evolutions between these two frameworks.
137
138 ----
139
140 == Medical data analytics ==
141
142 No API documentation for this service category is available at the present time.
143
144 ----
145
146 == Neuromorphic computing ==
147
148 === Job Queue service ===
149
150 [[[[image:documentation-icon.png]]>>https://nmpi.hbpneuromorphic.eu/api/v2/||rel="noopener noreferrer" target="_blank"]]
151 REST API for submitting jobs to the BrainScaleS and SpiNNaker platforms, and for retrieving job results.
152
153 === Quotas service ===
154
155 [[[[image:documentation-icon.png]]>>]]
156 REST API for requesting and managing compute quotas for the BrainScaleS and SpiNNaker platforms.
157
158 ----
159
160 == High-performance computing ==
161
162 === UNICORE ===
163
164 [[[[image:documentation-icon.png]]>>https://sourceforge.net/p/unicore/wiki/REST_API/||rel="noopener noreferrer" target="_blank"]]
165
166 UNICORE (UNiform Interface to COmputing REsources) provides tools and services for making high-performance computing and data resources accessible in a seamless and secure way for a wide variety of applications.
167
168 UNICORE offers RESTful APIs for job submission, job management, data access, data movement and workflows, and is already integrated with the HBP AAI and the user/project management at the HPC sites. UNICORE can help solve many integration issues, provide the necessary building blocks and APIs for creating cross-site workflows and it provides added value such as site-to-site data movement (between POSIX filesystems), access to external data, or sharing of HPC data sets.
169
170 === Supercomputing Proxy ===
171
172 [[[[image:documentation-icon.png]]>>https://unicore-job-proxy.apps.hbp.eu/docs||rel="noopener noreferrer" target="_blank"]]
173
174 Supercomputing proxy provides services which run on supercomputers a means of running a job on behalf of an EBRAINS user without requesting that the user have an HPAC account. The job is launched using a service account in the name of the service provider. The service is limited to running jobs which use pre-determined executables. The end-user cannot use the supercomputing resources for purposes not intended by the service provider.
175
176 ----
177
178 == Cloud computing ==
179
180 === OpenStack ===
181
182 [[[[image:documentation-icon.png]]>>https://wiki.ebrains.eu/bin/view/Collabs/openstack/||rel="noopener noreferrer" target="_blank"]]
183
184 Red Hat OpenStack Platform provides the foundation to build a private or public Infrastructure-as-a-Service (IaaS) cloud on top of Red Hat Enterprise Linux. It offers a highly scalable, fault-tolerant platform for the development of cloud-enabled workloads.
185
186 === OpenShift ===
187
188 [[[[image:documentation-icon.png]]>>https://wiki.ebrains.eu/bin/view/Collabs/kubernetes/openshift/||rel="noopener noreferrer" target="_blank"]]
189
190 OpenShift is a family of containerization software products developed by Red Hat.
191
192 ----
193
194 == Developer tools ==
195
196 === Gitlab ===
197
198 [[[[image:documentation-icon.png]]>>https://docs.gitlab.com/ee/api/||rel="noopener noreferrer" target="_blank"]]
199
200 EBRAINS hosts a self-managed instance of GitLab for source code management and CI/CD.
201
202 === Docker registry (Harbor) ===
203
204 [[[[image:documentation-icon.png]]>>https://docker-registry.ebrains.eu/devcenter-api-2.0||rel="noopener noreferrer" target="_blank"]]
205
206 The EBRAINS Docker registry uses Harbor, an an open source trusted cloud native registry project that stores, signs, and scans content.
207
208 === Logging/monitoring APIs ===
209
210 [[[[image:documentation-icon.png]]>>]]
211
212 TO DO
213
214 ----
215
216 == Administration ==
217
218 === PLUS ===
219
220 [[[[image:documentation-icon.png]]>>https://lab.ch.ebrains.eu/user-redirect/lab/tree/drive/Shared%20with%20all/PLUS-LAB/PLUS%20KPI-PrI%20API.ipynb||rel="noopener noreferrer" target="_blank"]]
221
222 PLUS is a project management tool for the Human Brain Project.
223
224 ----
225
226 The book icon used on this page is from a [[set of icons by Boca Tutor>>https://www.iconfinder.com/iconsets/tutor-icon-set]] and is licensed under the [[Creative Commons>>https://en.wikipedia.org/wiki/en:Creative_Commons]] [[Attribution-Share Alike 3.0 Unported>>https://creativecommons.org/licenses/by-sa/3.0/deed.en]] license.
Public

API Catalogue