Attention: The Collaboratory IAM will down for up to 1 hour on Monday, the 7th of July 2025 starting from 5pm CEST (my timezone) for up to 1 hour. Any and all services, which require a user login with an EBRAINS account, will be un-available during that time


Changes for page HPC Resources in EBRAINS

Last modified by korculanin on 2025/01/29 10:46

From version 10.1
edited by korculanin
on 2024/06/10 13:08
Change comment: There is no comment for this version
To version 4.1
edited by korculanin
on 2024/06/10 09:53
Change comment: There is no comment for this version

Summary

Details

Page properties
Content
... ... @@ -2,9 +2,9 @@
2 2  (((
3 3  (% class="container" %)
4 4  (((
5 -= HPC Resources =
5 += My Collab's Extended Title =
6 6  
7 -for EBRAINS Users
7 +My collab's subtitle
8 8  )))
9 9  )))
10 10  
... ... @@ -12,47 +12,15 @@
12 12  (((
13 13  (% class="col-xs-12 col-sm-8" %)
14 14  (((
15 -The high-performance computing services for EBRAINS include:
15 += What can I find here? =
16 16  
17 -* Scalable Computing Services
18 -** Massively parallel HPC systems that are suitable for highly parallel brain simulations or for high-throughput data analysis tasks.
19 -* Interactive Computing Services
20 -** Quick access to single compute servers to analyse and visualise data interactively, or to connect to running simulations, which are using the scalable compute services.
21 -* Active Data Repositories
22 -** Site-local data repositories close to computational and/or visualization resources that are used for storing temporary replicas of data sets. In the near future they will typically be realised using parallel file systems.
23 -* (((
24 -Archival Data Repositories
17 +* Notice how the table of contents on the right
18 +* is automatically updated
19 +* to hold this page's headers
25 25  
26 -* Federated data storage, optimized for capacity, reliability and availability that is used for long-term storage of large data sets which cannot be easily regenerated. These data stores allow the sharing of data with other researchers inside and outside of HBP
27 -)))
21 += Who has access? =
28 28  
29 -Currently, we have in-kind resources available for EBRAINS users at:
30 -
31 -* CINECA: Galileo100 (scalable + interactive)
32 -** 3 M core-h, for EBRAINS members (developers) only
33 -* JSC: JUSUF Cluster (available until 03/25)
34 -** core-h according to availability 
35 -
36 -
37 -Further HPC resources are available through regular national calls at:
38 -
39 -* France: [[CEA>>https://www.edari.fr]]
40 -* Germany: [[JSC>>https://www.fz-juelich.de/en/ias/jsc/systems/supercomputers/apply-for-computing-time]]
41 -* Italy: [[CINECA>>https://www.hpc.cineca.it/hpc-access/access-cineca-resources/iscra-projects/]]
42 -* Switzerland: [[CSCS>>https://www.cscs.ch/user-lab/allocation-schemes]] 
43 -
44 -
45 -In most cases, a prerequisite for the national calls is that the PI be affiliated with an academic institution located in the same country as the hosting site.
46 -
47 -
48 -= (% style="font-family:inherit" %)How to apply for the EBRAINS in-kind resources(%%) =
49 -
50 -Applying for EBRAINS in-kind resources is straightforward and requires only a technical review. Use the provided [[template>>https://drive.ebrains.eu/f/46b93975bde64a2e92e9/]] and send it to [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
51 -
52 -
53 -= (% style="font-family:inherit" %)Contact(%%) =
54 -
55 -For further information concerning HPC resources, please contact us at [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
23 +Describe the audience of this collab.
56 56  )))
57 57  
58 58