Attention: The Collaboratory IAM will down for up to 1 hour on Monday, the 7th of July 2025 starting from 5pm CEST (my timezone) for up to 1 hour. Any and all services, which require a user login with an EBRAINS account, will be un-available during that time


Changes for page HPC Resources in EBRAINS

Last modified by korculanin on 2025/01/29 10:46

From version 22.1
edited by korculanin
on 2024/09/16 09:33
Change comment: There is no comment for this version
To version 33.1
edited by korculanin
on 2024/09/16 14:57
Change comment: There is no comment for this version

Summary

Details

Page properties
Title
... ... @@ -1,1 +1,1 @@
1 -HPC Resources
1 +HPC Resources in EBRAINS
Content
... ... @@ -1,13 +11,3 @@
1 -(% class="jumbotron" %)
2 -(((
3 -(% class="container" %)
4 -(((
5 -= HPC Resources =
6 -
7 -for EBRAINS Users
8 -)))
9 -)))
10 -
11 11  (% class="row" %)
12 12  (((
13 13  (% class="col-xs-12 col-sm-8" %)
... ... @@ -19,19 +19,22 @@
19 19  * Scalable Computing Services, massively parallel HPC systems that can, e.g., be used for large-scale simulations or heavy machine-learning workloads.
20 20  * Interactive Computing Services that allow the use of high-end single compute servers, e.g., to analyse and visualise data interactively or to connect to running simulations, which use scalable computing services.
21 21  
12 += HPC Resources Available for EBRAINS 2.0 =
22 22  
23 -= (% style="font-family:inherit" %)HPC Resources Available for EBRAINS 2.0(%%) =
24 -
25 25  Currently available HPC resources for EBRAINS users at
26 26  
27 27  * CINECA: Galileo100
28 28  ** 3 M core-h, for EBRAINS members (developers) only
29 29  * JSC: JUSUF Cluster (Available through at least 03/25)
30 -** core-h according to availability 
31 -
19 +** core-h according to availability
32 32  
33 -Further HPC resources are available through regular national calls at:
21 += How to Apply for HPC Resources =
34 34  
23 +Applying for HPC resources for both clusters (Galileo100 and JUSUF) is straightforward and requires only a technical review. Use the provided [[template>>https://drive.ebrains.eu/f/46b93975bde64a2e92e9/]] and send it to [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
24 +
25 +
26 +==== Further HPC resources are available through regular national calls at: ====
27 +
35 35  * France: [[CEA>>https://www.edari.fr]]
36 36  * Germany: [[JSC>>https://www.fz-juelich.de/en/ias/jsc/systems/supercomputers/apply-for-computing-time]]
37 37  * Italy: [[CINECA>>https://www.hpc.cineca.it/hpc-access/access-cineca-resources/iscra-projects/]]
... ... @@ -40,13 +40,8 @@
40 40  In most cases, a prerequisite for the national calls is that the PI be affiliated with an academic institution located in the same country as the hosting site.
41 41  
42 42  
43 -= (% style="font-family:inherit" %)How to Apply for HPC Resources(%%) =
36 += Contact =
44 44  
45 -Applying for HPC resources for both clusters (Galileo100 and JUSUF) is straightforward and requires only a technical review. Use the provided [[template>>https://drive.ebrains.eu/f/46b93975bde64a2e92e9/]] and send it to [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
46 -
47 -
48 -= (% style="font-family:inherit" %)Contact(%%) =
49 -
50 50  For further information concerning HPC resources, please contact us at [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
51 51  )))
52 52  
Collaboratory.Apps.Collab.Code.CollabClass[0]
Public
... ... @@ -1,1 +1,1 @@
1 -No
1 +Yes
XWiki.XWikiRights[3]
Allow/Deny
... ... @@ -1,0 +1,1 @@
1 +Allow
Levels
... ... @@ -1,0 +1,1 @@
1 +view
Users
... ... @@ -1,0 +1,1 @@
1 +XWiki.XWikiGuest
XWiki.XWikiRights[4]
Allow/Deny
... ... @@ -1,0 +1,1 @@
1 +Allow
Groups
... ... @@ -1,0 +1,1 @@
1 +XWiki.XWikiAllGroup
Levels
... ... @@ -1,0 +1,1 @@
1 +view