Attention: The Collaboratory IAM will down for up to 1 hour on Monday, the 7th of July 2025 starting from 5pm CEST (my timezone) for up to 1 hour. Any and all services, which require a user login with an EBRAINS account, will be un-available during that time


Changes for page HPC Resources in EBRAINS

Last modified by korculanin on 2025/01/29 10:46

From version 23.1
edited by korculanin
on 2024/09/16 09:35
Change comment: There is no comment for this version
To version 33.1
edited by korculanin
on 2024/09/16 14:57
Change comment: There is no comment for this version

Summary

Details

Page properties
Title
... ... @@ -1,1 +1,1 @@
1 -HPC Resources
1 +HPC Resources in EBRAINS
Content
... ... @@ -1,13 +11,3 @@
1 -(% class="jumbotron" %)
2 -(((
3 -(% class="container" %)
4 -(((
5 -= HPC Resources =
6 -
7 -for EBRAINS Users
8 -)))
9 -)))
10 -
11 11  (% class="row" %)
12 12  (((
13 13  (% class="col-xs-12 col-sm-8" %)
... ... @@ -19,7 +19,7 @@
19 19  * Scalable Computing Services, massively parallel HPC systems that can, e.g., be used for large-scale simulations or heavy machine-learning workloads.
20 20  * Interactive Computing Services that allow the use of high-end single compute servers, e.g., to analyse and visualise data interactively or to connect to running simulations, which use scalable computing services.
21 21  
22 -= (% style="font-family:inherit" %)HPC Resources Available for EBRAINS 2.0(%%) =
12 += HPC Resources Available for EBRAINS 2.0 =
23 23  
24 24  Currently available HPC resources for EBRAINS users at
25 25  
... ... @@ -26,11 +26,15 @@
26 26  * CINECA: Galileo100
27 27  ** 3 M core-h, for EBRAINS members (developers) only
28 28  * JSC: JUSUF Cluster (Available through at least 03/25)
29 -** core-h according to availability 
30 -
19 +** core-h according to availability
31 31  
32 -Further HPC resources are available through regular national calls at:
21 += How to Apply for HPC Resources =
33 33  
23 +Applying for HPC resources for both clusters (Galileo100 and JUSUF) is straightforward and requires only a technical review. Use the provided [[template>>https://drive.ebrains.eu/f/46b93975bde64a2e92e9/]] and send it to [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
24 +
25 +
26 +==== Further HPC resources are available through regular national calls at: ====
27 +
34 34  * France: [[CEA>>https://www.edari.fr]]
35 35  * Germany: [[JSC>>https://www.fz-juelich.de/en/ias/jsc/systems/supercomputers/apply-for-computing-time]]
36 36  * Italy: [[CINECA>>https://www.hpc.cineca.it/hpc-access/access-cineca-resources/iscra-projects/]]
... ... @@ -39,11 +39,6 @@
39 39  In most cases, a prerequisite for the national calls is that the PI be affiliated with an academic institution located in the same country as the hosting site.
40 40  
41 41  
42 -= (% style="font-family:inherit" %)How to Apply for HPC Resources(%%) =
43 -
44 -Applying for HPC resources for both clusters (Galileo100 and JUSUF) is straightforward and requires only a technical review. Use the provided [[template>>https://drive.ebrains.eu/f/46b93975bde64a2e92e9/]] and send it to [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
45 -
46 -
47 47  = Contact =
48 48  
49 49  For further information concerning HPC resources, please contact us at [[base-infra-resources@ebrains.eu>>mailto:base-infra-resources@ebrains.eu]].
Collaboratory.Apps.Collab.Code.CollabClass[0]
Public
... ... @@ -1,1 +1,1 @@
1 -No
1 +Yes
XWiki.XWikiRights[3]
Allow/Deny
... ... @@ -1,0 +1,1 @@
1 +Allow
Levels
... ... @@ -1,0 +1,1 @@
1 +view
Users
... ... @@ -1,0 +1,1 @@
1 +XWiki.XWikiGuest
XWiki.XWikiRights[4]
Allow/Deny
... ... @@ -1,0 +1,1 @@
1 +Allow
Groups
... ... @@ -1,0 +1,1 @@
1 +XWiki.XWikiAllGroup
Levels
... ... @@ -1,0 +1,1 @@
1 +view