Changes for page User documentation

Last modified by alexisdurieux on 2022/01/27 15:00

From version 1.1
edited by evareill
on 2021/04/28 11:57
Change comment: There is no comment for this version
To version 4.1
edited by alexisdurieux
on 2022/01/27 14:42
Change comment: There is no comment for this version

Summary

Details

Page properties
Author
... ... @@ -1,1 +1,1 @@
1 -XWiki.evareill
1 +XWiki.alexisdurieux
Content
... ... @@ -17,7 +17,7 @@
17 17  
18 18  Why does each service provider need a Fenix service account? The reason is that the Principal Investigator who gets the Fenix service account is legally responsible for the jobs being run on the supercomputers and for not enabling the end-user to run unintended executables.
19 19  
20 -[[image:HPC Job Proxy diagram.jpg]]
20 +[[image:Collabs.ebrains-unicore-job-proxy.User documentation.WebHome@ebrains-job-proxy-Job sequence.png]]
21 21  
22 22  == Use case ==
23 23  
... ... @@ -40,7 +40,6 @@
40 40  1. The HPC Job Proxy logs the job results, and pushes the actual cost of the job run for Bob to the EBRAINS Quota Manager.
41 41  1. The HPC Job Proxy notifies the Application of the results of the job.
42 42  
43 -
44 44  == Sample transaction diagram ==
45 45  
46 46  [[image:ebrains-job-proxy-Job sequence.png]]
... ... @@ -52,7 +52,7 @@
52 52  
53 53  * Submit a job: **POST /api/jobs/**
54 54  
55 -​​​​​​​To submit a job to the proxy, you need to provide the following information as part of the POST JSON body:
54 +To submit a job to the proxy, you need to provide the following information as part of the POST JSON body:
56 56  
57 57  * **job_def** - JSON: The Unicore job definition. For more information, please visit the [[__Unicore documentation__>>url:https://sourceforge.net/p/unicore/wiki/Job_Description/]]
58 58  * **site** - string: The Fenix site on which to run the job.
... ... @@ -72,7 +72,6 @@
72 72  JURON
73 73  )))
74 74  
75 -
76 76  * Fetch a job's details: **GET /api/jobs/<job_id>**
77 77  
78 78  The proxy will query Unicore on-the-fly for the job’s latest details. Fenix sites retain the information of past jobs for a set amount of days (30 days at CSCS). If the request is made after that delay, the information returned is that which has been stored in the HPC Job Proxy. The full information retrieved and stored is the following:
... ... @@ -96,9 +96,9 @@
96 96  {{/code}}
97 97  {{/info}}
98 98  
99 -* Fetch a job's file: **GET /api/jobs/<job_id>/<filename>**
97 +* Fetch a file present in the job's execution directory: **GET /api/jobs/<job_id>/<filename>**
100 100  
101 -​​​​​​​Out of convenience, you can fetch the output of a job (placed by Unicore in the job’s working directory). The filenames available are: stdout, stderr, UNICORE_SCRIPT_EXIT_CODE. If the job has pre commands or post commands, the following filenames are also available: stdout and stderr respectively under the folders .UNICORE_POST_0/ or .UNICORE_PRE_0/
99 +​​​​​​​Out of convenience, you can fetch the files that are present in the job's execution directory. The output, error and exit code of a job are placed by Unicore in the job’s working directory. The filenames available are: stdout, stderr, UNICORE_SCRIPT_EXIT_CODE. If the job has pre commands or post commands, the following filenames are also available: stdout and stderr respectively under the folders .UNICORE_POST_0/ or .UNICORE_PRE_0/. If your job is creating files than they are too available to be fetched.
102 102  
103 103  The proxy will respond with the //raw content// of the requested file. These files are available as long as the site does not delete them (30 days for CSCS PIZ DAINT).
104 104  
... ... @@ -106,7 +106,7 @@
106 106  
107 107  == Sample usage ==
108 108  
109 -A Jupyter Notebook provides sample code for the Application’s access to the HPC Job Proxy.
107 +[[A Jupyter Notebook provides sample code for the Application’s access to the HPC Job Proxy.>>https://lab.ch.ebrains.eu/user-redirect/lab/tree/shared/EBRAINS%20HPC%20job%20proxy/HPC_job_proxy_usage.ipynb]]
110 110  
111 111  == Source code ==
112 112