Institute for Advanced Simulation (IAS)
Servicemeu
- Deutsch
- English
-
Search
Institute for Advanced Simulation (IAS)
JSC introduced a new model for using the supercomputing systems in November 2018. The new model overcomes well-known obstacles in using the systems like the handling of several user accounts and sharing of data between project members and even between different projects.
To achieve these goals the new model is user-centered, i.e. each user have only one account on the systems, via this account all assigned projects can be accessed. Particular attention was paid to data handling: user space is from now on separated from project space, this allows working on the same data for all project members. An important step for future needs is also the introduction of data projects. A data project is not connected to a CPU budget like the known compute projects, but it provides access to further storage layers. Data projects are independent of compute projects and allow sharing of data also in communities using different compute projects.
After you login to a supercomputer you will end up in a mostly empty $HOME directory. Each system now has its own $HOME directory. The shared subfolder links to the same directory for all systems (to share common files). The new $HOME is only for basic configuration (vim, Emacs etc.), there is only a limited amount of data quota available.
Command to get the overview of all my projects (compute (C) and data (D)) and budgets:
jutil user projects -o columns
Your old $HOME data can now be found inside of:
$PROJECT_<name of compute project> e.g. $PROJECT_cjsc
Please regard: for this transferred data all access rights have been kept as before. You might want to set the setgid-bit: chmod -R g+s <dir>. This way you assure that also in this directory data is created with the project group instead of juser.
All other data or directories you create directly in $PROJECT_<> are created with -rwxrws--- as expected in the project directory.
Your old $WORK data can now be found inside of:
$SCRATCH_<name of compute project> e.g. $SCRATCH_cjsc
Your old $ARCH data can now be found inside of:
$ARCHIVE_<name of data project> e.g. $ARCHIVE_jsc
Please regard concerning the shell-variables for institutes with "-" in the name: the "-" is replaced by "_" in the variable out of technical reasons, e.g. ias-1 results in $PROJECT_ias_1, $SCRATCH_ias_1, $ARCHIVE_ias_1.
If you prefer you can also set a default $PROJECT and $SCRATCH by using:
jutil env activate -p <name of compute project>
and a default $ARCHIVE by using
jutil env activate -p <name of data project>
You have to add -A/--account <name of budget> to your preferred SLURM submission command:
sbatch -A <name of budget> …
salloc -A <name of budget> …
Or directly within your jobscript by adding:
#SBATCH --account=<name of budget>
If you prefer you can also set a default budget by using:
jutil env activate –p <name of compute project> -A <name of budget>
Date | News |
---|---|
04.12.2018 | Switch to the new model finalized. |
27.11.2018 | Slides of the SC introduction course regarding the new usage model |
19.11.2018 | User tool jutil described |
29.10.2018 | Maintenance announcement: user information by email. |
22.10.2018 | Maintenance announcement for Nov 30 - Dec 4. |
17.10.2018 | Link to the JuDoor portal documentation included. |
17.09.2018 | JSC news article published. |
11.09.2018 | Documentation online. |
The concept can be most easily explained by a comparison between the old and the current model:
Comparison between old and new usage model
As can be seen in the comparison between the old and the new model the filesystems are ordered: to compute projects $PROJECT and $SCRATCH are assigned, access to the known $ARCH is now handled via data projects. Additionally there is no project data anymore in the user's home.
To achieve this data migration was needed. Users don't needed to perform this themselves, it was done for them. But you need to know how we proceeded:
Each compute project got a directory in the new SCRATCH file system. Data from WORK has been copied to $SCRATCH/<old_account>.
All Users automatically got a new account for access to the systems, they have been informed after transition about the new account name. You can just login with this new account name as before because all ssh-keys will be transferred automatically to that account.
ACLs currently set by the users will not be transferred. Access can now be performed via project membership.
The maintenance to apply this new usage model took place between
30th November starting 08:00 until 4th of December.
Please regard, that queued jobs have been cancelled during the mainenance as they will not run properly afterwards.
Tools are available to help users setting up their environment under the new model:
activate a certain project (i.e. export environment variables)
During activation of a project environment variables will be exported. As an account can be now bound to several projects the variables are marked accordingly, e.g. $PROJECT_chcp01.
These tasks can be performed via "jutil", the usage of jutil is described here.
At the moment accounting information cannot be queried via jutil, please use still q_cpuquota. We introduced a new option -l to q_cpuquota which allows to query job information on former accounts of the users.
At the same time we started production of our new Portal to our supercomputer services called JuDoor.
All users got access to the portal and can manage their personal data as well as their account and project memberships via JuDoor. A big advantage: account creation on an SC system will be performed in minutes. This is possible by a fully online process including the user and the project's principle investigator (PI), no paper handling will be involved anymore. Please consider that JuDoor will be the only way to get accounts on the JSC systems. PIs can manage their project members online themselves, they also have the possibility to assign a project administrator (PA) who can manage the account handling for them. The PA announcement can also be done in the portal itself.
Please have a look at the JuDoor Documentation.
Contact:
SC-Support Team
Jülich Supercomputing Centre
Forschungszentrum Jülich
52425 Jülich
Germany
Phone: 02461 61-2828
E-Mail: sc@fz-juelich.de