moved to https://docs.hpc.taltech.ee not changed to rocky yet

Visualization




Accessing the visualization system


The visualization computer viz.hpc.taltech.ee can be accessed by SSH from within the university network, from FortiVPN or using a jump-host:

ssh viz.hpc.taltech.ee
ssh viz -J base.hpc.taltech.ee

The access to viz is limited to using ssh-keys, no password login. Therefore the ssh-key must be copied to base. More can be found here.

The base home directories are shared with viz.




Using GUI software remotely


The visualization software can be executed directly on the visualization node of the cluster, thus removing the need to copy the data for the analysis. There are several possibilities to use graphical programs remotely on a Linux/Unix server:

At least one of these methods should work for any user, which one depends on the configuration of the client computer (your desktop/laptop).

These methods also have different requirements what client software needs to be installed on your computer:
  • SSH, e.g. PuTTY on Windows (essential)

  • X-server (essential for X-forwarding, Xpra, VirtualGL; not needed for X2GO, VNC)(installed by default on Linux; for Windows Xming or VcXsrv) for MAC (XQuarz)

  • Xpra

  • TightVNCViewer

  • VirtualGL


SSH is essential on all platforms, an X-server and VirtualGL are highly recommended and Xpra and VNC are recommended to have on the client computer.

Window manager or Desktop envirionment for remote use

The default window manager for VNC and X2GO is fvwm2, a configurable, lightweight and fast window manager.

You can get a context menu by clicking on the background (left, middle and right give different menues). By the way, a nice X11 feature is easy copy-and-paste, marking with the left mouse button automatically puts the selection into a copy buffer and clicking the middle mouse button inters at the current mouse cursor position. No annoying ctrl+c, ctrl+v necessary.

Within VNC or X2GO you are running a complete desktop session. Typical modern desktop environments require a lot of memory just for the desktop environment! For this reason only resource-efficient window managers like jwm fvwm2 awesome lwm fluxbox blackbox xmonad tritium are installed.

Software to be automatically started can be configured in .xsession (or .vnc/xstartup or .xsessionrc-x2go).

Available Visualization software on compute nodes

  • ParaView

  • VisIt

  • Py-MayaVi

  • RasMol

  • VESPA


to use GPU run vglrun paraview

Available Visualization software on viz

  • ParaView (native install, just run paraview, to use GPU run vglrun paraview)

  • VisIt (run /usr/local/VisIt/bin/visit)

  • COVISE (run /usr/local/covise/bin/covise)

  • MayaVi (native install, just run mayavi2)

  • OpenDX (native install, just run dx)

  • RasMol (native install, just run rasmol or rasmol-gtk)


In-situ visualization (in preparation)

In-situ visualization creates the visualization during the simulation instead of during the postprocesssing phase. The simulation code needs to be connected to in-situ visualization libraries. e.g. Catalyst (ParaView), LibSim (VisIt) and Ascent.

The following are installed on our cluster

  • (Catalyst)[https://www.paraview.org/hpc-insitu/]

  • (Ascent)[https://github.com/Alpine-DAV/ascent]

  • LibSim

  • SENSEI

Ascent on all nodes (except amp*, viz)

module load rocky8-spack
module load ascent

Ascent on amp (CUDA accelerated)

module load amp-spack/0.19.0
module load ascent/0.8.0-gcc-9.3.0-txjp

Catalyst on amp (OSMESA)

module load amp-spack/0.19.0
module load catalyst/5.6.0-gcc-9.3.0-python-3.9.13-q64g 

Catalyst can be used within OpenFOAM and (NEK5000)[https://github.com/KTH-Nek5000/InSituPackage] simulations.