Skip to content

Accessing the Clusters

Shell Access

The all IT4Innovations clusters are accessed by SSH protocol via login nodes loginX at the address cluster-name.it4i.cz. The login nodes may be addressed specifically, by prepending the login node name to the address.

Note

The alias cluster-name.it4i.cz is currently not available through VPN connection. Use loginX.cluster-name.it4i.cz when connected to VPN.

Anselm Cluster

Login address Port Protocol Login node
anselm.it4i.cz 22 ssh round-robin DNS record for login[1-2]
login1.anselm.it4i.cz 22 ssh login1
login2.anselm.it4i.cz 22 ssh login2

Salomon Cluster

Login address Port Protocol Login node
salomon.it4i.cz 22 ssh round-robin DNS record for login[1-4]
login1.salomon.it4i.cz 22 ssh login1
login2.salomon.it4i.cz 22 ssh login2
login3.salomon.it4i.cz 22 ssh login3
login4.salomon.it4i.cz 22 ssh login4

Authentication

Authentication is available by private key only. Verify SSH fingerprints during the first logon:

Anselm:

    md5:
    29:b3:f4:64:b0:73:f5:6f:a7:85:0f:e0:0d:be:76:bf (DSA)
    d4:6f:5c:18:f4:3f:70:ef:bc:fc:cc:2b:fd:13:36:b7 (RSA)
    1a:19:75:31:ab:53:45:53:ce:35:82:13:29:e4:0d:d5 (ECDSA)

    sha256:
    LX2034TYy6Lf0Q7Zf3zOIZuFlG09DaSGROGBz6LBUy4 (DSA)
    +DcED3GDoA9piuyvQOho+ltNvwB9SJSYXbB639hbejY (RSA)
    2Keuu9gzrcs1K8pu7ljm2wDdUXU6f+QGGSs8pyrMM3M (ECDSA)

Salomon:

    md5:
    f6:28:98:e4:f9:b2:a6:8f:f2:f4:2d:0a:09:67:69:80 (DSA)
    70:01:c9:9a:5d:88:91:c7:1b:c0:84:d1:fa:4e:83:5c (RSA)
    66:32:0a:ef:50:01:77:a7:52:3f:d9:f8:23:7c:2c:3a (ECDSA)

    sha256:
    epkqEU2eFzXnMeMMkpX02CykyWjGyLwFj528Vumpzn4 (DSA)
    WNIrR7oeQDYpBYy4N2d5A6cJ2p0837S7gzzTpaDBZrc (RSA)
    cYO4UdtUBYlS46GEFUB75BkgxkI6YFQvjVuFxOlRG3g (ECDSA)

Note

SSH fingerprints are identical on all login nodes.

Private key authentication:

On Linux or Mac, use:

$ ssh -i /path/to/id_rsa username@cluster-name.it4i.cz

If you see a warning message UNPROTECTED PRIVATE KEY FILE!, use this command to set lower permissions to the private key file:

$ chmod 600 /path/to/id_rsa

On Windows, use PuTTY ssh client.

After logging in, you will see the command prompt

  ___   _____   _  _     ___                                           _     _
 |_ _| |_   _| | || |   |_ _|  _ __    _ __     ___   __   __   __ _  | |_  (_)   ___    _ __    ___
  | |    | |   | || |_   | |  | '_ \  | '_ \   / _ \  \ \ / /  / _` | | __| | |  / _ \  | '_ \  / __|
  | |    | |   |__   _|  | |  | | | | | | | | | (_) |  \ V /  | (_| | | |_  | | | (_) | | | | | \__ \
 |___|   |_|      |_|   |___| |_| |_| |_| |_|  \___/    \_/    \__,_|  \__| |_|  \___/  |_| |_| |___/

                                     http://www.it4i.cz/?lang=en

Last login: Tue Jul 9 15:57:38 2013 from your-host.example.com
[username@login2.cluster-name ~]$

Note

The environment is not shared between login nodes, except for shared filesystems.

Data Transfer

Data in and out of the system may be transferred by the scp and sftp protocols.

Anselm Cluster

Address Port Protocol
anselm.it4i.cz 22 scp
login1.anselm.it4i.cz 22 scp
login2.anselm.it4i.cz 22 scp

Salomon Cluster

Address Port Protocol
salomon.it4i.cz 22 scp, sftp
login1.salomon.it4i.cz 22 scp, sftp
login2.salomon.it4i.cz 22 scp, sftp
login3.salomon.it4i.cz 22 scp, sftp
login4.salomon.it4i.cz 22 scp, sftp

Authentication is by private key only.

Note

If you experience degraded data transfer performance, consult your local network provider.

$ scp -i /path/to/id_rsa my-local-file username@cluster-name.it4i.cz:directory/file
$ scp -i /path/to/id_rsa -r my-local-dir username@cluster-name.it4i.cz:directory

or

$ sftp -o IdentityFile=/path/to/id_rsa username@cluster-name.it4i.cz

A very convenient way to transfer files in and out of cluster is via the fuse filesystem sshfs.

$ sshfs -o IdentityFile=/path/to/id_rsa username@cluster-name.it4i.cz:. mountpoint

Learn more about ssh, scp and sshfs by reading the manpages

$ man ssh
$ man scp
$ man sshfs

On Windows, use the WinSCP client to transfer the data. The win-sshfs client provides a way to mount the cluster filesystems directly as an external disc.

More information about the shared file systems is available here.

Connection Restrictions

Outgoing connections, from cluster login nodes to the outside world, are restricted to the following ports:

Port Protocol
22 ssh
80 http
443 https
9418 git

Note

Use ssh port forwarding and proxy servers to connect from cluster to all other remote ports.

Outgoing connections, from Cluster compute nodes are restricted to the internal network. Direct connections form compute nodes to the outside world are cut.

Port Forwarding

Port Forwarding From Login Nodes

Note

Port forwarding allows an application running on cluster to connect to arbitrary remote hosts and ports.

It works by tunneling the connection from cluster back to users' workstations and forwarding from the workstation to the remote host.

Pick some unused port on the cluster login node (for example 6000) and establish the port forwarding:

$ ssh -R 6000:remote.host.com:1234 cluster-name.it4i.cz

In this example, we establish port forwarding between port 6000 on cluster and port 1234 on the remote.host.com. By accessing localhost:6000 on cluster, an application will see the response of remote.host.com:1234. The traffic will run via the user's local workstation.

Port forwarding may be done using PuTTY as well. On the PuTTY Configuration screen, load your cluster configuration first. Then go to Connection->SSH->Tunnels to set up the port forwarding. Click Remote radio button. Insert 6000 to theSource port textbox. Insert remote.host.com:1234. Click the Add button, then Open.

Port forwarding may be established directly to the remote host. However, this requires that the user has ssh access to remote.host.com

$ ssh -L 6000:localhost:1234 remote.host.com

Note

Port number 6000 is chosen as an example only. Pick any free port.

Port Forwarding From Compute Nodes

Remote port forwarding from compute nodes allows applications running on the compute nodes to access hosts outside the cluster.

First, establish the remote port forwarding form the login node, as described above.

Second, invoke port forwarding from the compute node to the login node. Insert the following line into your jobscript or interactive shell:

$ ssh  -TN -f -L 6000:localhost:6000 login1

In this example, we assume that port forwarding from login1:6000 to remote.host.com:1234 has been established beforehand. By accessing localhost:6000, an application running on a compute node will see the response of remote.host.com:1234.

Using Proxy Servers

Port forwarding is static, each single port is mapped to a particular port on a remote host. Connection to another remote host requires a new forward.

Note

Applications with inbuilt proxy support experience unlimited access to remote hosts via a single proxy server.

To establish a local proxy server on your workstation, install and run SOCKS proxy server software. On Linux, sshd demon provides the functionality. To establish SOCKS proxy server listening on port 1080 run:

$ ssh -D 1080 localhost

On Windows, install and run the free, open source Sock Puppet server.

Once the proxy server is running, establish ssh port forwarding from cluster to the proxy server, port 1080, exactly as described above:

$ ssh -R 6000:localhost:1080 cluster-name.it4i.cz

Now, configure the applications proxy settings to localhost:6000. Use port forwarding to access the proxy server from compute nodes as well.

Graphical User Interface

VPN Access

  • Access IT4Innovations internal resources via VPN.

Comments