Friday, August 15, 2014

[Rocks-Discuss] ssh root at compute-node requires password

Greg Bruno greg.bruno at gmail.com 
Fri Jan 15 08:54:01 PST 2010
On Fri, Jan 15, 2010 at 7:35 AM, Tent Pig <tentpig at yahoo.com> wrote:
> I just finished a fresh install of Rocks 5.2 on a 500-node cluster, pxebooting all compute-nodes by hand, insert-ethers running, etc. to get a functional cluster.
>
> All of my nodes are up and running.
>
> My problem is if I attempt, as root on the head-end, to ssh root at compute-whatever, I'm being asked for a password... on every single compute node.
>
>
> This is annoyingly aggravating. I have several post-install scripts to run on each of the compute nodes, and I really don't feel like typing the root password several thousand times. It will take me months to complete this rollout, and I was supposed to be finished by the end of business today.
>
> I googled this problem and it seems pretty common, but I can't get a clear resolution. .ssh folder permissions, regening the ssh key, yada yada yada... all of which I've tried and nothing fixes it.
>
> I'm absolutely dead in the water here at the moment.
>
> Is there a simple solution to this problem? Reinstalling the compute nodes (one of the suggested solutions) doesn't seem a reasonable answer, since I just installed the compute nodes, haven't even touched them after they rebooted, and I have this issue. (Nor do I really feel like sitting there to reboot/PXEboot manually all over again.)


here's a little background on how rocks distributes the root user's
public key and on ssh login in general.

in order to login to a host without having to type a password with
ssh, the public key for the user must be in the file
$HOME/.ssh/authorized_keys on the remote host and the user's private
key must be loaded into the user's environment. you can see what
private keys are currently loaded by executing:

    # ssh-add -l

in rocks, when a user logs in and if the file $HOME/.ssh/id_rsa
doesn't exist, then the user is prompted to create their ssh keys.
this is also true for the root user, which is why you create ssh keys
for the root user when you login to the frontend the first time after
the frontend is installed. also, in rocks, after your ssh keys are
created, the public key is copied into $HOME/.ssh/authorized_keys.

on the frontend, the root user's home directory is not NFS mounted on
all the compute nodes, so in rocks, we distribute root's
authorized_keys file inside the kickstart file when a compute node
installs.

so, if you change root's ssh keys on the frontend, then the normal way
authorized_keys is updated on the compute nodes is through
reinstallation.

but, the good news is, i believe there is a workaround.

on the frontend, let's generate new ssh keys for root:

    # rm -rf /root/.ssh/id_rsa*

then logout and log back in. this will prompt you to generate new ssh
keys for root.

now, let's distribute root's authorized_keys file with 411:

   # cd /var/411

edit 'Files.mk' and change the line:

    # FILES += /my/file

to:

    FILES += /root/.ssh/authorized_keys

now tell 411 to push all the files under its control out to the compute nodes:

    # rocks sync users

now try to login to a compute node.

No comments:

Post a Comment