Monitoring through persistent SSH TCP Connections

(Duffkess) #1

Hi Everybody,

I just hadthe requirement to monitoring some devices via ssh commands. (the only option I have because the systems where closed appliances where I cant configure anything).
For speed purposes I came accross this post:
and I was wondering if it would be simplier and easier just to use this in my ssh configuration:

Host *
   ControlMaster auto
   ControlPath ~/.ssh/master-socket/%r@%h:%p
   ControlPersist 60s

So that when a New SSH Connection is created and no socket already exists it just makes an new master out of it automatically.
Has someone expieriences with that and used this in production environmetns?

Thanks for any input.

(Duffkess) #2

I played a little bit and found a great working solution for the problem.

I’ve written a small script that runs ssh commands and checks the output.

the important thing is:

I’m using a specific path for ssh sockets, lets say /etc/.ssh/sockets and in the script I check if user@host:port exists in the path, if not it is a new connection, so use:

ssh -oControlMaster=yes -oControlPath=/etc/.ssh/sockets/user@host:port -oControlPersist=60s 'ssh command'

and if the file exists use the existing connection:

ssh -oControlMaster=no -oControlPath=/etc/.ssh/sockets/user@host:port 'ssh command'

This works perfect for my scenario. The oControlPersists let stay the ssh tcp session open for the next 10 seconds, when another call uses the socket the time increases. The ssh session will only drop when there is no call for more than 10 seconds.

In the script you should of course create the path to the file dynamically, I’m passing host, user and port via Icinga2 custom vars.

You should play with the timings to see what is feasable for you.