Shell – error handling in expect

expectlinuxshell-script

I am just improving my question because I have achieved so far up to this:

set username [lindex $argv 0]
set password [lindex $argv 1]
set hostname [lindex $argv 2]

if {[llength $argv] == 0} {
  send_user "Usage: scriptname username \'password\' hostname\n"
  exit 1
}

send_user "\n#####\n# $hostname\n#####\n"

spawn ssh -q -o StrictHostKeyChecking=no $username@$hostname

expect {
  timeout { send_user "\nFailed to get password prompt\n"; exit 1 }
  eof { send_user "\nSSH failure for $hostname\n"; exit 1 }
  "Password: "
}

send "$password\r"


expect {
  timeout { send_user "\nLogin failed. Password incorrect.\n"; exit 1}
  "{severname:mike} "
}
send "ls -lrt\n"

expect {
       "{severname:mike} " {
         send "uname\n"
       }
}
expect {
        "{severname:mike} " }

send "exit\r"
close

I hope I am correct but how do I get the out put of commands logged locally errlogs and success logs
where command is a list of commands like ls, ls -lrt

moreover when I put this :

 spawn ssh -q -o StrictHostKeyChecking=no $username@$hostname {ls -lrt;df -h}

it logs in and executes ls -lrt;df -h but after that threw errror
error with debug option. The connection closes perhaps because of the command execution in ssh scope.

[root@testgfs2 final]# ./workingscript.exp  mike bar01 10.38.164.103

#####
# 10.38.164.103
#####
spawn ssh -q -o StrictHostKeyChecking=no mike@10.38.164.103 ls -lrt
parent: waiting for sync byte
parent: telling child to go ahead
parent: now unsynchronized from child
spawn: returns {19901}

expect: does "" (spawn_id exp6) match glob pattern "Password: "? no
Password:
expect: does "Password: " (spawn_id exp6) match glob pattern "Password: "? yes
expect: set expect_out(0,string) "Password: "
expect: set expect_out(spawn_id) "exp6"
expect: set expect_out(buffer) "Password: "
send: sending "bar01\r" to { exp6 }

expect: does "" (spawn_id exp6) match glob pattern "{severname:mike} "? no


expect: does "\r\n" (spawn_id exp6) match glob pattern "{severname:mike} "? no
total 6
-rw-r--r--   1 mike   other        136 Feb 26 08:39 local.cshrc
-rw-r--r--   1 mike   other        157 Feb 26 08:39 local.login
-rw-r--r--   1 mike   other        174 Feb 26 08:39 local.profile

expect: does "\r\ntotal 6\r\n-rw-r--r--   1 mike   other        136 Feb 26 08:39 local.cshrc\r\n-rw-r--r--   1 mike   other        157 Feb 26 08:39 local.login\r\n-rw-r--r--   1 mike   other        174 Feb 26 08:39 local.profile\r\n" (spawn_id exp6) match glob pattern "{severname:mike} "? no
expect: read eof
expect: set expect_out(spawn_id) "exp6"
expect: set expect_out(buffer) "\r\ntotal 6\r\n-rw-r--r--   1 mike   other        136 Feb 26 08:39 local.cshrc\r\n-rw-r--r--   1 mike   other        157 Feb 26 08:39 local.login\r\n-rw-r--r--   1 mike   other        174 Feb 26 08:39 local.profile\r\n"
write() failed to write anything - will sleep(1) and retry...
send: sending "ls -lrt\n" to { exp6 send: spawn id exp6 not open
    while executing
"send "ls -lrt\n""
    (file "./workingscript.exp" line 30)

command variable contains a list of commands sperated by semicolon like this :

   command 1; command 2;command 3(=$command)

I don't know how can I execute it inside expect script one by one
so instead of putting the commands in ssh scope
I could do something like:

spawn ssh -q -o StrictHostKeyChecking=no $username@$hostname

IFS=';'
for i in $command
do
expect {
   "{severname:mike} " {
           send "$i\n"
           }  2>> errorlog 1>> succeslog
    }
done

I just used the unix syntax and expect syntax combined just to make you understand what I am trying to do.

Best Answer

Expect's model of interaction does not include a separation of stdout and stderr streams. When you spawn a process, you create a new pseudo-TTY which behaves almost identically to what a user would see in their terminal, and a user cannot typically distinguish between the two. Try it:

$ echo "This is stdout"
This is stdout
$ echo "This is stderr" 1>&2
This is stderr

Without doing something to label the difference, Expect cannot distinguish them (just like a user on the command line). So Expect may be the wrong tool for what you need to do.

Now, without knowing exactly why you need this, it's hard to come up with an alternative method for you. For what you need, it looks like you can accomplish this with a single ssh command:

ssh -q -o StrictHostKeyChecking=no $username@$hostname "ls -lrt; df -h" 1>>successlog 2>>errorlog

For another scripting language with built-in support for spawning a child process and getting its stdout and stderr separately, try Python and its subprocess module.

import shlex
from subprocess import Popen, PIPE
cmd = Popen(shlex.split('ssh -q -o StrictHostKeyChecking=no $username@$hostname "ls -lrt; df -h"'), stdout=PIPE, stderr=PIPE)
cmd.wait() # Wait for the command to complete
success_output = cmd.stdout
error_output = cmd.stderr

But if you really, really want to do this in Expect, I think the best approach may be to check the exit value of the last process and to write to your error log file when it is non-zero. Here's an example of something like that:

spawn ssh -q -o StrictHostKeyChecking=no $username@$hostname
# Receive password prompt and send the password here...
expect {
    -ex $prompt { send "ls -lrt\r" }
    timeout {
        send_user "Everything is terrible forever.\n"
        exit 1
    }
}
expect {
    -ex $prompt {
        set output $expect_out(buffer)
        send {echo RETVAL::$?::}
        send "\r"
    }
    timeout {
        send_user "Everything is terrible forever.\n"
        exit 1
    }
}
expect {
    -ex {RETVAL::0::} {
        set fp [open successlog a]
        puts $fp $output
        close $fp
    }
    -re {RETVAL::[1-9][0-9]*::} {
        set fp [open errorlog a]
        puts $fp $output
        close $fp
    }
    timeout {
        send_user "Everything is terrible forever.\n"
        exit 1
    }
}
Related Question