The simplest way is to store the response and compare it:
$ response=$(curl -X POST -d@myfile.txt server-URL);
$ if [ "Upload successful" == "${response}" ]; then … fi;
I haven't tested that. The syntax might be off, but that's the idea. I'm sure there are more sophisticated ways of doing it such as checking curl's exit code or something.
update
curl
returns quite a few exit codes. I'm guessing a failed post might result in 55 Failed sending network data.
So you could probably just make sure the exit code was zero by comparing to $?
(Expands to the exit status of the most recently executed foreground pipeline.
):
$ curl -X POST -d@myfile.txt server-URL;
$ if [ 0 -eq $? ]; then … fi;
Or if your command is relatively short and you want to do something when it fails, you could rely on the exit code as the condition in a conditional statement:
$ if curl --fail -X POST -d@myfile.txt server-URL; then
# …(success)
else
# …(failure)
fi;
I think this format is often preferred, but personally I find it less readable.
Your first command should work without whitespaces:
curl -T "{file1.txt,file2.txt}" ftp://XXX/ -user YYY
Also note the trailing "/" in the URLs above.
This is curl's manual entry about option "-T":
-T, --upload-file
This transfers the specified local file to the remote URL. If there is no file part in the specified URL, Curl will append the local file name. NOTE that you must use a trailing / on the last directory to really prove to Curl that there is no file name or curl will think that your last directory name is the remote file name to use. That will most likely cause the upload operation to fail. If this is used on an HTTP(S) server, the PUT command will be used.
Use the file name "-" (a single dash) to use stdin instead of a given file. Alternately, the file name "." (a single period) may be specified instead of
"-" to use stdin in non-blocking mode to allow reading server output while stdin is being uploaded.
You can specify one -T for each URL on the command line. Each -T + URL pair specifies what to upload and to where. curl also supports "globbing" of the -T
argument, meaning that you can upload multiple files to a single URL by using the same URL globbing style supported in the URL, like this:
curl -T "{file1,file2}" http://www.uploadtothissite.com
or even
curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/
"*.txt" expansion does not work because curl supports only the same syntax as for URLs:
You can specify multiple URLs or parts of URLs by writing part sets within braces as in:
http://site.{one,two,three}.com
or you can get sequences of alphanumeric series by using [] as in:
ftp://ftp.numericals.com/file[1-100].txt
ftp://ftp.numericals.com/file[001-100].txt (with leading zeros)
ftp://ftp.letters.com/file[a-z].txt
[...]
When using [] or {} sequences when invoked from a command line prompt, you probably have to put the full URL within double quotes to avoid the shell from interfering with it. This also goes for other characters treated special, like for example '&', '?' and '*'.
But you could use the "normal" shell globbing like this:
curl -T "{$(echo *.txt | tr ' ' ',')}" ftp://XXX/ -user YYY
(The last example may not work in all shells or with any kind of exotic file names.)
Best Answer
Something like this should do the trick:
curl -d param="$(cat file1.txt)" ...