Linux – Scripting an sFTP batch upload every 1 minute

amazon-web-serviceslinuxsftpwebcam

I am wanting to record video of part of my house with a webcam while I am away for a few days to try and ensure any burgalers are photographed and the photo uploaded to my server before they even realise it.

I have setup a webcam to stream footage using mjpg_streamer. That works and is streaming OK.

It is possible to take a still capture from the stream whenever you want called 'FileName.jpg' :

wget http://127.0.0.1:8080/?action=snapshot -O FileName.jpg 

I have an Amazon FTP server with Amazon Web Services and FTP access using sFTP. I am connected using Firefox's FireFTP plugin currently, so that works. The idea is to leave the computer running with the connection live.

I'd like to script the taking of a picture from the stream say every 1 minute and have the picture uploaded to my server via the live FTP connection, and either have the original file deleted off my PC so that the next one will save O or append a number to the end of each file and then FTP it up. e.g FileName1.jpeg, Filename2.jpeg.

I've Googled for hours and although there's loads of posts about scripting an FTP upload, I can't find any about a constant stream of uploading…i.e. "Watch this folder, upload the content every 1 minute and then a minute later upload whatever new content is in it".

I guess I need a bash script that will :

  • keep a counter so that each newly created file gets a different name
  • Send that filename to the "wget http://127.0.0.1:8080/?action=snapshot -O FileNameXX.jpg" every 30 seconds or 1 minute
  • Upload the FileNameXX.jpg to the FTP server

But I have no idea how to do that! Can anyone direct me? Or does anyone know of a way to do with FileZilla or something (which cant watch a folder AFAIK : https://forum.filezilla-project.org/viewtopic.php?t=41609?

Best Answer

My first tip would be to name the files using the date and time they were taken. That way you won't need to keep a counter anywhere, which would be difficult in a script which doesn't run continuously as its variables would get reset on each invocation. You could store the variables in files, but it's easier if you ensure the names won't collide. Something like wget http://127.0.0.1:8080/?action=snapshot -O "Snapshot-$(date).jpg" if you are using Bash. (Sorry if the syntax doesn't work, I'm no Bash expert and I'm typing this in my phone.)

Like you mentioned, there are several tutorials about scripting FTP uploads available. At least one of them should have included an example which uploads files by a pattern, such as "Snapshot-*.jpg", where the wildcard would match the timestamp. Or, you could point the FTP program (such as lftp or ncftp, which have binaries meant for scripting) to upload everything in a certain folder. Then wipe the folder if the program succeeded. That way you can run your script as often as you want using cron or a systemd timer, and have it be flexible enough to always try to upload any files which it didn't succeed with the later time it ran.

There's also software designed to do this task, and more, on their own. One such programs, which I've used myself, is simply called "motion" and is available for most distributions. It has built-in motion triggering (record and/or take snapshots) or continuous modes. It can be a bit CPU-intensive on systems like a Raspberry-Pi, but it certainly works.

If you want to step it up a bit, perhaps run multiple remote/local cameras, and have the motion detection offloaded to a more powerful central machine, look at Zoneminder. It takes longer to set up, and is in my experience picky about you manually setting the correct resolutions on your camera feeds, but it can be scripted to some degree.

Related Question