Wget Voodoo

Filed under: tinkering,webcam — jaydublu @ 1:09 pm

I’m stumped by a supposedly simple problem with using wget to regularly fetch a snapshot from a webcam over a not-too-reliable network connection to then push to a website. If the connection fails wget overwrites a good file with a 0 byte one – how can I get it to leave the original intact?

Here’s my script (simplified – mine actually fetches 4 images and writes to two ftp accounts):

#!/bin/bash

# fetch images, store them locally
wget --user=#### --password=#### http://192.168.1.15/axis-cgi/jpg/image.cgi -O video2.jpg
# ... next images

# now push them to a webserver
ftp -in <<EOF
  open my.domain.co.uk
  user ##### #####
  bin
  put video2.jpg
# ... put other images
  close
  bye
EOF

Ideas I’ve had but not been able to realise yet …

  1. Scour the wget manpage for some option to only overwite the output file if sucessful
  2. Get wget to output to a temp file, wrap in script testing filesize to overwite the ‘real’ file if filesize > 0 bytes
  3. Get wget to output to a temp file, wrap in script testing wget repose to overwite the ‘real’ file if response contains ‘saved’
  4. Somehow put logic in the ftp script to only upload files > 0 bytes