Wget Voodoo

Filed under: tinkering,webcam — jaydublu @ 1:09 pm

I’m stumped by a supposedly simple problem with using wget to regularly fetch a snapshot from a webcam over a not-too-reliable network connection to then push to a website. If the connection fails wget overwrites a good file with a 0 byte one – how can I get it to leave the original intact?

Here’s my script (simplified – mine actually fetches 4 images and writes to two ftp accounts):

#!/bin/bash

# fetch images, store them locally
wget --user=#### --password=#### http://192.168.1.15/axis-cgi/jpg/image.cgi -O video2.jpg
# ... next images

# now push them to a webserver
ftp -in <<EOF
  open my.domain.co.uk
  user ##### #####
  bin
  put video2.jpg
# ... put other images
  close
  bye
EOF

Ideas I’ve had but not been able to realise yet …

  1. Scour the wget manpage for some option to only overwite the output file if sucessful
  2. Get wget to output to a temp file, wrap in script testing filesize to overwite the ‘real’ file if filesize > 0 bytes
  3. Get wget to output to a temp file, wrap in script testing wget repose to overwite the ‘real’ file if response contains ‘saved’
  4. Somehow put logic in the ftp script to only upload files > 0 bytes

Proxying with Apache2 on Ubuntu

Filed under: tinkering,ubuntu,webcam — jaydublu @ 4:01 pm

Further to earlier problems with using Apache2 on Ubuntu to proxy web requests to devices inside my local network, I think I’ve now sussed it.

Specifically, I’m trying to get Apache to enable external access to a webcam inside my network, where for some reason I can’t enable access to it directly using my router.

I’m now relatively confident that the appropriate way to do it is to enable mod_proxy and mod_proxy_http with sudo a2enmod proxy_http, this then allows use of ProxyPass directive within a vhost for example:

ProxyPass /webcam http://192.168.1.15/

Server monitoring

Filed under: review,tinkering — jaydublu @ 5:39 pm

mysql_queries-week MuninI think I’ve finally found an almost perfect suite of tools to monitor webserver performance and availability – it’s only taken five years!

The most recent discovery that has me all excited is Munin – I’d heard of it before but can’t think why I’ve never given it a go. It’s a fantastic tool for recording all sorts of useful metrics in rrdtool stylee graphs – far too much info in fact as it’s bringing out my hypochondriac tendencies.

I’ve been using Nagios for years – although Ubuntu distros make it easier to set up it is still a bit like hard work, but once it’s set up it’s great. I’m using nrpe plugins to remotely monitor many of the same metrics as Munin is recording on a suite of servers, but Nagios is set to generate alerts if they go out of tolerance. Once you get the thresholds right it can warn you of impending trouble before a site actually fails – a theory which actually worked a few weeks back when alerts for page response time and processor load allowed me to take evasive action before a site actually crashed.

I’ve got a utility script or two, such as one which monitors MySQL replication, which is regularly polled by Nagios which triggers an alert if a certain string isn’t found. I’m sure there is a plugin or other cunning way to get Nagios to do this without a script, but this was easy, and it works!

Finally for in-house tools, good old AWStats for logfile analysis gives me an idea of raw traffic served.

For remote tools, I use an email to sms gateway to allow Nagios to alert me of critical problems if I’m not at my machine, for a second opinion and as a safeguard I also subscribe to a remote monitoring service – of the many I’ve tried I favour Alertra, but also use Pingdom occasionally. Finally Google Analytics allows traffic analysis within the site, and that’s about it.

But as the BBC says, other services are also available.

When to rebuild?

Filed under: life,tinkering — jaydublu @ 5:05 pm

This post can relate to so many things – stylesheets, php code, glass fibre moulds – at some point when you’ve gone through a few iterations of an agile like process, you start thinking “If I’d known I was going to end up doing this, I would have started differently”

When you’re so close to completing, yet you know deep down it’s getting messy overcomplex and perhaps not as ‘nice’ as it could be, the decision of whether to leave it as-is because it works or to take the opportunity to rebuild everything before you go any further becomes almost unbearable.

But then I suppose is that not what the perpetual beta is all about, and eventually you get the opportunity to go for v2.0 (or 1.0 even)? In the mean time if it’s not doing any harm, leave it alone – you know it will only open up another can of worms if you start from scratch again!

Ubuntu’s Subversion

Filed under: tinkering,ubuntu — jaydublu @ 5:41 pm

I’ve a minor gripe about Ubuntu – only ‘cos it’s caught me out a couple of times.

My local dev server is runnung Unbuntu Gutsy, and I do run apt-get upgrade etc. every now and then to keep things current.

I tend to keep most of the sites I’m working on checked out out of the repository somewhere that Apache can get to them so I can see the rendered output easily, and to make life easier I also access the server’s webroot over an SMB share from my laptop.

Life was great until I upgraded my laptop’s TortoiseSVN to 1.5.0-something-or-other as it keeps nagging to do – but if I’m careless enough to do an update on a remote working copy using Tortoise, it upgrades it to the new 1.5 format, which means it can’t be used by subversion on the boxes own command line as the Gutsy Subversion package is not up to 1.5 yet.

Twice now I’ve had to check out a fresh working copy to overcome this problem, and to save any future accidents, I’m downgrading my Tortoise to a pre-1.5 version – I looked at trying to get an ‘experimental’ Debian package installed but it looked far too risky.

I hate computers!

Filed under: rants,tinkering,webcam — jaydublu @ 6:25 pm

Probably unfortunate given what I do for a living, but there you go. Actually, I’ve always wondered if it’s a useful trait – I’m not usually a proponent of technology for technologies sake – if I can solve a problem without getting high-tech, that’s my preferred option.

I may just be feeling paranoid, but it seems that the bytes have been ganging up on me recently – I have a growing number of niggling problems that refuse to go away. The odd challenge can be quite enjoyable, as long as the implication of not fixing it is not too severe – but you soon start to realise how much we have grown to depend on email / google / multimap / skype and all the other trimmings when you can’t get them.

And worse, it almost makes me feel physically ill when I start to feel I’m not able to keep it all running – when things go wrong faster than I fix them; when the silicon is ruling me rather than vice versa. Is judgement day coming?

Having decided to give Vista a chance, how is it repaying me? My machine has started bluescreening two or three times a day at the most infuriating times (like when you’re blogging about it). I’m going to live with it for a bit longer rather than doing anything rash – ongoing.

The lighthouse webcam crashed earlier in the week – usually gets sorted by turning the power off then on again, but from inside the lantern there was no sign of life outside – meaning a second trip with more keys, tools etc. Turns out it was a blown fuse so nothing major – fixed.

I’ve replaced the firewall in my modem / router with a mini-ITX box running IPCop – the idea was to secure my network a bit better while allowing certain individuals more access to growing numbers of devices I’m hosting – but it absolutely refuses to let OpenVPN work as advertised, and I’ve loved that application when I’ve used it in the past – ongoing. While fiddling I did muck something up with the blue network meaning it wouldn’t grant dhcp leases to wireless devices – fixed (phew). Postscript – OpenVPN is now working perfectly – turned out it was a problem within the network and not with ipcop – zerina is a great plugin that makes managing OpenVPN a doddle.

My Acer easyStore NAS is pretty much up and running now, but I still have a niggle where every time I restart my laptop the backup application can’t then see the drive – you have to remove protection then re-protect to run a backup, meaning you have to do it manually every time – that wasn’t the idea but I haven’t been bored enough to try and get some more support after the last time – ongoing.

Do things like dodgy starter solenoids on my truck count as computer problems? Still adding to my irritation though – hopefully this week I’ll crack more issues than arise and get back to a tolerable level of ‘silicon rage’.

Of course I’m my own worst enemy – I will not leave well alone, and have to keep fiddling or trying to improve things. But the moral of this story (if there is one) – don’t let the machines grind you down!

A little knowledge is a dangerous thing

Filed under: tinkering,trundle,ubuntu — jaydublu @ 1:32 pm

I like to think I know a little bit about most things surrounding the Internet, and whilst not claiming to be an expert I like to think I’m at least competent in most things I turn my hand to.

But every now and then I get caught out, and reminded how dangerous it can be to tinker with things you don’t fully understand – there are some people out there with far too much time on their hands.

As part of my Trundle project, I attempted to make a webserver running on the beast’s eventual operating system available to the public Internet – not for public consumption mind, but so I can see it when I’m out and about. Now I didn’t want to put the whole thing on a public IP address, just a little bit of it – and apart from anything else I’ve already got an externally available webserver on my Internet connection.

So my idea was to use mod_rewrite to proxy a set of urls to the internal server’s private IP address. I’m sure it’s something I’ve done before in other Apache instances, and it sounded feasible, but for once Ubuntu fought back a bit. Still, I felt I’d prevailed.

Now it turns out I’d opened up a vulnerability to someone, somewhere, to do something with my network. It was cunningly disguised in that the traffic wasn’t enough to be hugely obvious, but I was playing with awstats and got curious about some odd traffic.

It turns out I’d unintentionally configured my webserver to allow anyone to use it to proxy requests to anywhere else. Short of cloaking the eventual source (or destination?) of the traffic I can’t see what was gained – the requests seem mostly to have been for banners or clickthrus in flash game sites. I wasn’t hosting the files so nothing was gained in terms of bandwidth, and it doesn’t seem like a ddos attack.

Anyway, I’ve disabled the proxying functionality now, and checking the logs although I’m still getting the requests they now get a 403 response. I hope they’ll die out eventually, or will I have to get my fixed IP address changed do you think?

Gradwell VoIP

Filed under: review,tinkering — jaydublu @ 11:17 am

To open up another can of technical worms, I’ve been investigating voice over IP – I’m about to get into some remote collaborative development so am setting up my tools. I also need a new phone number so I can keep work from home, but being a skinflint I want to do it as cost effectively as I can.

So I read up on VoIP, and investigate Skype and the like, but feel I need something a bit more commercial. Gradwell keeps cropping up as a name, and I’ve had dealings with them in the past and know them to be a reputable company on the bleeding edge of technology.

Grandstream GXP2000So I sign up for a trial, and to do it properly buy a proper phone off them (the Grandstream GXP2000 for reference) but while waiting have a quick play with soft phones – on this case X-Lite.

Now I immediately run into problems that registration with the SIP server is flakey, and although when registered you can ring out or in it’s not often you can get voice through – it’s sounding very much like problems I had when trying to set up videoconferencing, and that turned out to be firewall issues and was only really resolved by putting the conf units on dedicated public IP addresses.

When the Grandstream turned up from Gradwell though, it was preconfigured for my account – and I have to say that I just plugged it in and it worked … almost. There was still something going on in my network that caused connectivity to drop occasionally, which was bad if it happened in the middle of a conversation, but I did have a one hour call with little or no sign that it wasn’t using a ‘proper’ phone. So I’ll put that down to my network and not the service. I’ve since swapped out my router, changed an old HP network hub for a shiny Netgear gigabit switch, and reviewed what boxes are plugged in and chattering away all the time. I’ve also unplugged a redundant wifi access point. Things now seem a lot more stable.

Following on from that long call I decided to extend the trial, and upgraded from a single user account to Gradwell’s ‘Centrex’ package – it basically allows multiple ‘extensions’ to connect as if on a virtual PBX exchange – you can configure all sorts of neat stuff like how external numbers ring through to extensions – IP or external, hunt groups, voice menus, forwarding … and the individual extensions can be anywhere. Very cool.

When I get an opportunity I’m also keen to try using mobile phones that have SIP functionality – my N95 should be one of them but unfortunately I’m with Vodafone who in their infinite wisdom decided that I should be protected from IP telephony and removed those bits of the firmware. Time to swap provider perhaps?

I do have reservations though; although the bandwidth of individual calls isn’t great (less than 100kbps per conversation) the experience is severely compromised if your network isn’t as tight as a gnat’s chuff – anything blocking connectivity will play hell with things. VoIP isn’t the only thing that will suffer, but if the service is business critical it could become a big issue. And that leads me to general concerns about networks – I don’t know if I’m still missing a trick but they are a bugger to diagnose when stuff starts misbehaving. At the moment I think VoIP isn’t for the faint hearted!

It’s still early days and the system hasn’t really had an opportunity to be used in anger, but I’m sure once the initial teething troubles are sorted it will be a fantastic system, with some very useful features and very economical to operate.

Ftp over ssh tunnel

Filed under: tinkering — jaydublu @ 6:20 pm

This isn’t radical, but it tripped me up for half an hour…

SSH tunneling is a great easy way of remotely accessing a network if you’ve access to an SSH account within it – using something like PuTTY you set up whichever ports you want access to and they magically appear locally.

But ftp is a bit trickier – the protocol needs two ports – 21 and another one. I couldn’t figure out how to configure tunnels in PuTTY to let the second port through, even using PASV connections.

But then I twigged all the references to SCP and SFTP which didn’t seem relevant as I wanted to reach a different remote machine to the one I was SSH’d to – tunnel port 22 to the machine you want to reach using PuTTY and then with the tunnel open you can use something like PSFTP to SFTP to the remote machine.

Make sense?

Ubuntu Apache2 mod_rewrite proxy rules

Filed under: tinkering — jaydublu @ 5:45 pm

I had a big problem getting an Ubuntu Feisty Fawn Apache2 instance to use proxying rewrite rules.

Firstly, mod_rewrite is not enabled by default, which is probably no bad thing. So ‘sudo a2enmod rewrite’ fixes that.

Now I can use a rule to allow my main server to proxy requests through to a smaller server: ‘RewriteRule ^(.*)$ http://192.168.1.16$1 [P]’ but no – I’m getting a ‘403 Forbidden’ error. Checking the Apache error log I find ‘attempt to make remote request from mod_rewrite without proxy enabled’

So proxying needs enabling. If I do ‘sudo a2enmod proxy’ the error changes to ‘client denied by server configuration’ so I try changing ProxyRequests to on in /etc/apache2/mods-available/proxy.conf, and the very insecure ‘Allow from all’ in the proxy block.

Now I’m getting a warning ‘proxy: No protocol handler was valid for the URL /. If you are using a DSO version of mod_proxy, make sure the proxy submodules are included in the configuration using LoadModule.’ in the error log – so I try ‘sudo ln -s ../mods-available/proxy_http.load /etc/apache2/mods-enabled/proxy_http.load’ to manually add the http sub_module and bingo!

Now to tidy up the mess – other than the manually created symbolic link, all I’ve done is tweak /etc/apache2/mods-available/proxy.conf thusly:

<IfModule mod_proxy.c>
#turning ProxyRequests on and allowing proxying from all may allow
#spammers to use your proxy to send email.

ProxyRequests On

# <Proxy *>
# AddDefaultCharset off
# Order deny,allow
# Deny from all
# #Allow from 192.168.1.
# </Proxy>

<Proxy http://mysite.com/*>
Order deny,allow
Allow from all
</Proxy>

# Enable/disable the handling of HTTP/1.1 “Via:” headers.
# (“Full” adds the server version; “Block” removes all outgoing Via: headers)
# Set to one of: Off | On | Full | Block

ProxyVia On
</IfModule>

That seems to be working, although I’m sure there must have been a tidier way.

Postscript – WARNING: This has just had unintended consequences – I seem have enabled some grebs to use my network to proxy requests. Other than cloaking the original destination of the traffic (and it seems to be most banner ads and clickthru redirects, from a few IP addresses) I don’t see what has been gained, and if I hadn’t been closely examining logs and traffic recently it could have slipped past my attention.

So with hindsight – think very carefully about enabling proxy-http modules – I don’t know exact details of what went on, but I now regret doing it!

Next Page »