To update or not to update

Filed under: life — jaydublu @ 4:11 pm

There’s a sweet contentment knowing that a system is fully up to date and patched, but how uncomfortable it soon becomes when you find out there’s a new update or patch – leading to the nagging question of if / when to go through the grief of updating.

On the one hand, you know you should apply updates as soon as you can to get any potential benefit of new features, bugfixes or even improved security and performance – but anyone who has been around long enough will know that even the simplest patch can upset an otherwise stable system and be a royal pain to resolve.

That leads to the temptation to procrastinate and put off updating systems until it can’t be avoided any longer, but the longer updates are left the more they build up, and the worse the process of updating becomes – again you may hear the sigh of bitter experience?

I’m not just talking about applications like WordPress and Drupal, or the dreaded Windows Update – there are so many constant reminders from all sorts of systems to ‘update now?’ And I’m talking about minor updates rather than the biggies like Vista -> Windows 7 or Drupal 6 -> Drupal 7 – those you have a choice, but minor updates you really don’t, it’s more a matter of ‘when’ than ‘if.

Here is a summary the guidelines that I try and stick to:

Have a regular ‘maintenance period’ – Accept that maintenance is a fact of life, and schedule an appropriate amount of time at least monthly to review logs, apply updates and other housekeeping tasks to maintain a healthy system. Of course this implies that the more systems you look after, the more time will be spent in maintenance, but that cannot be avoided – without tlc systems tend to ask for your attention when it’s not so convenient!

Don’t do updates on a Friday afternoon – As tempting as it may be to slip this task in at the end of the week if your ‘proper work’ allows, it’s tempting fate and if you have problems with an update you might not have enough time to resolve it, or you might shortcut testing and not spot an issue until Monday. Make sure you have set enough time aside to do it properly, and you’re in the right frame of mind.

Design your system to be maintained – when carrying out initial installation and configuration, think ahead to how you are going to apply and test updates and patches, and migrate changes between environments. Good decisions at the start make life so much easier six months down the line. Documentation and good naming conventions are key. And of course as part of your deployment there was a test plan that can be used to validate updates?

Prepare and maintain a documented maintenance procedure – It makes the task much less daunting to know that you have a ‘script’ worked out rather than having to keep re-inventing the wheel. Wikis are ideal for the purpose as they’re easy to create and maintain. Anything you have to spend time figuring out so that next time you don’t have to struggle to remember it, or work it out again. If you find information that’s wrong or missing, fix it, and keep documentation updated as your system evolves.

Read the release notes – try to get an understanding of what the update or patch you’re about to apply will change so you can analyse potential risks and identify any mitigating measures you might take, and to also identify what testing you can do to satisfy yourself all’s OK afterwards.

Once you start – don’t stop until you finish – it’s probably worse half applying updates than not applying them, and even harder to do maintenance next time. So no distractions, and remember the ‘not on Friday afternoon’ rule.

Pat yourself on the back afterwards – it’s a necessary job, so make sure you get your due reward. If you’re doing this for someone else (a client) make sure they appreciate the effort it takes, and don’t be apologetic about taking the time to do it properly. Few people notice a well running system, but you soon get shouted as when it goes titsup, so sleep well in the knowledge that all is right with the world again and you have done a proper job!

Limiting access to Drupal content types

Filed under: drupal,web development — jaydublu @ 1:43 pm

There are contributed modules that extend Drupal’s innate desire that all content should be publicly visible – notably node privacy byrole and Content Access. If you’ve defined your own node in a module, you can also use node_access() to set access rights to that node type. But what if it’s a core or CCK content type and you want to keep things simple?

Well here’s a ‘light’ way to restrict access to a given content type (‘mytype’ in this instance) to the owner of the node and any given role (‘administer nodes’ in this instance although it could be anything). Put this anywhere convenient and off you go.

/**
 * Implementation of hook_nodeapi()
 *
 * Limit viewing of mytype nodes to owner and admins.
 */
function mymodule_nodeapi(&$node, $op, $a3 = NULL, $a4 = NULL) {
  global $user;
  if ($node->type == 'mytype' && $op == 'view') {
    if (!user_access('administer nodes', $user) && $user->uid != $node->uid) {
      drupal_access_denied();
    }
  }
}

Storing dates in Drupal Schema API

Filed under: drupal,web development — jaydublu @ 6:25 pm

This one caught me out for a good while – I’ve got my own data stored in a database using Drupal’s Schema API and one field I want is a date, so I used the ‘datetime’ type. But whenever I came to use a value anywhere I couldn’t get any of Drupal’s date formatting functions to work – they were expecting a unix timestamp (makes sense) but the Schema API uses ‘datetime’ as a field type on MySQL so was getting a MySQL date string in return.

The answer was not to use datetime as a schema type, but int, and when passing data make sure you pass a timestamp. format_data() and views_handler_field_date etc. will then work as expected.

But it makes you wonder who left this mantrap lying around for foolhardy developers like me to fall into?

Drupal Views bulk export

Filed under: drupal,web development — jaydublu @ 5:06 pm

I’ve been using Drupal in earnest since the New Year, and I have to say that I wish I’d discovered it sooner. It’s by no means perfect, but I think perfection is impossible with CMS systems and it’s doing everything I need it to.

Initially I was sceptical about using the Views module – I was thinking that Views are just for people who can’t be bothered to do proper PHP, but then I got into them and discovered the power and simplicity, and I have to say my first approach to most content display challenges in Drupal now tends to use Views.

But this leads on to one of my biggest gripes about Drupal – how much of the site configuration is stashed in the database where there is little accountability or control – I much prefer having files in the filesystem where you can use Version Control.

Then I discovered bulk export which does all of that – and it makes a big difference in safe use of Views. This is how I’ve done things several times now quite successfully:

  • Carry out rapid development using the Views UI as normal.
  • Enable the Views Exporter module – this turns on the ‘bulk export’ tab under ‘tools’ in the Views admin
  • In Site Building > Views > Tools > Bulk Export, select the views that you want to export, and enter the name of the module you want to store the views code in and hit ‘export’
  • Follow the instructions on the next page to add snippets to the .info and .module files, and to create the .views_default.inc file
  • In Site Building > Views > Tools > Basic, click ‘Clear Views Cache’ button
  • The right hand-most link against the views you exported in the admin list should now have changed from ‘delete’ to ‘revert’ – go ahead and ‘revert’ all the views you exported

That’s it – simples!

If you need to make any changes to exported Views, there are two approaches:

  1. Modify the exported php directly in the .views_default.inc file – these changes should be immediately reflected on the site, but you might need to hit ‘Clear Views Cache’ just to be sure.
  2. Use the Views UI, and re-export the views again, save the updated .views_default.inc file, then hit ‘revert’ on the list to use the file version

I’ve found it very useful when there are two or more enviornments being used for development (e.g. dev and live) to be able to spot that a local change has been made to an exported view by the presense of the ‘revert’ link – using diff on exports gives you a clue to what setting changed – a bit fiddly but not as bad as trying to go through every setting the UI one at a time to spot the difference. It’s also very reassuring to have Subversion (or your vrsion control system of choice) keep all your various environments in sync.

The final trick to pass on that has made my life much simpler is proper use of names and tags in views – if all the views for your module have the same prefix to their name, and the same tag, it makes it much easier to spot them all in lists, and to select them all together when you come to re-export them – it’s very embarrassing if you miss one and a whole section disappears from the site! If you need to rename or add tags – it’s easy to spot in the exported code where they can be changed, and that’s something you can’t do through the UI.

Now if only I could get more of Drupal’s config in the filesystem this way …

Collections

Filed under: life — jaydublu @ 7:22 pm

So here’s the thing – I heard some Lou Reed on TV last night and I realised I hadn’t heard Transformer for far too long, so this afternoon I decided to finally get around to digging out and plugging all the various bits of my good-old-fashioned analogue hi-fi together again after far too long.

Having got over that fact I seem to have mislaid Lou Reed, the first album that caught my eye from my cherished vinyl collection as I scanned for something to put on was Fairground Attraction “The first of a million kisses”.

So I breathe a deep contented breath, close my eyes, lean back in the chair and wonder why the hell it is that I haven’t done this for … probably a good 10 years!!! My mind wanders further, and I realise that I’ve had this album for almost two decades, and although a few tracks on side one are like old friends I don’t know if I played the whole album more than a couple of times, but it’s great. So what other gems are in the modest collection of 200 odd LPs that I once put so much thought into? Given that most of the music is probably more than 20 years old – what have I missed in the mean time? Yikes – there can’t be enough hours in the day to find out what I’ve missed or am missing!!!!

But then I realise I have the same thought when I browse through my boxes of old novels looking for the next book to read again, and discover an old favourite, or something I once bought and never got around to reading, and also when I look for a film to watch out of the DVD collection (at least I finally threw all the videos away after they went mouldy or I’d have them to worry about too).

Half way through side2 (oh how I love vinyl – so tactile!) and I’ve decided that I don’t care. In all the various collections I have squirrelled about in cupboards or sheds, on shelves and in boxes (and don’t get me started on tools, or bits of electronics) I’ve got more than enough to keep me fully occupied and amused for the rest of my life even if I worked at it full time – what does it matter if I’m missing other stuff – what I’ve got already is more than I will ever need and it’s all just fantastic even if it is often more than 20 years old!

As I mellow with time, it seems I’m becoming content with what I have – the grass is pretty green over here so why worry about the other side of the bridge?

I love vinyl!

Server monitoring

Filed under: review,tinkering — jaydublu @ 5:39 pm

mysql_queries-week MuninI think I’ve finally found an almost perfect suite of tools to monitor webserver performance and availability – it’s only taken five years!

The most recent discovery that has me all excited is Munin – I’d heard of it before but can’t think why I’ve never given it a go. It’s a fantastic tool for recording all sorts of useful metrics in rrdtool stylee graphs – far too much info in fact as it’s bringing out my hypochondriac tendencies.

I’ve been using Nagios for years – although Ubuntu distros make it easier to set up it is still a bit like hard work, but once it’s set up it’s great. I’m using nrpe plugins to remotely monitor many of the same metrics as Munin is recording on a suite of servers, but Nagios is set to generate alerts if they go out of tolerance. Once you get the thresholds right it can warn you of impending trouble before a site actually fails – a theory which actually worked a few weeks back when alerts for page response time and processor load allowed me to take evasive action before a site actually crashed.

I’ve got a utility script or two, such as one which monitors MySQL replication, which is regularly polled by Nagios which triggers an alert if a certain string isn’t found. I’m sure there is a plugin or other cunning way to get Nagios to do this without a script, but this was easy, and it works!

Finally for in-house tools, good old AWStats for logfile analysis gives me an idea of raw traffic served.

For remote tools, I use an email to sms gateway to allow Nagios to alert me of critical problems if I’m not at my machine, for a second opinion and as a safeguard I also subscribe to a remote monitoring service – of the many I’ve tried I favour Alertra, but also use Pingdom occasionally. Finally Google Analytics allows traffic analysis within the site, and that’s about it.

But as the BBC says, other services are also available.

This guy loves statistics!

Filed under: life — jaydublu @ 5:49 pm

I’ve seen some of these charting tools in things like Google Analytics but thought it was just a gimmick, but not only is this stuff entertaining, but makes the data informative and what’s more adds weight to the conclusions.

Thanks to Robbie for this.

When to rebuild?

Filed under: life,tinkering — jaydublu @ 5:05 pm

This post can relate to so many things – stylesheets, php code, glass fibre moulds – at some point when you’ve gone through a few iterations of an agile like process, you start thinking “If I’d known I was going to end up doing this, I would have started differently”

When you’re so close to completing, yet you know deep down it’s getting messy overcomplex and perhaps not as ‘nice’ as it could be, the decision of whether to leave it as-is because it works or to take the opportunity to rebuild everything before you go any further becomes almost unbearable.

But then I suppose is that not what the perpetual beta is all about, and eventually you get the opportunity to go for v2.0 (or 1.0 even)? In the mean time if it’s not doing any harm, leave it alone – you know it will only open up another can of worms if you start from scratch again!

Can’t log in to Drupal site in IE7, but can in Firefox?

Filed under: web development — jaydublu @ 12:20 pm

Not unique to Drupal, but common to any that use cookies to maintain login session information. I’ve met it before, but I just lost an hour and I have a brick shaped imprint in my forehead again, so I thought I’d drum into my head … don’t use underscores in domain names!

So often when using subdomains for setting up dev sites etc., it’s so easy to use site_dev.domain.com or similar. But, underscores are not valid characters in domain names.

Firefox is forgiving, and behaves as you would hope, but IE7, Opera etc. won’t properly set cookies on domains with underscores.

Solution – use hyphens instead e.g. site-dev.domain.com

Learning to take pictures (again)

Filed under: photography — jaydublu @ 12:58 pm

I’ve always had an interest in photography, and ten or twelve years ago I got quite into the technicalities of black and white film – the zone system etc. I had a couple of Olympus OM-2 bodies, and a crude but workable darkroom. Much fun was had.

Moving house several times I never got around to setting up the darkroom again and it all still sits on shelves (about to go on eBay).

In the mean time, digital photography became more mainstream, and four or five years ago I got myself a Fujifilm F700 which is a compact but capable little number. We’ve had some good times together and I’ve taken some images that I’ve been very happy with, but it still doesn’t encourage the full exploration of photographic techniques.

This christmas, I took advantage of an offer to get a ‘proper’ camera – a Fujifilm S5 Pro – and I’m now starting to invest the time to learn how to use it and express myself. It’s a stunning camera, and I’m so glad I made the decision to get it against other options such as a Fujifilm S100FS bridge camera.

The amazing thing I’m finding is how ‘basic’ the camera feels – it certainly doesn’t make it easy to take pictures without thinking about what you’re doing – but I feel that’s the whole point , and I love it.

View from Happisburgh Coastwatch - processed from RAW file.I’m still feeling my way transferring philosophies and techniques learnt using mono negative film to digital technologies.

Certain things are the same and surprisingly familiar – focussing, metering, composition, depth of field etc. but, when it comes to moving away from ‘average’ pictures or when you can’t rely on automatic settings is when the fun starts.

I did once know my way around the zone system, and could customise the way I processed B&W film and printed the final image to get a variety of tones and tonal ranges to suit the subject. But how does that translate to digital?

Well the first thing I’m just now starting to get to grips with is that film processing and other darkroom techniques are roughly equivalent to using RAW format and manipulation in something like Photoshop. The image shown here is a bland shot from inside Happisburgh Coastwatch, but it took some fiddling to get it so you could see detail from both inside and outside.

So I’ve an expensive camera – necessary to get a good sensor and image processing software, and Photoshop doesn’t come cheap. In the digital world, to get to grips with the core technicalities of the zone system for instance is quite an expensive undertaking.

But, when I started experimenting ten or so years ago it was much more affordable – you didn’t need much of a camera if you used good film, and it was surprising what good results you could get in a darkroom with some pretty basic kit.

I wonder what the digital equivalent of the pinhole camera is?

« Previous PageNext Page »