PHP ob_gzhandler “Content Encoding Error”

Filed under: web development — jaydublu @ 3:56 pm

You know sometimes how the simplest little issue holds you up for days? Well I’ve just had a doozy!

I’m maintaining an inherited CMS application that I’m still having to trust that some of the inner workings ‘works’ because it’s all a bit involved and I’m not getting paid to rip it all apart for the sake of it.

We’re deploying the app onto new servers – something that I’ve done a few times so I wasn’t scared – but this time I just couldn’t get it to work – calling up pages I was getting “Content Encoding Error” messages from Firefox, and generally not helpful responses from other browsers.

The app was using GZIP to compress output where browsers support it using ob_gzhandler I knew, and if I commented out the gzip bits it was working, but that wasn’t something I wanted to do – and I was determined to find out why the same codebase wasn’t working on this environment with almost identical configuration to others that it did work on.

To cut a long story short – there’s a configuration include called at the start of each page that’s not under version control (for obvious reasons), and on this server it somehow acquired a couple of line breaks at the end after the closing php tag, so it started the html output early.  Nothing to do with double encoding or other potential issues I found when Googling.

Using Amazon S3 to deliver flv content

Filed under: web development — jaydublu @ 4:59 pm

I’ve used Amazon S3 for some time now on and off – it’s a great, fast, cheap service, but it does have its own quirks.

Developing a site that hosts a Flash movie displaying a series of flv encoded videos, we decided to host the flv content on S3 to save on bandwidth costs in case we got mass amounts of traffic.

First problem we encountered was performance – with a bucket created using the default location the videos were loading slower than they were playing, but switching to use a bucket created to be located in Europe, the speed has no longer been a problem.

The last issue which really took some head scratching was reported by some testers that the videos weren’t loading – they tended to be testing from behind corporate firewalls.

It would seem that when we used the S3Fox Firefox extension to upload the files, it didn’t know what flv’s were so didn’t set a Content-type. S3 default response is not to sent a Content-Type, which it would appear these obtuse firewalls didn’t like so blocked it.

The solution was to knock up a php based upload script using the Amazon S3 PHP class written by Donovan Schonknecht specifying ‘video/x-flv’ as a Content-Type – works a treat.

Using Subversion to version control websites

Filed under: web development — jaydublu @ 11:34 am

I can remember when I first started developing oh-so-many years ago, instinct and common sense told me I should implement some sort of version control, even though at the time I was writing simple MS-DOS Basic applications, and working by myself. At strategic intervals, if I remembered, I took a copy of my source code, renamed it and put it somewhere ‘safe’. In reality I didn’t, but I knew I should so that if all else failed I had something to go back to.

As my applications started getting more complicated, as the business risk of getting it wrong increased, and especially when I started working collaboratively with other developers, the need for a ‘better way’ grew. I was aware of systems like CVS, but they didn’t seem entirely suited to our web development environments and the challenges we faced. We needed something which didn’t unduly get in the way of our work, and helped with as many problems and risks as possible:

  • Audit trail - who changed what, when, to get the code get into the current state?
  • Comparison – what changed between two versions
  • Collaboration – assisting multple developers working on the same files
  • Branching and Merging – the ability to create parallel threads of development, and combine them back together if required
  • Disaster recovery – assistance reconstructing sites in a hurry
  • Code migration - keeping envirnoments in sync, and moving changes between them

The break came, as with most things, with the opportunity to spend a bit of time ‘playing’ with possible solutions. First thing was giving Subversion a go – although similar in many ways to CVS, I found it much more suitable for my needs in that it was more usable, a ‘version’ represents the entire codebase rather than a specific file, and commits are atomic meaning if something fails it cancel the entire transaction rather than leaving a working copy ‘broken’ or inconsistent.

Now I had a tool I liked, how to apply it to our needs? My primary concern was moving sets of amends between environments – I wanted to get this automated and as safe and foolproof as feasible. Yet it had to be an easy process that anyone (within reason) could do.

So the files on the webserver, or any other environment which needs to be covered by version cnotrol, should be a working copy – that is checked out from a repository using a Subversion client. The new method to get changes to the live site is to make them in a dev environment, commit them to the repository (with a suitable log message of course) then ‘svn up’ the live site.

One stumbling block was that different environments tended to need different configuration – whether it was filesystem path, url of the web root, or database connection details. Like all well structured applications this configuration was already in a single global include file – but this was a very risky file to have under version control. It was too easy to have a locally modified file overwritten, or even worse to accidentally commit a dev server’s config and update (and break) a live site.

The solution – don’t version environment configuration!

Put all your environment specific code in a file named for instance .config.local ( the dot prefix prevents Apache from serving requests for this file which would be a bit of a security loophole) placed in the sites DOCUMENT_ROOT and setting a global ignore on *.local in all svn client config. Now in every script you can ‘include ($_SERVER[‘DOCUMENT_ROOT’].’.config.local’);’ and you’re away, safe in the knowledge that you don’t have to keep an eye out for this dangerous file.

Of course that’s a simplification of the task for safe management of sites, but it demonstrates the point. Neither does it take into account all the potential issues surrounding managing websites – changes in the unprotected config file, filesystem permissions and database schemas to name but a few; but with a bit of thought and some good working practices Subversion can usually help out to some degree, if only to keep manual documentation of such things in the same place as the code.

But that’s another post someday … what is for certain is that properly used and suitably configured, Subversion is most definitely worth the effort, and it could save your life one day – or at least your job.

Old Skool Rulez

Filed under: life,opinion,web development — jaydublu @ 4:36 pm

As posted earlier, I’m going back to my roots after 6 years in the relatively comfortable life of an employee of a large agency, and specifically as someone who has had a sabbatical from mainstream development whilst managing a team of developers.

Now I’m back on my own again I’m reviewing my skills and experience, the current state of the industry and best practice, and sorting my tools and techniques out ready to get busy (hopefully).

It’s amazing how much has changed in the intervening years and yet how much is the same. Packing up my desk and hunting down the books I took to Soup, many are well thumbed from regular use despite their age they are still relevant. I’ve also been compiling a wishlist on Amazon of titles bought for the company library that I will want to get copies of, but to be honest there aren’t many essentials – good references for PHP, MySQL, CSS, Apache, JavaScript etc. But even then can be found online – it’s much easier to type http://uk.php.net/explode to remember what order to pass parameters to explode() for instance than to find the book – but I digress.

I’ve been rebuilding a few sites I first built oh-so-many-years-ago – one was even still using Dreamweaver Templates <hangs head in shame> – but they are still doing the business for the owners and all they want is a quick design refresh and a bit of new content. “Oh, and while you’re at it could you just add a news section we can update ourselves?” So the quandary begins – how much do you reuse and how much do you rebuild, and what technologies do you use?

Perhaps unlike many working by themselves on ‘smaller’ sites, my recent background has exposed me to all shapes and sites of web content delivery technologies, from full on Enterprise level Content Management Systems such as Vignette and Stellent, other commercial ones like RedDot, or open source ones like Drupal or Joomla! or in between like Expression Engine, and then of course there’s all the custom applications that get written for specific applications, or reusable frameworks and libraries that can give advantages in rapid deployment / development.

And then there’s the platform to build on – once is was a choice of flat html (with some help from Dreamweaver perhaps) or Perl (or the new kid PHP) or ASP. Now there’s all the Java based technologies, Python based (I still reckon Zope should have become more mainstream) ASP.NET, Ruby on Rails, and of course my old favourite PHP is going from strength to strength. And it doesn’t stop server-side, with the advent of AJAX and frameworks such as jQuery, so much more can be done on the browser.

I understand and buy into Standards Compliance, Accessibility, Search Engine Optimisation, Usability and all the other buzzwords. I’m able to gather requirements, write specifications, manage projects and carry out quality reviews. I’ve been involved in projects that have been great successes, and others that have spectacularly failed, enough to know how to avoid the pitfalls.

But does all that knowledge and experience help in my current situation and perhaps give me an advantage over someone only just starting in the industry? It’s a mixed blessing because although I rarely have to say “I have no idea how to do that or what’s involved”, the opposite is also a problem because I know of perhaps too many possibilities and alternatives, and how things could be done ‘properly’.

As a little aside, I’ve often observed that any sort of development or design or construction or problem solving that is done in a constrained environment is likely to have a much more creative and pleasing outcome than if done with the luxury of infinite possibilities – it makes you focus and think and consider relative merits of alternatives with a clearer vision of the ultimate goal rather than being dazzled or distracted by niceties.

So how have I tackled these rebuilds? Well the Dreamweaver Templates site is hopefully a textbook example of how to use PHP to make a relatively simple brochureware site more maintainable. The templating features of Dreamweaver (ah yes, I remember them well!) have been replaced with PHP includes – so common elements like html <head>, top level page layout, navigation etc. are all shared. A ‘page’ is represented by a PHP file which sets variables such as page title, navigation state, calls the relevant includes for the top of the page, have the body content hard coded, then calls the footer includes. The one dynamic page is the news section which calls content from a MySQL database, with a little utility script allowing the client to manage news stories.

Why didn’t I use my first project as a freelancer again to flex my muscles and show off all my skills? Because the requirements didn’t need it. This application is beautifully simple, very easy to host and maintain, blisteringly fast, and hopefully will go another five or six years before its next rebuild. Any half competent developer could look at the source and figure out how to make any changes within minutes of getting access to the source. And it only took a couple of days to complete – including me trying to come up with some sort of new visual design and I’m no designer!

This approach has been used by me and my team for many years with great success, from small 6 page site to large corporate sites for FTSE100 companies. If it meets all key requirements what’s the advantage of making life more complicated? Admittedly the finer details of how to implement it have changed – using css for layout rather than tables for instance, good clean semantic markup, secure against XSS and SQL Injection (hopefully) and obscuring email addresses from spambots, with a sitemap.xml and Google Analytics tagging, and it’s all under version control …

I’m starting planning a much bigger site that needs more content management, and have reviewed several frameworks and applications to make life easier, but have still settled on the approach above with one small change – using Smarty to separate logic from presentation, and moving more of the content into the database. But it’s still clean, simple, fast and reliable.

KISS – if it’s getting too complex it’s probably wrong.

Handling large file uploads in PHP

Filed under: review,tinkering — jaydublu @ 5:40 pm

I’ve been a bit lax with techy posts recently, so I thought I’d jot down some things I’ve been working on of late.

Playing with Flash Media Server and video encoding, but nothing conclusive to write about yet so watch this space.

What I have had some success with though is handling large file uploads in PHP. It’s something I’ve come across in the past that you soon hit upload_max_filesize or max_execution_time when making web apps to allow upload of files into the megabytes and the slippery slope that is increasing either or both php.ini setting in response to client requests and against better judgement.

In the past I used to set up ftp accounts and say ‘use a proper protocol to transfer files – duh!’ but that’s not exactly user friendly.

So last weekend I went on a hunt for a better solution, and particularly any client-side file upload tools – came across a few that were contenders, and finally settled on JUpload as an interim solution, saving having to code something from scratch.

I’ve previously blogged about my love/hate relationship with Open Source software – here’s one that goes down in my book as a goodie.

It took a bit of fiddling to get the demo code working – it helps if you check out the SVN head – the release package wasn’tquite working. It also took a bit of hunting to find the JUpload PHP class in the wiki which saved me a lot of time writing my own (and reinventing the wheel). It wasn’t handed on a plate how to get it working, but it also wasn’t too hard to figure out.

So I’ve got it breaking everything down into 1MB chunks, and uploads are a whole lot more reliable than just using one big fat HTTP file upload. Whether the applet will survive as is or be modified, or if I will develop something bespoke based perhaps on its core concepts only time will tell, but for now I’ve got bigger fish to tickle.

Flickr Groups

Filed under: Happisburgh,photography,tinkering — jaydublu @ 11:09 am

I’ve been slowly building a Flickr group for Happisburgh, and it’s coming along very nicely.

It was originally as a bit of R&D for work as a client wanted to use a group for some promotional work, so I needed to check how groups work. Initially I put a load of my pictures up, but thought I’d give it a go to run it properly so I searched for likely members and sent them FlickrMail invites – and most responded positively. Currently the group has 25 members and 215 pictures.

So I do a bit of administration, improve the group description a bit, do a bit of promotion on other groups and invite some new people to join. Also I make a couple of discussion postings.

Do I dare start another? Why not – I’ll not push it too hard yet but if I set it up right it might take off by itself – it will certainly be an interesting experiment. So I start Happisburgh Lighthouse group, put some of my pictures up, and link to it from the Lighthouse website.

That starts another line of enquiry going though – it would be cool to have a Flickr badge on there with a random selection of images, but it’s powered by Joomla! and the way TinyMCE is set up it doesn’t like Flickr’s badge code. So I’m now playing with flickr4j Joomla! extension.

Of course I’ll report back how I get on.

www.flickr.com

photos in Happisburgh Lighthouse More photos in Happisburgh Lighthouse