Eclectic Dreams

A Web Design and Development Blog

The web will always be a moving target

June 10th, 2015

The future is already here — it’s just not very evenly distributed. – William Gibson

The web moves fast. Faster every year, what with evergreen browsers across the board. It’s certainly a far cry from the bad old days, when we went 5 years between Internet Explorer updates. It would be convenient to think that because we live in a world where people’s browsers are regularly updating, that we live in a world where the web is in a reliable state.

Oh yes, a quick check of your web stats may show that IE8 is the new IE6, and even that is on its way out. We’re nearly at a stage where there’s a baseline of CSS 3 available, which for those of us who remember trying to get CSS working at all in Netscape 4 is a huge shift.

But the only constant is change. Yesterday’s cutting edge is todays common baseline. The web moves on with new Browser APIs, new CSS and new HTML elements. HTML 5 becomes HTML 5.1, CSS gets its CSS 4 selectors and ECMAScript reaches version 7.

This stuff doesn’t arrive all at once though.

It arrives in dribs and drabs, with different browser development teams focusing on differing priorities. Chrome and Firefox have Web RTC already, but it’s still under development for IE, who knows when it’ll hit Safari mobile. Want to use it for a project? Go take a look at and you’ll be hit by the most common conundrum in web development:

How do I make this work where I need it to?

This isn’t new. This has been going on since there were multiple browsers. From the days of trying to make DHTML work with IE and Netscape’s different layer models to the days of having Promises in some versions of Android on mobile, but not in Safari on iOS 7. The future is like the past, only there’s more of it.

Which brings me to the point. The web is a continually moving target. It probably changed in the time it took me to write this. If you work with web stuff you need to embrace this fact. It will be the only constant in your career. When I’m old and grey and building hypermedia virtual experiences in HTML 10, it’ll be no different, except for maybe some silver space-age jumpsuit and a dodgy prostate.

On the web progressive enhancement is and will always be, the methodology of choice. It makes your site robust to the shifting sands of the web front end. You don’t control your audience’s choice of browser, operating system, connection speed, device, ability to interact with technology or understanding thereof. You don’t control the flow of new features to those browsers, the priorities of their developers and organisations. You don’t get to decide if a feature will be implemented well or buggily or partially.

All you can do is pick a good baseline, and enhance for those who have the shiny.

You do get to work on the most globally available, unpredictable, diversely interacted with communication platform in the world. Enjoy that.

Fun times with Appcache

August 20th, 2014

When we started work on our responsive web app for managing your library service, Soprano, we had planned offline support since day one. However  we waited to get the basic product launched before sorting out the offline side, as even two years ago it was known to be a difficult beast. We’d actually abandoned a previously attempted offline version of our catalogue due to complexities of retrofitting appcache onto a nice REST-ful site. So we allocated a majority of my time at the tail end of ’13 for the offline-ification.

Offline webapps are… Lets be honest, a great idea marred by a particularly bizarre implementation, poor documentation and more gotchas than I can count.  If you’re looking for a solid intro to doing it right, see this great article series from the Financial Times . If you want to learn about some of the gotchas, then most have been nicely documented at AppcacheFacts and A List Apart. Hopefully near-future tech like Service Workers will make the process less of a pain, but they are a while off…

This blog post is mostly a grab bag of stuff I haven’t seen documented elsewhere, and may help others beating the technology with a stick until it works.

Let’s start with the fun of SSL. Imagine you want to build and test your app locally with a similarly secure connection as it does in the real world. To do this you create a self signed certificate.  Well for the love of all that is good make sure you properly install the SSL cert. Recent versions of Chrome and Firefox refuse to store an appcache from an invalid SSL cert. This includes certs that have expired or aren’t trusted, or have mismatched domains, or basically anything that might trigger security warnings. And no , you can’t just click that nice Proceed Anyway button on the SSL warning screen, that won’t work. In Firefox it will die silently and not give you any debug info as to why the appcache download failed. Chrome and Safari will give you a little more info in dev console. You need to properly install the certificate, on a Mac install it to your keychain for Chrome and Safari anyway, Firefox wants to use its own store hidden away in the  nest of preferences panes.

Next, be careful of putting expires headers on, well, anything. You know how all the articles on cache manifest talk about you should never, ever cache the main manifest file or the whole offline version is permanently cached? Well, turns out you have to be a bit careful with Firefox, as if you add expires headers for right now or just in the past, then well the cache instantly expires and won’t run the full initial update/download. But be careful of Chrome and Safari, as they do need expires headers or the manifest may get stuck. Yeah, I swallowed my pride and did some browser sniffing for that. Oh, and make sure none of the files referenced in your manifest have past or very short expiry as Firefox respects that and downloading the manifest can fail because of those files.

If you are using appcache, the best way is to have a separate webpage with the manifest and wrap it in an iframe. The main advantage of this is that it ringfences the cache. Normally, when you just add an appcache directly to a page every (matching) GET request you make via AJAX is added to the cache. Leave a one page app for a while and watch those GETs rack up and fill the cache. This means appcache has a built in DDOS mode, whereby when you expire a manifest every file in it gets re-downloaded right then. Using an iframe means you get more control of what gets cached.

That said, if you want IE10 on Win 7 to work with it, iframes might not be the way to go. That particular combo seems to have more trouble actually detecting going offline. IE11 seems to handle it fine.

One fun thing I haven’t seen much mention of is the concept of foreign files in appcache. At one point in development we accidentally ended up with two manifest files in different iframes embedded on the same page (due to a mis-built bit of JS). This causes all kinds of fun as you try and work out which version of a file you have between two appcaches, and which cache is the one being used for the page you are viewing. So if you see a file flagged as foreign in Chrome debug, that’s likely what’s happening.

A note on dev tooling. Chrome and Safari tell you loads about what’s going on with an appcache. What’s downloaded, where it failed, and so on. It’s dev tools also let you know if something’s origin is the cache, not a real request. Plus also the url chrome://appcache-internals/ makes flushing bad or wrong ones easy. You’ll need to do this lots, since once you add an appcache every http error will go to the fallback cache page, that’s fun when debugging issues outside the front-end…

Firefox gives the most unusable debug tool for app cache,  hidden somewhere off in the depths of a sub-dev tools command line. Seriously, the Firefox tool’s appcache validate gives really obtuse errors or just fails. It does allow you to quickly flush an appcache, which can be handy, but that’s about all it’s good for. I’m seriously hoping this gets improved before this stuff gets superseded by the new shiny service worker caches.

Anyway, enough negative, here’s positive things: We have a live system using app cache for offline functionality, which is pretty nifty for a responsive back-office application often used where connectivity is unreliable. Lots of dev headache for a really useful thing to people doing a job on the floor.

Who says enterprise software can’t do HTML5?




In Praise of

November 1st, 2012

Something extraordinary happened a few weeks ago and I felt like I should mark the occasion: The government launched a website that didn’t suck.

No, not only didn’t it suck, this website has a wonderful user experience. Really. Not only that, but it came in without falling foul of the cost creep endemic in government digital projects. It replaced and rationalised content. It made things clearer and used plain English. It focused on what users wanted to find, not what government departments wanted to say. The team shared their code on github and accept pull requests. They set metrics for pages and continually revised and improved. They ran detailed user and accessibility tests.


I can’t quite remember when I first heard of the then project. Possibly from one of the many folks on the web scene who seemed to get sucked into the project. This in itself was the first sign that those behind the new government website knew what they were doing, they got professionals with industry respect to work on the prototype. Over the last year or so I’ve watched it move from alpha to beta and slowly iterate and optimise. They’ve done this in public, often sharing their detailed research and testing experiences on their blog (with some interesting results, see their notes on auto-completing search). The web community in the UK has been cheering from the sidelines the whole way, because it’s really made a nice change to be enthusiastic about a government website. It’s weird when the hotest startup in the UK is a government website…

If you ever wanted a case study for a large scale user-centric redesign, this is it.

This is truly a watershed moment, and one whose lessons I hope will cascade down from central government to regional (please pay attention Birmingham City Council) and other public sector areas. The Government Digital Service is a model for how this kind of project should be delivered. They put the web it its own category and built something amazing.

Given I have pretty much hated the major policies of the current coalition, it seems very strange to be congratulating it, but for their support of this one project they really deserve it. They’ve given us a UK website to be proud of.