Wednesday is Link Day!

Business / Working Life

Data is not an asset, it’s a liability:

Strategic procrastination:

Putting on the shipping goggles: (if you’re not a regular reader of Signal vs Noise then you should be)

Reduce the distance between the people who make decisions about the product and the people who build the product:

Everything is broken:

Fluid coupling: When exactly did enterprises become late adopters of technology?

Preparing organisations to become design-infused:

New, responsive design reduces bounce rate at by 37%:


Little Big Details: taking inspiration from the little details that make designs great:

Predictive personas: quote: ‘…the question they should be asking themselves isn’t, “If I interviewed a user, would this describe her?” The question should be, “If I found a person like this, would she become a user?”’

Forget about the mobile Internet:

The style guide: (other examples available on

Style guide from Salesforce:

Improving the checkout experience with animations: (but read the article below…)

Design safer animation for motion sensitivity:

Design patterns:

Front-end principles for designers:

The language of modular design:

How modern web design works:

Progressive enhancement / Performance

Bruce Lawson’s talk “Ensuring a performant web for the next billion people”: (Opera Mini is a popular browser for those with low-powered and low-bandwidth devices)

Aaron Gustafson with a timely reminder that we don’t really control our web pages:

Jeremy Keith’s presentation on progressive enhancement from May: (video and full transcript)

Preloading, prefetching, prebrowsing:

Embracing the network: modern techniques for building resilient front ends: deck, no video published yet)

A beginners guide to website speed optimisation:

User experience / Usability / Accessibility

The psychological speed of mobile interfaces: (this is much the same as “perceived performance” which I bang on about)

The device context continuum – where does the common device context continuum start and end? (hint: it doesn’t)

Hello, my name is <Error>:

Living with bull:

How to write an error message:

Visual ARIA bookmarklet:


Fantastic introductory article about JavaScript promises:

Learning JavaScript in 2015 (from scratch):

Learn JavaScript essentials:

Really interesting look at why SoundCloud started using microservices, by their director of engineering:

5 questions every unit test must answer:

Package built on PhantomJS to generate screenshots at different sizes:

6 tips for Chrome devtools:

Client-side MVC is not a silver bullet:


Free e-book from Smashing Magazine:

Fill Murray: placeholder images of Bill Murray:

And finally…

Old maps:

Interactive cubic Bezier curve editor (more fun than it sounds):

Big list of naughty strings:

The tough truth of reality

I make no secret of the fact I’m a huge progressive enhancement believer. The fundamental reason why I believe the vast majority of web sites (yes, and web apps) should be written using progressive enhancement principles is that we just don’t know how they will be accessed.

We live in a time where many of us have powerful multi-core computers in our pockets, ridiculously fast Internet connections, high resolution screens, advanced web browsers supporting the latest web technologies. But even if we are one of those luck few, things go wrong.

Networks drop packets, servers go down, DNS fails. Resources on pages (most notably JavaScript) can fail to load or fail to run because of all manner of errors – both within and outside our control. Why would you not want to build your site in such a way that makes it as robust as possible?

This is how many developers believe the modern world looks:


The vertical axis is the capabilities of networks, devices and browsers. The horizontal access is time. As time goes on networks, devices and browsers get better, faster, more fully-featured. That’s true, these things do make progress.

But this belief gives developers a false sense of security, thinking they don’t need to worry about the same things they worried about in the past: crappy network connections, devices with hardly any RAM, browsers that don’t support modern JavaScript APIs etc. They put their trust in The Blessed Void Of Obsolescence.

But the harsh reality is more like this:


We should think about the massive possible spectrum of circumstances under which a user could interact with our sites.

Because even with a modern device things still go wrong. A user with the latest version of Google Chrome will get frustrated if your site doesn’t work because a CDN fails to deliver a crucial JavaScript file. All they will know is that your site doesn’t work.

Progressive enhancement is not about “what if users turn JavaScript off” but “what happens when the page is loaded in sub-optimal circumstances”.

Wednesday is Link Day

A super bumper jumbo crop for you :0)


Interface writing – code for humans:

The best interface is no interface:

Making companies competitive by expanding design’s role: (more UIE goodness)

Style guides best practices, a presentation by Brad Frost:

Wonderful presentation by Jared Spool on building delightful UX:

Performance and progressive enhancement

The Guardian reports on advertising affecting web page performance (if you can find the article amongst all the ads…):

And CNN Money is also talking about web performance:

Progressive enhancement, by the government:

Don’t add the clever thing:

10 ways to minimise reflows:

Designing with progressive enhancement (talk, slides):

Cache efficiency study by Facebook:

Offline first – the final frontier?:

The web’s cruft problem:

There was a lot of discussion about progressive enhancement following a couple of conferences in June, here are the best articles I saw about it:

Assumptions (by Remy Sharp):

Baseline (by my man-crush Jeremy Keith):

Thriving in unpredictability (by Tim Kadlec):

Availability (by Stuart Langridge): (also see


WAI-ARIA screen reader compatibility tables:

The great and good of the accessibility world are putting together an Album for Accessibility:

Styling forms accessibly:

The business case for (accessible) issue prevention:

The accessibility cheatsheet:

Tools and resources

Control and manage real smartphones from your browser:

Awesome geek podcasts! Awesome!

Lightweight, standalone JavaScript input masking:

Get started with CSS (a free course by Russ Weakley, CSS guru):

New W3C mobile checker tool:

Free book on JavaScript:

And another one:

Know your HTTP (posters to print):

Performance tools, a good list by CSS Tricks:

Accessibility testing plugin for Chrome:

Automated accessibility testing:

Accessibility visualisation toolkit:

New performance tools in Firefox:


Everyone knows about, so here’s

.Net Framework 4.6 is coming, with lots of goodies:

Yet Another Weekly Email:

Developer or user convenience, who should pay? Good stuff from Aaron Gustafson:

A website for code reviews:

The boring front-end developer:

Layers and legacies: a warning about old code:

Comparisons between software and medicine:

The whole of JavaScript in one picture:

.Net extensions galore:

Useful JavaScript debugging tips you didn’t know:

No good can come of bad code:

The role of a senior developer:

And finally…

You know (and hopefully love), so check out

An old-skool synth in JavaScript:

Finally, a solution to providing comments without feeding the trolls:

3D maps of every London Underground station:

Stories about the internet (more interesting than it sounds):

The untold story of the invention of the game cartridge:

For the pedants among you:

Progressive enhancement matters

I’m a bit of a progressive enhancement nut. Some people think it doesn’t matter. It does. A lot. Here’s a very recent example.

I just tried to pay for a quick weekend break next year, booked through TripAdvisor. The “pay now “button didn’t work.

TripAdvisor fail

Why did it fail? Because it’s not a submit button:

<input class="ftlPaymentButtonInner sprite-btn-yellow-43-active" value="Continue to PayPal" type="button" name="submit" onclick=" clearAllGhostText();   ftl.sendActionRecord('VR_CLICK_SUBMIT_PMT_PP_REDIRECT');  return payment.submitPayPalPayment(this);" onmouseout="$(this).addClass('sprite-btn-yellow-43-active'); $(this).removeClass('sprite-btn-yellow-43-hover');" onmouseover="$(this).addClass('sprite-btn-yellow-43-hover'); $(this).removeClass('sprite-btn-yellow-43-active');" >

And how could this be fixed? Probably as simply as this:

<input class="ftlPaymentButtonInner sprite-btn-yellow-43-active" value="Continue to PayPal" type="submit" name="submit" onclick=" clearAllGhostText();   ftl.sendActionRecord('VR_CLICK_SUBMIT_PMT_PP_REDIRECT');  return payment.submitPayPalPayment(this);" onmouseout="$(this).addClass('sprite-btn-yellow-43-active'); $(this).removeClass('sprite-btn-yellow-43-hover');" onmouseover="$(this).addClass('sprite-btn-yellow-43-hover'); $(this).removeClass('sprite-btn-yellow-43-active');" >

Hardly advanced web development.

By the way this happened on a Windows 8 laptop with 12GB RAM, using the latest version of Chrome.

To most people this failure would have completely put them off paying, and it certainly didn’t impress me. However I was determined to pay, so I switched to my phone and paid on there. Job done, but it certainly wasn’t a good experience.

So, you still don’t think progressive enhancement matters?

Making the ShopTalk Show theme tune

I’m a regular listener to the ShopTalk Show podcast, a really great show about web design and development hosted by Chris Coyier and Dave Rupert. A couple of years ago, not long after the show started, I offered Chris and Dave a very rough piece of music as a theme tune, and they’ve been using it ever since.

But the time has come to refresh the theme tune, and the guys asked me to record a new version. Specifically they wanted it “rootsier”, and around 20 seconds in length. Here is the new version:

If you’re interested in how I recorded it then read on!

Chris and Dave wanted me to incorporate a recording of a crowd shouting the show motto “Just Build Websites”. I heard in my head exactly how that sample could be used at the end of the tune, so I knew what I was aiming for.

I busked a quick bit of music in the style they wanted. I grabbed my guitar, started strumming an A chord, and very quickly had a chord progression I was happy with, including the ending that could include the “Just Build Websites” sample. I wrote the music down on the back of an envelope (as that’s a traditional thing to do) and added some ideas for an arrangement:


Now I could start recording. I used the fantastic free Audacity software, as I wouldn’t need any fancy effects. It handles simple multitrack recording, and is really easy to use.

Click track

First I laid down a click track, to ensure I was keeping in time with myself. I recorded enough bars for a lengthy click intro before I start playing, and of course enough bars to go right to the end of the recording. If you’re playing along to a click track like this it’s a good idea to have at least 4 bars before you start playing. It gives your hands time to get from the keyboard to the instrument, and lets you settle into the tempo.


Acoustic 1

Then I grabbed my trusty Line6 Variax 500 and chose the Martin D-28 6-string acoustic guitar sample sound. That’s the main guitar you hear at the beginning.


The Variax is great as it gives a really good impersonation of a real guitar. Yes, anyone with a good ear can hear it’s not a real Martin D-28, but then it didn’t cost ten grand. Plus it comes with a massive range of other stringed instrument samples, a couple of which I talk about below.

The first acoustic took a few takes to get right, but I wasn’t happy with the sound, so decided to add a second acoustic guitar.

Acoustic 2

This time I chose a Gibson J-200 and double-tracked the first acoustic, but using my fingertip instead of a pick. I panned the second acoustic to the left and pulled the levels down. This gave me a nice stereo spread of sound.


I knew for the main body of the piece I wanted a ringing guitar sound, probably double-stringed and higher than the guitars. I don’t have a bouzouki (although I think that will be my next instrument purchase; Jeremy will be pleased) but by choosing a Martin D 12-28 12-string guitar sample and putting a capo up on (I think) 5th fret it was a reasonable rendition.

This took quite a few takes to get right as my fingers didn’t want to do what I told them. Sometimes that’s just the way it goes. Here are the two acoustics and bouzouki tracks.



The time had come to lay down the Groove Machine. Ahem, I mean record the bass. I’ve got a beautiful 1978 Fender Precision bass, and a couple of takes later I had it done.


Bass almost always needs come compression to even out the levels. I applied some basic compression to the bass track:


And ended up with something looking much fuller, although a couple of peaks were uncomfortably high:


Here’s a simple trick. If you see a peak which is really high, like the ones in the red circles above, it’s likely that there are just one or two waves which are high. Zoom right in and select the high waves:


Then pull the levels down by a notch:


This is before:


And this is after:


I’ve found that this doesn’t affect the sound in any noticeable way, but help to stop clipping.


To round out the “rootsy” nature of the recording I added a banjo part, using the Gibson Mastertone sample in the Variax. It’s pretty low in the mix, and it’s not complicated, as I ain’t a good finger picker. But it’s a nice little addition to the piece, particularly at the beginning.

When I say I’m not a good finger-picker I mean I’m really awful, so getting this banjo part OK was Hard Work. I lost count of how many takes it took, but it was well over 20.

Just Build Websites (JBW)

Now I was ready to add the JBW sample. The problem was the words are said faster than the tempo of the tune. When I was busking the chord progression I tried playing the music fast enough to fit the speed the words are said, but it sounded manic.

You can see here the peaks of the JBW track don’t match with the peaks of the bass track above.


What I needed to do was cut the words apart, so they are spoken at the same tempo as the music. I toyed with stretching the JBW sample, but it sounded awful.

Cutting the JBW sample into parts was easy, but then I got a horrible “clipped” effect after the first two words, because there was artificially-inserted silence. What to do? Add reverb? Nope, that sounded horrible and false. Instead, I copied parts of the sample and shifted it along to mask the gap. Here you can see how I lined the words up with the bass notes, then covered the gap with my fake echo:


I pulled the level of the fake echo down until it was reasonably unobtrusive, but still masked the gap:


Next I used my Alesis DM5 digital drum kit to lay down a drum line. It’s mostly pretty simple, but I did do a sweet triplet fill before the last two notes:

The drum sound was a bit dry, so I added some small room reverb:



As the levels were getting set as I added the tracks there wasn’t much to do in terms of mixing. I did find the bass got lost, particularly in small headphones. By the way, it’s a good idea to check your mix not just on studio monitors, but on headphones. I used Grado SR60 and RHA MA350 (I think) in-ear headphones. A bit of EQ on the bass soon made it pop out again:


After exporting my mix to 2 tracks I “topped and tailed” the resulting file to remove empty space at the beginning and fade out nicely at the end. Then I encoded to MP3 and emailed it to Chris and Dave. Job done.


The second acoustic was slightly out of tune one one or two notes, so I re-recorded that. And I wasn’t happy with two of the notes on the bass as well, so I re-recorded that. Dave also made a great suggestion of adding a little banjo intro. My amazing (not) finger-picking skills once again saved the day:

As this banjo was stuck out on its own I added a touch of reverb to beef it up.

Now, listening to the individual parts you may thing “hmm, these are a bit raw”. You’re right, they are. As part of that “rootsy” feel I didn’t want to overall piece to feel too polished – I wanted it to sound like a bunch of guys messing around with some recording gear late one night after a few beers. Hopefully I’ve achieved that, and hopefully the listeners to the ShopTalk Show will like it. I certainly enjoyed recording it.