I Am The Very Model Of A Modern Web Developer

I am the very model of a modern web developer
I build my sites in Ember or in React or in Angular
The download size is massive but development is easier
My grunt and gulp and NPM all prove that I am geekier

I am the very model of a modern web developer
My animations cause offence in anyone vestibular
A carousel with massive pictures should be seen as de rigeur
And light grey text with background white will make my content sexier

Some people say my websites should all be enhanced progressively
But my developmental choices have been made deliberately
A 12 meg payload is a guarantee of exclusivity
I don’t want Luddite users who refuse to upgrade from 2G

I have a brand, you could say I’m an Internet celebrity
But try to load my site on anything but super fast 4G
You’ll just get empty pages on your Android or your iPhone 3
Upgrade your phone, you pauper, or you’ll never get a byte from me

With apologies to Gilbert and Sullivan.

HTML Matters

Rant time. No-one can deny that web development tooling has improved in leaps and bounds over the last few years. I’ll sound like a moaning old man if I talk about how primitive things were in the old days, so I won’t.

But despite this wealth of tools, loads of good quality information online, and access to resources and training why do I still regularly see HTML like this in new web projects:

<div class="footer">
<span><img src="twitter.gif" /></span>
<span><img src="facebook.gif" /></span>
<span><img src="instagram.gif" /></span>
</div>

It appears modern web developers have within their grasp a panoply of development and build tools – NPM, bower, gulp, grunt etc – but don’t have access to HTML elements which have been implemented in browsers for years. HTML matters!

It matters because structure matters. Meaning matters. Semantics matter (but don’t go overboard). Accessibility matters. For many projects, SEO matters.

A web page is, at it’s core, a structured document. Pile on all the fancy-pants JavaScript frameworks you want, but you’re still delivering HTML to a rendering engine built in a browser. If you’re making no effort to use appropriate HTML elements to mark up your content then you need to sharpen up your skills.

Unit testing in WordPress

One of the things I really appreciate about developing in the .Net stack is the fantastic unit test support. Using mocking libraries like moq and leaning on the power of nunit to handle my dependencies means I can write unit tests that truly do test just the unit under test. True unit tests are useful for three very important reasons:

  1. That the code is doing what it should
  2. That the code handles unexpected inputs correctly
  3. That after refactoring the code continues to do what it did before

A robust, extensive suite of tests – both unit and integration tests – are crucial for good quality software and, fortunately, have been pretty common in the .Net development world for a long while.

When developing for WordPress, however, it’s not always been that way. I remember not so many years ago that test of any kind wasn’t something often talked about in the WordPress community. I guess we were focussed on getting code shipped.

Things have changed, and automated testing is now a recognised part of the WordPress development workflow. One of the problems with the WordPress Unit Test Suite, as pointed out by Greg Boone, is that it’s not actually a suite of unit tests – it has dependencies like a MySQL database, so would be more correctly called a suite of integrations tests. Pippin also calls these kind of tests “unit”, but they are definitely integration tests.

I’m at risk of over-egging this point, so please read this good description of the difference between unit and integration tests.

To ensure the large WordPress plugin I’m currently building is as good as it can be I want to build a suite of (true) unit tests. That means I need way of mocking WordPress functions (such as do_action, apply_filters and wp_insert_post) and globals such as $current_user and – crucially – $wpdb. It turns out there are a few options, which I’ve briefly investigated. I’ll be using WP_Mock and the PHPUnit test double features.

The well-known WP_Mock from the clever guys at 10up is the backbone of mocking WordPress. It allows you to mock any WordPress function with some simple syntax:

\WP_Mock::wpFunction( 'get_permalink', array(
            'args' => 42,
            'times' => 1,
            'return' => 'http://example.com/foo'
        ) );

This will mock the get_permalink method when the only argument is the integer 42, ensuring it is only called once, and returning the string ‘http://example.com/foo’. Clever stuff.

There are other static methods in the WP_Mock class which allow you to:

  • Mock a method which returns the same value (a pass-through method)
  • Mock the calling of filters and actions
  • Mock the setting of actions and filters

Mocking $wpdb turns out to be pretty simple, as I can use the built-in test double functionality in PHPUnit. Sample code in the MockPress project wiki shows I can do this:

// in my test setUp method:
global $wpdb;
unset($wpdb);

// whenever I want to mock a $wpdb function I set up the method to mock:
$wpdb = $this->getMock('wpdb', array('get_var'));
// and set the value I want to be returned from my mock method:
$wpdb->expects($this->once())->method('get_var')->will($this->returnValue(1);

// now I can check the mock returns what I want:
$result = $wpdb->get_var("select anything from anywhere");
$this->assertEquals(1, $result);

I now just have to ensure my code is written in such a way as to make unit testing easy! I can highly recommend The Art of Unit Testing by Roy Osherove if you want to get into this deeply.

Crash Test Dummies

Crash test dummie reading "Crash Testing for Dummies"No, this isn’t a post about the band. It’s about real crash testing, also known as progressive enhancement testing.

Of course, this had to be Another Progressive Enhancement Post, didn’t it!

Ever thought about why car manufacturers test their cars under crash conditions? Is it because people deliberately drive their cars into walls or ditches? No; not usually, anyway. They test the safety of their cars because we live in an unpredictable world where things go wrong, all the time. Exceptional circumstances surround us every single day. Often we experience near misses – sometimes we’re no so lucky.

In fact, things go wrong on the roads so often that we’ve created thousands laws and guidelines that try to minimise the possibility of these exceptional circumstances occurring. We have speed limits and training before anyone can get behind the wheel of a car. We have street lighting and pedestrian crossings, kerbstones and crash barriers.

Yet things still go wrong on our roads. Sometimes through carelessness and stupidity, sometimes though negligence. Sometimes the blame can’t really be applied to anyone in particular.

Car manufacturers invest in making their cars safe, so that when the unexpected happens – which, at some point, it will – the occupants and other road users are kept as safe as possible. We expect nothing less, and safety features are rightly promoted to help sell cars. That’s good; we should strive to create a safer world where possible.

Yet on the web it’s a different story. No-one believes that things never go wrong online. In fact in my experience there’s rarely a web browsing session where something didn’t break. Images fail to load, sites respond so slowly they appear to be down, JavaScript throws an error because two scripts from different 3rd parties can’t co-exist, web fonts don’t load and so text is invisible. The list of what could – and often does – go wrong when loading websites goes on, and on, and on.

What’s happening here? Do we as web developers, designers, business owners not realise the inherent unpredictability of the Internet? Do we not understand that the web was designed to be like this – to keep going even if all that is delivered to the browser is the HTML? No, many of us understand but sweep this reality under the carpet.

We are dummies.

We’re dummies because we chase after the latest JavaScript framework-du-jour without considering if it supports the core principles of the web. We overload our pages with unoptimised images and gargantuan CSS files generated by a pre-processor. We fail to deliver first and foremost what our users fundamentally require – the content.

We’re dummies because we leave the crash testing to our users – the very people we should be protecting from those exceptional circumstances! And then we have the gall to complain that they aren’t using the latest browser or operating system, or that their device is underpowered. Here’s the reality for you: even when browsing conditions are optimal, things still often go wrong.

So, in my opinion are JavaScript frameworks bad? Do I detest CSS pre-processors? Do I have an allergy to beautiful imagery online? No, of course not. It’s our use of these tools which I rail against. Enhance your pages as mush as you want, but start from the beginning. Semantic HTML content and forms.

Don’t be a dummy.

The tough truth of reality

I make no secret of the fact I’m a huge progressive enhancement believer. The fundamental reason why I believe the vast majority of web sites (yes, and web apps) should be written using progressive enhancement principles is that we just don’t know how they will be accessed.

We live in a time where many of us have powerful multi-core computers in our pockets, ridiculously fast Internet connections, high resolution screens, advanced web browsers supporting the latest web technologies. But even if we are one of those luck few, things go wrong.

Networks drop packets, servers go down, DNS fails. Resources on pages (most notably JavaScript) can fail to load or fail to run because of all manner of errors – both within and outside our control. Why would you not want to build your site in such a way that makes it as robust as possible?

This is how many developers believe the modern world looks:

spectrum-lies

The vertical axis is the capabilities of networks, devices and browsers. The horizontal access is time. As time goes on networks, devices and browsers get better, faster, more fully-featured. That’s true, these things do make progress.

But this belief gives developers a false sense of security, thinking they don’t need to worry about the same things they worried about in the past: crappy network connections, devices with hardly any RAM, browsers that don’t support modern JavaScript APIs etc. They put their trust in The Blessed Void Of Obsolescence.

But the harsh reality is more like this:

spectrum-truth

We should think about the massive possible spectrum of circumstances under which a user could interact with our sites.

Because even with a modern device things still go wrong. A user with the latest version of Google Chrome will get frustrated if your site doesn’t work because a CDN fails to deliver a crucial JavaScript file. All they will know is that your site doesn’t work.

Progressive enhancement is not about “what if users turn JavaScript off” but “what happens when the page is loaded in sub-optimal circumstances”.