Unit testing in WordPress

One of the things I really appreciate about developing in the .Net stack is the fantastic unit test support. Using mocking libraries like moq and leaning on the power of nunit to handle my dependencies means I can write unit tests that truly do test just the unit under test. True unit tests are useful for three very important reasons:

  1. That the code is doing what it should
  2. That the code handles unexpected inputs correctly
  3. That after refactoring the code continues to do what it did before

A robust, extensive suite of tests – both unit and integration tests – are crucial for good quality software and, fortunately, have been pretty common in the .Net development world for a long while.

When developing for WordPress, however, it’s not always been that way. I remember not so many years ago that test of any kind wasn’t something often talked about in the WordPress community. I guess we were focussed on getting code shipped.

Things have changed, and automated testing is now a recognised part of the WordPress development workflow. One of the problems with the WordPress Unit Test Suite, as pointed out by Greg Boone, is that it’s not actually a suite of unit tests – it has dependencies like a MySQL database, so would be more correctly called a suite of integrations tests. Pippin also calls these kind of tests “unit”, but they are definitely integration tests.

I’m at risk of over-egging this point, so please read this good description of the difference between unit and integration tests.

To ensure the large WordPress plugin I’m currently building is as good as it can be I want to build a suite of (true) unit tests. That means I need way of mocking WordPress functions (such as do_action, apply_filters and wp_insert_post) and globals such as $current_user and – crucially – $wpdb. It turns out there are a few options, which I’ve briefly investigated. I’ll be using WP_Mock and the PHPUnit test double features.

The well-known WP_Mock from the clever guys at 10up is the backbone of mocking WordPress. It allows you to mock any WordPress function with some simple syntax:

\WP_Mock::wpFunction( 'get_permalink', array(
            'args' => 42,
            'times' => 1,
            'return' => 'http://example.com/foo'
        ) );

This will mock the get_permalink method when the only argument is the integer 42, ensuring it is only called once, and returning the string ‘http://example.com/foo’. Clever stuff.

There are other static methods in the WP_Mock class which allow you to:

  • Mock a method which returns the same value (a pass-through method)
  • Mock the calling of filters and actions
  • Mock the setting of actions and filters

Mocking $wpdb turns out to be pretty simple, as I can use the built-in test double functionality in PHPUnit. Sample code in the MockPress project wiki shows I can do this:

// in my test setUp method:
global $wpdb;
unset($wpdb);

// whenever I want to mock a $wpdb function I set up the method to mock:
$wpdb = $this->getMock('wpdb', array('get_var'));
// and set the value I want to be returned from my mock method:
$wpdb->expects($this->once())->method('get_var')->will($this->returnValue(1);

// now I can check the mock returns what I want:
$result = $wpdb->get_var("select anything from anywhere");
$this->assertEquals(1, $result);

I now just have to ensure my code is written in such a way as to make unit testing easy! I can highly recommend The Art of Unit Testing by Roy Osherove if you want to get into this deeply.

Crash Test Dummies

Crash test dummie reading "Crash Testing for Dummies"No, this isn’t a post about the band. It’s about real crash testing, also known as progressive enhancement testing.

Of course, this had to be Another Progressive Enhancement Post, didn’t it!

Ever thought about why car manufacturers test their cars under crash conditions? Is it because people deliberately drive their cars into walls or ditches? No; not usually, anyway. They test the safety of their cars because we live in an unpredictable world where things go wrong, all the time. Exceptional circumstances surround us every single day. Often we experience near misses – sometimes we’re no so lucky.

In fact, things go wrong on the roads so often that we’ve created thousands laws and guidelines that try to minimise the possibility of these exceptional circumstances occurring. We have speed limits and training before anyone can get behind the wheel of a car. We have street lighting and pedestrian crossings, kerbstones and crash barriers.

Yet things still go wrong on our roads. Sometimes through carelessness and stupidity, sometimes though negligence. Sometimes the blame can’t really be applied to anyone in particular.

Car manufacturers invest in making their cars safe, so that when the unexpected happens – which, at some point, it will – the occupants and other road users are kept as safe as possible. We expect nothing less, and safety features are rightly promoted to help sell cars. That’s good; we should strive to create a safer world where possible.

Yet on the web it’s a different story. No-one believes that things never go wrong online. In fact in my experience there’s rarely a web browsing session where something didn’t break. Images fail to load, sites respond so slowly they appear to be down, JavaScript throws an error because two scripts from different 3rd parties can’t co-exist, web fonts don’t load and so text is invisible. The list of what could – and often does – go wrong when loading websites goes on, and on, and on.

What’s happening here? Do we as web developers, designers, business owners not realise the inherent unpredictability of the Internet? Do we not understand that the web was designed to be like this – to keep going even if all that is delivered to the browser is the HTML? No, many of us understand but sweep this reality under the carpet.

We are dummies.

We’re dummies because we chase after the latest JavaScript framework-du-jour without considering if it supports the core principles of the web. We overload our pages with unoptimised images and gargantuan CSS files generated by a pre-processor. We fail to deliver first and foremost what our users fundamentally require – the content.

We’re dummies because we leave the crash testing to our users – the very people we should be protecting from those exceptional circumstances! And then we have the gall to complain that they aren’t using the latest browser or operating system, or that their device is underpowered. Here’s the reality for you: even when browsing conditions are optimal, things still often go wrong.

So, in my opinion are JavaScript frameworks bad? Do I detest CSS pre-processors? Do I have an allergy to beautiful imagery online? No, of course not. It’s our use of these tools which I rail against. Enhance your pages as mush as you want, but start from the beginning. Semantic HTML content and forms.

Don’t be a dummy.

The tough truth of reality

I make no secret of the fact I’m a huge progressive enhancement believer. The fundamental reason why I believe the vast majority of web sites (yes, and web apps) should be written using progressive enhancement principles is that we just don’t know how they will be accessed.

We live in a time where many of us have powerful multi-core computers in our pockets, ridiculously fast Internet connections, high resolution screens, advanced web browsers supporting the latest web technologies. But even if we are one of those luck few, things go wrong.

Networks drop packets, servers go down, DNS fails. Resources on pages (most notably JavaScript) can fail to load or fail to run because of all manner of errors – both within and outside our control. Why would you not want to build your site in such a way that makes it as robust as possible?

This is how many developers believe the modern world looks:

spectrum-lies

The vertical axis is the capabilities of networks, devices and browsers. The horizontal access is time. As time goes on networks, devices and browsers get better, faster, more fully-featured. That’s true, these things do make progress.

But this belief gives developers a false sense of security, thinking they don’t need to worry about the same things they worried about in the past: crappy network connections, devices with hardly any RAM, browsers that don’t support modern JavaScript APIs etc. They put their trust in The Blessed Void Of Obsolescence.

But the harsh reality is more like this:

spectrum-truth

We should think about the massive possible spectrum of circumstances under which a user could interact with our sites.

Because even with a modern device things still go wrong. A user with the latest version of Google Chrome will get frustrated if your site doesn’t work because a CDN fails to deliver a crucial JavaScript file. All they will know is that your site doesn’t work.

Progressive enhancement is not about “what if users turn JavaScript off” but “what happens when the page is loaded in sub-optimal circumstances”.

Progressive enhancement matters

I’m a bit of a progressive enhancement nut. Some people think it doesn’t matter. It does. A lot. Here’s a very recent example.

I just tried to pay for a quick weekend break next year, booked through TripAdvisor. The “pay now “button didn’t work.

TripAdvisor fail

Why did it fail? Because it’s not a submit button:

<input class="ftlPaymentButtonInner sprite-btn-yellow-43-active" value="Continue to PayPal" type="button" name="submit" onclick=" clearAllGhostText();   ftl.sendActionRecord('VR_CLICK_SUBMIT_PMT_PP_REDIRECT');  return payment.submitPayPalPayment(this);" onmouseout="$(this).addClass('sprite-btn-yellow-43-active'); $(this).removeClass('sprite-btn-yellow-43-hover');" onmouseover="$(this).addClass('sprite-btn-yellow-43-hover'); $(this).removeClass('sprite-btn-yellow-43-active');" >

And how could this be fixed? Probably as simply as this:

<input class="ftlPaymentButtonInner sprite-btn-yellow-43-active" value="Continue to PayPal" type="submit" name="submit" onclick=" clearAllGhostText();   ftl.sendActionRecord('VR_CLICK_SUBMIT_PMT_PP_REDIRECT');  return payment.submitPayPalPayment(this);" onmouseout="$(this).addClass('sprite-btn-yellow-43-active'); $(this).removeClass('sprite-btn-yellow-43-hover');" onmouseover="$(this).addClass('sprite-btn-yellow-43-hover'); $(this).removeClass('sprite-btn-yellow-43-active');" >

Hardly advanced web development.

By the way this happened on a Windows 8 laptop with 12GB RAM, using the latest version of Chrome.

To most people this failure would have completely put them off paying, and it certainly didn’t impress me. However I was determined to pay, so I switched to my phone and paid on there. Job done, but it certainly wasn’t a good experience.

So, you still don’t think progressive enhancement matters?

Spotlight: jQuery plugin

Recently I worked on a website help system, the main feature of which was to highlight particular elements on areas of the page. You know the kind of thing: ‘Click the highlighted search button to search your data’. The designs I was given showed the web page covered by a semi-transparent grey overlay, except for the areas that needed highlighting.

Here’s the problem. The shapes of the un-highlighted bits weren’t just rectangles; they were circles. So my immediate idea of using a bunch of absolutely-positioned <div> elements with opacity:0.6 wasn’t going to cut it.

I decided to use the <canvas> element, and after some digging found this excellent page on the Mozilla developer docs site that explains the different modes available for compositing multiple shapes in a single canvas element. This was the answer.

<canvas> is supported by IE9 and above, which was acceptable for the project I was working on. If you need support for older IEs then this looks like a good solution.

Anyway, I thought this kind of approach might be useful for others so I’ve written a small jQuery plugin called Spotlight which allows you to put a spotlight on elements on your page. See a quick demo here.

See the plugin on GitHub.