14 December 2011
You can’t institutionalize innovation. If you could, everyone would do it.
12 December 2011
The skill of a pilot is in bringing vertical speed to zero just at the landing point.
The skill of a bell ringer is in bringing rotational speed to zero just at the balance point.
The skill of managing change is in making the rate of change as close to zero as possible at the point of change.
Successful pilots, bell ringers and change managers do this apparently effortlessly by preparing their trajectory well ahead of time and being continually alert to changes in the environment which might require a correction – but those corrections will tend to be small in relation to the goal.
Unsuccessful pilots and bell ringers crash.
9 December 2011
I went to two events yesterday.
The first was the launch of the Government Digital Service, or rather a housewarming party for their shiny new offices. In fine agile tradition, they put on a slick show and tell with short sharp presentations about their work and achievements topped and tailed by Francis Maude, Mike Bracken (who has blogged his account of the day), Martha Lane Fox (likewise) and Ian Watmore. There was an enthusiastic crowd of supporters twittering furiously and other blog posts are starting to appear [added 10/12 - and GDS has now posted the presentation material and links to press coverage] . The dress code was smart casual, with a lot more emphasis on the casual than the smart. There was a buzz, a sense of creativity and spontaneity, of energy and talent unleashed, of an approach which felt a million miles away from both the stereotype and the reality of government projects.
Frustratingly, I had to leave early to get to the second event.
That was a much more sombre affair, closed and closed in, in an anonymous Treasury meeting room. The programme I work on was being reviewed, to check that we are managing effectively and are on track to deliver. There was little that was casual, in dress or anything else. There were plans, business cases, critical paths, migration strategies, decommissioning strategies, privacy impact assessments and a pile of other stuff besides. There was pointed questioning on risks, affordability and resilience. I make no complaint about that. We are spending public money – rather a lot of public money – and we should be challenged and tested on whether the spending is wise and the results assured. The track record of large government projects is not so great that there is room for complacency. But it felt a very long way from the world of GDS.
That matters, because actually the two are very closely linked. They represent, in effect, different ways of thinking about the same problem, and have roots in some of the same people and ideas. And in recognising that, I suddenly realised that I had rediscovered a thought I had first had at a seminar I went to almost four years ago, where both approaches were represented, each largely talking past the other. Tom Steinberg, who spoke at the point of inflection between the two, memorably started by saying that he completely disagreed with everything which had been said in the first half, and that the solution to the problem of big blundering IT projects was to have small fleet of foot projects, not to find a cure for blundering. I reflected on the apparent tension then as I reflected today:
And then the penny dropped. The apparent gulf between the two parts of the seminar is itself the challenge.
We need to apply two different sets of disciplines (in both senses), in two separate domains:
- An approach to the customer experience – both offline and online elements – which is flexible and responsive and which maximises its exposure to customer intelligence in order to do that
- An approach to the supporting processes which is robust, consistent and correctly applies the full set of rules
The collective culture and skills of government are much more geared to the second than the first – and the risk is not just that we don’t do the first as well, but also that we can all too easily fail to spot the need to do it all. The first is where there is the greatest need for change, flexibility and responsiveness – and where tools and approaches are available to deliver that responsiveness. The second requires the hard grind of implementing big robust systems which do the transactional heavy lifting as invisibly as possible.
Of course the distinction isn’t an absolute one, and of course each domain needs to incorporate the key strengths of the other. But if we confuse them, we are at risk of getting the worst of both worlds.
My view has changed in the four years since then. I no longer think they are two different domains, they are aspects of what should be intrinsic in any approach (though scale and purpose will drive balance and relative importance). But perhaps there is a risk that big projects are still too much trying to learn the lessons of the last decade and too little trying to anticipate the needs of the next. It is no longer enough for systems to work (though they do, of course, absolutely have to work); they must work well, and work well specifically for the people who will use them. Or, as Helen Milner reported Mike Bracken as saying at another event yesterday:
That makes a lot of sense to me, though only if it is understood that in this context function is an integral part of beauty (as Brian Hoadley rightly challenged).
Conversely though, it is not enough to make beautiful things, though, perhaps less obviously but no less necessarily, they do need to be beautfiul. It is essential that they work and work well too.
Looked at one way, the core mission of GDS (and not just GDS) is to make beautiful things which work well. That means some of the values so apparent in the GDS event need to be more obvious in many other aspects of the work of government. We will have made great progress when discussions about projects in anonymous Treasury meeting rooms are more like the world of GDS. But as increasingly function begins to underpin beauty, it may also mean that the palest shadow of the Treasury meeting room also needs to fall across the sunny loft which is GDS.
One of the key tests of the success of GDS will be that when their turn comes to give an account of themselves in that room in Treasury, their approach is recognised and valued – and the work of every other project is being tested against it. And another key test may be that that room will be a bit less anonymous, with its own wall of post-its and whiteboards.
5 December 2011
We can never be a normal user of our own services. We can temper that by being self-conscious in reflecting on our experiences as users of other people’s. But even that tacitly assumes that we are like normal users, other than in our expertise as providers of a particular service.
But that assumption may be badly wrong if in fact we are unlike typical users in ways which introduce the risk of systematic skews in our perceptions. And it’s a safe starting point to assert that if you are reading this, you are sufficiently abnormal that you should worry about that distortion (and if follows that I am so much more abnormal for writing it that I am almost certainly beyond hope).
The first step is to recognise that you can only have a personal appreciaton of the usability of a service by using the service.
I spent a great day on Friday visiting a local authority and talking about ways in which local and central government could work more closely together around service delivery, particularly for people we know will have to deal with both (or several) of us. They took me round their one stop shop, and showed me their plans for a newer and shinier one. It was truly impressive stuff. But there was a small voice in my head reminding me of my own experience a few weeks ago at a one stop shop as a resident in my own local authority which, as I reflected then, wasn’t bad, but wasn’t great either. It’s not that the voice was telling me it might not be as good as it looked. It was telling me that I would never be able to tell by looking.
In the early days of the web, the DTI (I am pretty sure it was) won an award for having the best government website. I understood why: it had a level of visual and structural clarity which was well ahead of the standard of the competition. Asked which government site deserved the prize, I would have awarded it to them too. Looked at without any specific purpose in mind, it was superb. But as I discovered when I had to look for something I knew was in there somewhere, it was much less good at meeting actual needs.
Even those who might be supposed to have an experience sufficiently close to the end user to have a reliable understanding of their experience can’t be assumed actually to do so. I wrote a few months ago about the difference between bus drivers and bus conductors which is one illustration of that point, but there are many more to be found wherever you look for them.
The second step is to recognise that whatever your experience as a user, you should not assume that your reactions are normal.
Designers and builders of services tend to assume that we are just like the people who will use those services, except that we have some specialist inside knowledge which gives us a slightly altered perspective. Having acknowledged that, we may – we should – go on to recognise that there are needs some users of the service may have which we don’t share. So we will think about accessibility and plainness of English (to say nothing of plainness of Welsh). But still we are tacitly assuming that while people may be at different points along a spectrum, there is only one spectrum.
I was helped to recognise the danger of that assumption a few years ago by hearing an account of a small qualitative research study into channel preferences, particularly the relative attractiveness of doing things online and by phone. It turned out that what everybody wanted was to be sure that the information they had provided had been correctly recorded and confident that action would be taken as a result. No surprise in that, that’s pretty much what I want too. For me, the conclusion is obvious: given that objective, an online transaction is clearly to be preferred. I can be completely clear about what data is being captured, avoid having to say, ‘no, that’s B for Bertie’ in increasingly exasperated tones and can be pretty sure that whatever system the organisation concerned has for doing whatever needs to be done next, it knows it has that thing to do. But for a lot of people, it turned out, the answer is equally obvious, and is precisely the opposite. Talking to a person gives you the confidence that the organisation has asked the questions it needs to ask, and as a result knows what it needs to know. Critically, it is felt to mean that responsibility for accuracy and completeness has been accepted by the organisation, whereas self-service data entry leaves an unwanted sense of responsibility somehow sticking to the user. And a human having accepted that the transaction is complete is more reassuring than any form of electronic confirmation.
Bruce Tognazzini has just published an essay on the apparently esoteric topic of whether the navigation of an iphone contacts list is better done by scrolling or searching. If that’s an important issue for you, it’s worth reading. If it isn’t (as it isn’t for me, since I don’t have an iphone), it’s worth reading anyway, as the core of his argument is much more general. It is in essence that some patterns of thinking are over-represented among those who design and build services, with a real risk that services so designed are optimised for people who think like them, not for a potentially much larger group whose mental model and heuristic preferences may be very different.
For all our bluster about how special we in high tech are, we really tend to think of ourselves as average—average intelligence, average likes and dislikes, average knowledge. We are none of the above. In fact, only one person in the entire world is average, and we don’t know who that person is.
Engineers (including programmers), he argues- are typically much more logical, much more abstract and better at rote memory than the rest of us. Unconstrained, that can have interesting results. Tognazzini takes as an example Steve Wozniak, one of the brains behind the Apple II:
[He] later developed the CL 9, the first programmable universal remote control. It featured the keys 0 through F, labelled with the standard Hexadecimal notation so familiar to everyone born with 16 fingers. It enabled you to capture and command 256 different codes spread across 16 invisible “pages.” You just had to memorize the page and position of all 256 of those codes and you could control everything! Woz and about three other people were able to make excellent use of the resulting product. Engineering, even genius engineering (and Woz was and is second to none), must be balanced with equally talented design.
So we need designers too. But that (shades of Officer Krupke) isn’t the whole answer either:
Graphic designers, left unchecked and unschooled, are likely to aim for maximum visual simplicity at the expense of both learnability and usability. Such interfaces require users to discover new capabilities by clicking around and seeing what happens. Users don’t do that. In the most extreme cases, functionality desperately needed by the majority of users may actually be removed from products in the effort to generate visual simplicity.
So it turns out that we also needs human-computer interface (HCI) experts, of which, of course, Tognazzini happens to be one.
The three professions, working together, with a healthy tension among them, produce good software and good products. That balance of power is critical to success.
But even that, of course, is not enough: we still haven’t got to the people who will actually use this service yet. So it doesn’t really matter whether you agree that a constructive tension between the three disciplines Tognazzini discusses is the optimal approach, the question is still whether the people who end up designing and building services are systematically dissimilar to the people who end up using them.
That’s not an argument that those professional disciplines are wrong or irrelevant: on the contrary, they are essential. Nor is it an argument about superiority: this is not a demand to dissolve the people and elect another. It is instead a recognition of the need to correct for skewed representation of different mental models. It is an argument that what seems most obvious may be most dangerous, because it may not be at all obvious to others.
So it’s not just that we aren’t the users, but that we may be too unlike them to understand the gap.
Small footnote: I know that ‘user’ is a controversial and imperfect word, but in the context of what is being discussed here it isn’t easy to find another one. I have argued elsewhere that ‘customer’ is generally the least worst word, but then and now I am not persuaded that it is a particularly productive debate.
2 December 2011
The burial of human remains at sea requires a marine licence.
That must be one of the more arresting first lines of any government web page. Its combination of human tragedy and bureaucratic process packs a lot into eleven words.
You won’t find that line, or anything else on the subject, at Directgov. That’s neither surprising nor perhaps unreasonable. Very few bodies are buried at sea – exact numbers are hard to come by, but estimates are in tens a year, a tiny proportion of the half million or so deaths each year in the UK.
The line instead comes from the website of an organisation little known, I suspect, to non-specialists, the Marine Management Organisation, the core purpose of which has little to do with the disposal of corpses. But getting a licence for burial at sea is without doubt a government service directed at individuals, so in principle it should be found where other such services are to be found, which in the not too distant future means the single government domain. I have no imminent expectation of finding it there (and make no criticism that it won’t be). But it is worth asking why that should be and what it tells us about government more generally.
Back in the early days of e-government, there was a target to get all government services online. Increasing the numerator would help achieve the target, but then so would decreasing the denominator. Creating a definitive list of relevant services was the only way of preventing a percentage score from drifting about uncontrollably. Burial at sea was often the example used in the largely pointless debates which ensued. It was a good example, because it brought together two separate issues: was this a service which anybody was every likely to want to do online; and were there enough of them to justify putting it online at all?
Entirely expectedly, government information and services follow a Zipf distribution, made famous by Chris Anderson in The Long Tail (but applied to websites at least as early as 1997): there is a small number of things which get an enormous amount of attention, and there is an enormous number of things which get a small – sometimes a vanishingly small – amount of attention. Two lessons are often drawn from that: one good and one potentially very bad.
The good one is that there is great value in identifying the things which most people want to do most of the time, and ensure that they can do them easily and efficiently. The potentially bad one is to assume that the rest doesn’t matter and either ignore it or delete it.
In the physical world, it is more or less essential to cut off the distribution. A good bookshop won’t just rely on best sellers, but equally there will be a limit to the number of titles it can stock which only sell one or two copies a year. Amazon, with warehouse fulfilment, can do much better than that, and it has been estimated that 37% of their revenue in 2008 came from sales of books ranked below 100,000.* It would be supreme folly for Amazon to announce one day that they were rebuilding their web presence and would henceforward only cover the top 100,000 titles.
Government is not Amazon. Web pages are not books. Analogies are flawed. And yet.
The question of how the government’s web presence should be culled and curated is not a new one. It has been around in various forms since the earliest days of e-government, documented perhaps most clearly and consistently by Alan Mather. At least as far back as 2003 (and actually well before then) he had a strategy which looked uncannily like that of the single government domain:
- Fewer websites not more. Kill 50 websites for every new domain name.
- Less content not more. Delete five (or fifty, or five hundred) pages for every page you write.
- Solve the top 50 questions that citizens ask … and structure your content around those first. Then do the next 50 and the next. The people who know these questions are the ones that answer the phone in your call centres, the ones that write in to your agency and the ones that visit your offices for help; likewise, they visit accountants, advice bureau, charities and so on.
- Test search engines to see how your site ranks – both from a mindshare side and for individual queries.
- Impose rigorous discipline on use of “words” – plain speak.
- Impose even more rigorous discipline on the structure of the content, including metadata so that it’s easy to read – by people and by search engines.
Or in other words, start at the top of the Zipf distribution, and work systematically along until you stop. Tom Loosemore has a pithier version which means much the same:
Taken as expressed, it’s hard to disagree with the approach Tom and his team are taking. But a great deal hangs on the word ‘superfluous’. In this context, I think it is being used to mean two quite distinct things, but risks treating them as one. The first is rot, decay and duplication. Too much money is being spent very inefficiently to maintain – or all too often to fail to maintain – information which is poorly organised, hard to find, badly maintained and structured round what organisations do, not what people need. The second is obscure specialisation: there is a vast amount of information which most people don’t want or need and won’t ever want or need, and its existence makes it harder for the important stuff to shine through.
Focusing on an ‘irreducible core’ is a very good way of tackling the first problem, but risks overlooking the second. Whether that is a bad thing is a contingent question which is not inherently an easy one to answer, and which potentially raises some awkward questions about the singularity of the single government domain. There are three basic options:
- Everything goes into the single pan-government site
- Popular and important stuff goes into the single pan-government site and the rest goes somewhere else
- Popular and important stuff goes into the single pan-government site and the rest doesn’t go anywhere
To an extent this is (or can be made to be) a matter of timing – pursuing Alan’s idea of tackling the problem in fifty-question chunks. But even with that approach, sooner or later we get to the question of whether enough is enough. In order to know that, we need to understand two things. The first is the value to users of the long tail – government’s version of Amazon’s 37%. If it is high, or to the extent that it is high, the choice is between options 1 and 2. Neither is entirely attractive: option 1 risks compromising the quality and clarity of the much smaller set of key services; option 2 creates a messy boundary and breaks the principle that there is one place to go. If though the value to users of the long tail, or some furthest reach of it, is relatively low, the choice is between options 1 or 2 and 3. And if option 3 is even to be considered for some subset of information that might otherwise have been included, that raises a very big question.
Luckily, GDS is full of exceptionally smart people (and now even fuller) and better still, they have invented the needotron. That’s the right systematic approach – but I will be fascinated to see whether they find a way of creating the right long tail, and of stopping the tail being so unwieldy that it trips up the dog.
*These numbers are hard to make intuitive sense of. Amazon are currently claiming to have ‘over 750,000′ books available for the kindle, which sounds like more than enough for anyone – yet I regularly find that the books I actually want to buy are not among them.
2 December 2011
This blog has not had a substantive post for quite a while. There’s no particular reason for that, other than that I find that the longer I haven’t written something here, the harder it feels to write anything, so the longer the gap keeps growing. So this is to break the cycle and blow the dust off, with some substance to follow soon. There’s a post coming up which was 95% written two months ago, after that I will discover if I can remember how to string two sentences together.
5 October 2011
We are not the customers of our own services. And even if we think we are, we are still not: we know too much, we cannot stop thinking as provider or designer.
Sometimes we are the customers of other people’s services and that holds up a mirror – sometimes a very distorting mirror – to our own. Even then, of course, the perspective of someone who is then tempted to blog about the experience is not wholly to be relied on.
A couple of days ago, I went to buy some parking permits from my local council. I am not going to give a blow by blow account of the experience. It wasn’t bad, but it wasn’t great either. There were lots of small ways in which it could have been better: here are three where service design could be improved.
The first is a point of measurement. There is a ticket based queuing system – take a ticket and wait for your number to be called. Screens show average waiting time (with the average while I was there going up by about a minute for every minute of elapsed time, but that’s another story). But there is also a queue to talk to a receptionist to get a ticket in the first place. So the queuing time being measured is a process management view, not a customer experience view.
Having got a ticket, the next thing is to wait. That turned out to be very confusing – numbers were called apparently randomly, so both giving no indication of progress up the virtual queue and making it impossible to know whether your number had been called or not. I assume that there was a process going on of assigning cases to appropriately skilled staff members, and so in practice several queues running not just one. That’s perfectly sensible, but would be a lot less confusing for customers if the queues had distinct number ranges. To make matters worse, one of the display screens showed ticket numbers in the queue – but only some of them. If the number one higher than mine has been called, and if my number doesn’t appear on the screen, should I start worrying that I have missed my turn? As it turns out, no, I didn’t need to have worried, but it was hard to be sanguine at the time.
The final point is that ineffective innovation can be a long term burden. The time came to pay for my parking permits. There was nothing so obvious and straightforward as a normal card reader. Instead, I had to be led to a payment machine, the like of which I had never seen before and which had clearly been intended to operate on a self-service basis.
In theory you put in a reference number, a postcode and, of course the card details, and got in exchange a receipt to be exchanged for whatever it was you had paid for. In practice that had clearly proved to be too difficult, so members of staff now walk the length of the building and enter everything except the card details, then stand around waiting for the payment to go through. The net effect is the worst of both worlds, with more staff time used than necessary to deliver a more disjointed service. That neatly illustrates two important points: attempted self service which fails is more complicated and expensive than not attempting the self service in the first place; and, to adapt Jakob Nielsen, users spend most of their time making other payments, so prefer your payments to work the same way as all the others they already know.
I got what I went for, I didn’t have to wait inordinately long, the service was friendly and effective. But it could – and I think should – have been just a bit better still.
28 September 2011
Things which caught my eye elsewhere on the web
- Betagov blues.. « Digital by Default Outside of Hercules House ‘digital by default’ seems a long, long way away and requires making compromises in order just to get some momentum. Small wins are achievable (and you can bet we celebrate each one!) but getting anything larger out of the door requires considerable patience and fortitude.
- How to be good at work | Stephen Hale Personally, I am much better at my job because of social tools. I’m better informed, often helped by others, better connected, more grateful, and more ready to share my own thoughts than I would be without tools like Yammer, Twitter and blogs.
- It’s the end of the web as we know it « Adrian Short The promise of the open web looks increasingly uncertain. The technology will continue to exist and improve. It looks like you’ll be able to run your own web server on your own domain for the foreseeable future. But all the things that matter will be controlled and owned by a very small number of Big Web companies. Your identity will be your accounts at Facebook, Google and Twitter, not the domain name you own. You don’t pay Big Web a single penny so it can take away your identity and all your data at any time. The things you can say and do that are likely to be seen and used by any significant number of people will be the things that Facebook, Google and Twitter are happy for you to say and do. You can do what you like on your own website but you’ll probably be shouting into the void.
- Nik Cubrilovic Blog – Logging out of Facebook is not enough Privacy today feels like what security did 10-15 years ago – there is an awareness of the issues steadily building and blog posts from prominent technologists is helping to steamroll public consciousness. The risks around privacy today are just as serious as security leaks were then – except that there is an order of magnitude more users online and a lot more private data being shared on the web.
- Prototyping as an ethos | Brian Hoadley So if we take our responsibility seriously, why don’t our clients? Why do they so often try to cut corners, cut out research and prototyping, shudder at the idea of iteration (which will equal cost now but provide potential benefit later), and railroad us down an agile path that promises iteration, but so often delivers linear, scaled-back development with no opportunity to evolve already built functionality?Prototyping and testing gives you a real opportunity to test, iterate and re-test. It allows teams to incorporate learnings (other than their own) so that the end results more closely resemble the type of result that users might actually find useful.
- Schneier on Security: Complex Electronic Banking Fraud in Malaysia The criminals use a fake card to get a new cell phone SIM, which they then use to authenticate a fraudulent bank transfer made with stolen credentials.
- Schneier on Security: Complex Electronic Banking Fraud in Malaysia [comment] One problem with multi-channel authentication is that the owners/maintainers of the individual channels may be unaware of the consequences to the end-user of their security weaknesses.
- Should you launch at a conference? – Joel on Software We probably could have brought it to market after three months. That would have been ever so lean. There was a strong temptation just to dump it on the world super-early and spend the next year iterating and improving.We didn’t do that. We worked for nine months, and then launched. I couldn’t stop thinking that you never have a second chance to make a first impression. We got 131,000 eyeballs on 9-month-old Trello when we launched, and it was AWESOME, so 22% of them signed up. If we had launched 3-month-old Trello, it would have been NOT SO AWESOME. Maybe even MEH. I don’t want 131,000 eyeballs on MEH.
- A Cohesive & Unified Identity for British Government — Paul Robert Lloyd If we want to talk about reducing bureaucracy, and simplifying government, then surely we need to think abouthow it can be representedwithone singlecommon identity rather than a multitude of different logos.
- Older freemium app users fork over cash, younger users spend time — Tech News and Analysis Younger users spend more time in freemium apps but don’t plunk down as much money while older users are the opposite, less free with their time but more likely to open up their wallets.