The burial of human remains at sea requires a marine licence.
That must be one of the more arresting first lines of any government web page. Its combination of human tragedy and bureaucratic process packs a lot into eleven words.
You won’t find that line, or anything else on the subject, at Directgov. That’s neither surprising nor perhaps unreasonable. Very few bodies are buried at sea – exact numbers are hard to come by, but estimates are in tens a year, a tiny proportion of the half million or so deaths each year in the UK.
The line instead comes from the website of an organisation little known, I suspect, to non-specialists, the Marine Management Organisation, the core purpose of which has little to do with the disposal of corpses. But getting a licence for burial at sea is without doubt a government service directed at individuals, so in principle it should be found where other such services are to be found, which in the not too distant future means the single government domain. I have no imminent expectation of finding it there (and make no criticism that it won’t be). But it is worth asking why that should be and what it tells us about government more generally.
Back in the early days of e-government, there was a target to get all government services online. Increasing the numerator would help achieve the target, but then so would decreasing the denominator. Creating a definitive list of relevant services was the only way of preventing a percentage score from drifting about uncontrollably. Burial at sea was often the example used in the largely pointless debates which ensued. It was a good example, because it brought together two separate issues: was this a service which anybody was every likely to want to do online; and were there enough of them to justify putting it online at all?
Entirely expectedly, government information and services follow a Zipf distribution, made famous by Chris Anderson in The Long Tail (but applied to websites at least as early as 1997): there is a small number of things which get an enormous amount of attention, and there is an enormous number of things which get a small – sometimes a vanishingly small – amount of attention. Two lessons are often drawn from that: one good and one potentially very bad.
The good one is that there is great value in identifying the things which most people want to do most of the time, and ensure that they can do them easily and efficiently. The potentially bad one is to assume that the rest doesn’t matter and either ignore it or delete it.
In the physical world, it is more or less essential to cut off the distribution. A good bookshop won’t just rely on best sellers, but equally there will be a limit to the number of titles it can stock which only sell one or two copies a year. Amazon, with warehouse fulfilment, can do much better than that, and it has been estimated that 37% of their revenue in 2008 came from sales of books ranked below 100,000.* It would be supreme folly for Amazon to announce one day that they were rebuilding their web presence and would henceforward only cover the top 100,000 titles.
Government is not Amazon. Web pages are not books. Analogies are flawed. And yet.
The question of how the government’s web presence should be culled and curated is not a new one. It has been around in various forms since the earliest days of e-government, documented perhaps most clearly and consistently by Alan Mather. At least as far back as 2003 (and actually well before then) he had a strategy which looked uncannily like that of the single government domain:
- Fewer websites not more. Kill 50 websites for every new domain name.
- Less content not more. Delete five (or fifty, or five hundred) pages for every page you write.
- Solve the top 50 questions that citizens ask … and structure your content around those first. Then do the next 50 and the next. The people who know these questions are the ones that answer the phone in your call centres, the ones that write in to your agency and the ones that visit your offices for help; likewise, they visit accountants, advice bureau, charities and so on.
- Test search engines to see how your site ranks – both from a mindshare side and for individual queries.
- Impose rigorous discipline on use of “words” – plain speak.
- Impose even more rigorous discipline on the structure of the content, including metadata so that it’s easy to read – by people and by search engines.
Or in other words, start at the top of the Zipf distribution, and work systematically along until you stop. Tom Loosemore has a pithier version which means much the same:
Taken as expressed, it’s hard to disagree with the approach Tom and his team are taking. But a great deal hangs on the word ‘superfluous’. In this context, I think it is being used to mean two quite distinct things, but risks treating them as one. The first is rot, decay and duplication. Too much money is being spent very inefficiently to maintain – or all too often to fail to maintain – information which is poorly organised, hard to find, badly maintained and structured round what organisations do, not what people need. The second is obscure specialisation: there is a vast amount of information which most people don’t want or need and won’t ever want or need, and its existence makes it harder for the important stuff to shine through.
Focusing on an ‘irreducible core’ is a very good way of tackling the first problem, but risks overlooking the second. Whether that is a bad thing is a contingent question which is not inherently an easy one to answer, and which potentially raises some awkward questions about the singularity of the single government domain. There are three basic options:
- Everything goes into the single pan-government site
- Popular and important stuff goes into the single pan-government site and the rest goes somewhere else
- Popular and important stuff goes into the single pan-government site and the rest doesn’t go anywhere
To an extent this is (or can be made to be) a matter of timing – pursuing Alan’s idea of tackling the problem in fifty-question chunks. But even with that approach, sooner or later we get to the question of whether enough is enough. In order to know that, we need to understand two things. The first is the value to users of the long tail – government’s version of Amazon’s 37%. If it is high, or to the extent that it is high, the choice is between options 1 and 2. Neither is entirely attractive: option 1 risks compromising the quality and clarity of the much smaller set of key services; option 2 creates a messy boundary and breaks the principle that there is one place to go. If though the value to users of the long tail, or some furthest reach of it, is relatively low, the choice is between options 1 or 2 and 3. And if option 3 is even to be considered for some subset of information that might otherwise have been included, that raises a very big question.
Luckily, GDS is full of exceptionally smart people (and now even fuller) and better still, they have invented the needotron. That’s the right systematic approach – but I will be fascinated to see whether they find a way of creating the right long tail, and of stopping the tail being so unwieldy that it trips up the dog.
*These numbers are hard to make intuitive sense of. Amazon are currently claiming to have ‘over 750,000′ books available for the kindle, which sounds like more than enough for anyone – yet I regularly find that the books I actually want to buy are not among them.