Net present value

The train now approaching platform 2 has been delayed for 155 years.

A few weeks ago, I found myself on the Cornish Riviera, the fastest train from Cornwall to London, stopping only at Plymouth, Exeter and Reading between the Tamar and Paddington, and timetabled to go from Penzance to London in exactly five hours.

Splendid. Except that the distance is a shade over 300 miles, so the average speed is a less than dizzy 61mph. And even that average covers some interesting variation. Looked at section by section, the speeds tell a powerful story.

Distance (miles) Time Speed (mph)
Penzance to Plymouth 79½ 2h 40
Plymouth to Exeter 52 57m 55
Exeter to Reading 137¾ 1h 38 81
Reading to London 36 25m 86
Penzance to London 305¼ 5h 61

Some of that variation is explained by the fact that the train stops eight times between Penzance and Plymouth and not at all between the other pairs of stations, but on any basis, a supposed express taking two hours to cover eighty miles is hardly impressive.

This isn’t about the trains. They may be forty years old, but they are still capable of reaching 125mph and they certainly don’t get close to that anywhere west of Exeter.  So it comes down to the track and, more specifically, to a set of design decisions made in the 1840s.  The terrain was not easy, with lots of steep valleys perpendicular to the line of route, money was short, and construction costs were minimised. Apart from the curves, other obvious signs of those constraints remain today – the splendid Royal Albert bridge across the Tamar is an engineering masterpiece, but it is only single track, as was the entire line in Cornwall when first built.

In making those design decisions, did the promoters of the line give any thought to the fact that they were investing for the next two centuries or more? Of course they didn’t – what, after all, had posterity ever done for them? With the benefit of those two centuries of hindsight, would it have been rational to have invested more at the outset in order to secure a long stream of higher benefits? Almost certainly, and if the capital markets were not capable of doing that, we can now see that it would have made good sense for the government of the time to have borrowed to make more effective investment possible. Leaving aside the fact that that’s not what governments in the 1840s saw their job as being (we are already well into the realm of fantasy here), the very long range value of some kinds of investment decisions, combined with the very long term constraints some of those decisions bring means that the there is a long-term skew to sub-optimal levels of public investment.

Meanwhile, Daniel Davies has written a beautiful essay, summarising Switzerland in 18 vignettes. The whole thing is a delight and well worth reading, but a couple of sentences prompted me to think again about my Cornish experience:

The SBB, great though it is, is not the real miracle of Switzerland compared to the dozens of little cantonal and sub-regional railways that serve even the smallest little towns on rails carved into the roads or running alongside them. This sort of infrastructure asset doesn’t depreciate if maintained properly, and it keeps providing the services for which it was intended in all economic climates. It’s a classic illustration of a point that John Quiggin has regularly made – that classic “risk-adjusted” discounted cash flow analysis will always overstate the risks of government spending and result in underprovision of infrastructure.

Devon and Cornwall provide a different example of that point with the closure of the line between Exeter and Newton Abbot earlier this year when a stretch of track was washed away. There was once some redundancy in the system which could have routed round the damage – but that has long since been removed, leaving rail access to Plymouth and beyond vulnerable to a single point of failure. Again, we can only speculate about the long term return there would have been on keeping the line across Dartmoor open.

This isn’t a post mourning a pre-Beeching world, where trains puffed across sunny rural landscapes, pausing from time to time to pick up a few milk churns. Seeing how the past might be valued differently is not the same as valuing everything that is past. Trying to understand the long-term value of infrastructure is not the same as saying that all infrastructure has long-term value.

Indeed, this isn’t really a post about the past at all. The railway line across Cornwall, curves, gradients and all, is what it is, and nobody is going to build another one. So this is a post about the future: can we find ways of being smarter about the long term value (and costs and risks) of our investment choices?

 

 

 

Cucumber legislation

We are approaching the traditional time of the silly season in UK news and politics, the quiet period when in the absence of real news, the frivolous and the dotty get more column inches than they otherwise would.1 In Poland and indeed much of the rest of Europe, that period is know as the cucumber season.2

cucumber cross section

With that slightly unlikely introduction (for reasons which will become apparent), let us return to the question of whether law is code and, to the extent that it is useful to talk about it that way, what ways of producing better code might tell us about making better law. Quite clearly, law is not actually code and it is arguable – and indeed argued – that it is wrong and unhelpful to think of it that way. Just recently, Evgeny Morozov has written about ‘algorithmic regulation’ as a threat to the democratic process. But even to the extent that he is right (which in my view is not very great), it’s a different question to the one I am interested in here.

Law, like code, is a complex system of components, where defined inputs should lead to determined outputs. A critical question in both worlds is therefore whether the black box between the two successfully translates the first into the second. Every approach to software development there has ever been – and there have been many – has been an attempt to solve that problem for code. Approaches to the development of law have been less structured and less eclectic, but again, the purpose of drafting legislation is to give effect to intentions.

In each case, it is valuable to test whether the inputs do in fact generate the intended outputs. For law, that can be quite tricky. One reason for that is that it may take a long time (and a complex process) to work out what the inputs are, never mind what the output is. One reason we have judges is to run just such tests: given a set of facts, and given the application of the law to those facts, what outputs does the system generate? In more complex cases, it can take several sets of judges several years to reach a final answer to that question. To add further complexity, the judicially correct assessment of the meaning of law can change over time, even where the law itself does not.3

Computer code, to put it mildly, is not like that. Because it is not like that, the techniques for testing and validating it are very different. They can in principle be more structured and more systematic – indeed they can in principle and occasional practice produce error free code. But even – or perhaps especially – outside such rarified exceptions, ensuring that code matches intent is a difficult and imperfect process, as it is for law.

And so back to cucumbers and to an intriguing post from Richard Pope about using software test techniques to identify whether regulations are being enforced, by analysing data about activity and matching it with regulatory activity.

There are tools for doing this using a syntax called Cucumber, which as Richard explains

is designed to both be readable, and to be written by, a non-technical person, but can be run automatically by a machine.

But if it is possible to use such an approach to test whether regulations are being applied, why not use the same approach to generate the regulations in the first place?

There are fairly well established tools for turning legislation into decision rules (though their adoption for their core purpose does not seem to be terribly widespread).4 Turning decision rules into legislation is a rather different question, but conceptually is not so very different from the kind of behaviour-driven development which Cucumber supports (though ‘conceptually’, of course, can be a very long way from ‘practically’).

All of that takes us back to a question first raised by John Sheridan: if law has some similarities to code, are there tools and techniques which can be adopted from the development of software to the development of law?

The short, but not very helpful, answer is almost certainly that there are. Any longer answer needs to take account of some profound differences – the architecture and structure of legacy legislation compared with legacy code, to take just one example. That might mean that tools and processes need to be quite distinct, but it doesn’t stop the concepts and disciplines having common application.

So Cucumber-driven legislation might or might not be the next big thing – but in either case the idea prompts some important questions and points to useful areas for more detailed exploration. And this is, after all, the cucumber season, a time for speculative fancy with little requirement for strong foundations.

Cucumber picture by viZZZual.com, licensed under Creative Commons

  1. Column inches are not what they once were, of course, but I use the phrase deliberately since the idea of a silly season has itself rather wilted under the pressure of the constant news cycle, and this is not a week where the news feels particularly silly.
  2. That seems both horticulturally slightly inaccurate and a bit of a stretch of association, but let’s not worry about that.
  3. The interpretation of the US constitution by its supreme court provides clear examples.
  4. The market leader has existed in various guises for at least 15 years, and is now known as Oracle Policy Automation.  It is probably most suited to very rule-based processes such as tax and benefits,  but even there it has never really taken off.

Govcamp is useless

It’s the Monday1 after the Govcamp before, the day Dave Briggs once described as the most depressing day of all, as the exhilaration of the event crashes into the realities of working lives. I like Govcamp for a long list of reasons I wrote about last year - I won’t repeat them here, but they are implicit in what follows, so might be worth a quick look before going on.2

We were all at the same event, but it’s a safe assumption that few if any of us were at the same event. Ten parallel sessions four times over means that there were 10routes through, even before taking account of the application of the rule of two feet. So these are some some thoughts on the event I was at, which may or may not resonate with anybody else’s.

Why Govcamp is useless

But before diving into that, an aside on the general uselessness of Govcamp. That uselessness is not a weakness, it is the very essence of what Govcamp is and how it works.3 Govcamp as an entity reaches no conclusions, sets no actions. It has no opinions and no manifesto. It does just one thing, and does it very effectively: it brings people together in a way which facilitates conversations between them on the topics they most want to talk about.

View from City Hall through a window with the word 'talk' stencilled on the glass

It’s worth saying that, because from time to time people (including me a few years back) get frustrated, and argue that if only Govcamp were different, it would be different, and that without concrete actions resulting from it, it is somehow a waste of time. There are two ways of responding to thoughts of that kind.

The first follows from the way Govcamp works now. What gets talked about is what people want to talk about. So the way to change what gets talked about, is to encourage different people to come – and what gets talked about in any case will change over time as both people and issues come and go.

The second is to put Govcamp into context. Not every kind of event needs to attempt every kind of task. There are events designed to make things happen, get things built, train people in specific skills, collaborate across organisational boundaries, and a whole host more. There is room in all that for an event where conversations happen, for as long as people find value in the conversation – and the way that Govcamp tickets disappeared suggests pretty strongly that the appetite for that has not gone away.

How Govcamp was useful

So with that out of the way, a few reflections on what did happen.

The setting, in City Hall, was without doubt the most spectacular yet, a building of spirals many of us found temptingly photogenic.

City Hall - Assembly Chamber

It was interesting though, how even very small changes to the layout and to flows of people can make what seem to be disproportionate differences. So the combination of the long walk from the assembly chamber to the other meeting rooms, the absence of on site coffee, and having ten choices at each of four sessions, rather than eight choices at each of five resulted in fewer serendipitous conversations and an even stronger sense than usual that too many other interesting things were happening somewhere else.

I started at the session on votecamp, which was exploring ways of encouraging more young people to vote. It’s an important topic, but I didn’t feel I had much to contribute, so I moved on – though not before hearing Ade Adewumni make the simple and very powerful point that we can’t hope to understand why some people don’t vote without understanding why other people do (especially given the basic irrationality of voting at all). I suspect that thought has much wider application: we tend to focus much more on why people don’t comply and don’t think enough about why people do.4

I ended that session in a group asking ‘what do you want from your agile supplier?’, though that felt like a bit of a euphemism for ‘how do you cope with your decidedly non-agile customer?’.

Then to a session I had proposed on how work works, which was partly written up as we went along, and which I won’t add to here.

After lunch, there was a compelling option: John Sheridan had dangled the prospect of combining legislative structures and JS Bach, which made his session irresistible.

I hoped for the emergence of a new Gödel, Escher Bach, but alas JSB went unmentioned, with only the pale consolation of copies of the Interpretation Act in his place.5 John Sheridan and the Interpretation ActBut despite that initial disappointment, the discussion was both fascinating in its own right, and a great example of how a Govcamp audience could pick up on a theme and bring a distinctive perspective to it. Prompted by John’s work, I have written before both on a concrete example of where a more code-based approach might lead to clearer law and more abstractly, how law, code and architecture fit together. I left the session with three one liners rattling round my head:

  • There is no architectural thinking in how the structure of the statute book is managed.
  • We shouldn’t let the lawyers get away with it any more
  • How do we, the geeks, share what we know with our lawyerly friends?

I spent the final session of the day in a debate on the question of whether all was well with digital in central government. The proposition was that there is nothing to worry about, but it was pretty clear that it was a question intended to evoke a short sharp answer to the contrary – which it successfully did. There is a slightly simplistic view that if the rest of the world were more like GDS, it would be a better place. The problem with that is not that it’s necessarily wrong (if it were, it probably would be), but that it doesn’t take sufficient account of the cultural and organisational context within which all this is happening. I keep going on about (and kept going on about) the fact that a relatively thin joined up information layer cannot in itself be expected to drive deep organisational and service change: the solution requires a better government as well as a better web service. The apparent GDS focus prompted some back channel dissent:

which helped draw the conversation back to what I think is a better version of the question:

That’s all an issue which has been rumbling along ever since GDS started, and actually for years before that. There isn’t going to be a solution which is intrinsically right for evermore, but that makes it the more important that we don’t lose sight of the question.

Useless but very valuable

That was my day. Or rather that was a version of my day, but in the telling, it’s lost a large part of what was best about it all. There were conversations, some brief, some longer, some connections made, which shed new light on old problems and identified whole sets of new challenges. The most frustrating thing, as ever, is that there was a whole bunch of people I would really like to have spent some time with who I didn’t talk to at all, and in some cases barely set eyes on. An event with 10,000 routes through it cannot be otherwise.

So, as ever, Govcamp was useless. But it is a very special and rather compelling form of uselessness. If we could be this useless more consistently, who knows what could be achieved.

Pictures by me, W N Bishop, and Alex Jackson

  1. No, it’s not actually Monday, but half this post self-destructed and had to be recreated, thus adding delay and reducing dramatic effect.
  2. Especially if you are not familiar with the barcamp/unconference model – in which case the description of the process at the beginning of this post might be useful.
  3. Or, indeed, any unconference or barcamp – this is a point about the method, not about the specific event.
  4. Which in turn reminds me of a great post from a few years back by Will Davies on the illusory reality of government.
  5. And I can never wholly escape the thought that policy on interpretation should rest with the Circumlocution Office.

Public and strategic

JET test mock up

Most of the time, the hottest place in the solar system is the core of the sun. Some of the time, the hottest place in the solar system is tucked away in an obscure building on an anonymous industrial estate on a former airfield in rural Oxfordshire.1 There they use extreme heat and power to smash atoms together to release energy from their fusion.

This is JET, the joint European torus. It was planned forty years ago, has been operating for thirty years and, if all goes well, in another thirty years its successor’s successor will be producing electricity with virtually no fuel and virtually no waste.

It may not happen, of course. There is the old and pointed joke that nuclear fusion has been thirty years in the future for at least the last thirty years. But at Culham, the home of JET, they are adamant that the physics and maths of the problem has essentially been solved. All that is left is engineering and scaling up. As Brian Cox put it a few years ago

What frustrates me is that we know how to do it as physicists, how it works. It is an engineering solution that is within our grasp. I don’t understand why we don’t seem to want it enough at the moment.

If – or when – it does work at commercial scale, the potential impact is enormous. For all practical purposes, the fuel needed is unlimited and the waste produced insignificant.

This then is strategy of the grandest kind. And it is a profoundly public strategy. Compared with the normal range of a blog which claims to be about public strategy, this is heady stuff. I don’t understand the physics and engineering here much beyond the level needed to have my mind thoroughly boggled, but I like to think I know a bit about public strategies. I spent three fascinating hours touring JET and MAST2 at Culham last week – anybody can go, though the tickets are hotter than Glastonbury, so you have to book months in advance.

So here are a public strategist’s reflections on fusion, prompted by my visit.

1. Ambition measured in decades

It is a frequently heard criticism of governments that they are incorrigibly short term in their thinking, driven by electoral cycles and by ministers’ knowing that they are unlikely to be in post to see the consequences of their decisions. There are plenty of examples people can point to of that, and they often do. You can mount an argument that the public sector is very bad at deciding and pursuing long term strategic goals, but it’s had to deny that there are some kinds of long term goals which only governments pursue at all. The symptom of that is often commercial viability, but that’s a measure of uncertainty more than anything else.3

NeutraliserWhat only governments can do is long term challenges where the goal may be clear, but the method for approaching it is not. The most famous example of that is the space programme. In a post mainly prompted by one of the most extraordinary pictures to come from that whole endeavour, I quoted Bruce Baugh making exactly that point.

This is a statist venture from beginning to end, and demonstrates the ability of the modern regulatory state to undertake and complete large useful scientific endeavors. There is, so far, simply nothing comparable in the corporate sector, and it’s worth keeping in mind that when people talk about doing away with any but the most minimal state, this is one of the things that’d be done away with along with someone favorite caricature of the pork barrel.

It’s not that it’s logically or conceptually impossible that a mission like this could happen any other way. It’s just that three hundred years in to the industrial revolution and nobody’s yet done this kind of thing any other way. And that’s worth noting along with sheer wonder of the achievement itself.

It isn’t yet clear that the bet on fusion power will pay off, though the odds are looking considerably better than they once did. But is clear that without a public strategy, the bet would never have been placed and, whatever the potential, we would never have known if it could be realised.

2. Sustaining public support

The fact that only governments can make long expensive bets on the distant future doesn’t mean that they can do it readily or easily. In public policy terms there are two fundamental requirements. The first is sufficient wealth to make the investment at all. The second is sufficient public support to allow it to be made.

In the case of fusion power the first requirement is managed by spreading the cost. The ‘E’ in JET is for ‘European’. Its replacement, being built in France, will effectively be global. The second requirement is in some ways more interesting. At a time when governments around the world are making painful Lone workingchoices about expenditure, continuing to pay the stake on a thirty year bet may look indulgent. So why do it? I suspect that there is no single tidy answer to that. Some of it could be about devolved decision making to bodies such as the EPRSC and the availability of supra-national funding through EURATOM. Some of it may be an element of double or quits – that having spent so much time and money getting this far, it would be foolish to throw in the towel just when it might be getting somewhere. Some of it may be simple obscurity. Some of though is clearly trying to influence people to look favourably on what they do -which I appreciate, since it is presumably why they offer visits in the first place.

3. Simplicity of purpose

Perhaps the most famous mission goal in history was set by President Kennedy in 1961:

I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the Moon and returning him safely to the Earth.

A year later he further boosted the political weight behind Apollo, and positioned it as a defining moment for the national psyche.

We choose to go to the moon. We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win, and the others, too.

Nobody has pulled off anything quite like that before or since. The strength of Kennedy’s position was not that he was executing a well defined plan. There was no plan, and there couldn’t be, because nobody knew how to set about the Power indicatortask. But the goal was impossibly aspirational, absolutely precise – and very simple.4

Fusion research shares some of those characteristics, though without the advantage of inspirational speeches from US presidents. The goal is clear and its achievement will be unambiguous: either a fusion reactor produces sustainable net power or it doesn’t. The principle of how it should work was clear as soon as the audacious idea of mimicking the sun was formed. But what actually needed to be done to make it work in practice cannot have been at all clear at the outset. Having a clear plan is invaluable. But a strategy based on simplicity of purpose is critical.

4. Pragmatism of delivery

Grand visions of simple strategic purposes may be essential, but they don’t deliver anything, as any inventor of a perpetual motion machine will eventually tell you. Lewis Carroll nailed the basic principle of strategic delivery well over a hundred years ago:

Begin at the beginning and go on till you come to the end: then stop.

Vessel EarthDelivery is relentless pragmatic and prosaic. There is no glamour at Culham, there is nothing futuristic.5 A lot of the ancillary equipment looks dated and a bit shabby – though mostly reassuringly solid. This is down to earth steady progress built on foundations of robust engineering.6

Strategic fusion

What, if anything, can we learn from all of that for public strategy?

On the face of it, an obvious conclusion might be that programmes such as this are so exceptional in their length and complexity that there are few if any lessons to apply to the more mundane tasks of government. If those were the strategically distinctive characteristics of the fusion programme, there might be little more to be said.

But while those are the most obvious characteristics, I am not sure that they are really the most distinctive. What really stands out is the combination of precision of purpose and painstaking pragmatism in achieving that purpose. There are very good reasons why political goals and decision making rarely reach that level of simple clarity – but all the more reason to keep that as a standard to aspire to.

All pictures were taken on the visit – more can be found here.

 

  1. On some accounts, it is the hottest place in the universe. But there is no need to quibble about details.
  2. MAST is the Mega Amp Spherical Tokamak. Tokamak is a contraction of тороидальная камера с магнитными катушками (toroidal’naya kamera s magnitnymi katushkami — toroidal chamber with magnetic coils), abbreviated in the Russian style by syllables rather than initial letters. So MAST is a Russian contraction nested in an English acronym. It probably reveals too much to say that I am entertained by that.
  3. For a powerful recent example, read this essay by Tim Heffernan about the future of copper mining, “When working out the cost-benefit analyses of new developments, mining firms look ahead a quarter of a century and more. Their potential investments measure in the tens of billions of dollars; their calculations, understandably, err on the side of caution.” That sounds like – and is – an enormous undertaking But it is a bounded one: the challenge is not that the problem of getting copper out of the ground is not understood, it is the massive and sustained programme management needed to do it in a way which is profitable in the long term.
  4. Of course it was also deeply rooted in the cold war. Kennedy wanted to conquer space in no small part because he feared that the Russians would control space and thus earth as well.
  5. On the contrary, there are orange carpet tiles.
  6. Turning JET on briefly draws 2% of total national grid capacity (so is carefully timed not to coincide with the end of Coronation Street) and because even that is not enough, has two enormous flywheels acting as mechanical batteries. That sort of power engineering is just one example of the infrastructure needed before the fancy stuff can even begin.

The new normal is already old

wpid-20130123_075745.jpg

Somehow we used to manage without knowing when the next bus was coming, and somehow life still went on.  In London, that distant past is less than two years ago. All that time ago, bus arrival information was new, exciting and empowering.

Now, of course, it has just vanished into the background. It is how things are and how they should be.  Successful inventions disappear from our awareness. Until, that is, they go missing or stop working.

And the next bus could be anywhere.

Two worlds, not quite yet colliding

I went to two events yesterday.

The first was the launch of the Government Digital Service, or rather a housewarming party for their shiny new offices. In fine agile tradition, they put on a slick show and tell with short sharp presentations about their work and achievements topped and tailed by Francis Maude, Mike Bracken (who has blogged his account of the day), Martha Lane Fox (likewise) and Ian Watmore. There was an enthusiastic crowd of supporters twittering furiously and other blog posts are starting to appear [added 10/12 - and GDS has now posted the presentation material and links to press coverage] . The dress code was smart casual, with a lot more emphasis on the casual than the smart. There was a buzz, a sense of creativity and spontaneity, of energy and talent unleashed, of an approach which felt a million miles away from both the stereotype and the reality of government projects.

Frustratingly, I had to leave early to get to the second event.

That was a much more sombre affair, closed and closed in, in an anonymous Treasury meeting room. The programme I work on was being reviewed, to check that we are managing effectively and are on track to deliver. There was little that was casual, in dress or anything else. There were plans, business cases, critical paths, migration strategies, decommissioning strategies, privacy impact assessments and a pile of other stuff besides. There was pointed questioning on risks, affordability and resilience. I make no complaint about that. We are spending public money – rather a lot of public money – and we should be challenged and tested on whether the spending is wise and the results assured. The track record of large government projects is not so great that there is room for complacency. But it felt a very long way from the world of GDS.

That matters, because actually the two are very closely linked. They represent, in effect, different ways of thinking about the same problem, and have roots in some of the same people and ideas. And in recognising that, I suddenly realised that I had rediscovered a thought I had first had at a seminar I went to almost four years ago, where both approaches were represented, each largely talking past the other. Tom Steinberg, who spoke at the point of inflection between the two, memorably started by saying that he completely disagreed with everything which had been said in the first half, and that the solution to the problem of big blundering IT projects was to have small fleet of foot projects, not to find a cure for blundering. I reflected on the apparent tension then as I reflected today:

And then the penny dropped. The apparent gulf between the two parts of the seminar is itself the challenge.

We need to apply two different sets of disciplines (in both senses), in two separate domains:

  • An approach to the customer experience – both offline and online elements – which is flexible and responsive and which maximises its exposure to customer intelligence in order to do that
  • An approach to the supporting processes which is robust, consistent and correctly applies the full set of rules

The collective culture and skills of government are much more geared to the second than the first – and the risk is not just that we don’t do the first as well, but also that we can all too easily fail to spot the need to do it all. The first is where there is the greatest need for change, flexibility and responsiveness – and where tools and approaches are available to deliver that responsiveness. The second requires the hard grind of implementing big robust systems which do the transactional heavy lifting as invisibly as possible.

Of course the distinction isn’t an absolute one, and of course each domain needs to incorporate the key strengths of the other.  But if we confuse them, we are at risk of getting the worst of both worlds.

My view has changed in the four years since then. I no longer think they are two different domains, they are aspects of what should be intrinsic in any approach (though scale and purpose will drive balance and relative importance). But perhaps there is a risk that big projects are still too much trying to learn the lessons of the last decade and too little trying to anticipate the needs of the next. It is no longer enough for systems to work (though they do, of course, absolutely have to work); they must work well, and work well specifically for the people who will use them. Or, as Helen Milner reported Mike Bracken as saying at another event yesterday:

[blackbirdpie url="https://twitter.com/#!/helenmilner/status/144713550638755840"]

That makes a lot of sense to me, though only if it is understood that in this context function is an integral part of beauty (as Brian Hoadley rightly challenged).

Conversely though, it is not enough to make beautiful things, though, perhaps less obviously but no less necessarily, they do need to be beautfiul. It is essential that they work and work well too.

Looked at one way, the core mission of GDS (and not just GDS) is to make beautiful things which work well. That means some of the values so apparent in the GDS event need to be more obvious in many other aspects of the work of government. We will have made great progress when discussions about projects in anonymous Treasury meeting rooms are more like the world of GDS. But as increasingly function begins to underpin beauty, it may also mean that the palest shadow of the Treasury meeting room also needs to fall across the sunny loft which is GDS.

One of the key tests of the success of GDS will be that when their turn comes to give an account of themselves in that room in Treasury, their approach is recognised and valued – and the work of every other project is being tested against it.  And another key test may be that that room will be a bit less anonymous, with its own wall of post-its and whiteboards.

Pictures by Paul Clarke

Alpha gorilla

Everybody who has had much to do with the development of government web services knows that there have been failures of imagination, failures of bravery, failures of technique and failures to seize opportunities – as well as successes in the teeth of opposition and incomprehension. Few have had the opportunity to start from scratch (though those who have have often made good use of that opportunity). So there are inevitably people who will look with envy at what the alpha gov team has achieved and, just as importantly, what it was given licence to achieve. Relly Annett-Baker caught that sense in her recent post:

The frustrating part is plenty of people before Alphagov could see the problems and probably a good few of the solutions too. They were not able to act on them (and many have privately told us of their struggles). And they probably feel like, well, like how everyone feels when the consultants waltz in and say exactly what you’ve been saying for the last however many months. We have been given the utopian blank slate that others have only dreamed was possible. To those people, I can only say this: we aren’t wasting the opportunity.

But everybody who has had much to do with the government web services also knows the complexity of forces which bear down on creativity and design choices, sometimes from undue caution but at least as often from the fact that genuinely contradictory pressures have to get reconciled.

That’s where it starts to get interesting, because Alpha gov is beginning to find itself in this territory. It has come under criticism for the choices it has made about accessibility, for compromising on its approach to UX and even for the amount of white space frivolously scattered about the site. To my mind much more interestingly, questions are also being raised about its scalability and extensibility. One commenter on alpha gov’s about page puts it this way:

It looks good. Vast improvement on Directgov. Alpha seems like a great way to test and design the public face of e-govt and I’m sure a lot of the comments you get will praise the big leap forward in usability on show here. I hesitate to say this, but that’s the easy bit. Does your remit with Alpha go as far as testing the other side of this – i.e the other end of the transactional processes, within the Departments? It’s just as important that that alpha provides Departments with the flexibility, functionality and autonomy they need to adapt and develop their products and online services quickly, as it is to make sure the public interface works well. I suspect this will be hard though – the barriers will be more cultural and political than technical.

Alpha gov is a proof of concept. But what concept has it proved? That there are more arresting and more user friendly ways of building a government navigation site? Definitely. That starting with what users actually want to do, and then helping them do it is a good and (in this context) radical approach? Assuredly. That this could replace Directgov or become the heart of the single government domain (whatever that is)? Well, no. Not because it is clear that it couldn’t do those things, but because that is not what it has been built to test.

So what, then, is this alpha? Is it an alpha gorilla, asserting dominance and superiority? Or is it alpha software, tentatively tiptoeing into the daylight for a short and critical life before being cast aside?

The name is supposed to connote the second. But because of all the doubts, uncertainties and insecurities described above, some will inevitably hear the first. Tom Loosemore is horrified by that possibility. I don’t have a scintilla of doubt in his good faith but objectively, as the marxists used to say, I think he may be wrong. The purpose of alpha gov is to challenge, to point fingers at the past and so, by implication, at those who have played parts in creating that past. The position it is aspiring to occupy is not some marginal piece of unimportant communication to a group nobody cares about, it is to be the new paradigm for the way the whole of government interacts with its citizens. It is to be the alpha gorilla, even if its chosen weapon is the alpha site. Aspirant alpha gorillas have to fight to establish their position. Some succeed, and dominate the pack (at least until the next aspirant comes along). Some fail, and are ejected. What we are seeing is the beginning of that fight.

I don’t think Tom and the alpha gov team need feel apologetic about that. But equally, I don’t think that most of those involved in creating the set of things alpha gov is there to challenge need to feel guilty or apologetic either. That’s because alpha gov is, in one important sense, a sleight of hand. It is proposing a technical solution to a supposedly technical problem. That’s good, but technology is not, fundamentally, the reason why the government’s web presence is as it is. The real problem is not technology but sociology. To the extent that the structure of government has been designed at all, it has been designed to be delivered in ways which can be managed. Government is not fragmented as an accident but as a way – for a long time the only possible way – of getting things done. One result of that, as I have argued before, is that there is no such thing as the government. The question then becomes, how in a world of rich and complicated public services, detailed legal frameworks (often highly specific to the service they regulate), every conceivable combination of personal characteristics and needs, and long tribal histories we can nevertheless make things better by deploying the new and more powerful tools we now have available.

From that perspective, the primary power of alpha gov is not as a solution, but as a catalyst. It does less to provide answers than those who built it might have hoped or thought. But it does very starkly pose a question and demand an answer. Who chooses to pick up that question and answer it may show who is the real alpha in the pack.