Things which caught my eye elsewhere on the web

Mobile, smartphones and hindsight — Benedict Evans
It’s always fun to laugh at the people who said the future would never happen. But it’s more useful to look at the people who got it almost right, but not quite enough. That’s what happened in mobile. As we look now at new emerging industries, such as VR and AR or autonomous cars, we can see many of the same issues. The big picture 20 years out is actually the easy part, but the details are the difference between Nokia and DoCoMo ruling the world and the world as it actually happened. There’s going to be a bunch of stuff that’ll happen by 2025 that we’d find just as weird.

Finding the natural motivation for change |
The concept of finding the ‘natural motivation’ of players involved is a key component when I’m planning any type of systemic change. This isn’t a particularly unique or new idea, but I am constantly surprised how rarely I see it adopted in practice, and how often things fail by not taking it into consideration. It is critical if you want to take a new idea from the domain of evangelists and into ‘business as usual’ because if you can’t embed something into the normal way people act and think, then whatever you are trying to do will be done reluctantly and at best, tacked on to normal processes as an afterthought.

In recent years I’ve been doing a lot of work to try to change systems, thinking and culture around open government, technology in government and open data, with some success. This is in part because I purposefully take an approach that tries to identify and tap into the natural motivation of all players involved.

Social Media is the new black | Catherine Howe
The need for public servants to become active online reflects a belief that this is the most effective way of starting to experience and therefore understand what our transition to a network society might mean. Without that understanding we are powerless to try and shape it for the future. Social media can be seen as the leading edge of a range of digital and networked technologies which will disrupt the way in which we operate. Best therefore to try and get a grip of this before quantified self or augmented reality really blows your mind.

Setting standards that stick — Public Innovators’ Network — Medium
The responsibility for those with influence is to find the people who want to do the right thing, and remove everything that gets in their way.

Big organisations run on inertia. That doesn’t mean they stand still. They resist change in their direction and speed. If you’re trying to set standards for what good looks like, to have a real impact, you have to deliver something that disrupts inertia. So to have real bite, you need standards attached to powers. Look forward to becoming unpopular.

The NHS’s future is digital – but not if we simply replicate poor paper processes | Candace Imison | Society | The Guardian
In some cases where technological interventions have failed, new systems have simply been layered on top of existing structures and work patterns, creating additional workload for healthcare professionals. The technologies that have produced the greatest immediate benefits have been carefully designed to make people’s jobs easier, with considerable investment in the design process. People we interviewed for our research talked time and again about the importance of using technology to reimagine current work processes.

Put down all behaviour hurtful to informality! – Matt Edgar writes here
I believe productive informality is more than nice to have: it forms a virtuous circle that we can turn to our advantage:

Service productivity builds trust
Trust engenders informality
Informality is the route to richer, faster learning
Continual learning is essential for any service to be productive

Why historians should care about web archiving | Webstory: Peter Webster’s blog
So, in twenty or thirty years’ time, historians of the very late twentieth century will have reason to regret that no-one thought to keep their primary sources safe for them. But there is another problem. It is a brave historian who writes on the very recent past, a remote subject indeed; I myself wrote an article in 2004 that extended up to 1990, and not without some unease about the hostages to scholarly fortune it gave. And so most of the historians who have the greatest personal stake in archiving the web right now haven’t yet entered the profession. I would argue that historians are uniquely well-placed to view the present in relation to the past, and thus to anticipate those aspects of the present for which there is most need for a record. But it would take a significant change in culture such that historians working now start to take a hand in preserving sources for our successors.

A little while ago I was at UKGovCamp and it was fantastic. — Medium
The sad truth of it is we are all scared, confused and in doubt. We have a general idea that we don’t want to bring shame to the departments or companies we work with and we all know we don’t get everything right all of the time. Something to do with being fallible humans. How pesky.
My impression is this is in part caused by the attention the press rightly pays to the work that we do in government. When we fail we tend to fail big and the public have a right to know why. The sad truth of that of course is yet again no one wants to say anything, so trying to find out why we fail when we do is like getting blood from a stone for the people who have to do this job.

The 7 Deadly Sins of User Research — Medium
My definition of a successful user research study is one that give us actionable and testable insights into users’ needs. It’s no good asking people what they like or dislike, asking them to predict what they would do in the future, or asking them to tell us what other people might do.
The best way of gaining actionable and testable insights is not to ask, but to observe. Your aim is to observe for long enough so that you can make a decent guess about what’s going on. Asking direct questions will encourage confabulation, not tell you what is actually going on.

Sometimes you need a speedboat, sometimes a container-ship. — Medium
We thought for a long while about how to describe the difference between things which needed to be slow and sure and precise (the CMS) and things which needed to be fast. One had a huge risk of breaking mission critical things if you got it wrong, the other had the huge risk of being built so slowly that a breaking news opportunity was missed or the cost was too high for the duration the application was needed. Eventually we called the slow one a container ship and the fast one a speedboat. Both are useful, both have their place. The thing which enabled speedboats was cheap disposable compute resources, APIs and code to consume the APIs quickly and reliably.

The Waze Effect: AI & The Public Commons — NewCo — Medium
Absent a more robust dialog addressing these issues, we run a real risk of creating a new kind of regulatory capture — not in the classic sense, where corrupt public officials preference one company over another, but rather a more private kind, where a for-profit corporation literally becomes the regulatory framework itself — not through malicious intent or greed, but simply by offering a better way.

Time to change Twitter, or #RIPTwitter? | Paul Bernal’s Blog
The idea of using algorithms is very attractive, but it’s underpinned by an illusion that algorithms are somehow ‘neutral’ or ‘fair’. This is what brings about the idea that Google is a neutral indexer of the internet and a guardian of free speech, but it really is an illusion. Algorithms are human creations and embed ideas and biases that those who create them may well not even be aware of. They can make existing power imbalances worse, as the assumptions that underpin those imbalances are built into the very thought processes that create the algorithms. Yes, people can compensate, but even that act of compensation can bring about further biases. Where the essence of the idea behind an algorithm is to make Twitter more money, then that bias itself will interfere with the process, consciously, subconsciously or otherwise.

The Secret UX Issues That Will Make (Or Break) Self-Driving Cars
We don’t ditch what we have. We constantly update our metaphors, trying to find familiar handholds that quietly explain how a technology works. In digesting new technologies, as we climb a ladder of metaphors, each rung might follow the one before. Over time, we find ourselves further and further from the rungs we started with, so that we eventually leave them behind, like so many tiller-inspired steering wheels.

The deeper lesson in all this is that people naturally get frustrated when something doesn’t do exactly what they imagined; they get lost when things don’t work as assumed.

Five steps to making a product users love — Medium
This is the culmination of a few years’ worth of discovery, practice and thinking from working with early-stage digital products. This is my playbook that describes what I do, and a little of how I do it.