It’s four in the morning. I’ve woken up in the middle of the night, can’t get back to sleep, so I’ve got the NBA play-offs on. No idea why I can’t get back off. Maybe it’s because later today, during normal daylight hours, we’re due back the verdict from the assessment for the service I work with others on.
In these quiet hours maybe it’d be interesting to reflect on the assessments, see if that gets it out of my head.
If you’re not in the know, at the moment designing and making a government service with a high level of transactions, like tax credits, means that service must go through three service assessments, if it is to get from “discovery” to “live”.
A pass in each assessments paves the way for the creation of a service to continue, to progress towards being “live”, that being available to the public. The assessments measure the development of a service against the 18 points of the digital service standard.
Discovery is about, um, discovering the needs of users. (Read more on discovery here.) There is no assessment after this phase, other than deciding whether there is a need to continue – or no need to continue.
The first assessment is on the alpha work. A pass sets you up for the beta phase, where we need to build and end to end service. (Read more on beta here.)
I’ll stop there. Because this is where we are at. We are in the beta phase. And recently our work in beta been for an assessment. This means if the service passes it will be available to the public for the first time in its renewed form. While the beta form of the service is available to the public use we will continue to learn from the users on the service – and continue to do more work on the service. This is just the start. There is much, much more to do. The “launch” is just the start. We will always be in beta.
And on tax credits that is because what we want to release to the wider public isn’t all of tax credits.
At the moment the tax credits “system” or “process” as the public knows it is mainly served by information provided back to HMRC by posted forms and/or over the telephone. There is already a service in place. There is already as way for you as the public to interact with government. But that service has evolved over years of responding to the actions of its “customers”, reacting to the needs of policy. That’s a lot of moving targets.
For the user tax credits should be a simple as possible journey. Unfortunately making something that meets everyone’s requirements is a lot of work. We design and make to be inclusive. But getting it made takes time.
So we iterate. We find a mass we concentrate on, something solid, some journeys that means something, to get a core service together. And gradually we will add to that, add around that, and amend that. This way we can get things made so they work, are usable quicker. In theory we can get something to the users, to the public faster. On a programme of work that takes time, a lot of time, this means a gradual release of features. In theory we can regularly get stuff out. Over time we will get there.
So where does the assessments and the standard fit into this?
Straight up I am a big fan of the standard. At the least you can see it as a great tool for change. If it was voluntary think there’d be the same change happening? I doubt it. But think of it more than a tool. It’s a way of working, a mindset. And that’s where we come in.
Over the past few months we’ve been having some interesting conversations around the tax credits service I am working on. If we get the “core” out there, if we get that “core” through a beta assessment, so we can get some people using a web-based service, if we take a gradual “doing but not yet done” approach. The question we have been asking since the turn of the year: How do we make sure a continually evolving digital service stays on track with the digital service standard 18 points? That is one of our needs.
I also don’t see the digital service standard as “hoops to jump through” or a checklist. It’s more a state of mind, some guidance to keep the making of government services on the right track. I’m aleso not apprehensive about the assessment. The assessment is almost an exam to check you’ve been going about things the right way. If you haven’t? Well, fail, innit.
But that’s me. Others do not think like that. It can be seen as “hoops to jump through” or a checklist. And whatever side people fall on there, there are many that see the assessment as an exam. And at the end your service passes or fails off the back of that.
And that is a great thing. That ensures users are getting services that are usable, services for them. Users may not know it has been through this process, but government has made sure it’s alright. This is continual government improvement. One day – maybe, hopefully – we’ll reach a moment where it will be normal to use a digital based service from government and not feel it is not rubbish, it is not frustrating, and so on. You will go on, do your thing, job done.
So, on the idea of continual government improvement, over the past months we’ve been having conversations, within the team and wider within government, about how we can make the journey of designing and making something more coursework not exam work. How can we make the way we make things more joined up, of one government, of departments collaborating, of the “adjudicator” becoming more of a mentor?
The Australian DTO recently published a blog post on their move to look at “in flight assessments”. This was timely given our needs, and is something we’re keeping an eye on.
And something we’d love to try on tax credits.