The process of building new digital services involves being assessed by Government Digital Service (GDS) to make sure you are meeting the Digital by Default Service Standard.
What was the assessment like?
Within our team we did not want the assessment to be seen as something to be feared! We saw it as an opportunity to show off what we had built to date and have our peers within GDS critique our progress (a peer review).
We took the process seriously and prepared as well as we could by reviewing our progress weekly against the 18 Service Standards we would be assessed against. We also collected evidence to share with the assessors and sat two internal mock assessments.
The two mock assessments were ran by Rob Viant and Jan Bytheway (thank you both) and they grilled us for hours on all the possible questions that GDS could ask us – imagine your worst ever job interview which lasts for a minimum of four hours!
With our well thought out preparation and posters ready to stick up on the walls in GDS, a small number of the team – the Service Manager, Product Manager, Scrum Master, User Researcher and Technical Architect – travelled to Aviation House in Holborn for the assessment.
The assessment ran for three hours and allowed for the team to outline the service, the needs it was meeting and a demonstration of the live service. The remaining time was taken up with answering questions relating to the 18 Service Standards.
Our initial thoughts were the assessment was informal, open and honest and was not nearly as hard as Rob Viant’s mock assessment (Rob you are a harsh assessor!). We felt confident we would pass the assessment although the fact that users were not yet able to create an account within the new service did worry us.
How did we get on?
The result of the assessment was sent to us three days later and we had passed 17 points out of the 18 Standards. As we had thought, we had not passed point 12 because the new “service currently requires users to login using the legacy service”.
Understandably we were disappointed, but after some reflection we began to see the many positives in the result:
- we had passed 17 out of 18 points (still a huge achievement for the team!);
- we had been reviewed by our peers and had answered every question and query confidently;
- the assessment panel said how impressed they were with the work the service team had done to understand user needs through thorough and varied research and their plans for continued research to iterate the service;
- the panel were particularly impressed with how the service team had challenged existing practice to improve the clarity of the information they provide to users;
- the panel thought the team spoke knowledgeably and confidently about our technology decisions.
The initial result of the assessment led us to believe we would not be able to move the service onto GOV.UK and increase the number of the users using the service (yet!). However, the result of the assessment had a nice surprise:
“the service has been given permission to launch as a private beta. The Policy and Engagement team is keen to explore options for linking to the (private beta) service on GOV.UK as the service assessment has provided a strong user case for doing so”
This is great news as it means we could still release the service on GOV.UK.
We are now in discussions with GDS on what this means for the service and when we can release it – we will blog soon with more information.