I’m going to share my experience of an assessment for the Skills Funding Agency (SFA). I see this role as a great opportunity to gain an insight into how other organisations are tackling the digital agenda. It’s also a way to make sure our new digital services are developed to meet the criteria. So they can pass the service assessment and appear on GOV.UK.
SFA’s Provider Digital Service
The SFA are (or were at the time of assessment) in an alpha phase. They’re building a service for colleges and training organisations to manage their contracts with the agency. The aim is for a single place where these organisations can manage their online funding relationship with the SFA.
The assessment was led by Andy Graham (Land Registry) and also supported by Jo Jones (Companies House) and Paul Turner from the Department for Business, Innovation and Skills (BIS). I was responsible for asking the ‘technical’ service standard points 6-10. It was also the first time the other assessors had performed a review using the more concise 18 as opposed to 26 point guide.
Assessments are performed to ensure services are developed in the line with the Digital by Default Service Standards. Assessments by Government Digital Service (GDS) are normally reserved for, but not limited to, ones that have (or are likely to have) more than 100,000 transactions per year like us. For services below that, assessments are arranged within the service’s responsible department. For the BIS family, these are performed by the BIS Digital and Technology Transformation Team (DigiTT) which was the case with the SFA assessment.
What did I find out from the assessment?
It was encouraging to see the amount of user research that had occurred. Personas had been developed, existing training providers had been engaged with and actual users of the existing system had been questioned. Understanding user needs is a key part of the assessment and getting this right stands you in good stead!
Moving onto the line of questioning I lead; it was interesting to hear that some of the challenges faced by the SFA are not dissimilar to those faced within the Land Registry. When building new services, you are rarely building completely ‘greenfield’, i.e with no existing systems that you have a dependency on. Government as a Platform was a theme we also discussed. The SFA considered the use of the GOV.UK Verify platform but it’s not appropriate for the SFA at present as there is no citizen identity required; the relationship is between the SFA and training providers.
We also discussed the options for how you transition from an existing service to another.. The SFA have a guaranteed audience who will use their service regardless. They could migrate using a big bang approach by decommissioning the current service, or see which service the users prefer. I like the latter. A conscious choice to use the new service by customers would be bona fide evidence it had improved the experience and met their needs. This fits in with the headline from the Government Service Design Manual to ‘build services so good people prefer to use them’.
Outcomes from the assessment are posted on the GDS data blog once the service manager has been informed of the outcome. There are two possible outcomes, pass and not passed. The alpha assessment for the SFA was passed.
What learnings can we bring to the Land Registry?
Like I previously mentioned, seeing another service helps Land Registry learn from other’s experiences. Important themes I took away from the assessment are:
- Confirming of the value of performing thorough user research from an early stage
- Re-iterating that it’s okay not to have solved all of the problems in Alpha, it’s about prototyping so users can understand the service and give feedback
I’m hoping I get invited to more assessments as I really enjoyed getting on the road and seeing the digital work occurring across government; it gives me confidence we’re not alone and helps me assess our own services against the standard.