Technical Report on the ORUK Validator

Hi all,

As I’ve mentioned, I’ve been investigating the ORUK Validator to see how much work it would take to adapt it to general use with “vanilla” HSDS or other Profiles.

To that end, I’ve produced a short Technical Report outlining my findings and recommendations:

The report covers my understanding of the shape of the validator and its capabilities, as well as my recommendation.

In summary:

  • You can deploy/run the back-end validation service independantly of the dashboard, and use it to validate an API feed. It’s required that the API feed is open, or otherwise you need to do some network-fu to host the validator in a location where it can access the URLs it needs to validate without any authentication
  • Schemas need to be manually loaded in, so you can load in copies of the HSDS Schemas in place of the current ORUK schemas to “trick” it to validate vanilla HSDS feeds, or your own profile.
  • It looks like it covers 11 out of 22 use cases from the use case spreadsheet, with some caveats.
  • Purely out of pragmatism, I recommend that if we want a general purpose HSDS validator we should reimplement the validation logic as a re-usable library; as the ORUK Validator is designed to be a monolithic “application” rather than a reusable tool (not a value judgement!). This also side-steps some of the licensing issues with the ORUK Validator, and gives the community a choice to determine the tech stack based on its own needs (not that there’s anything explicitly wrong with the tech stack used in the ORUK Validator).

I aim to present some headlines from this at the next Technical Committee meeting, alongside the list of use cases not explicitly covered by the tool so that the committee and work to prioritise these remaining ones for any future validation tool development.

Cheers,
Matt

3 Likes

Wow. This is a great comprehensive report. I look forward to discussing at the next Standing Technical Meetup to which the current ORUK developer has been invited as an observer. He is new to ORUK work so he is still learning and will appreciate some of the comments about needing better documentation.

I doubt much thinking went into the choice of licence and expect the ORUK people would be very receptive to a suggestion to switch to a different licence.

My reading of the report is that we should modularise (decouple) aspects of functionality so they can be used independently. This certainly applies to use cases that are deliberately not addressed by the current UK work. We also need to soft-code choice of version and profile.

I personally (and this is not an official view from the UK team working for MHCLG) would like us to migrate to one code set with responsibility for different parts allotted to different people/teams - specifically to: OR International (currently supported by ODSC) and ORUK (currently supported by TPXimpact).

There are things (such as closed feeds) which the UK team is not keen to do, but others (such as logging a history of passes of results or gathering metrics - on the last_assessed field) which it might be keen on. It would be great if we could combine experience and expertise for one validator that supports all versions and profiles of HSDA from now.

1 Like

Thanks Mike! I really hope that it came across as a relatively neutral take, as I know a lot of work went into the validator and it is well implemented and very suited for the UK context for which it was designed :slight_smile:

He is new to ORUK work so he is still learning and will appreciate some of the comments about needing better documentation.

Definitely happy to pass on my reckons about documentation although I’ll try to make it clear that I think the urgency is lower than maybe came across in the report. Jeff was kind enough to respond to my questions very quickly, so I’m keen to make sure he doesn’t panic and there’s a desperate need for thorough docs yet.

I doubt much thinking went into the choice of licence and expect the ORUK people would be very receptive to a suggestion to switch to a different licence.

Good to hear! As part of writing this report, I ended up opening a Github issue to this effect (rather than be the guy who moans about things and doesn’t offer solutions…).

My reading of the report is that we should modularise (decouple) aspects of functionality so they can be used independently. This certainly applies to use cases that are deliberately not addressed by the current UK work. We also need to soft-code choice of version and profile.

I agree. This would go a very long way towards making this a core part of the community’s tooling ecosystem.

Soft-coding choice of version and Profile should be a goal for sure, but there remains open questions around how a validator should find/fetch schemas for versions and profiles. I think this is something for wider community infrastructure, rather than having the ORUK Validator have to tackle this independantly.

I personally (and this is not an official view from the UK team working for MHCLG) would like us to migrate to one code set with responsibility for different parts allotted to different people/teams - specifically to: OR International (currently supported by ODSC) and ORUK (currently supported by TPXimpact).

I think I understand and agree with this. Just to check, are you saying that e.g. we’d all be working within the same Github organisation to create things which are all labelled/“owned” by “Open Referral”, and then responsibility for different things is then devolved to teams such as OR/ODSC and ORUK/TPXimpact? Or have I misunderstood?

There are things (such as closed feeds) which the UK team is not keen to do, but others (such as logging a history of passes of results or gathering metrics - on the last_assessed field) which it might be keen on.

I think the beauty of having reusable and flexible components/modules is that the UK context can build applications which just ignore the notion of closed feeds, whereas other contexts can use the same validation components to validate local data or data behind closed APIs.

It would be great if we could combine experience and expertise for one validator that supports all versions and profiles of HSDA from now.

I’m keen to avoid us thinking of a validator as a single product; rather I want us to drill down into what “validation work” entails and build tools to address those needs. These can be recombined in multiple ways to support different communities. But yes, I envision a future where one of those combinations is a hosted validator website which works across all versions and profiles of HSDS/HSDA.

1 Like

Well it’s partly about who funds the work and partly about where the code goes. I could see UK work starting in a separate repository and then pushed to the main OR repository for acceptance there. I’m not the best person to saay how the mechanics should work.