Doteveryone has a call for suggestions following brexit under the banner “never let a crisis go to waste“.
That applies to data usage and transparency too – the changes ahead will be good for the transparency of data flows, or bad for the transparency of data flows, but they will not be neutral.
Do you know what happened to your medical records? what would happen if you could find out?
The BBC tech journalist Rory Cellan-Jones was surprised last week – and had a very human reaction.
As startups in shoreditch want to do more with data, will there be a higher standard than the GDPR for bulk personal datasets (large amounts of data about citizens)? It should not be enough to use data simply because you can. It should be necessary to be seen to keep your promises, not just say you will keep promises.
British institutions are in an ideal place to move this forward. For example, the NHS is an integrated health system, capable of talking to a patient over time, with information and resources and institutions bigger than one company or short term interest. Verify has the potential to do similar for the rest of the public sector.
In short: Show, don’t tell.
The NHS is in a position to do this with health data – it can show patients how 70+% of their data was copied tomorrow if there was political will, with a reporting route to patients via existing digital means.
The private sector is nowhere – their data models are based on selling data and commercial confidentiality as an excuse for deep creepery. Although it is a great opportunity for the companies that wish to abandon a race to the bottom for invasions of privacy, and profit in respecting people and not selling them out. Will companies using data about you give a complete accounting of they used it for and why?
The Deepmind messes, including those previously, remind us that while data may be used legally, that is not the same as being publicly acceptable or in the interests of data subjects, which is also not the same as not being a jarring surprise. Ignoring public acceptability and views is what caused the concern in the first place.
Deepmind are running as fast as they possibly can towards the NHS, because by the time that the public understand the full potential of AI, DeepMind hope they have NHS examples to reassure the public and make it too hard politically to shut them down. The blockchain businesses are running towards the banks for similar reasons around cryptography.
While one data agenda undermines public confidence in institutions, other parts need those institutions to continue long enough to be in a position to undermine them.
There can and should be greater accountability from reputable organisations – the best indicator of how an organisation will use data about you next month is how they used it last month, and whether when asking for consent, how much they will inform you.
There is an alternate approach – sustaining public confidence, but it will require care and design. We no longer have to solely use a one off signal – a tick box on a single use form – as the only direction of communication that was designed for a world of paper.
This is also a problem of institutions. Private entities will claim transparency, and there will be no mechanism for public accountability. The criticism of Google Deepmind’s recent Royal Free project was disappointment in the NHS, not surprise at Google being creepy and making statements seen to be unsubstantiated. Companies keep doing what is in their own commercial interests. There has to be a systematic institutional response to scope for predatory behaviour, which will be undermined by commercial actor, as the most dodgy companies can not continue business as normal in a data transparent world.
It is a business imperative to manufacture public crises to bring institutions capable of transparency down to the secrecy of the creepiest unregulated business.