The first story was about Microsoft filing a patent to record and score meetings based on body language. https://www.bbc.com/news/technology-55133141 The patent application contains a drawing depicting a bunch of sensors in a meeting room to capture and analyse factors including facial expressions and body language. Whether this patent results in a commercial development remains to be seen, but there must be some motive behind it.
The other story was about Amazon’s Panorama box: a device that can be fitted to existing cameras which can then draw on off-the-shelf apps to read and interpret the images being processed, and take some form of action through automated workflow rules and business logic. https://www.bbc.com/news/technology-55158319 AWS Panorama is something which is available now and is used for functions like detecting vehicles being driven in places they are not supposed to be, tracking customer queues in shops and the like. Fender guitars currently use it to track how long it takes for an employee to complete each task in the assembly of a guitar. Follow this link if you want to know more about the product https://aws.amazon.com/panorama/.
The big question debated by the team at DPP was the extent to which the regulation of the use of “personal data” is up to the job of effectively controlling the use of these technologies. The TUC recently published a report on technology-based monitoring in the workplace https://www.tuc.org.uk/research-analysis/reports/technology-managing-people-worker-experience which throws up some interesting concerns and remedies, but this kind of technology is not contained to the workplace – if it’s in use in shops to monitor queues and public places to detect traffic violations it inevitably has the potential to affect ordinary citizens. In addition, AWS Panorama is an augmentation for internet protocol (IP) cameras which this morning were selling for as little as 99p on E-Bay so the cost of technology is unlikely to be a barrier to wide-spread adoption.
As this is new technology or in the case of AWS Panorama, existing technology deployed in innovative ways, using it to process personal data is subject to a data protection impact assessment (DPIA). But the Outsourced DPO has mixed experience of DPIA: some being a highly detailed critical analysis, and others being a superficial tick-box exercise obviously lacking understanding. One concern raised in the TUC report is that the interests of workers is often overlooked when AI is rolled out at work. This should not happen in an effective DPIA process. The report also cites a recent Ipsos survey conducted for the European Commission which apparently says that a whopping 42% of enterprises currently use at least one AI technology. Wow! That if that is an accurate reflection of reality it represents an awful lot of DPIAs that have been carried out.
The DPIA process in itself is a great tool and, coupled with the requirement to enter into consultation with the supervisory authority prior to rolling out processing which remains high risk even following DPIA mitigation it has the potential to act as a brake or at least an effective check on badly conceived implementations. The ICOs regulatory sandbox is another great tool allowing enterprises to discuss and explore concepts with the regulator. But the necessary vagueness of the language used in the law could be its weakness. One person’s high risk is another’s medium risk. A DPO with a technical background may conduct a better DPIA than one without.
What we really need is society and culture to change. The DPIA is only as good as the people undertaking it and they are only going to work within the social constraints and norms they recognise. We need data protection and privacy to be given the same high regard that health and safety and other disciplines enjoy. In time DPIAs will be an integral part of decision making and front-loaded into projects and initiatives. They won’t be the preserve of privacy practitioners – everyone will be able to do one at some level. But at the moment it seems that there is a rush to try these new technologies out to push the boundaries and find out what they can do. Sadly, their privacy implications appear to be of less concern.