Is there a right and a wrong way to use your customers' biometric data?

It stands to reason that organisations invest time and money in seeking out the best and most effective technologies for managing customer contact and, in particular, identification and authentication.  HMRC, for example - not known for its advanced or innovative customer contact approach - adopted a voice authentication system, designed to record customer voices in lieu of a more ‘traditional’ password. Following complaints, however, it was decided by the ICO that callers had not been given further information or advised that they did not have to sign up to the service. There was no clear alternative option and the ICO decided that HMRC did not have adequate consent from its customers. An enforcement notice ordering HMRC to delete any data it continues to hold without explicit consent has been issued as a result.

The Information Commissioner said that ‘HMRC appears to have given little or no consideration to the data protection principles when rolling out the Voice ID service’.

She went on to highlight the scale of the data collection – seven million voice records – and that HMRC ‘collected it in circumstances where there was a significant imbalance of power between the organisation and its customers’

Not only did HMRC not explain to customers how they could decline to participate in the Voice ID system, but it did not explain that there would not be a detrimental impact if customers declined to participate. Finally, it was found that there was no data protection impact assessment in place, which should have been carried out given the risks associated with processing biometric data.

 

So, what do you need to know about using biometric data?

The ICO highlighted some key points for any organisation considering using new and innovative technologies involving personal data (including biometric data) to think about:

  1. Under the GDPR, controllers are required to complete a DPIA where their processing is ‘likely to result in a high risk to the rights and freedoms of natural persons’ such as the (large scale) use of biometric data. A DPIA is a process which should also ensure that responsible controllers incorporate ‘data protection by design and by default’ principles into their projects. Data protection by design and default is a key concept at the heart of GDPR compliance.

  2. When you’ve done your DPIA, make sure you act upon the risks identified and demonstrate you have taken it into account. Use it to inform your work.

  3. Accountability is one of the data protection principles of the GDPR - it makes you responsible for complying with the GDPR and says that you must be able to demonstrate your compliance by putting appropriate technical and organisational measures in place.

  4. If you are planning to rely on consent as a legal basis, then remember that biometric data is classed as special category data under GDPR and any consent obtained must be explicit. The benefits from the technology cannot override the need to meet this legal obligation.

SOURCE:https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/05/blog-using-biometric-data-in-a-fair-transparent-and-accountable-manner/

 

The ICO guidance on informed consent provides further advice. The ICO states that it is developing guidance on biometric data and we will share that on our blog as soon as it is released.