Robert Matthews, a regular writer for QED column in the Sunday Telegraph, looks at the “wonders of technology” David Blunkett, the Home Secretary, decided it was time we should all benefit from.
Mr Blunkett appears to have fallen under the spell of biometric methods, which use characteristics ranging from fingerprints to handwriting to verify the identity of people. He seems to favour a particularly sophisticated version of the technology, which uses the unique iris patterns of the eye to check ID.
…there is still a stunning lack of awareness of a basic mathematical result that shows why we should all be very wary of any type of screening, biometric or otherwise.
In the case of screening – whether for breast cancer or membership of al-Qaeda – the [Bayes’s] theorem shows that the technology does not do what everyone from doctors to Home Secretaries seems to think it does.
To take a concrete example, suppose a biometric screening method is 99.9 per cent accurate: that is, it spots 99.9 per cent of imposters, and incorrectly accuses one in 1,000 bona fide people (in reality, these are very optimistic figures). Now suppose that every year a horde of 1,000 terrorists passes through Heathrow airport. What are the chances of the biometric system detecting any of them?
The obvious answer is 99.9 per cent. But, in fact, Bayes’s Theorem shows that the correct answer is about two per cent. That is, when the alarms go off and the armed response team turns up at passport control, it is 98 per cent likely to be a false alarm.
Why? Because not even the amazing accuracy of the biometric test can cope with the very low prior probability that any one of the 60 million passengers using Heathrow each year is a terrorist. Sure, it boosts the weight of evidence in favour of guilt 1,000-fold, but that is still not enough to overcome the initially very low probability of guilt.
So there you have it. You just need to calculate your probability of being one of the incorrectly accused one in 1,000 bona fide people and books your ticket accordingly.