A British businessman who sold the Iraqi security forces more than 6,000 fake bomb detectors based on $20 golf ball finders bought from the United States was convicted of fraud on Tuesday in Britain’s central criminal court. The businessman, James McCormick, 56, was described in court as having made more than $75 million from sales of the fake device in Iraq and Georgia, among other places, claiming they could detect bombs, drugs, currency and ivory, and track objects up to 3,280 feet below ground.
The devices were not faulty nor defective; they were completely useless. They had no working electronics in them that could detect bombs or anything else. The device has only one moving part, an antenna-like piece of metal that freely swivels, supposedly detecting explosive and other materials. McCormick sold the devices for up to $40,000 each. At least 800 of the detectors were purchased by the Iraqi government and used at checkpoints throughout the country, as well as in Mexico, Syria, Lebanon and Niger. McCormick faces up to eight years in prison.
According to the Times, “Iraqi officials reacted with fury to the news, noting a series of horrific bombings in the past six months despite the widespread use of the bomb detectors at hundreds of checkpoints in the capital. ‘This company not only caused grave and massive losses of funds, but it has caused grave and massive losses of the lives of innocent Iraqi civilians, by the hundreds and thousands, from attacks that we thought we were immune to because we have this device,’ said Ammar Tuma, a member of the Iraqi Parliament’s Security and Defense Committee.”
The Logic and Psychology of Bogus Detection
You might think that if the bomb-detecting devices were completely worthless, that would soon become clear to those people whose lives depended on them.
And you’d be wrong. The detectors were widely used in the field for years before questions were raised about their validity, and McCormick himself claimed to truly believe that his gadget worked.
How is it possible that the bogus detectors went undetected for so long? Part of the answer lies in human psychology and understanding random chance.
In security and law enforcement situations where people and devices are trying to detect things such as bombs or drugs, the task is finding the threats among the non-threats (or the signal from the noise). Bombings are relatively rare events compared to the number of non-bombings: the vast majority of people, vehicles, and items going through a security checkpoint will be innocent and harmless. The number of real threats will be a small fraction of the potential threat, perhaps 1% (or likely far less).
If one of the bogus bomb detectors falsely detects a bomb or threat, then a suspicious package or person is searched and (assuming nothing is found) allowed to go on their way — an event that routinely happens at airport security checkpoints around the world every day. If that happens (with any device, working or not) the assumption is not that the device doesn’t work, but instead that it gave off a false alarm — as real, working devices often do. Thus the bogus detector’s false positives are ignored (as long as they do not happen continuously, of course). Because bombings are relatively rare events, most of the time a non-working bomb detector will appear to be working correctly, since it didn’t detect bombs that were not there. As they say, even a broken clock is right twice a day.
What about when the device “failed”? Even after a real bomb was successfully smuggled through the checkpoint (and not detected by the non-working bomb detector) and detonated, the fact that the detector failed to find the bomb can simply be rationalized away by security officials: maybe the bomb was expertly hidden in some material that prevented the detector from working that time, or the person using the detector didn’t operate it correctly, or maybe the device simply malfunctioned, as all electronics do now and then. After all, everyone knows that no system, person, or device is 100 percent accurate or perfect all the time: Operators make errors, devices are not correctly calibrated and give off false alarms, drug-sniffing dogs make mistakes, and so on. Because there were alternative, plausible explanations for the device’s clear failures, few suspected that they did not (and could not) work at all.
There are other powerful reasons to give the bogus devices the benefit of the doubt, including the investment of money and trust in the devices. The gadgets look scientific and were claimed to have been proven effective by the manufacturer; soldiers and security personnel using them would have neither the scientific expertise nor the authority to question them. Surely no one could imagine that such an important piece of security equipment used by high-ranking officials would be completely fraudulent.
For these reasons there is no way to know exactly how many innocent lives McCormick’s bogus bomb detectors cost. Examples like this provide cautionary tales for not only the importance of using good science (especially in life-and-death matters), but also how we can fool ourselves into thinking something works when it doesn’t.
Photo: Metal detectors are frequently used to scan for bombs in Iraq. Credit: ALI ABU SHISH/Reuters/Corbis