How Humans Differ From Computers
Data has three essential qualities: accuracy, precision, and veracity. Accuracy tells us if we are close to the target (in the ballpark). Precision exactly measures an answer (how many decimal places). Veracity corresponds to the facts (the real world).
The difference between accuracy and precision is easy to define, e.g, target shooting. A five-shot group is precise if it fits under a postage stamp. But such a grouping is not accurate unless it’s in the center of the target. If it’s not, it needs to be corrected for windage.
Errors of veracity are the history of warfare. The great US military power stumbled in Vietnam, whose most effective military technology was the bicycle. Despite a long historical record, the US also underestimated people defending their homeland.
Accuracy refines veracity. A parent says the stovetop is too hot to touch. Three-year olds quickly learn when it is too hot and when it is not (detecting heat at safe distance). No conceivable computer can duplicate this simple feat, i.e., discovering the accuracy of the facts.
Most living creatures perform comparable feats. They do so with brains that are tiny and slow compared to the so-called brains of computers. Living brains evolved primarily for veracity, and secondarily for accuracy. Precision is a product of civilization.
Humans interact with the world to learn veracity first and then accuracy. The computer cannot know the world (veracity) beyond facts we provide. It cannot know if its answers are close (accuracy) without our evaluation. Only then can it employ its one great skill: precision.
Recall that early computers were just high-powered calculators. Today they do much, much more. Yet, most of what they do is still calculation—larger, faster, and far more precise. They have no inherent ability for veracity or accuracy, That’s what we do.