Review: James Halperin, The Truth Machine ("speculative novel")

Alan P. Scott - Rants - Reviews

mendacity and faint praise


"The Falsehood that exalts we cherish more
Than meaner truths that are a thousand strong."
--Aleksander Sergeevich Pushkin
in William Barton & Michael Capobianco, Iris

Although I was tempted at the time to respond to some of the sound and fury that erupted when James Halperin's The Truth Machine was first announced on the Usenet newsgroup rec.arts.sf.written ("r.a.sf.w"), I refrained from doing so until I was able to pick up the book at the library and read it myself. (I was influenced enough by the negative opinions not to want to search it out on my own individual dime.)

I was somewhat surprised to find that it was neither as bad, nor as good, as I'd expected.

The Truth Machine is credible as a first novel - the prose is tendentious and a little heavy, and the characterization weak, but not more so than many successful thrillers with less plausible plots. It's not great work, by any stretch of the imagination - certainly not the kind of thing that would ordinarily have sparked fervor either pro or con among the worlds-weary residents of r.a.sf.w (That - I say, that's a joke, son... a Foghorn Leghorn reference, for those who have read the book) - but I found it readable. My standards are loose, though, and I read voraciously. More discriminating readers may want to take warning from Robert G. Buice's review in the online Commonplace Review.

* * *

So if The Truth Machine is no great shakes as literature, why all the hooraw?

Well, the book started out as a self-published and self-promoted venture, before being picked up by Random House; as I recall, some of the early posts on r.a.sf.w were so enthusiastic that they seemed a lot like ads, which provoked some controversy.

Plus, the book wears its political sentiments boldly on its metaphorical sleeves, in the "news headlines" that begin each chapter. Some people don't like that. I had a hard time getting a specific reading on what Halperin wanted to happen politically, though, perhaps because Halperin was trying to make his utopia seem "realistic" by including elements with which he personally doesn't agree. He's definitely not a liberal, whatever that word means anymore, but his utopia seems to include (for instance) whopping doses of corporation-worshipping libertarianism, many rather simplistic reactionary solutions, and a belief in the continued necessity for/existence of a strong federal government which seems rather at odds with both.

But I think one of the biggest problems people had with The Truth Machine was the more-than-sneaking feeling that the author intended this particular bit of throwaway fiction to be prescriptive - that the development of a Truth Machine is not only an achievable but also a desirable goal in real life. The jacket copy for the Random House edition fosters this view, descending into hyperbole by calling the book "groundbreaking," "prophetic," and "nothing less than a history of the future." The author himself may well believe his own hype, although he does make some small attempt to be circumspect, saying only "the technological and political predictions dated after 1995, although based on extensive research, are fictional." So I'll spend most of my time attempting to debunk this idea, straw man though it may be.

A Truth Machine, as described in the novel, is a combination of scanning hardware and sophisticated software that is capable of
1) deciding with 100% accuracy whether or not a particular human's veracity can be judged using the device, and
2) in those humans it decides it can read accurately, detecting deceit, again with 100% accuracy.

Not "near 100% accuracy" - it's stated explicitly in the book that even 98% is not good enough to achieve the goals set for the device. 100%.

Right there I have a problem; I don't think that any system can tell with 100% accuracy when it can be used accurately. I would be tempted to say that it's mathematically impossible for any nontrivial system to be entirely immune from false positives - from thinking it's okay when it's not. Some cybernetic equivalent to Godel's Incompleteness Theorem, perhaps.

Detecting deceit would be trivial in comparison, yet I have a hard time believing that one either, given the wide variability in how brains seem to do their business. I suspect the footprint of the nebulous concept "deceit" in one brain has no guaranteed common factors with that footprint in another. An empirically-trained neural net could catch most cases of deceit, that I do not doubt, and in fact great strides have apparently been made in that area recently... but the accuracy Halperin demands of the device is not, in my opinion, physically possible. (I may have to retract this view, though, in light of research I encountered in June 2001 on the "P300" area of EEGs; this particular trace appears to provide a reliable and automatic way of detecting a brain's guilty recognition of something, and hence may lead to a real-world "truth" machine.)

That is even assuming that the "Truth Machine" would be used only as a force for good (assuming that truth = goodness). There's a reason why "poor and honest" is a cliché; it seems far more likely to me that the machine would be used to keep the poor honest, by the rich and powerful who would never have to submit to its effects. Halperin's attempts to work around this problem seem to me to be unbelievable.

Worse, though, is the fact that the so-called "Truth Machine" in the book doesn't actually measure truth at all. It measures only deception, making no distinction between a true belief and a false belief honestly held. (Halperin actually acknowledges this in the novel once, early on, but the rest of it blithely ignores both this fact and its implications.)

Unlike some, I am not always willing to impute conscious deceit to those who disagree with me. I believe that many, if not most, truly believe the positions they're taking, however untenable they may be to my own more enlightened outlook. They're not deceiving me, and the Truth Machine wouldn't have any problem passing them for whatever test they're taking. Yet this doesn't prevent them from doing, sincerely, [what I think is] the wrong thing.

In a world continually wracked by the convictions of believers, does Halperin really think that the elimination of conscious deceit would fix as much as he says it does?

* * *


Last updated June 11, 2001.

©1997, 2001 Alan P. Scott. All rights reserved.

Contact me:

ascott@pacifier.com