Amazon used to include a so-called concordance that listed words, phrases, and other information deconstructed out of a novel on the book’s sales page. What were Dan Brown’s favorite words? What were Tom Clancy’s favorite phrases?
When I saw those concordances, my first thought was that they sounded very close to author Italo Calvino’s parody of literary deconstruction in his novel If on a winter’s night a traveler. The gist of the parody was that one would be able to enjoy an entire novel by simply reading lists or words and phrases along with other tips uncovered through computer analysis.
As we have seen, computers have been used to read texts to validate whether those texts are within an author’s style or were written by somebody else. I can see the value in that far beyond the anti-plagiarism software used by some universities. That is, what’s the likelihood that a newly discovered book was written by a great master?
I read Calvino’s book long before Amazon was a gleam in anyone’s eye. So, when I first saw those Amazon concordances, I immediately thought of the parody in the novel. We’re almost there, I thought. We can almost read the concordance and get the same amount of enjoyment out of the book we would have found had we bothered to spend many hours reading it. Maybe this is why Amazon removed the feature: it reduced sales.
This all came to mind this morning when I read “From ‘alibi’ to ‘mauve’: what famous writers’ most used words say about them” in The Guardian. We learn here that Bradbury’s favorite word was “cinnamon,” that Rowling likes the phase “dead of night,” that Dan Brown uses “full circle,” and that Nabokov used the word “mauve” forty four times.
Computers will tell us amazing things. I don’t really want to know them unless I’m writing satire. (I once proposed using the Amazon concordance to The Da Vinci Code to write bestseller novels with the right stuff in them to get big reviews, loads of money, and movie deals.) I will confess that when I find myself using a word or a pet phrase too many times in a story, that I do a search for the suspected word or phrase to see how often it appears. If I don’t like what I see, I get rid of it.
I don’t think I want to know how often Nabokov used the word “mauve,” much less what a computer or an expert in literary analysis thinks that fact means. I don’t even care if James Patterson uses 160 cliches per 100,000 words or consider it a plot spoiler to hear that Donna Tartt uses “too good to be true” more than somebody in an ivory tower deems appropriate.
When computers and their deconstructionist slaves finish with a novel, the story, I think, gets lost in the shuffle rather like learning that you love your spouse due to sequences of binary reactions in your brain rather than the fact they listen to what you say and care about you and support even your worst faults.
The Amazon concordance had its amusing feature, telling us the number of words the books gave us per dollar and per ounce. The value of that can’t possibly be underestimated.
Too much information, and to what end?