the senselessness of radical intentionality
There is a meme floating about on the blogosphere that illustrates the stupidity of Jeff Goldstein AKA Protein Wisdom (Thersites also joins the fray.)
Now the bulk of what Goldstein talks about is, indeed, the antithesis of radical deconstructionism, but what marks him out as a fool is that he states that the Mona Lisa was painted by Michelangelo. Hmmm.
Perhaps what we have here is a case of mixing up the Teenage Mutant Ninja Turtles. I mean, without their color-coded masks, they are basically identical. Now, I’m not sure how you could exactly mix up Michelangelo with Leonardo, since Michelangelo wears orange and wields nunchakas while Leonardo wears blue and favors the katana, but perhaps Mr. Goldstein has a peculiar form of color-blindness. Myself, I tend to confuse Leonardo with Donatello because I am red-green color-blind and sometimes have difficulty telling blue from purple. The only thing that saves me is again the weaponry, since Donatello uses a bo staff.
But, again, the meat of his argument is the supremacy of radical intentionality: the meaning of the text is supposedly exclusively dependent on what the author meant, regardless of how the text can be interpreted.
This is dissonant with the (now) more favored method of deconstruction. In the aftermath of Derrida (whom I have yet to read), this has become the de facto standard for approaching a text. While critics are wont to demonize the strawman of radical deconstructionism, this isn’t what most sane people actually practice. Radical deconstructionism leads you to eisegesis, really. Since the premise is that nothing has any inherent meaning, you can twist every word to mean whatever you want it to mean, and every text can be used to support whatever bizarre agenda you espouse. Now, eisegesis has long been discredited as a form of literary criticism, so it is ridiculous that the critics of deconstructionism are intent on resurrecting it so that they have something they can kick around.
The more sane version of deconstructionism which is usual practice when writing critical analysis involves the moderately (then) revolutionary idea that texts do not exist in a vacuum. As is clear to any student of English when comparing Beowulf to The Canterbury Tales to The Wasteland to Neuromancer, the meaning of words change. This is the nature of living languages—meanings evolve, definitions split off, some words get radically modified, other words become obsolete. So from the purely historic linguistic point-of-view, it is crucial to contextualize a text.
Contextualization can be taken further, though. You can choose to get into the nitty-gritty of culture and subculture. This is the tack that post-colonialism basically takes. Through this lens, the sense of the text is only truly accessible when you decode the (frequently) racist and ultra-nationalistic subtext along with the text itself, whether from the point-of-view of the conqueror or the conquered.
In modern subcultures, contextualization is sometimes the only sensible way to derive meaning. For a trite example, how can you tell what the word “cool” means? Or maybe “bad”? Context (meaning historical context and context within the text itself) is not enough to decode this. It depends on who the author is, but it also depends on who the intended audience is. (And, frequently, it also depends on who the unintended audience is.) If I am writing directly about one high school kid talking to another high school kid, it may mean one thing. If, on the other hand, I am a grown adult writing about high school kids talking to one another, it can unintentionally mean something different. If I am writing about a high school kid talking to an adult, it probably will mean something entirely different as well (since you’ve got to start factoring in irony, sarcasm, and plain old misdirection.) This kind of subanalysis is only possible by deconstructing not only the text itself, but the context—historical, situational, cultural.
Now I am probably starting to mix terms together. (I was never good at taxonomy.) What I mean by context (unqualified), or by textual context, is simply the linguistic structure of the text. For example, how is “bear” used in the sentence? Is it a noun or is it a verb? This is basic grammar, and illustrates an oddity of English and similar languages which are very context-dependent. Most words tend to be extremely ambiguous without other words to back it up. The beauty of this is that, as an author, you don’t have to take the other contexts (historical, situational, cultural) into account—you can make it completely explicit when you need to. This is in stark contrast to something like Mandarin, in which words are not context-dependent, meaning that a single word can be readily decoded without any ambiguity without relying on its relationship to other elements in the text. Unfortunately, the downside is that you end up being heavily dependent on the other contexts in order to obtain unambiguous meaning (and sometimes this is the only way to obtain any meaning), and this is the main reason that people born outside of Chinese culture tend to get Chinese texts all fucked up.
So why is deconstructionism antithetical to intentionality? Intentionality presumes that the author’s intent is the only message worth analyzing, the only version of the text that has any legitimacy. I find this point of view nonsensical when you consider what language is actually used for: communication.
Few people with any background in information technology would gainsay the fact that when considering language (or, perhaps to make the metaphor more exact, when considering communication protocols), intentionality has zero usefulness except perhaps in debugging. When you are considering a computer, you are effectively trying to communicate to something that is completely unable to derive context and decode meaning from the incoming text. There is only one mode of interpretation: literal decoding.
Every programmer knows that if you don’t spell everything single damned thing out to the computer, the computer will not do what you intend it to do. By this simple fact, it is obvious that radical intentionalists would make terrible computer programmers.
What programmers and network specialists rely on is the fact that each fragment of “text” (if you deign to call a computer program a text) means roughly the same thing to every computer you send it to. There is a meaning to each word that exists independent of its context (although, realistically, each word is useless outside of its context—just see what a computer, or a person for that matter, thinks of the keyword for
without any indices or counters or commands to run, for that matter.) There is, in essence, a universal glossary, and when you tell any sane computer that understands a sane computer language to “print,” it will print. (Now this is not without pitfalls, because you do have to specify context: do you mean the console display, or the printer, or the network, but we won’t get into technical details here.) But even with computers there are variances. For one thing, there are dialects of computer languages. For another thing, there are different architectures and while the higher level language defines the logic, some very arcane bugs occur at the compiler level, which will never be caught if you don’t accept the fact that there are variances in architecture. And if the language is high-level enough, you stop caring about actual implementation. The example of this lies in the famed buzzword of the moment AJAX. You can write your program the way you want it, but the actual implementation is tied to the receiver end. The client does all the interpretation, and good programmers need work in graceful degradations so that all the multifarious platforms out there can get most of the desired functionality.
Considering a computer is perhaps an exercise in reduction ad absurdum, but it illustrates an important point: deconstructionism takes into account the human equivalents of processor architecture and client platforms: context. And not just context as applies to the author, but context as applies to the reader.
The key thing to remember is that language is primarily for communication. Sure, we use language to create art, and we play with language without necessarily intending actual communication, but we must realize that these ends are subversions of the original intent of language. Just because you don’t intend something with the text you are creating doesn’t mean that it doesn’t have a communicative aspect to it, in the same way that just because you are writing an actual poem in COBOL doesn’t mean that you can’t feed that poem into a compiler and actually have the computer do something with it (even if that something is to crash and burn horrendously.) Intent is easily separable from the text itself. Think of a deconstructionist as a computer that interprets a program. The big difference is that computers have no access to the author’s intention.
And given that the purpose of language is communication, it follows that all texts have both an author and a reader. Given the existence of an author and of a reader, it is clear that we end up dealing with multiple contexts. When I read Spenser’s The Faerie Queene, I can only contextualize it in terms of my 21st century mind. Clearly, Spenser could not have similarly written the text with a 21st century context in mind, and clearly, he did not write it for a 21st century context, since there is no way he could have conceived of such a thing. I think the existence of the passage of time in of itself makes the idea that intentionality is pre-eminent moot.
Now clearly there is a political agenda motivating the appeal to radical intentionality, in the same way that deconstructionism and post-modernism was readily channeled into post-colonialism. Intentionality reasserts fiat by pure authority. I say it’s this way, and that’s the only meaning worth talking about. This is a great tool for despots and tyrants. In contrast, deconstructionism democratizes text. Each reader takes his own particular set of contexts into the text, and is forced to be cognizant of the way his particular contexts interact with the text. In this way, nothing is taken for granted. In an ideal piece of literary criticism, all assumptions need to be laid bare.
As someone else noted in the rapidly proliferating comments on the sites cited above, deconstructionism does not destroy intentionality, it merely demotes it to simply another one of the assumptions that we must inspect.
Interestingly, a similar evolution is occurring in the IT world. We are moving from binary-only single platform applications (Windows-only programs) to open-source/platform independent applications (any open source project, but more spectacularly, the so-called Web 2.0) We have moved from a context-deprived, intention-only computing universe (bearing in mind that the only reason why context was irrelevant is that almost everyone ran Windows, so the application was the only thing that mattered) to a computing universe where the client platform does all the heavy lifting. What this means in practical terms is that the things a programmer can take for granted has diminished.
What the demise of intentionality means is that authors need to examine their texts from the lens of the reader. Interestingly, the very same person can sometimes experience the very same text completely differently if they do this exercise.
With the demise of intentionality, you cannot reasonably say something that can be construed offensive by saying that that’s not what you meant. What this simply reveals is that you did not (or you refused to) take into account the context in which such a comment would be received. It is context that makes the word “nigger” acceptable when black people say it to each other, but which makes it odious when a white person says it to a black person. I am tempted to say that this is common sense, but as has been well illustrated, common sense ain’t exactly common.
I am suddenly reminded of Borges’ story “Pierre Menard, Author of Don Quixote” where he brilliantly examines what radical intentionality would mean. In his story, Pierre Menard is a character who writes Don Quixote exactly as Cervantes had previously wrote, but Borges asserts that Menard’s work is completely different from Cervantes’ work.
And for my final feat of deconstructionism, I can’t help but examine Mr. Goldstein’s nickname “protein wisdom.” Now, radical intentionalist that he is, he probably has some private reason for this pseudonym that is likely very inane. But deconstructionist that I am, I can’t help but observe that this is very homoerotic. (Note that I don’t have the sense of decorum exhibited by my fellow commenters who have only obliquely alluded to this. I personally feel it necessary to make this explicit.) I am at once besieged by the imagery of Mr. Goldstein performing fellatio on some unspecified person or persons and drinking up their semen. And perhaps feeling enlightened after doing so. Hence, protein wisdom. So what I’m talking about here is basically bukkake. (Man, this is going to give me nightmares.)