Friday, January 6, 2017


“Writing is saying to no one and to everyone the things it is not possible to say to someone.”—Rebecca Solnit

Writing should be that, at least. The idea I find in Solnit’s statement is the one that has been the driving force behind all the journal writing I have ever done. It’s saying to no one and, I would say, “anyone” things not meant for any particular someone. Not that I’m writing suppressed secrets or anything like that. For me, at the start, it was a case of needing writing to say anything at all. Most conversations aren’t aimed for much purpose apart from exercising the vocal chords or simply making time with some particular person. Argument is generally an airing of griefs rather than of views. Conversation has its place, but rare are its occasions, in my experience. Chat is much more prevalent and, in my youth, I had a knack for that only in very limited contexts. And I wasn’t particularly skilled at introducing a topic and developing it. That came much later, with teaching. In the days when I first began keeping a journal—19—I wrote because no one was listening and, even if they were, I didn’t have much to say, aloud.

It’s that “not possible” that we might spend some time discussing. What makes saying something “possible” or “not possible”? Some might think: censors, internal or external. But censors insist that something is forbidden to be said or maybe, in a sense, unthinkable and thus unsayable. But “not possible to say to someone” is the full phrase. The key idea it seems to me is that there is no “someone” poised to receive these intelligences. It’s “not possible” to think of a single individual. No valued listener or friend. More, perhaps: what one wants to write, needs to write, doesn’t necessarily need to be heard. It must be read, or forget it. This is what I took Solnit to mean because it addresses my own quandary about writing to be read. I have no problem with writing something I’m expected to write—the terms are, as it were, provided by the occasion. But writing what no one asked one to write, writing that isn’t simply—as in a notebook—for one’s eyes only or primarily, such writing demands a reader who is not oneself, and yet who could that person be? All “someones” in one’s life are foreclosed by that phrase “not possible to say to someone.” If there were someone one could address, one would write a personal letter, pick up the phone, send an email or text.

Perhaps tweets function in the way Solnit means. They certainly have a gang’s-all-here quality that means they’re for everyone, whoever, wherever. And they seem to be, rhetorically, a gesture more than anything, something that, if addressed to only one someone, would have personal meaning, but when flung into the internet become bits of observable phenomena, to be made of as who so will. But I don’t have much to say about that. Though, arguably, a blog post, and I’ve written a few of those, is just an overly long tweet. If so, then, yes, let’s just say that in this online format one is speaking to no one and anyone all the time. But is that what one is always doing in writing anyway? Perhaps, but to me the difference between online writing and a journal is that “anyone” factor. In time, should a notebook survive, anyone might come across it and read it, true, but it wasn’t written for that eventuality. A post already presumes an environment in which, potentially, anyone’s eyes might fall upon something, for reasons which remain obscure. So, while I feel that journal writing is for “no one” (except me), blog posts are for “anyone,” deliberately.

Then again, I think of Nietzsche’s subtitle for Thus Spoke Zarathustra: “a book for everyone and no one.” If Solnit is correct, every book could bear that subtitle, as any act of writing could. But Nietzsche meant it in a particular way, as though the contents of the book, while there for everyone to glean, had no immediate audience. No one was quite ready to receive it or read it. And yet it was written for them, for us, all.

That aspect of Nietzsche’s writing appealed to me greatly in my teens. That sense that “no one,” perhaps, had ever quite gotten it, so that “everyone” was missing the point. I have that sensation a lot. Most things I read, however perspicacious they may be, usually suggest to me some aspect of the question that the writer is not addressing, is missing. It was Nietzsche who first exposed me, repeatedly, to how prevalent is the fact that, in making a point, one misses a point. It’s not simply that there are two sides to the point and one is stressing one and ignoring the other, no, it’s more dialectical than that. It’s the fact that, in saying something, one creates a shadowy negative of what one is saying in the reader’s mind. A reader well-informed on the topic will have other facts and points already raised, mentally. But even someone just reading along will see the gaps in the logic and, sometimes fatally, the rhetorical sleights that create a sense of authority where there is only opinion or, worse, received opinion. We all drop the ball in writing and even more so in speaking. In fact, a lot of writing seems to exist for no other purpose than to sound the horn, saying “look out, I’m speaking here.” Some people have so much to say.

In our Trumped-up times, speech, as any kind of measured rhetoric, has taken a big hit. Public discourse may not survive the blow. Already it was weak in the knees. Obama, who speaks with a judicious weighing easy to parody, was a true anomaly in U.S. politics. It’s all banter, bluster and balderdash now, and one tweets to everyone and anyone what may not be reasonable to say to “someone.”

Which, I suppose, might be a way of saying—to anyone!—that one reason to keep writing, much as I hate to say it, is to stop one’s ears to all the worthless verbiage. If I’m writing I can’t be listening, or reading. And there’s only so much of the latter two acts I feel willing to engage in, at this time. Sure, it’s always possible to read writing from some other time, to engage the mind with more knowledge that, while not strictly useful, helps to offset the sense of wallowing in the worst excesses of the American public so far endured. To the extent that we Americans are all some portion of the body politic, we are all now numbered among the Unfortunate Stooges of America, played for patsies by a Clown Prince of Crime, à la The Joker.

During the election, I happened to see episodes of the old TV series Batman, starring Adam West, in which The Penguin runs for mayor and he’s kicking the incumbent’s ass, so they ask Batman to run, and he does, much in the measured tones of our outgoing Prez, which gets him nowhere in the climate of the Penguin’s sideshow razzle-dazzle. The Penguin’s rhetoric’s resemblance to Trump’s empty promises is uncanny, or would be except that the blueprint for how to say nothing and mean it has long been engraved into the national psyche, so much so that Trump on the stump was always the bad Reality TV version of what a scripted bullshit-slinger would sound like, trumpeting the message that the only people stupider than his listeners are the people who have been elected or hired to do the jobs they do. I’ve heard this “everyone’s an idiot but me” line my entire life, and it usually comes from someone who hates the higher-ups but who doesn’t want their tasks. Wants to snipe, not lead. Trumpy, however, sniped his way into a job. It’s a job he doesn’t really want—in the sense of its job description—unless he can do it his way. He was elected president, but he ran for monarch. So I guess we’ll see how that plays out. What else is there to watch?

Meanwhile, it’s a new year. I’d like to say it’s time, for me, for a return to writing, the kind of writing I don’t engage in often enough, to go back to whatever it was that got me interested in doing it and to find out if it’s possible to say what I wanted to say. I never had enough faith in big abstract things like “the American people” or “God” or “my fellow man” to be bitterly disappointed by the crap that comes down. In the Batman episode, ultimately, the people don’t elect The Penguin, showing that there was still an electorate capable of distinguishing between a snake-oil salesman and a person with at least a few commitments to something other than himself and his own will to power. But that was in 1968, which is when everything got broken and pretty much stayed that way. When I was in high school, in the mid-Seventies, I read this passage by Kurt Vonnegut, Jr. It seemed to say it all then.

I remembered The Fourteenth Book of Bokonon, which I had read in its entirety the night before. The Fourteenth Book is entitled, "What Can a Thoughtful Man Hope for Mankind on Earth, Given the Experience of the Past Million Years?"
It doesn't take long to read
The Fourteenth Book. It consists of one word and a period.
This is it:

At the time, that suited my view, as a teen without much connection to my times or my contemporaries. Later, when I was much better educated, I would try to qualify that passage. The “past million years” is too sweeping a generality. I still think so, but I would apply the formula to “the past 50 years,” easily. So it goes.

Monday, July 4, 2016


A Curated Self (7/2/16)
It’s finally the effort to articulate the self that matters to me, but that would be a self in formation, through the experiences that, from the vantage viewed, seem most prevalent. It’s a curated self, then. As such, the experiences form a sort of syllabus or gallery, a selection that suits as objective correlative of something to which they can only attest. The writing then is the attestation, the signifying of what is otherwise mute experience. Finding the voice for this correlation, then, is the task, shaped by all one needs it to mean, to stand for (and against). Because finally what is at stake is the selection of—militia-wise—one’s “colors.” So all the list-making is only an effort to nudge one’s current self—drowsing in its indifference—back to the moments when it mattered, this self-formation, this education by one’s best lights. And in the scope of those lights everything else—what one does and becomes or fails to do or become—shines or, at least, becomes visible, vocable.

The position as “growth of the critic’s mind”—rather than poet’s or artist’s—structures the tendencies. This is not a discussion of how one becomes a writer, but how one becomes a consciousness that discerns values, that argues worth and meaning. Resisting this in the name of the worth of my own imagination—as to be shown in invented characters, situations, or verse forms/voices—has set me on and off. The point of criticism becomes simply the clarity of seeing and saying. The philosophical benefit is simply enhancement of being, which is to say consciousness, for what else is there? The chosen objects express and establish the speaking subject.

Professing and poeticizing (7/3/16)
To qualify an earlier statement, re: “professor.” One is a professor in the sense of professing certain values, in the experience and in its presentation. What one makes more of—than one might in a lecture or in a position paper—is the fact of the representation. The performance bears remarking on as it is by no means certain what its orientation should be. It’s certainly not only determined by the effort to pass on knowledge or to clarify or instruct. The intention behind those activities is more selfless, whatever their requirements and varieties may be. Who stands before an audience to articulate the self? The poet, perhaps we think, and in that sense I suspect my position to be lyrical, as in based on that intangible state of self-communing that promotes the composition of verse, or may, but the intention is again different. The composition is not a rendering at that level—or at least not at the level I pretend to when writing in verse. The critical element necessitates a different status. I suppose one could write a verse essay and achieve something like, but in such instances I would be more likely struck by artfulness or its lack, though that’s only suppositious. I could only know for certain by doing.

In any case, the purpose is not to assert by means of lyrical intensity primarily but by some element more discursive and arrived at via a process of reflection that, for me anyway, is much more mysterious in verse. The emphasis here requires fidelity to some quality other than lyricism, or music, or the value of putting into speech for the sake of speech. The quality has to be a testimonial, a statement or argument worth entertaining about what the given object means or has meant. The rendering is of the “figure”; the means is the “fiction,” and I’m only able to approximate, in advance, what such fictions might be comprised of. Certainly, as in lyric, a definite element may be the autobiographical, that sense that the speaker must let his own ideas play upon his pulses—to use Keats’ phrase—and certainly the autobiographical impulse—in me—will mean an allusiveness to whatever has left the deepest impressions in that part of life lived, as one once said without irony, inwardly. The irony is all on the side, here, of making clear what might be best left obscure. Why should it be necessary to maintain in language, outwardly, what is best apprehended inwardly? No other reason than death. Speak now, or forever hold your peace, indeed.

It’s not simply a matter of “strange things I have in head which will to hand,” but it is that, if by “strange” we accept “particular” or as one says in an older idiom, “peculiar” to the mind in which they originate. But it is also a matter of—and this is a key distinction from what I would expect from myself in either lecture or verse—following one’s thought where it leads, regardless of whether or not it prove original, poetic, or instructive. It should at least be illuminating because it illumines what otherwise is dark: the relations between distinct occasions and experiences and readings and rememberings. It is the making of the net to catch the sleek fish of one’s imaginary. Those shapes that signify by their motions and colors and rhythms.

A personal geometry (7/4/16)
“Really, universally, relations end nowhere, and the exquisite problem of the artist is eternally but to draw, by a geometry of his own, the circle within which they shall happily appear to do so.”—Henry James

So, to drop my “net” image and accept James’ figure of a personal geometry, the question is how to plot out “the relations” so as to end somewhere for, as he says, the relations don’t really end. And that—knowing that all too well—I could say has stymied me more than once when I felt “prepared” to embark on such criticism as I have in mind. One knows at the outset that much of the geometry is so personal and individual as to be a matter of indifference or unintelligibility to other minds. The eternal question I came back to again and again in writing about Finnegans Wake is: “What is the principle of selection and what is the principle of combination?” These could not be empirically determined and so analysis must ever fall short of a full account. Thus all readings are to a purpose, and that purpose shall be whatever—given “the scene”—the critic deems viable to an audience.

But in these efforts that seem more in the manner of a confession—the revealing of the “I” or the “self”—one’s audience is, as it were, the ages or the god of the ages. One’s accounting becomes a rhetorical performance, maybe even a ritual performance (as prayer and perhaps poetry are both). But also, if one allows it, playful. For the making of a personal geometry is—as I titled an early effort at a long poem—a matter of “trials and errors.” One is on trial for not trying harder and must acquit oneself as one may, mea culpa.  And one must own one’s errors or never learn from them. And how often does such become the matter of “the lecture”? Still, all that rhetoric that derives from one’s “position” must be at the service of whatever one is able to draw with one’s peculiar geometry: that circle—or as I said constellation—that holds the relations together in some kind of composition, as even an abstract painting is composed.

What excites me about this prospect is the sense that I shall discover many relations as I go and that my “schema”—which I have been at pains to develop for about two decades (since grad school)—will alter as I go on. The schema, based on that tripartite division pulled from Mallarmé by way of Rancière, arrived in the winter of this year, but the elements to be impressed, to be related, have been swimming about gaily my entire lifetime with no schematic, geometrical net to ensnare them. Perhaps they’re better so. Why force these relations as reified things, as objects? We know well enough the incentive to “stand back and let it all be.” But, whether as artist or philosopher or critic or poet, one must impose this practiced geometry and make such shapes as one can. So far, in conception, I’ve been working toward an autobiographical ground, the “growth of the critic’s mind” idea, because it helps me distinguish experience from the historical chronology that comes to me not only from the form most biographies assume, but my way of ordering my books and my LPs. Remember the moment in the film Hi-Fidelity when Rob blows Dick’s mind by telling him he’s arranging his LPs not alphabetically, not chronologically, but autobiographically. The autobiographical arrangement blows my professorial mind as well. The key tension in my “take” has been between the fidelity to history that my education expects of me—the history of art and literature and ideas—and the fidelity to my personal experience. The incentive now is to admit the degree to which one’s own lights and the happenstances of one’s own experiences inflect not only the reception of art and ideas but their very meaning. The peculiar logic—or is it dream?—by which we get “from here to there, eventually.”

One thing, I think, that makes this project arrive now is something I referred to in Towards Criticism, 1, when speaking of The Ambassadors: that sense of renouncement and preemption so clear in Strether’s ultimate position. Since I’ve never been the age I am now before, I’ve never had this particular vantage—but having it now means renouncing some claims, at least privately, for the sake of what one would make of the past. From a very young age, I was provoked by Dickens’ indelible opening to David Copperfield: “If I am to emerge as the hero of my own life or if that position is to be attained by someone else, these pages must show” (or words to that effect). To be “the hero of one’s own life” is at once presumptuous but is also essential, if one finds one’s life worth recounting. But what heroism is there in simply choosing to look and read and reflect? James at least understands that our hero might be a hero because of the pains taken in developing a peculiar geometry. Thus any biography of an artist is always a rather flat affair since we never get an account of those two elements I isolated in my question about FW. We never know why and how the artist did what s/he did. We, at best, get a sense of “the scene” in which what was done was achieved, and get maybe some elements of personality to add to “the figure” that we could not get from “the fiction,” or the work. But the defining choices and the decisions made in the act of creation forever elude such scrutiny. And yet, I suppose, the hero has made those choices, from dictates of consciousness.

Since I’ve never written a—to me—convincing poem that wasn’t primarily concerned with delineating, by some peculiar geometry, the relations of the contents of my state of being at that precise instant (all my poems being very much a here and now affair), I’m reticent at trying to understand the process. Reading myself—what has been written in that manner—lets me have glimpses of how my mind works, but I find that some part of me—the lyrical “Ich,” why not?—is leading some other part of me—the critic, the reader—on a merry chase. “Hide, fox, and all after!” has been a slogan of mine for quite some time on that front. And certainly one does not spend years reading the Wake if one is not easily intrigued by all a fox might hide. Along that line of thought, I think, one easily sees the strength of my distinction between verse and essay, as I would practice each. I want to keep to the decisions made when fixed—one’s attention anyway—on an object at hand. The object shall not be, as in poems, the puzzle of my own state of mind, or emotional state, or what have you. Which is not to say that the prose won’t “blow hot and cold.” It’s still a matter of breath, after all.

Sunday, July 3, 2016


Reading is different as one grows older. Young, I used to read “for all time,” my brain alive to the words as though a photographic plate, letting the lines be burnt upon my imagination so that the words and the images they create would never fade. First readings of certain books remained largely intact. I could leaf through the books in my mind, “see them” again, if not read them again. The capacity that allowed me to recall the actual words used, to see them before me in my mind’s eye, dispersed probably around age 30. Thereafter, it became harder to call to mind all the particulars, but that wasn’t simply due to aging memory, it was also due to educational dispatch: in college, which I attended between ages 26 and 30, one read for short-term memory, for the sake of “the course.” Much of what one studied was given definite temporal parameters. After all, old knowledge will always be replaced by new. Nothing is for all time.

But those earlier readings, in my teens and twenties, were for the sake of my own mind and development, not for the sake of the curriculum. Those readings were between me and the world of letters, and I belonged there by virtue of my curiosity, my hungry search for writing that would matter to me, that would shape my mind and imagination and word-usage, my way of thinking as though in a book. My way of reading my own mind.

In college my mind was enlarged by many more avenues of study, of discipline and major, than I had sought out on my own, and yet that enlargement created a darkening of the photographic plate. There were simply too many facts, dates, names, places, works, events, readings. While I still preferred re-reading those in my pantheon of greats, that pantheon was no longer a world unto itself; it was a collection, a constellation, created by my initial inquiries.

Now, mid-fifties, I can look back on my educated reading as well, so that arranging my books on my bookshelves after what may be the last big move of my adult life, I relive, or reanimate imaginatively, the process by which I filled in the gaps so that literature becomes not simply my circle of heroes, but a continuum in which one tendency replaces another “forever”—or, until it simply becomes, for the moment, contemporary. The mental walk through literary history (my books are arranged chronologically by year) furnishes me with many thoughts and memories, many of ideas never realized. Time was, these ruminations would have been considered material for lectures about developments in art and literature, lectures never delivered nor written because, in the manner of the student who prepared but didn’t get “called on,” all my study did not lead to a university position. So part of what I reflect on is what that preparation amounts to, in and of itself. The ability to pass exams, to read for and pass “generals,” to have a body of knowledge that isn’t “useless” so much as “unused.” But knowledge—knowing who wrote what when, or who said what about which work, or the plot of various novels and plays—isn’t the point of such study, it’s a by-product, not an outcome.

The point is argument. And perhaps it’s enough to reflect that the argument against reading as education is inherent in the failure to find the job offer one expected. But that’s not the argument that motivates me, that too is an element of happenstance—of opportunities missed or abilities lacking. The argument furnished by my originary reading relied upon the necessity of reading as a developmental decision; reading as an auto-didact would be improved by subsequent education, certainly, but that doesn’t overshadow the initial impetus: the belief in that early reading was belief in myself within the world of letters, not belief in myself as an academic, as a professor. The fact that I did attain to a knowledge base that enabled me to lecture authoritatively in my field means nothing beyond that. That knowledge makes one a teacher. The Ph.D., arguably, makes one a scholar, or gives one the wherewithal to research and argue at a professional level. But neither of those capacities makes one “a writer,” as I understood the term in my youth. Though even that term is a dodge. I had no interest in “writers,” per se. My heroes were not “writers,” they were artists, with all the romantic—and perhaps subversive—associations that might conjure, because what they created were works of art. And much of my study at the doctoral level was aimed to understand that distinction. What does it mean to consider some product of writing—a novel, play, essay, poem, book, text—a work of art?

I had entered the university at the age of twenty-six because I felt I’d gone as far as I could on my own. I wanted to learn to read French and German; I wanted to study the history of art and of literature so that I could understand how my latter day heroes—Joyce and Thomas Pynchon—fit in. But also art. I’d spent a few years working as security at an art academy/museum and the sheer diversity of forms considered art in the ‘80s burdened me with an irksome incomprehension. If there was a history, if there were principles, then one might devise an argument for some works and against others. I suspected that such was not really necessary—neither for artists nor professors nor critics—but I wanted to insist upon it for myself. I was already possessed—I believed—of a discerning critical sensibility, but what was it discerning? What was the basis of my own affinities? What was the purpose of convictions other than simply having them?

All along I was convinced that I read not to become an informed critic, or a good student or teacher, but to become an artist in my own right, and that meant—to myself in the mid-twenties—getting a handle on what my heroes had achieved in the grand scheme of things. Which meant coming to terms with the fact that there is something like a “scheme” in the world of letters. Even though many of my heroes were iconoclasts and rebels within any such scheme. And that worried me. Trying to create an ad hoc “scheme” into which the writing—the art—that mattered to me might fit was a disservice to some affinity that mattered to me more than keeping company with what Nietzsche—one of my earliest heroes—liked to call “schoolmen.” In fact, apropos of Nietzsche’s idiosyncratic autobiography Ecce Homo, I once remarked to one of my dissertation advisers that I’d like to entitle my thesis “Why I Like the Books I Like.”

Not content with only one aspect of the portrait of my affinities—the literary—I have expanded the list to include films and rock music and poetry, perhaps even fine art. As an undergrad, I was a student of both art history and comparative literature and my list-making from that combined study followed more closely historical and aesthetic tendencies not decreed solely or even mostly by my own preferences. But to create a personally relevant list was always the more compelling aim. There are philosophical grounds for this, having to do with the idea that experience occurs in a time for all time, but that the time when something happened and the time in which it is recalled are never one and the same, much less the time when something first appeared and the time when it was first appreciated in a critical fashion.

Thus my stress on these different experiential points: the time of emergence, when something first comes to light, call it (the following terms derive from Mallarmé) “the scene”; then the time when something is experienced/appreciated by the critic—in this case, me—as “the figure”; then the time when something is recounted, placed into thought and writing—as “the fiction.” The idea that exercised my mind in my time at Princeton was that of the “supreme fiction,” an idea taken from Wallace Stevens, who formulated, as did Mallarmé, a tripartite division: “It Must Be Abstract,” “It Must Change,” “It Must Give Pleasure.” The “abstraction” is “the scene,” that idea of a time and place that remains—as idea or conception—after the time and place has passed away, in the manner that one speaks of “the Sixties” or “the fin-de-siècle,” or “the Renaissance.” Zooming in, one is concerned with the blooms that are flourishing at a particular moment, whose having been alters the air we breathe thereafter. Change is part of “the figure,” the being or bloom who instantiates the idea in its moment, but then continues to alter, for nothing gold can stay.

The “pleasure” corresponds to “the fiction,” in which experience makes a claim to endure. There are many easily forgotten or undifferentiated experiences, but some others have a definite thrust or power or vibrancy that makes them lucid, knowable, preferred. The factor of pleasure—hedonistic as it may seem—has to do with beauty as the high desiderata of aesthetic experience. We may wish to marry the simple contemplation of beauty with something more active, with “the beautiful gesture,” or the acte gratuit, or with the selfless sacrifice of a Christ or a political dissident, but I would argue that, transposed to art, such acts still must be contemplated, and so we remain within a realm of the pleasure of our own experience, inseparable from the pleasure of being alive, or of being itself.

For me, the fiction that seems to matter more than any other—as the basis, I suppose, of all my pleasure in whatever I take pleasure in—is “the growth of the critic’s mind,” where the critic is both author and reader, and the “growth” is the interdependent reading of one by the other, of a changing “figure” read against an abstract “scene,” for the sake of that enduring fiction: consciousness.