Tuesday, August 15, 2017

The Future of a Delusion

There is no happy wrap-up here, so good luck figuring out this snarl; I don't solve the problem, merely pose it.

Camus, in “An Absurd Reasoning,” wrote that the question that is of first importance for philosophy is whether it is worth it to continue to live. Why not suicide? What I intend to get at here is not that it is a delusion to want to continue – Camus thought suicide was a mistake. The delusion has to do with life, but, once suicide is out of the way, the question of first importance after that will come into view.

“Man never thinks so much as when he is suffering,” wrote someone who sounds wise to me. I am not “really” suffering; there will nearly always be someone who will or who will have, or who is suffering, more than any speaker; yet Alain de Botton tells us not to disparage ourselves for what suffering we do endure, nor to disparage our suffering itself as unimportant. Such is most of my suffering: mental, emotional, spiritual, rather than physical. Goethe wrote, “What are the pains of the flesh compared to the agonies of the spirit?”

My “super-ego” is typically an obnoxious drunk sitting next to me at a bar … this imaginary person spouts what passes for wisdom among the parroting thoughtless. This person always has a put-down ready. One has to be humble enough to admit one does suffer from one's problems. Suffering, to my super-ego barfly, is false: it insults their vanity. Unless you are suffering in a narrow way and more than anyone else, the idea that one is suffering is ruled out, in the ternary opposition of this subhumanizing fool. One is either “truly” suffering, which is ruled out; one is close to an unfeeling neutral (one's above having any feelings); or one is in the ecstasies of drugs or sex. Finer gradations simply don't occur or are unworthy of consideration.

My super-ego is an average fool. With a comfortable job, no commitments admitted, he or she inhabits an inane interstitial paradise of low-level intoxication, where dullness turns into supercilious know-it-all-ism. You've likely met, and detested, this person. This imaginary person, “based on actual events,” is suffering, in an insufferable way, from a delusion.

It's not as simple, here, as concluding one's own self-importance, alone, is at issue. Perhaps it is a constellation of signs and symptoms, a syndrome with multiple dimensions. It is a complex in this sense. (While I have used the psychoanalytic term, “super-ego,” I do not intend “complex” in a precise psychoanalytic sense.)

The status quo, for our super-ego companion, is simultaneously unquestionable and held up as obviously inevitable. “Obviously” – the fool will berate me if I disagree, and will treat me as if they think I am stupid. Maybe the delusion has a significant eristic component: the fool will try to “win” the conversation at any cost to reasonable thought. It is as if it is a battle not of wits but for existence. This used to be clearer when I was ten years old. Now there are too many confounding factors, and I quickly become confused in areas like this one.

The companion, stalking through my mind's recesses – popping in to irk me – seems to think I am out to get them. Out to obliterate them. It's either them or me, in a struggle of life and death. It is a Sartrean encounter: the hateful look is used to take away my status as a fellow human being. It is an attempt to destroy my subjectivity, to nullify my existence: my super-ego is trying to kill me.

My experience of this onslaught is protection against the mob. Regardless of the metaphysical status of the mob's “mentality,” there is a characteristic lowering of thought in a towering chimeric giant formed out of individuals. The “shitstorms of the Net” have their basis in mobs of bodies. I have a super-sensitive self-censor, who functions to protect me from having my subjectivity nullfied by fools, by playing the parts of them in my imagination. I am advised to stay indoors!

Where can such self-censorship lead? The “anti-psychiatrist,” R. D. Laing, wrote that we say we are pressured by “society,” but that it is we who apply the pressure upon ourselves. That's what seems to be going on here; but why apply it at all? Society. And we are society. We are part of the miasmic “society,” a popular enough villain, so often invoked whether for praise, or in dismay or outrage.

This sort of self-curbing has led to a society in which children no longer play outside unsupervised. It is better, says the chimeric super-ego, to keep them within reach and indoors. Connected to the neural-drip of the shitstorms of the Net. Safe in body and assailed in mind, in spirit. Turpitude as brain food!

Is the delusion in question the belief, that one can escape the self-censor? Or is it that one cannot (and so the censor must be obeyed)? Is it both: to escape the censor we must obey it? I think Zizek has spoken of this paradoxical trap in some way.

I have no happy ending in mind; rather, my super-ego companions are rattling my cage with their insistence on banal conclusions. And you have already known them all!

Thursday, July 20, 2017

Nietzsche on the Edge

I have a stuffed Nietzsche vulnerable to being eaten by our dog, Taiko Waza. He surveys and surveils our bedroom. He's a kind of Ozymandian figure: he doesn't look like Nietzsche much, and this indicates the inevitable falling away from an original.

The effigy looks like just anyone. He can sit, but he can't sit up straight. He's bottom heavy, weighted in the rump, under soft plush. The plush of his coat is sickeningly slick, and the color is a vile gray. He is saccharine sweet. Even his mustache is worn cleanlier than the 19th century thinker wore his. He is a softening, rounding off, and generification of one of the most outrageous and controversial dead White European males, bless his polyester heart.

There is a fantastic bust of Nietzsche; and there is also a bland one. The first looks like a hybrid of Chthulu and Nietzsche, and the second looks like a hybrid of a body builder and Nietzsche. In one, the mustache hangs down, grotesquely, like tentacles; in the other, it is as regular and well trimmed as a toothbrush.

To read ten books by Nietzsche is to learn to respect the sculptor who shaped the Chthulu-Nietzsche. Nietzsche lived “dangerously,” and the bust looks dangerous itself. Nietzsche's books are dangerous in that, in the wrong hands, they may be used for evil; Nietzsche's psychological observations often seem to me to be like those of a brilliant psychopath. Psychology for Machiavellian princes.

The stuffed Nietzsche gives no hint about The Antichrist, nor that its author referred to himself as “dynamite” to blast away old values and “idols,” making way for new ones – creative destruction. He is softer than the Nietzsche of his foremost translator, Walter Kaufmann.

Saki, the pseudonymous British author, wrote that, as he contemplated his becoming older, the last thing he wanted a reputation for being was “amiable.” This plush doll is an amiable firebrand. You'd think he wrote chatty trash like Terry Eagleton's Why Marx Was Right. He looks more like a businessman than like Superman's father. He is a commodity. His ass has a label sticking out of it!


To judge a book by its cover is wiser, no matter how mistaken, than to judge a philosopher by his/her popularizations, whether in comics, catch-phrases, or plush decorative dolls. Reading Nietzsche changed my life, which has nothing to do with his anodyne effigy sitting on the edge of our dresser, gazing into the abyss.

Wednesday, July 19, 2017

Indubious Musings on Dubitability


A doubt without an end is not even a doubt.
– Ludwig Wittgenstein, On Certainty

I read On Certainty two years ago, and I hardly remember any of it. There was a “language game” with slabs, going on with no point, the two workmen engaged eternally in a Sisyphean labor. So I will take the line above, from the book, I found worth writing down on the cover of a writing pad now filled – I will take this line out of context: “Line, I pluck thee out!”

Let's take doubt without an end. “Nothing is absolutely certain,” said a philosophy professor. I'll leave aside the obvious, trite rejoinder: “Well then, smartypants, are you sure?” My reply was, “Aren't some things absolutely certain, like 'I am sitting here'?” The smartypants had a wicked look in his eye as he said, “How do you know you aren't dreaming?” I waited for him to turn back to face the audience, after he'd turned his back on me with disdain.

“I would still be 'sitting here' in my dream,” I said. He didn't answer, and only paced before us, before beginning his lecture.

I say this was satisfactory. I was not a heckler. Perhaps nothing is absolutely certain; perhaps foundationalism is undone – the idea that we begin with something indubitable and reason from there, to less and less certain (perhaps) propositions.

Perhaps not. Even granting objections regarding the question of the nature of “I,” I am still – whatever I am or am not – sitting here (even now). Even if “I” is said not to exist, “sitting here” exists. That is what I would be: not what I'd be doing but what I am. “I am sitting here.”

The phenomenological method has great appeal to me. Granted, I know little: for example, I'm unfamiliar with what may be phenomenology's earliest and strongest competitor, pragmatism. Phenomenology takes seriously my sitting here, in a way I haven't found anywhere, in an attractive way, with interesting language.

For example:
Even the forgetting of something, in which every relationship of Being towards what one formerly knew has seemingly been obliterated, must be conceived as a modification of the primordial Being-in; and this holds for every delusion and for every error. [Heidegger, Being and Time]
This is one way of considering forgetting, in direct contrast with representationalist theories of knowledge. When Heidegger writes – by hand; Being and Time was a (long) handwritten manuscript – of Being-in-the-world as a mode of Being of “Dasein,” he says things unlike those written in most other places, where Cartesian dualism, of subject-object binaries, is taken for granted. Dasein is not “inside” looking at a world “out there,” but is “out there,” “in” the world always already. This is not nonsense. It is a way of seeing human existence as existence.

In some sense, I say, “I am sitting here.” Folks like Heidegger, Husserl, Sartre, and Merleau-Ponty, take as a starting point, before physics, biology, anthropology – before positivist-scientific theorizing and experimentation – our simple being “in” a world, “in” a body, “in” a relationship with other human beings. This is a radical change in viewing a person, from the “natural attitude,” heavily influenced, for many of us, by Cartesian dualism.

“Without a doubt,” something is certain. Our existence is absolutely certain. Not that I am feeling sad or lonely – I may be ill-educated in self-knowledge and mistaken about my “inner” perceptions – but that I am “in” a world, doing something that is my current mode of being in that world. Not that we are not in a dream or a computer simulation or a poem, but that we are at all. To doubt this is not even to doubt but to pretend to doubt.


Saturday, July 15, 2017

Wearing a Barrel

Where has been the discovery of a means to following a “Moore's Law” of work-leisure ratios? Not “work-life balance” but life outweighing work by an order of magnitude. Members of The Frankfurt School decried stupefying leisure. Where is work, where is leisure, for human beings? For “informed citizens?” There is nearly enough “work” for informed consumers. Wallace Stevens, I've read, was vice president of an insurance company. T. S. Eliot was, James Joyce related, the most “bank-clerky” of the bank clerks on his shift – or was this before such desk work involved shift-work?

The barrel of a gun, and the grave. These are the “parameters” that describe our perimeters. Rand was right for all she was wrong: money is handled by the well armed. “Economics,” whose “laws” are not strong enough to enforce themselves, is enforced at gunpoint. Just try stealing or dodging taxes, and a barrel will be pointed at you by “the laws of economics.” There are no longer jails of economics as such – debtors' prisons. This is immaterial. (Debt is still enforced by a gun barrel.)

To look down the barrel of the gun that is the postmodern Western-American capitalist's relational database collection; this is now exhilarating, now terrifying. For lack of other words, we are lost; the men who run the country, and their wives who run them or are left by them – whether they are addicted to cocaine or social justice advocacy, dying of cancer or fad diets – will not ask directions. Who are offering directions? Charlatans and geniuses – and how are they to be told apart?

The distinguishing characteristic of fools is unwarranted epistemic confidence used as a crowbar or bludgeon. Who batter their way through crowds of each other, the rest of us interspersed, to collect their due, the brass ring, on their way into the earth; who are the cause of dreams of Heaven, an existence without assholes and evils. But what do I know? Who overcomes Dunning-Kruger; who but an expert can choose the experts; how are we even to recognize expertise? If D and K are experts.

The dream – an American Dream – is the rarest, or the most boring (because of its impossibility, its realization in an illusion), manifestation of the data in these relational databases. Numbers live; the numbers are created by us to co-create us; the computer boot(straps), a quasi-hermeneutic-circle without logical violation. Voilá! We have our freedom. That is the nineteen-thousandth table column in the database, connected to an index for faster querying. This number must(!) be minimized, and this is done, partly, by making much of it. Fireworks! Celebrations! Our “princess is in another castle” – and who will look for directions, even admit we are lost?

I did not ask to be born into the common era. I am required, in order to help to raise a child who will pick up my pick-ax when I have fallen over, dead; to know more about human life than does an empty cola can; in the interest of mercy; to keep others, likewise, working; I must it seems, work for one third of my day, including time for basic biological, psychological, and spiritual necessities, during my “working life,” five days a week – or, as the late William F. Buckley, Jr, quipped when speaking of understanding “the hippie movement,” I must “die painfully.”

The most salient fact gleaned throughout K-12 education, understood perhaps not then but years or decades later, is that the people who make up the masses of humanity do not care to become educated at all but to be trained in ways conducive to “earning” money, status, etc, the trappings of princes and queens and kings, their homes their “castles,” their lawns protective moats: wealth without wisdom.

For all that is said of wisdom having nothing to do with wealth, imagining their combination is so delicious it must be taboo – to attempt to acquire them both. The existence of wise rich/rich wise people would drive many who were neither, to burn, lynch, and pillage. Some of the super-rich “Masters of the Universe” say this themselves. I've read one of them, and where there's smoke there's fire; but wise, rich humans would be super-men, and assassinated whenever possible. It would be too much to bear, to have the opportunity to view oneself in such a degree of inferiority; they would be a living pantheon.

Do we have a choice? Is there “no clear alternative,” as Žižek has said there is not? In Heaven there would be no need for the firearms. In Heaven, “we're all dead.” And much better for it.

Tuesday, June 6, 2017

Do You Even Tune In?

It seems likely to me that a lot of the content on the Web is nearly identical: blog posts often say similar things to other blog posts on the same topic. These blog posts are written, often not by the blog's owner, to collect advertising revenue. What does that tell us about the content?

With so many blog posts being so similar, that is, saying the same things about the same things, we can imagine them as songs played on a radio. The Internet Surfer is fed or otherwise comes across these postings, and so “hears the song.” With a blog post, it's unlikely the person will reread the post the way so many people like to hear certain songs again and again. Blog posts are disposable; we consume the post and never read it again.

Memory for news, events, memes, blog posts, can be expected to decrease as exposure to these media items increases; being able to remember only so much, more text, sounds, and images leads to a lower absolute number of these items being remembered. Our Surfer may remember an exemplar of a type of media item. A particularly popular Star Wars meme, or a blog post that said something just so, may function as the memory handle on a whole slough of similar and possibly inferior items.

Each type of media item is like the song on the radio: you've seen one of the type, and you've seen them all. Seeing one of a type is hearing the song again. As we scan the Web, “turning the dial,” we hear the same song many times. Songs come into and fall out of favor rapidly, and we may forget we've even heard the song.

I use the radio analogy not only because it struck me first; the radio is entertainment used to collect advertising revenue. The Web is entirely entertainment: news, activism, shopping, Internet radio and streaming video – all are entertainment. “What about online education! What about moocs?” I am impatient at having to say it again: the entire Web is entertainment. What is not entertainment is not the Web; it is something else for which I have no name.

A final note: when thinking of blog posts as songs, we formed a schema of the song from the different, similar blog posts. Future radio may be used (to collect advertising revenue) to broadcast or to stream songs that have been altered by a computer, in such a way that no two plays are identical. In the first case, “out of many, one”; in the second, from one, many.

Thursday, June 1, 2017

One-Dimensional Places: An attempt at a phenomenology of the rest-stop affordances of the roadside attraction, generalizing from an example

Let's say I was sitting on some rocks beside the rock staircase leading to an overlook, reading Jean Baudrillard's Simulacra and Simulation, and I overheard a man speak; I'm not saying a man with his wife said the following, as they began their way up the stair:
“Why don't you just use the rest room, walk up the stairs, and leave, like everybody else? Bring your extra pillow, make yourself comfortable; reservations close at five o'clock.”
I am saying he might have, and so now we are in the realm of philosophy!

The empty picnic table, the well worn but unoccupied trails, the rocks beside the staircase, are decoration to most. "This is not a park"; this is another rest-stop on a drive across a state. Use of these amenities is, socially, implicitly forbidden. The disappointment of the one who sees the picnic table occupied by a single human form: “We don't actually want to stop here at the table, though you are preventing our possible use, and so you must go.”

The single human form is anathema to most, in a circumstance such as this one. “Truth begins with two,” Nietzsche wrote. Leaving aside that “I am legion,” I am carrying Baudrillard's book; I am not alone though I am a single (“embookened”) human form. The single human form: the image of the crazed serial killer. The book (and not this book only): subversion incarnate.

What sights, sounds, scents, coruscations of touch would this brave gentleman have foregone, to have been able to speak of a single human form reading a book in a somewhat natural setting, this way? In his mind, who knows what catastrophe he averted by making clear that he was unhappy at the prospect of sharing oxygen and vicinity with a “strange” human form? Fear; pacification of existence. One removes the other, and the other the one.

I got up and left; we two were not slated to be friends. What catastrophe I have averted in my imagination! I did not get into a conflict with another human being. I know that single human form better than he; surviving attempted murder was never anything I'd planned on, and it was that single human form I did not heed, that made the five scars in my body (“defensive wounds.”) It is that single human form that, as far as I know, as far as I can know, is still haunting entrances to parks.

Look at what has just happened: objective ambiguity. Marcuse, in One-Dimensional Man, gave a few examples of this. Assessing from two points of view neutralizes all objections. Yet this is still an irrational rationality. What do any of you dialectical thinkers have to say? 

I do not believe in “neutral” or “indifferent” worlds. Once something exists, it has an essential orientation, even if it exists only as possibility or impossibility. The pencil does not balance on its point. It was not intended to; that is not the design intent of a pencil. Perhaps some skilled pencil-balancer is out there; then, it would be that single human form's essential orientation, and so on. There is no zero-point, all is flux; or, zero is never reached without being left, once more, immediately or nearly so. The world is not digital, though this picnic table is a simulation.


I came back, circling around, to the picnic table, and returned to where I began, knowing it for the first time. And I've left out of fear of – the police and the madhouse!

Monday, May 22, 2017

Gatsby's Tragic Failure

Gatsby was "great," and he was tragic. It isn't clear to me what made Gatsby “great.” He had money (lots and lots and lots of it); he was not a hypocrite like his antagonist, the man who had married Daisy, the woman Gatsby loved; he had class. What did Gatsby know besides how to get rich? He knew he loved Daisy, but he thought he knew more than he did: he thought he could overcome absolutely any obstacle with his love of Daisy, and hers of him.

He fought in WWI. Certainly the trenches were not where the life was free. It seems Gatsby had more than one thing in common with the Austrian-born philosopher, Ludwig Wittgenstein, who was a decorated war hero!

Wittgenstein was born into wealth; Gatsby achieved it. Wittgenstein gave his huge fortune away. Gatsby lost his only when he died. Wittgenstein wrote a key philosophical tract in the cataract of mustard gas and machine-gun barrages. Gatsby wrote one of the most famous love letters in US history under the same circumstances.

Wittgenstein died of prostate cancer; Gatsby died with a bullet through his heart, both literal and metaphorical. Cupid was not a sympathetic character in Ovid's “Metamorphoses.”

A life with Daisy was still a possibility at the moment of Gatsby's death, as far as we know; and Wittgenstein probably would have known the modal logic of it. Maybe Gatsby ran up against the laws of the world: the fates will intervene to prevent time travel, to prevent bringing the future back, into the past, and the past into the future, the way Gatsby wanted to do.

In the midst of avarice, hypocrisy, sin and riches, he remained a good man; but what made him great? It was not his hope; I can't see how hope is in and of itself great. His sources of wealth may have been questionable; the bootlegging certainly showed the hypocrisy of his nemesis, Buchanan, Daisy's husband and “the polo player.” Gatsby “went to Oxford” the way I “went to” two well-known universities; and neither of us claims to be “a Such-and-such man,” implying we are the products of a full education there. When pressed, he admitted this about himself. He was forced into honesty. He lost his temper, nearly punching Buchanan in the face; but would Buchanan have pushed Daisy out of the window, beside which she and Gatsby were standing, if she had said she'd never loved him?

What was the importance of her saying she never loved him, anyway? It was required, by Gatsby, so their five-year separation would not be tainted by time's passage. Nothing happened during those five years: that was something Gatsby needed to know to believe one could relive the past. Gatsby insisted it be done; everything must go according to his plan, or it would not be perfect, and so would be nothing. It was not enough for the present to be what it was: the past must have been the present and vice versa.

Gatsby's last word was: “Daisy!” Wittgenstein's were, "Tell them I've had a wonderful life." Which had the greater achievement: Gatsby, who loved Daisy but perhaps perfection more; or Wittgenstein, who loved philosophy but perhaps perfection more? I see Wittgenstein's achievements through a glass darkly, and in the end it seems his life and love of philosophical perfection were enough for him. I see Gatsby's hope outrunning the possible as quixotic desperation, not altogether a bad thing.

Near the end of hope, and in the midst of catastrophe, with Daisy's automobiling encounter with Buchanan's mistress having just occurred, Gatsby flexes his plan, and is willing to run away with Daisy. If he had just run away with her when she told him she wished they could do it – but Gatsby's obsessiveness led to his rigidity, and if it wasn't perfect, it was not acceptable to him; he insisted on his own way, even to his and Daisy's ultimate detriment. It was his tragic flaw, his seeking of perfection and of the world's going according to his plan up to the last and smallest detail. At one point he makes a gesture with a finger, of a straight, diagonally ascending line in the air. "My life must be like this," he says. He didn't know about "what [success] really looks like." 

The relentless drive and single-mindedness that allowed him to become “The Great” Gatsby was his undoing. He and Daisy could have run away a number of times. But love wasn't enough for Gatsby.