Sunday, June 21, 2015

"The Revolution will not be Televised"

Twice in the past week I’ve come across essays suggesting that resistance isn’t merely futile, but that it plays right into the hands of the enemy. (As an aspiring pacifist, it feels strange to write the word “enemy,” but note that the Prince of Peace said “Love your enemy,” not “Don’t have any enemies.”) In the first instance, found in Tim Parks’ marvelous collection of essays, Where I’m Reading From, which functions as something of a State of the Union address for contemporary literature, the brand of resistance in question takes the form of the written word. Since this blog mostly exists so that I can participate in the resistance via writing, my ears instantly perked up as Parks quoted Orwell’s beef with Dickens’ satirical resistance to a stultifying British bureaucracy:

“In its attitude towards Dickens the English public has always been a little like the elephant which feels a blow with a walking-stick as a delightful tickling…. Dickens seems to have succeeded in attacking everybody and antagonizing nobody. Naturally this makes one wonder whether after all there was something unreal in his attack upon society.”

Parks then proceeds to frame a question that he aims squarely at satire: “Orwell treats Dickens as if he were a special case, but the question he raises here is whether all satire isn’t to some extent in connivance with the object of its attacks.” This is disturbing enough, except that I am left wondering if all forms of resistance (as we know them) aren’t to some extent in connivance with the objects of their attacks.

Slavoj Zizek suggests as much in the second essay encountered, “Resistance is Surrender,” (http://www.lrb.co.uk/v29/n22/slavoj-zizek/resistance-is-surrender) written in 2007 before anyone had begun to forget that Bush the Younger’s Iraq War may well have been the single worst foreign policy decision ever made by an American president. (The recent upward creep in Bush’s post-presidential approval ratings feels more like willful forgetting than forgiving, and like the worst kind of “I’m okay, you’re okay.”) Here is Zizek, in perhaps the definitive take on “I have met the enemy, and he is us”:

“The big demonstrations in London and Washington against the US attack on Iraq a few years ago offer an exemplary case of (the) strange symbiotic relationship between power and resistance. Their paradoxical outcome was that both sides were satisfied. The protesters saved their beautiful souls: they made it clear that they don’t agree with the government’s policy on Iraq. Those in power calmly accepted it, even profited from it: not only did the protests in no way prevent the already-made decision to attack Iraq; they also served to legitimise it. Thus George Bush’s reaction to mass demonstrations protesting his visit to London, in effect: ‘You see, this is what we are fighting for, so that what people are doing here – protesting against their government policy – will be possible also in Iraq!’”

It seems that peaceful demonstrations and resistance are either co-opted by their target (as per Zizek), or, impotent and erasable. Case in point for the latter my own Baltimore City, where a week of peaceful protests and marches in response to the circumstances of the death of Freddie Gray were a) largely ignored, and then b) completely forgotten once the “real” response of rioting and looting began. (And that’s “real” as in really newsworthy; actual events are over the second the cable news trucks roll up.)

Where does this leave us? As one reader responding to Zizek puts it: “‘Sit at home and watch the barbarity on television’ seems to be Slavoj Žižek’s new slogan for fighting capitalism.” But even if we take Zizek with a grain of salt, it seems nevertheless that both he and Parks in re Dickens have put their finger on something rather frightening, something that recalls the inevitable scene from every TV show in my youth when the hero lands in quicksand only to quickly discover that frantic efforts to get out only make one sink faster. So, is the only possible response to “sit at home and watch the barbarity on television” in the hopes that we might sink to our deaths a little slower? Zizek’s own answer is one of the most condescending non-answers ever proffered: “So what are we to do? Everything possible (and impossible), just with a proper dose of modesty, avoiding moralising self-satisfaction.” Since Zizek’s “everything possible” is an obvious disguise for his real answer, “I don’t know,” the second half of his answer actually contains everything he has to say about the way forward, which boils down to “Whatever you do, just don’t make an ass of yourself, have some dignity please.” Except I have a feeling that saving the world is going to require an awful lot of us making complete asses of ourselves until somehow, some way we get past our collective “I don’t know.”

We “don’t know” because although Zizek has no cure, his diagnosis of our larger dilemma is, if not an exact bull’s eye, exceedingly close to the mark. In short, we’re damned if we do resist, and we’re damned if we don’t. One is left wondering if this catch-22 isn’t just the human condition, as if the Woody Allen joke about two old women dining together in a restaurant, with the first exclaiming “Oh my, this food is horrible,” to which her friend adds “Yes, and the portions are so small,” is both the beginning and the end of the story.

But perhaps our dilemma isn’t the human condition at all (maybe there is no such thing?), and civilization has just painted itself into a corner. Does making this distinction even matter if the end result is that there’s no way out? I would argue that the reason it does matter is that it is the difference between there being well and truly no way out (i.e. this is the human condition, and maybe we should retire to our televisions after all) and the possibility that we just haven’t thought of a way out; think a riddle in which we’re locked in a room with no exits and a can of paint in which the only way out is summoning our inner MacGyver. Note that, per Wikipedia, MacGyver “prefers non-violent resolutions and prefers not to handle a gun.” Plus, if we’re fated to have yet another white guy as our symbol for saving the world, at least this white guy has a mullet.

And since we’re talking about white guys trying to save the world, we would be remiss not to mention Pope Francis’ release this week of a breathtaking encyclical on climate change in which he “advocates for a radical transformation of human society.” (http://www.slate.com/blogs/future_tense/2015/06/15/pope_francis_in_leaked_climate_change_encyclical_we_re_on_a_path_to_destroy.html) Interestingly, the Pope cautions against my MacGyver imagery, warning against “a blind trust in technical solutions.” (ibid) Instead, among other proposals, the Pope argues for “the creation of a ‘true world political authority’ that would be tasked ‘to manage the global economy; to revive economies hit by the crisis, to prevent deterioration of the present and subsequent imbalances; to achieve integral and timely disarmament, food security and peace; to ensure environmental protection and pursuant to the regulations for migratory flows.’” (ibid) But whether one agrees with the Pope’s top-down approach or prefers a world of anarchic individual MacGyver’s reassembling the earth even as they disassemble (dare I say deconstruct?)” true world political authority” altogether, one must admit that the Pope has outdone even Zizek with a timely, accurate diagnosis of what ails us, and a call to radical, seemingly unthinkable change.

Since I harangued Zizek for failing to propose any meaningful solutions, I will don MacGyver’s mullet and close with some of my own thoughts on how we might use the can of paint that got us here to get out of the locked room. Lately I have taken to going around and referring to myself as a socialist. More so than identifying as some kind of latter day Marxist, it is my way of announcing that I don’t in any way agree with this, this being what Pope Francis describes as “the spiral of self-destruction in which we are sinking.” (ibid) But if I am going to go around calling myself a socialist it is important that I acknowledge the many failures of really existing twentieth century socialism. Stalin’s murder of 50 million Russians, echoed by the further murder of millions more by Mao, Pol Pot, et al., is the obvious place to start. Strangely, this murderous madness was born as an attempt to undo the notion central to our own current (non-socialist) “spiral of self-destruction,” which is the simple notion that the world is neatly divided into two categories, winners and losers. I would submit that the origins of socialism’s murderous madness came in its effort to overcome the (equally mad ) split between winners and losers by declaring “Everyone’s a winner!”, as if the world could be saved not by MacGyver but by a carnival barker.

Socialism got it wrong, and in the process went completely mad, by espousing something that was both happy and false. Actually: Everyone’s a loser! All of history co-signs this as a true fact. But we would make an even bigger mistake than the socialists if we decided that this was both true and sad. Because it is actually funny, which is the key to everything, because the revolution will only succeed if it makes us laugh.

Everyone’s a loser. Put another way, the meek shall inherit the earth, and we’re all meek, i.e. we’re all losers. So let’s grow our loser mullets and save our world, MacGyver-style. After all, becoming a television character to save the world is better than watching the end of the world on television.




Wednesday, May 13, 2015

Watch This

The New York Times reported last week that France, like clockwork, has responded to January’s terrorist attacks with its own legally authorized surveillance state, one that “would give the intelligence services the right to gather potentially unlimited electronic data.” This unlimited data gathering would include, mais oui, “almost no judicial oversight.” (http://www.nytimes.com/2015/05/06/world/europe/french-legislators-approve-sweeping-intelligence-bill.html?_r=0)

The only way for the Times to put the French legislation into perspective was by way of comparison with an American surveillance state that is, with a nod to Governor Tarkin, fully armed and operational: “Among the types of surveillance that the intelligence services would be able to carry out is bulk collection and analysis of metadata similar to that done by the United States’ National Security Agency.”

The Times also notes that “American lawmakers are reconsidering the broad surveillance powers assumed by the government after Sept. 11,” which judicial activity is like attending to the fact that one was once pregnant when one is long since an empty nester. The peculiar horse known as the American surveillance state has already left the proverbial barn. Or, since we are discussing France here, we might just call it a fait accompli.

Like anyone else attempting to understand a confusing and frightening world, I consulted the oracle for guidance on my life under surveillance. When I typed “surveillance state” into Google one of the first few items on my search results was a 2013 article, “The Internet is a Surveillance State,” by Bruce Schneier. (http://www.cnn.com/2013/03/16/opinion/schneier-internet-surveillance/) Schneier cuts to the heart of the matter with a pithy summary of digital surveillance that sounds like hyperbole, except it isn’t: “All of us being watched, all the time, and that data being stored forever.” And if you think your data’s safe with your corporate friends at Facebook or Instagram, Schneier has this to say about a lot of mutual back scratching going on between Big Data and, e.g., Uncle Sam:

“Governments are happy to use the data corporations collect -- occasionally demanding that they collect more and save it longer -- to spy on us. And corporations are happy to buy data from governments.”

But Schneier’s conclusion, in which he is as exasperated with us folks as he is our demonic overlords, raises at least one important question. Here is Schneier: “Welcome to an Internet without privacy, and we've ended up here with hardly a fight.” So: Why haven’t we fought back?

As the situation so very recently unraveled here in Baltimore I instinctively turned, all kidding about the pseudo-oracular aside, to my own version of a prophetic voice in the wilderness. We all have shit we turn to when, as Pema Chodron puts it, things fall apart. But while dear Pema has been much help in the past when my own personal things were falling apart, when the whole world is falling apart I prefer the late Jean Baudrillard, an arrogant French intellectual who earned at least some of that arrogance with self-described “theory-fictions” that somehow still meet the criteria of “Just the facts, Ma’am.” Especially when it is the very facts of events like those just passed in Baltimore that are in question; if you can get down with this quote from The Intelligence of Evil, or the Lucidity Pact, then you can get down with Baudrillard: “Hence the dilemma posed by all the images we receive: uncertainty regarding the truth of the event as soon as the news media are involved.”

But back to our question: why haven’t we fought back against the surveillance state? Another passage in The Intelligence of Evil, or the Lucidity Pact, one that details our relationship with the digital, may hold some clues:

“The screen reflects nothing. It is as though you are behind a two-way mirror: you see the world, but it doesn’t see you. Now, you only see things if they are looking at you. The screen screens out any dual relation (any possibility of ‘response’).”

I would translate that by saying that the loneliest feeling in the world is checking your status update for likes or comments.

We haven’t fought back because our fear that no one is watching is so great that it leaves no room for the fear that Big Brother is watching. In fact, it is our certainty that no one is watching (“The screen reflects nothing”) that drives our desire for Big Brother’s gaze. We haven’t fought back against the surveillance state because, in the words of Princess Leia (whom, we should note, was exactly whom Governor Tarkin was explaining his fully operational battle station to), it’s our only hope.







Wednesday, April 15, 2015

Ambivalence: the Pros and Cons

My favorite Groucho Marx line has always been “Whatever it is, I’m against it.” Basically, I want to be a contrarian when I grow up. In The Secret Language of Birthdays, the description of those born on November 4th (my birthday) is “The Provocateur.” It is safe to say that this very blog exists so that I can practice at contrarian provocations. But since the truest words ever spoken were “I’m not a doctor, but I play one on TV,” in real life the closest I get to Groucho is “Whatever it is, I am both for it and against it.”

Having constructed an entire life out of ambivalence, I would say that the most interesting thing about the condition is that, far from being undecided, ambivalence is the state of being doubly decided. Sometimes this means that my ambivalence charades as Groucho’s contrarianism. For example, when confronted with outspoken atheism I can only think how obnoxious the ill-founded certainty in materialism is in the face of the stupefying mystery that anything exists whatsoever. But when confronted with religious piety I find my thoughts turning towards a) natural selection and b) how much more I’d rather watch a college basketball game than go to church/synagogue/meditation hall etc. This presents a façade of “Whatever it is, I’m against it” consistency, but is in fact the thoroughly inconsistent condition of being both for and against religion at the same time that I am for and against scientific secularism.

It should come as no surprise that this ambivalent Episcopalian ended up with a devout Jew, which arrangement protects me in equal parts from 1) my religion, 2) her religion, and 3) the absence of religion, while exposing me to each. Along these same lines, in my professional life I work hand in glove with management, while remaining an active member of the line worker’s labor union. And while my personal beliefs about human behavior are grounded almost entirely in a Freudian psychodynamic model, in my professional practice I am a strict behaviorist. The topography of my everyday world becomes: the ayes had it, but when and for how long?

So, by way of a possible explanation of ambivalence, and given that I am off the clock and we are deep in the weeds of my personal beliefs, Freud. Note the precise word Freud uses in his landmark 1923 essay, The Ego and the Id, to describe the change that takes place in a boy once his Oedipus complex takes hold:

“… until the boy’s sexual wishes in regard to his mother become more intense and his father is perceived as an obstacle to them; from this the Oedipus complex originates. His identification with his father then takes on a hostile colouring and changes into a wish to get rid of his father in order to take his place with his mother. Henceforward his relation to his father is ambivalent.” (emphasis added)

But perhaps the die isn’t merely cast for a complicated relationship with mon pere, as Freud goes on to explain that “The super-ego retains the character of the father.” It does so because the super-ego is essentially established in the father’s image:

“The child’s parents, and especially his father, were perceived as the obstacle to a realization of his Oedipus wishes; so his infantile ego fortified itself for the carrying out of the repression by erecting this same obstacle within itself. It borrowed strength to do this, so to speak, from the father, and this loan was an extraordinarily momentous act.”

So if, connecting the dots, “henceforward his relation to his father is ambivalent,” mustn’t we also recognize that henceforward his (my) relation to the dominant element of his (my) very own psyche, the super-ego, is also fundamentally ambivalent? And if religion, workplace hierarchy, and theory qua truth are the worldly elements which carry much of the super-ego’s water in re authority, then it should come as no surprise that, as noted above, my predominant stance to each of the three is one of ambivalence.

Freud consistently held that humanity’s foremost problem can be found in our inclinations and instincts towards aggression. If he was wrong about everything else he ever said, he was indubitably quite right about this. (We don’t have religion because he was wrong, we need religion because he was right. We need somebody to command us to love our neighbors as ourselves.) But let’s suppose that Freud was also right about the super-ego:

“It is remarkable that the more a man checks his aggressiveness towards the exterior the more severe- that is aggressive- he becomes in his ego ideal (super-ego)… the more a man controls his aggressiveness, the more intense becomes his ideal’s (super-ego’s) inclination to aggressiveness against his ego.”

I.e., if you love your neighbor as yourself, you become your own worst enemy.

Unless, that is, you slip out the back door as your super-ego comes gunning for you through the front. Under threat, human beings have two basic choices: fight or flight. And since you have as much of a chance in a brawl with your super-ego as you did as a toddler against your dad (“As the child was once under a compulsion to obey its parents, so the ego submits to the categorical imperative of its super-ego,”), you are left with the options of submission to the aggression (“Thank you sir, may I have another” by way of an unflagging towing of whichever party line is carrying the super-ego’s water), or flight. Ambivalence, I would suggest, is exactly the latter.

We all hear what we want to hear. And, it seems, the super-ego is no different. Ambivalence trades in the paradox of being simultaneously for and against the very same thing, whereas the super-ego’s common coin is that all too familiar binary oppositional pairing: “You’re either with us or against us.” So while I very well know that I am both for and against e.g. religion, there is no receptor on the super-ego’s brain for that particular agonist, if you will. Under such circumstances, like everyone else the super-ego hears the part that it wants to hear, that, in this case, I am for religion, and, satisfied of my obeisance, leaves me the heck alone.

I say my prayers every night, try, and fail, to love my neighbor as myself each day, because I really am for religion. This is, apparently, enough for my super-ego. But I also walked out of church eighteen years ago and (almost) never looked back; dutifully escorting my wife to shul on Shabbos or Passover is just a reenactment of walking out of church, not because I’m going to synagogue instead of church, but because it is, taking inspiration from my favorite Raymond Chadler novel, something of a long goodbye. So, I really do want nothing to do with religion (although, as the saying goes, goodbye is not always goodbye).

I am ambivalent. I am a paradox. I am free. Or as free as it gets on the lam.

Tuesday, March 10, 2015

The Dress

Human beings seem bound and determined, hard wired even, to divide up the world into two kinds of people: male/female, white/people of color, rich/poor, believers/atheists, righty/lefty, northern/southern, straight/gay. The list goes on and on. What every single pair on this list has in common, save for the possible exception of believers/atheists, who seem to have settled into a determined stalemate (though each has held the upper hand at various points in history), is that each is an example of the binary code that informs reality in very much the same way that zero and one structure a computer program. Our master code: winners/losers.

Postmodernism seems more than anything a loosely organized but consistent effort to overcome the division of our world into two kinds of people, all the assorted versions of winner/loser, by shattering each of the pairs into a million little pieces. I can still remember the time my sister, Pailin, a student in one of postmodernism’s academic footholds, informed me that there were not two genders, but (and I making the number up because I can’t remember it exactly, but you’ll get the point) sixteen. Because of sibling rivalry I pretended that this was the silliest thing I’d ever heard, when in fact, I too being a product of postmodernism, it rang so true as to be painfully obvious. Obvious enough, anyway, that I was soon incorporating it into my worldview, e.g. the best part of every astrology book is when it tells you how good of a romantic pair various signs such as Scorpio and Pisces are, so, even if it failed to undo masculine domination, the profusion of genders at least held out the prospect of making dating a lot more interesting. Interesting, that is, if like me you find the advice on Chinese restaurant placemats for Tigers to avoid romantic entanglements with Monkeys simultaneously mysterious and authoritative.

And just last year I also pretended to balk at the news that A had been added to LGBTQI, making it LGBTQIA, in acknowledgment of asexuality as another healthy, normal niche on the sexuality spectrum. Never one to pass up the opportunity to play the role of intellectual contrarian (see this entire blog), I outwardly harrumphed that this was yet another case of what Terry Eagleton has described (and I paraphrase Eagleton through the fog of memory) as the American fantasy that there are no disabilities, just differences, which, more to the point, is just a way of pretending away suffering. I think I actually just said that asexuality was a disorder, which is less pretentious, but amounts to the same thing. But even as I outwardly sat in judgment of asexuality in the very same way that the DSM-III once coded homosexuality as a mental illness, i.e. even as I obscured prejudice with false authority (reading the history of the DSM is to understand that it was quite literally made up, and, what’s worse, made up by committees), every postmodern bone in my body inwardly rejoiced that now there were nearly half as many sexualities as there were genders, which, factoring in both the eastern and western zodiac signs in addition to the myriad genders and sexualities, would make for a dating manual even longer than the DSM-5.

Postmodernism is not without its victories. The LGBTQIA movement’s success in promoting same-sex marriage rights may be its signal accomplishment. But if we grant that the main point of postmodernism has been to smash up the master binary code of winner/loser into a million little pieces, then postmodernism has been, by and large, a failure. The profusion of letters in LGBTQIA notwithstanding, almost all of us, including those of us supposedly inoculated against the practice by postmodern academe, go around dividing the world up into binary opposites all the time, and inevitably these pairs end up following the master code. Perhaps it’s not even that postmodernism has failed in shattering the code, but that we have an uncanny knack, if you will, for putting Humpty Dumpty back together again.

In other words, it’s impossible for us not to see the world in pairs of opposites. You can shatter straight/gay into LGBTQIAS (the S standing for straight, if I can be permitted the liberty of tacking it on as just another of the smashed up pieces of the straight/gay pairing), but for each letter we will inevitably divide that up into a binary pair, e.g. top/bottom, butch/femme, etc. Binary pairs may just be the cost of doing business. Postmodernism’s failure, then, in as much as it has attempted to undo domination by way of fracturing binary pairs, was inevitable. With binary opposites as our given, the only remaining opening for transformation is by way of rewriting, or at least erasing, the master code.

If you were paying any attention to the internet last week, you know about The Dress. But just in case you missed it, The Dress is a really existing dress that appears to some to be blue and black, while to others it appears to be white and gold. It has nothing to do with camera angles or lighting; as verified by my wife, Jen, in a group of people looking at the dress together on the same computer screen, roughly half saw it as blue and black while the other half saw it as white and gold. This was going on all over the planet last week.

So, there really are two kinds of people in the world. The white/gold perceivers and the blue/blacks. What’s the difference between them? There is no difference (excepting the obscure neurological triggers that explain the differing perceptions- the dress, apparently, is blue/black in “real life”), which is the best difference of all. This particular meaningless difference is stupefying and absurd in how it undermines everything we think we know about reality. This makes it precisely the kind of meaningless difference that can radically destabilize the content of everything we think we know about reality, which is that we think it consists entirely of winners and losers. Because while it may just be too hard to accept the fact that, using myself as an example, I happen to be a straight, white, male, Christian, right handed northerner entirely due to an accident of birth (and since I landed in the catbird seat on the “winning” side of each of those halves of binary pairs it certainly is ego-reinforcing to think that they somehow mark the quality of my character, and therefore all the harder to recognize them for the accidents of birth that they truly are), only a complete nincompoop would argue that perceiving The Dress as white and gold marks him as superior to those who see it as blue and black. (If the blue/blacks attempt to persuade the white/golds that they (the blue/blacks) are superior because they are seeing the “real” color, the white/golds can engage in the late, great Dean Smith’s Four Corner Offense and take the air out of the ball by asking the blue/blacks to adequately define what they mean by “reality,” which question three thousand years of western philosophy has failed to persuasively answer.)

The hope held out by The Dress, then, is twofold. First, in the meaningless difference found in The Dress we will come to recognize that binary pairs don’t necessarily contain winners and losers, and going one gigantic step further, may in fact never contain what we think of as winners and losers. There may, remembering Eagleton, be more suffering on one particular side of a binary pair, e.g. the disabled side of the able bodied/disabled pairing, but I would argue that the first rule of sanity holds that another person’s suffering is not, and can never be, a win for me. (If being a Christian has any meaning for me, it’s almost only exactly that rule. And that’s more than enough, a Grace of God that’s as dependable as the sunrise.) Second, we will begin to recognize how arbitrary our assignment to either side of a binary pair is, and find our attachment to that assignment ridiculous in a way that is funny enough to laugh the whole thing off. If I could just as easily have seen the The Dress as blue/black, couldn’t I just as easily have been a southern, gay, Zoroastrian, left handed, female of color? It is only when I perceive the obvious humor in the absurdity of these arrangements that existence becomes, like the NBA in the 1980’s, Fantastic!

Now, I know as well as you do that it will take a lot more than The Dress to erase the Master Code. (In the final estimation, I side with erasing, rather than rewriting the Master Code, as the world has had quite enough already of its various masters.) But The Dress is important not because it will accomplish the end of domination in one fell swoop, but because it is itself a code that, once cracked, points us in exactly the right direction. Einstein famously said that “God doesn’t play dice with the universe.” In response, I would suggest that God is constantly flipping coins, randomly assigning us to one side or another of the binary pairs the entire universe seems to be made from. This is clearly a ridiculous way to run a universe, meaning either that we’ve been let in on the joke, or the joke’s on us. Only the latter creates a universe full of losers, and, ipso facto, winners. We could, then, do much worse than heed Han Solo’s advice when he says to Chewbacca, in a moment of pique, “Laugh it up, fuzzball.” Because, who knows, we could just as easily have been born wookies.

Thursday, February 26, 2015

I'm Getting Static

I’ve been wearing the same buzzed haircut now for about eight years, largely for the same reason that Jim Harbaugh wears the exact same pair of khakis every day. Harbaugh explains “It’s gotten to the point where I have so much time in the day knowing that I don’t have to stand in front of the closet, trying to decide what outfit to pick out.” (http://blogs.mercurynews.com/49ers/2015/01/06/jim-harbaugh-weighs-in-on-his-49ers-exit-kaepernick-khakis-twitter-and-michigan/) Exactly. I never, ever have to think about my hair. Or at least I don’t have to think about my hair anymore than I do my toenails.

Like Harbaugh, I would happily wear the same clothes each day, at least to work, like my seventh grade Industrial Arts teacher Mr. Wilken, who had a grand total of three outfits that he rotated on a weekly basis, e.g. week 1- navy slacks, light blue short-sleeve button down, navy necktie; week 2- dark green pants, light green short-sleeve button down, dark green necktie; etc., and who was so humorless that he demanded absolute silence from us, his pupils, as we used t-squares to fill our papers with the desired combination of geometrical shapes. Humorless, that is, but for his one joke: “Why do they call it a Sears & Rowback motorboat engine? Because it always breaks down and you have to row back.” If anyone ever made a sound in Mr. Wilken’s classroom he would promptly exclaim “I’m getting static!” Nothing further was ever required.

I was well prepared for the rigors of Mr. Wilken’s shop class by Ms. Travers, a 4th grade teacher at my elementary school to whose classroom I was not assigned, but who did take her regular turn monitoring the cafeteria for all of 4th grade. Ms. Travers had but one rule for the lunchroom: no talking. On days when absolute silence was kept, she was sure to pop onto the loud speaker during the end-of -day announcements, genuinely thanking us for our beautiful behavior. Not once, ever, did one of us give Ms. Travers any static.

The great thing about silence is that, like buzz cuts and uniforms, you never have to think about it. It’s always the same, always beautiful. Many Buddhists love the movie Groundhog Day, finding in it the paradox that it is only in the repetition of the very same day over and over again that we find the opportunity for change by way of slowly, haltingly, but assuredly increasing our compassion. But I am left with the distinct impression that Ms. Travers’ and Mr. Wilken’s insistence on the repetition of silence had nothing to do with a paradoxical opportunity for growth and change. And I am only left wondering which is the more emphatic “yes!” to life: the Buddhists’ “Once more, from the top,” or Travers’ and Wilken’s “Let’s check the instant replay.”

Even more to the point, do I always wear the same haircut in order to free up time to improve on the past or in order to assure more of the same? Jim Harbaugh is lucky in that he can definitively answer that question based on the outcome of his last football game. Without wins and losses balancing the ledger, the question basically boils down to whether you would prefer to live forever or never die. Strangely, you can’t have both. I owe my own recognition of this distinction to the author of the Gospel according to Matthew and David Shields.

Perhaps wanting to live forever is the surest sign that you are making a mess of the effort to never die, a twist hinted at in Matthew 16:25’s “For those who want to save their life will lose it.” For a more contemporary take, Shields, in How Literature Saved my Life, writes the following about Raymond Kurzweil, the futurist who expects that nanobots will, in the next twenty years or so, cure all disease and reverse aging, an eventuality the 62 year-old is preparing for with a regimen that involves, per Shields, 150 daily supplements, weekend intravenous transfusions, and, for a worst case scenario, plans to cryogenically freeze his body:

“He wants not so much to live as never to die. He seems to me the saddest person on the planet. I empathize with him completely.”

Kurzweil and, it seems, Shields both take their Woody Allen much more literally than their Matthew 16:25. I am, of course, referencing Allen’s famous “I don’t want to live on in the hearts of my countrymen; I want to live on in my apartment.”

Things will get very interesting if Kurzweil is right about the nanobots. I have a feeling I’ll be sitting in my apartment, with the exact same haircut, wondering if I am still alive.

Sunday, February 15, 2015

The Trap You Set for Yourself

Raymond Chandler ‘s masterpiece, The Long Goodbye, reminds us that fiction is the best, perhaps only, place to find truth. And if we grant that truth is stranger than fiction, then we should go one step further and stipulate that fiction feels less strange than real life to us because it has the ring of truth to it. On page four Phillip Marlowe, Chandler’s protagonist and literature’s most sublime first person voice, having just happened upon a drunk named Terry Lennox, delivers the truth: “I guess it’s always a mistake to interfere with a drunk. Even if he knows and likes you he is always liable to haul off and poke you in the teeth.” Marlowe being Marlowe, he proceeds to interfere with the drunk, befriending Lennox and, on page six, he gives us the whole truth: “Terry Lennox made me plenty of trouble. But after all that’s my line of work.” Then, on page eighty-six, having just received a letter from Lennox explaining how he (Lennox) is about to kill himself to avoid being murdered in a mountain town in Mexico, Marlowe tells us nothing but the truth: “There is no trap so deadly as the trap you set for yourself.”

Chandler’s equation:

Truth=Truth

The Whole Truth= Ignoring the Truth

Nothing but the Truth= Ignoring a problem, i.e. the truth, doesn’t make it go away.

Reading Marlowe, it is impossible not to think about the many traps we’ve set for ourselves, the truths ignored. And, rather ironically, we’ve set our biggest traps by pretending to the throne of true knowledge. In other words, the pithy diagnosis is that we are “often wrong, but never uncertain.” Speaking specifically about climate change while at the same time generalizing his observation, Raymond T. Pierrehumbert restates the diagnosis when he says “In the United States, can we actually have a reality-based, serious deliberative process about anything anymore?” (http://www.slate.com/articles/health_and_science/science/2015/02/nrc_geoengineering_report_climate_hacking_is_dangerous_and_barking_mad.2.html)

One of the things that has often worried me about the idea of an omniscient God is that if God knows everything why would God need to listen to me? This worry might be indigenous to the United States, where everyone knows everything and no one is listening. Or, as Marlowe puts it:

“It was the same old cocktail party, everybody talking too loud, nobody listening, everybody hanging on for dear life…”

What, exactly, are we hanging on to? At the same old cocktail party in 1953 it’s “a mug of the juice.” Today we’re hanging on to something equally intoxicating, and you can still call it juice if you are willing to dig back into pop culture circa 1992, when the movie Juice, starring the late Tupak Shakur, told the story of “4 inner-city teens who get caught up in the pursuit of power and happiness, which they refer to as ‘the juice.’” (http://www.imdb.com/title/tt0104573/mediaindex) So, to mash up Raymond Chandler with Juice auteur Ernest R. Dickerson, Everybody is hanging on for dear life to power and happiness, to a mug of the juice.

It is this desperate clinging to power and happiness that makes a collective “reality-based, serious deliberative process,” a process you might simply call being adults, well nigh impossible. To understand exactly how this all happens we could do much worse than to turn to Lacan’s psychoanalytic theory of the subject supposed to know. According to Lacan, transference, that fundamental exchange between analysand and analyst, has nothing at all to do with the analyst’s actual fund of knowledge. Instead:

“It is the analysand's supposition of a subject who knows that initiates the analytic process rather than the knowledge actually possessed by the analyst… the analyst is often thought to know the secret meaning of the analysand's words, the significations of speech of which even the speaker is unaware.” (http://nosubject.com/index.php?title=Subject_supposed_to_know)

All of which leads to the analyst being “credited at some point with a certain infallibility.” (ibid) Infallibility. What could possibly lead to more power and happiness than that? But we are getting ahead of ourselves. Because, to do justice to psychoanalysis, we must note that “the analyst is aware” (at least in theory) “that there is a split between him and the knowledge attributed to him…. The analyst must realise that, of the knowledge attributed to him by the analysand, he knows nothing.” (ibid)

My gambit is that we walk around supposing that there are subjects who know all the time in everyday interpersonal relations, but without ever acknowledging the split between the subjects and the knowledge attributed to them. We know well and truly that we know nothing, but we keep that dirty little secret locked up tight because no one else seems to be owning up to this, which further leads us to suppose that quite possibly we are the only ones who know nothing. Everyone else, then, is supposed to know everything that I manifestly don’t. Making it all too easy to fall under the subject supposed to know’s spell, e.g. at Jiffy Lube when they inevitably tell me I need a new air filter yet again.

But what’s sauce for the goose is sauce for the gander, and we’ve locked our dirty little secret up so tightly that we too can be supposed to know that, e.g., Philip Marlowe is “literature’s most sublime first person voice.” (Full disclosure: I was inspired to read Raymond Chandler by the praise heaped upon him by one of my own personal subjects supposed to know, Slavoj Zizek. Zizek is, famously, a Lacanian, so I guess I obviously haven’t yet achieved the end of my “analysis” with Zizek by recognizing that he, too, knows nothing, i.e. “The end of analysis comes when the analysand de-supposes the analyst of knowledge.” (ibid))

To be clear, I am not suggesting that there is anything wrong with the pursuit of knowledge, in and of itself. When I take my car in to Jiffy Lube it is important that they do know how to change my car’s oil, just as it is important that I know that air filters do not actually need to be changed every 3,000 miles. Our civilization’s pathology is not to be found in the pursuit of knowledge, but in the trumping up of provisional knowledge as absolute certainty in the service of power and happiness. We are all like bad Lacanian analysts, who occupy the position of the subject supposed to know without ever acknowledging that, in fact, we know nothing. And when I say that we know nothing, I am interpreting Socrates’ famous “I know that I know nothing” to mean I know that I know nothing for certain. If there’s one thing that we human beings aren’t allowed it’s certainty. The second we had certainty all science and prayer would end. (The fact that science and prayer are both linked to uncertainty perhaps provides at least a glimmer of hope that the war between science and religion is, at bottom, unnecessary.)

More equations:

Knowledge= I know nothing

Knowledge= power

I know nothing= power

Edward Brown, in Tomato Blessings and Radish Teachings, quotes his teacher Suzuki Roshi explaining that “Zen is to feel your way along in the dark, not knowing what you will meet, not already knowing what to do.” Strangest thing in the world: we are at our most powerful when we are feeling our way along in the dark, together.

Sunday, January 25, 2015

Gaming the System

In a provocative but flawed Grantland column about the deflated balls controversy currently swirling around Bill Belichick, Tom Brady, and the New England Patriots, Charles P. Pierce explains that “as soon as the story broke about the possibility that the Patriots had been up to some shenanigans with the game balls while they were obliterating the Indianapolis Colts, 45-7, in the AFC Championship Game, the country proceeded to lose its freaking mind on the subject.” (http://grantland.com/the-triangle/brady-belichick-and-great-balls-of-fire-a-front-row-seat-for-the-foxborough-farce/) Pierce, here, is exactly right, i.e. there isn’t an ounce of hyperbole in his description of our collective response to “DeflateGate” (a generic and clichéd label for a scandal, yes, but still preferable to the other label being tossed around, “Ballghazi,” which is too uncomfortable in its implicit linkage of professional football, that proxy for war, with the Global War on Terror, that proxy for what Woodrow Wilson once described as “normalcy.”) Case in point, yours truly, who despite moving house this week has devoted approximately 30,000 of my 50,000 thoughts per day to DeflateGate (with another 10,000 reserved for monitoring the chance for snow).

But while Pierce grasps the breadth of this latest scandale du jour, he has no sense of its depth. Ignoring its depth, Pierce writes “The whole thing is flatly hilarious.” (ibid) And while the dramatics are not without their comic elements, the affair’s tragic elements are at least as prominent as the obvious comic surrealism. Although I would note that Pierce’s explanation of what exactly is so funny about all of this is more than a little bit troubling: “The whole thing is flatly hilarious. The way you can be sure of this is that the ladies of The View pronounced themselves outraged by the perfidious Patriots on Thursday morning. Rosie O’Donnell wanted them booted from the Super Bowl. (Trolling or insane? Our lines are open.)” If you’ve been listening to sports talk radio all week, as I have, you’ve heard one sports pundit after another pronounce themselves outraged by the perfidious Patriots, some of whom have also suggested that the Patriots be disqualified from the Super Bowl. But what almost all of these sports radio personalities have in common is that they’re men. But given that “the ladies of The View” lack the prerequisite equipment between their legs, any opinions they might offer necessarily reduce the discussion to hilarity. Is it mere coincidence that O’Donnell, who famously has no interest in the equipment between men’s legs, is singled out by name? Does Pierce not even realize that a few paragraphs before he dismisses O’Donnell for the audacity of suggesting that the Patriots be disqualified from the Super Bowl, he writes that “Serious people in serious media venues have proposed disqualifying the Patriots from playing in the Super Bowl against the Seattle Seahawks”? Seriously, Charles P. Pierce? In short, Pierce is saying loud and clear that the scandal is reduced to mere “farce” (the exact word used in the title of his column) precisely because women feel entitled to express their own opinions about it. Pierce goes on to say the following: “This is what I think: Once a scandal starts being discussed on The View, it stops being a scandal and becomes a sitcom. I think this should be a rule.” Memo to Charles P. Pierce: The notion that women are not to be taken seriously has been a rule for thousands of years. It’s called patriarchy.

Just as troubling is Pierce’s repeated assertion that the public reaction to the scandal has been so intense “because we are a nation of infantilized yahoos.” Infants, of course, can’t think for themselves. But I would suggest that the “the country proceeded to lose its freaking mind on the subject” because of a very well reasoned, and deeply felt, line of thought. Let me briefly sketch how the thinking goes. The Patriots have had the great good fortune over the last fifteen years, an eternity in pro football, to have arguably the greatest football coach of all time (Belichick) partnered with arguably the greatest quarterback of all time (Brady). And they got there by hiring a guy who got canned from his first gig as an NFL head coach and by drafting a guy in the sixth round, i.e. a quarterback prospect who was just as likely to get cut as he was to make the team. The Patriots essentially hit the football equivalent of the Powerball lottery. Because it turns out that Belichick is one of the maybe three or four authentic football geniuses who ever lived, and Brady is an assassin.

So, when you have the greatest coach of all time and the greatest quarterback of all time you already have every advantage a football team is ever going to need. And everybody knows this. The combination is not unprecedented, having occurred at least once before in the tandem of Bill Walsh and Joe Montana on the great San Francisco 49ers teams that won beaucoup Super Bowls in the 1980s. The interesting thing is that nobody hated the 49ers the way they hate the Patriots. Sure, you might have rooted against the Niners because you were tired of them winning all the time the same way that you might have rooted against Roger Federer when he was winning every time he stepped onto a tennis court, but no one truly hated the Niners or hated Federer because you can only develop so much antipathy towards simple greatness. Greatness might breed a certain amount of envy but it also breeds respect, and there is no respect in the level of animosity we all feel towards the Patriots. Because, beginning with the Spygate scandal in 2007, when the Patriots were first caught cheating red handed, the Patriots have come to represent something altogether different from greatness: gaming the system for an advantage you don’t need. The Patriots, who in the persons of Belichick and Brady have everything, (Belichick’s brains, Brady’s looks, success, wealth, fame, and supermodel wife) are the 1%. The Patriots, like the 1%, embody the paradox of already having everything but still wanting more, and, moreover, trampling the rules that everyone else is expected to follow in order to extract that paradoxical surplus from those who have little to nothing precisely because they follow those very rules. How interesting that on the very same day that the DeflateGate story broke, Oxfam released its report detailing that the 80 richest people on the planet have as much wealth as the poorest 3.5 billion combined, and that by 2016 the richest 1% will have more wealth than all the rest of the 99% combined.

Pierce, then, is dead wrong in labeling us “infantilized yahoos” for “losing (our) freaking mind on the subject” of the New England Patriots’ deflated balls. A more telling critique is to press those, like me, who couldn’t stop thinking and talking about Belichick and Brady gaming the system for an advantage they manifestly don’t need, but who, like me, heard about the Oxfam report on the radio while my sports talk radio station was on commercial break, and quickly forgot about it as soon as ESPN radio’s Mike and Mike in the Morning came back from break to joke about Seinfeld’s “shrinkage” episode in re: deflated balls. If denial ain’t just a river in Egypt, then projection ain’t just how Belichick and Brady watch game film. The problem in projecting everything onto Brady and Belichick, of course, is that at least we get to watch those two squirm during their respective DeflateGate press conferences (which turned out to be little more than exercises in plausible deniability), while the 1% are as hard to find in their piles of money as a needle in a haystack. Note that we have yet to hear about DeflateGate from the real money behind the New England Patriots, team owner Bob Kraft. Perhaps the most important element of being rich these days is that money talks, so you don’t have to. And we only make it easier for them (the 1%) when all we talk about is football, even, as in the case of DeflateGate, when we are really talking about them. This doesn’t make us “infantilized yahoos,” but it does make us unwitting accomplices. Nevertheless, I hope they boot Brady and Belichick from the Super Bowl, i.e. I’m with Rosie O’Donnell.

Saturday, January 17, 2015

"Democracy" and its Discontents

I was walking in my local grocery store parking lot this week when I saw the following bumper sticker: “Re-elect no one!” Now, this being the parking lot of that progressive enclave known as Whole Foods, it might be tempting to dismiss this expression of political disenchantment as little more than liberal whinging. But pace the cherished caricature of whiny effete leftists which enables us to get on with the masculine business of getting our hands bloody, kvetching about the United States’ Congress spans the entire political spectrum. Gallup polls indicate that “In 2014, an average of 15% of Americans approved of Congress,” and, more importantly, that “The same percentage (15%) of Republicans and Democrats approved.” (http://www.gallup.com/poll/180113/2014-approval-congress-remains-near-time-low.aspx) Gallup does note that congressional approval percentages are lower when the chambers of Congress are split between the parties, as they were in 2014, but even when Congress was undivided, as it was most recently in 2009, approval still maxed out at a mere 30%. So, depending on the year, anywhere from 70 to 85% of Americans are kvetching en masse.

If Congress is getting consistent F’s from we the people, it is clear that we expect more from our representative body. The dynamic is one of “You work for us, but this isn’t working for us.” With this as our unstated or implicit consensus, it then becomes a matter of uncovering just why Congress isn’t getting the job done. So, for example, we might look at the gerrymandering of congressional districts: “If a substantial number of districts are designed to be polarized, then those districts' representation will also likely act in a heavily partisan manner, which can create and perpetuate partisan gridlock.” (http://en.wikipedia.org/wiki/Gerrymandering) But if gerrymandering dates back to at least 1812, when the word was first coined in honor of gerrymandering Massachusetts governor Elbridge Gerry, then how do we make sense of the fact that “over the past four years, Congress' approval ratings have been among the lowest Gallup has measured”? (http://www.gallup.com/poll/180113/2014-approval-congress-remains-near-time-low.aspx) I would suggest that things begin to make sense when we accept the fact that Congress is getting the job done, it’s just that they don’t work for us anymore.

Support for this suggestion comes from a recent research article authored by Princeton University’s Martin Gilens and Northwestern's Benjamin I. Page. Gilens, in an interview with Sahil Kapur, describes their methodology thusly:

“What we did was to collect survey questions that asked whether respondents would favor or oppose some particular change in federal government policy. These were questions asked across the decades from 1981 to 2002. And so from each of those questions we know what citizens of average income level prefer and we know what people at the top of the income distribution say they want. For each of the 2,000 possible policy changes we determined whether in fact they've been adopted or not.” (http://talkingpointsmemo.com/dc/princeton-scholar-demise-of-democracy-america-tpm-interview)

Survey says!:

“Contrary to what decades of political science research might lead you to believe, ordinary citizens have virtually no influence over what their government does in the United States. And economic elites and interest groups, especially those representing business, have a substantial degree of influence. Government policy-making over the last few decades reflects the preferences of those groups -- of economic elites and of organized interests.” (ibid)

But a few factors make the obvious handwriting on the wall into more of one of those doctor’s office eye test Snellen charts. In addition to the aforementioned “decades of political science research,” there is the intense emotional investment that Americans have in our small-d democratic self-image; asking an American whether they live in a real democracy has, since 1776 (or since the Civil Rights movement, if you don’t happen to be white), been the equivalent of asking if the Pope’s Catholic or if bears shit in the woods. Add to this the fact that the voting polls are still open, and that elections are still fiercely contested between two seemingly opposed political parties (only seemingly, given that they both dance to the same twin tunes “of economic elites and of organized interests”), and the idea that the United States is no longer a democracy begins to shrink down to the illegible font size at the bottom of the Snellen chart.

All that to say that the American public is likely to express its discontent with Congress by kicking Democrats and Republicans out of congressional majorities on a rotating basis for the foreseeable future. And it is almost certain that they will maintain record low approval ratings for Congress regardless of which party holds the majority. The former behavior is invested almost entirely in maintaining the illusion of democracy, i.e. it is a blatant form of that primitive defense mechanism, denial (“acting as if a painful event, thought or feeling did not exist” (http://psychcentral.com/lib/15-common-defense-mechanisms/0001251)), while the latter behavior ventilates the rage of our unacknowledged loss. In other words, having lost our democracy, we are stuck in the first two stages of grief, denial and anger.

Since we need our democracy back, lest civilization devolve into hell on earth over the next few decades (time is of the essence!), I would also suggest that we don’t proceed out of anger and denial through the other three stages of grief: bargaining, depression, and acceptance. Instead, we need to cut the denial, and, necessarily but at great risk, work with our anger. The risk is that our anger will burn out of control, taking the form of what is known in America as domestic terrorism, and that the repression of this violence will be more violent still, and that our nascent police state will emerge from the other side of all this, enabled by previously unthinkable invasive technologies, in a position of total and implacable domination. Even given this risk, anger at the loss of our democracy is the only possible catalyst for resistance. But that anger must be yoked to the recognition that while war may very well be the continuation of politics by other means (politics having thus far been our saccharin substitute for democracy), peace is the actualization of an authentic democracy-yet-to-come. All of which makes me wonder if our late poet-prophet rock star, Kurt Cobain, was thinking of democracy when he wrote these lines:

“Come as you are, as you were,
As I want you to be
As a friend, as a friend,
As an old enemy.”