Sunday, December 28, 2014

Never Again

The horse has already left the barn, Pandora’s Box is already open, or the genie is already out of the bottle. Whichever cliché you prefer, each describes our new relationship to state-sponsored torture. Once beyond the pale, torture has crossed over as effortlessly as Taylor Swift making the switch from teenage country crooner to grown-up pop superstar. We now have a sitting Supreme Court justice, one Antonin Scalia, who, when asked about torture as a tool for interrogation, publicly opines that “We have never held that that’s contrary to the Constitution.” (http://www.salon.com/2014/12/22/7_worst_right_wing_moments_of_the_week_%E2%80%94%C2%A0rick_santorum_wants_you_to_know_hes_not_a_virgin_partner/)

For Scalia, torture is only ruled out by the Constitution as a “cruel and unusual punishment” for those already convicted of a crime. How convenient, then, the post-9/11 rolling back of habeas corpus rights, which intend the enabling of indefinite incarceration without trial for any individual the state identifies as an “enemy combatant.” Scalia’s reading of the Constitution legitimizes torture for the full duration of any such individual’s indefinite incarceration, i.e. the only way to stop the torture is to be convicted of acts of terrorism, except that as an “enemy combatant” one doesn’t have a right to stand trial for those very acts of alleged terror. Scalia’s brand of justice sounds rather like the old witch trials in which the suspected witch was subjected to dunking and could either a) admit to being a witch or b) prove that she wasn’t a witch by drowning. The only difference now is that with water boarding you get to drown again and again; as an “enemy combatant” one doesn’t really quite exist so one can’t actually die.

Even more troubling than this are the results of a recent ABC News- Washington Post poll conducted in the aftermath of the US Senate Intelligence Committee report on CIA torture. 59% of poll respondents think the CIA’s treatment (i.e. torture) of suspected terrorists was justified. William James once described religion as “a forced option,” reasoning that “We cannot escape the issue by remaining skeptical and waiting for more light, because, although we do avoid error in this way if religion be untrue, we lose the good, if it be true, just as certainly as if we positively chose to disbelieve.” Torture, I would argue, is a forced choice in much the same way, such that the 9% of poll respondents who hazarded “no opinion” as to CIA torture have, in their indecision (or, worse still, apathy) definitively decided in favor of torture in the same way that agnostics have, per James, decided against religion. This, if my math is correct, brings the percentage of Americans approving state-sponsored torture up to 68%. Which, given the wiggle room of polling margin for error, allows us to comfortably conclude that 7 out of 10 Americans are on board with water boarding. For context, note that this 68% is ten percentage points higher than the 58% of Americans who watched last February’s Super Bowl.

I was eating lunch with a colleague at work the other day when Senator Dianne Feinstein appeared on the café TV screen explaining why the Senate Intelligence Committee’s Report had been made public, an explanation that can be boiled down to the two most important words of her speech: “Never again.” But when Feinstein delivered these words my colleague didn’t hesitate in her reply, “Oh, like that’s ever going to happen.” For my colleague, torture was already in the same category as death and taxes, leaving me wondering how we got from the unthinkable to the inevitable in the seeming blink of an eye.

The answer, I would suggest, stems primarily from our deep-seated need to think of ourselves as fundamentally good people, a need that frequently manifests in the individual psyche, but which in this case also plays out in the realm of our collective identity. For many, if not most Americans, one of the keystone reasons we believe ourselves to be good is that we are, indeed, Americans. Whatever our personal foibles, simply by virtue of living and working in the United States we contribute to and participate in what is perhaps best summarized in the opening credits for the old Superman TV show: “Truth, justice, and the American way.” So even if I’m just an average Joe (or Jane), I am a part of something larger than myself, and that something isn’t just good, it is, essentially, the Good in the Platonic sense of the term; just as Plato’s form of the Good “allows one to realize all the other forms,” (http://en.wikipedia.org/wiki/Form_of_the_Good) the American Way allows one to realize truth, justice, and all of the other goods accruing to citizens of a land made, per Woody Guthrie, “for you and me.” Which Good isn’t all bad, at least when we kinda live up to the communitarian ethos embedded in Guthrie’s secular hymn. But what happens when the Good, like Marsellus Wallace in Pulp Fiction, announces its intent to “get medieval on yo’ ass”?

The “yo’” in question here, of course, is Islam, but it could just as easily be communism, China, or Mars, because all that really matters to the collective identity is that the Good is going to get medieval on somebody’s ass. The Good, you see, can’t get medieval on anybody’s ass. The Good can’t, to quote Marcellus Wallace again, “call a couple of hard, pipe-hittin’ (blokes) to go to work on the homes here with a pair of pliers and a blow torch.” Or at least the Good couldn’t. Past tense. But, per the 68 percent, now it would seem that the Good can. Get medieval. It’s not the American Way until suddenly it is.

Which brings us back to our need to need to think of ourselves as fundamentally good people, and the solution of subsuming ourselves in a greater (American) Good. The need doesn’t change, even as circumstances do, and even if those circumstances involve the Good perpetrating bald-faced evil. So when the Good gets medieval/evil, those of us who derive much of our sense of ourselves as basically good and decent people from the fact that we are Americans are subject to one of psychology’s most potent phenomena, cognitive dissonance, that “feeling of discomfort caused by performing an action that is discrepant from one’s self-concept.” (http://www.pearsonhighered.com/assets/hip/us/hip_us_pearsonhighered/samplechapter/0205796621.pdf) Or, in this case, the feeling of discomfort when the Good perpetrates evil in your name.

Dissonance theory maintains that there are two primary ways to respond to cognitive dissonance. The first involves “changing our behavior to bring it in line with the dissonant cognition.” (ibid) Senator Feinstein’s “Never again” is a perfect example of this approach. “Never again,” however, is extremely difficult and fraught with risk. Terrorism, as perpetrated on 9/11, is terrifying, indeed. Facing it down without resorting to ultra-violence of our own requires a courage that is excruciatingly difficult to muster. And it may fail. The attacks may come again.

Given the outsized difficulty and risk of “Never again,” it should come as no surprise that the 68% have managed their cognitive dissonance with the alternate approach of “attempting to justify our behavior through changing one of the dissonant cognitions,” or “by adding new cognitions.” (ibid) Put another way, “once we are committed to our views and beliefs, most of us distort new information in a way that confirms them.” (ibid) Making things even less surprising is the fact that the “closer people are to committing acts of cruelty, the greater their need to reduce the dissonance between ‘I am a good, kind person’ and ‘I am causing another human being to suffer.’ The easiest route is to blame the victim.” (ibid)

From here, it is all too easy to connect the dots:

1) America is Good (and by extension, so am I)

2) America engaged in the evil act of torture (and by extension, I am implicated, triggering cognitive dissonance)

3) Ipso facto, the torture of suspected terrorists was, far from evil, morally justified (and by extension, I am exonerated and cognitive dissonance is defused, despite the fact that torture was unequivocally evil right up until 9/11/2001)

No one, of course, is talking about any of this because no one is actually thinking about it: “the process of reducing dissonance is largely unconscious. Indeed, dissonance reduction works better that way.” (ibid) Perhaps this is yet another reason that Feinstein’s “Never again” fell so flat in the café at work last week and with 7 out of 10 Americans. You can’t get to “Never again” without stopping, facing your fear, and, crucially, thinking. And it is thinking that is sorely lacking in our unconscious resolution of cognitive dissonance, because even if torture weren’t unequivocally evil (which it is), one doesn’t have to think long and hard before realizing that the very best reason to abstain from torture is so that we, and especially those young men and women we send to the four corners of the earth to prosecute the war on terror, don’t become victims of torture ourselves. To our list of clichés we should add what goes around comes around. Pandora’s Box, indeed.

If we were to Monday morning quarterback the unforgivable decision to engage in state-sponsored torture we could modify another cliché and say that some rules are meant to never be broken. Taboos are taboos for a reason, and that reason, it turns out, is cognitive dissonance. Human weakness being what it is, it is all too likely that we will, as we have with torture, distort the truth in service of our sacred self-image. Kant’s categorical imperative might be helpful here: “Act only according to that maxim whereby you can, at the same time, will that it should become a universal law.” As impractical as the categorical imperative often is (we should indeed, contra Kant, lie in order to prevent a murder), we just need to know when to use it. And, it turns out, the categorical imperative is essential to the preservation of necessary taboos. As an intellectual/emotional exercise, ask yourself if you would rather live in a world where Scalia’s “We have never held that that’s contrary to the Constitution” or Feinstein’s “Never again” should become a universal law?

But the horse (torture) has already left the barn, and the truth (torture is evil) has already been distorted. In describing the future (“Never again”), Feinstein is talking to the past. But, if dissonance theory is correct and the only alternative to distorting the truth is to change our behavior to bring it in line with the dissonant cognition, then Feinstein and the rest of us 32% need to keep preaching to the empty choir stalls until they are fit to bust once more. And the only way we get there is if the truth matters a hell of a lot more to the 32% than the personal comfort that comes from feeling good about yourself does to the 68%. Knowing just how deeply I myself am committed to my own personal comfort and to my own self-image, I’d say that the odds, as they are if we choose to fight terror non-violently, are against us. We may fail. Torture may well continue. All we can do is fight to keep the barn door open, all along asking what is the meaning, the truth, of the empty barn?

Sunday, December 14, 2014

Here Comes Trouble

Last week, within the space of twenty four hours, my Uncle Bill snail mailed me a well-intentioned copy of a Consumer Reports article putatively debunking the benefits of gluten free eating, and Slate.com published an account of the burgeoning gluten free backlash. (http://www.slate.com/articles/life/food/2014/12/gluten_free_fad_don_t_be_annoyed_says_celiac_disease_memoirist.html)

Note that I went gluten free voluntarily a little over three years ago a few months before we found out that my daughter, Sammi, has Celiac disease. For me, a gluten free diet has resulted in reduced fatigue and anxiety, results that haven’t flagged after 36 months, while for Sammi it has produced a return to basic good health. So I am your basic true believer; quitting gluten changed my life and, essentially, saved my daughter’s. Accordingly, I scanned Uncle Bill’s mailing so that I didn’t feel like the guy who only ever consumes one version of the news (be it Fox News or NPR) because he doesn’t have the humility and/or confidence to admit that his side might be wrong about a thing or three. It turns out that the best Consumer Reports could come up with is to assume that if you have given up gluten you must be eating gobs of rice, which rice, Consumer Reports is happy to report, might be high in arsenic. (The high in arsenic rice sounds a lot like the high in mercury tuna, which means that tuna might finally get some respect; just imagine how exhausting the whole “chicken of the sea” label must be for tuna, leaving tuna no choice but to constantly remind folks that the actual sequence of events had life evolving in the ocean first and then crawling out of the sea onto dry land, i.e. chicken should rightly be considered “tuna of the land,” although this label may now fall to toxic-in-high-quantity rice.) Other than considering giving up eating rice cakes lathered in sunflower butter and jelly for breakfast, which I consider a daily confirmation of security in my masculinity, I tossed the article into the recycling bin without a second thought. (The sheer femininity of rice cakes makes the English language’s lack of gendered nouns seem stifling; why can’t we have la rice cake and le Manwich, even if this would lead to the difficulty of deciding on a gender for more androgynous foods like the marshmallow, which takes nature’s most androgynous shape, the equally round and straight cylinder.)

The gluten free backlash described on Slate.com did, however, hold my interest. Mainly because of a New Yorker cartoon it quoted thusly: “I’ve only been gluten-free for a week, but I’m already really annoying.” Moi? Annoying? But perhaps that’s exactly why I’ve taken so well to a gluten free lifestyle. Nothing, you see, gives me more pleasure than annoying my wife Jen by, e.g., pronouncing words in a way that really annoys her. I have a whole repertoire:

• Catsup instead of ketchup

• Pronouncing falcon so that the first syllable rhymes with all rather than Cal

• Pronouncing karate like the original Japanese’s “car-ahh-tay” rather than the Americanized “ka-rah-tee” and with the emphasis spread equally across all three syllables as opposed to the standard American pronunciation’s emphasis on the second syllable

• Pronouncing the s on the end of Illinois

• Pronouncing vegan “vay-gan” instead of “vee-gan” (Since I enjoy being annoying, and not being an asshole, I don’t use this pronunciation in front of actual vegans. Although this doesn’t mean that I am not an asshole, or that being an asshole doesn’t give me pleasure that I am too self deluding to acknowledge.)

But the strange thing is that unlike my relationship with Jen, in which I am annoying on purpose as a way of playing with her, in being gluten free I am unintentionally annoying; I am gluten free so that I can be anxiety and fatigue free, not so that I can turn down pieces of homemade cake offered to me in a spirit of kindness (which isn’t fun at all) and not even because I get to nonchalantly explain that it is gluten free bread when a friend calls me out for eating a sandwich (which is only fun if it does manage to annoy my friend; what more annoying response to “Gotcha” could there be than “No you didn’t”- as an example see Pee Wee Herman’s preemptive “I meant to do that,” which is both funny and annoying).

This sheds light on one of the strangest dynamics in human interpersonal relations, which is that if I do something for myself that feels really good and that I’m really excited about, I naturally want to share it with others, if only by talking about it and how good it’s making me feel. But this sharing, defined as any outward expression of passion for my new undertaking, even including simply engaging in the behavior that makes me feel good, is automatically interpreted by any individual who doesn’t either share my enthusiasm, or at least some form of sympathy towards it, as a judgment against them. In other words, “This is really cool, it changed my life and I’ve never felt better, and I highly recommend it,” gets automatically translated into and heard as “Excuse me, but they way you’ve been living your life is a big mistake.” And you can’t stop the translation by keeping your mouth shut; just going about your day and ordering a gluten free meal in a restaurant without otherwise making a peep is inevitably subversive.

In Tiger Writing, Gish Jen tells the story of a writing teacher who, in the midst of insulting Jen’s potential as a writer, explained that all good writing is subversive. The teacher was manifestly wrong about the exceptionally talented Jen, and I would suggest that he or she was also wrong about good writing. Writing needs to affirm as much as it subverts. But if we subtract one word from the lousy teacher’s formula, I think we are definitely on to something: All good is subversive. For whatever reason, we human beings seem to be hard wired for a zero sum game. If you have more access to the good, be it through gluten free eating, religious conversion, or simple luck, I necessarily have less. How else to account for Consumer Reports’ schadenfreude in their discovery of arsenic in the presumed staple of the gluten free diet?

My own faith tradition hints rather strongly at the subversive nature of the good: “Do not suppose that I have come to bring peace to the earth. I did not come to bring peace, but a sword.” Let us not be confused, Jesus is still very much the Prince of Peace, it’s just that establishing peace on earth requires rather a lot of conflict, even if this conflict takes the form of non-violent resistance. A pithy way of saying this in the Christian tradition might be to assert that there is nothing more subversive than spreading the good news. But whatever your faith tradition or lack thereof, the larger point in play here is that the good, however we experience it, be it via gluten free living or veganism, Christianity or Judaism, doesn’t make our life any easier. Instead, it does just the opposite, stirring up trouble for us wherever we go. That trouble could range from, on the low end, being the target of snarky New Yorker cartoons, all the way up to having your life threatened because of your religious beliefs.

In sum, the good leads to trouble, and the only way through that trouble is fidelity to the good, which guarantees more trouble. Annoying, isn’t it? But kind of funny, too.

Consumer Reports: “Looks like you’re eating arsenic for breakfast, buck-o.”

Me: “I meant to do that.”

Thursday, December 04, 2014

What Does a Man Really Want?

I happened to be driving in Delaware the other day when I passed a Harley Davidson dealership with a sign out front that read as follows: “Your wife called, and she says it’s okay.” This is quite possibly the most effective advertisement I have encountered since Miller Lite’s “Tastes great! Less Filling!” debate (the genius of which hinged on reformulating an age old question into its new, consumer-friendly form: “Is the glass half full or is it half full?”, which mutant question covers the range of sanctioned options living and voting in a 21st century western liberal democracy), and it works so well because it overtly winks at the fantasy on sale in the showroom. Which is the fantasy of male autonomy. The Harley’s signature exhaust blap is the trumpet fanfare announcing a man alone on his motorcycle, with the slight but noticeable edge of the outlaw, i.e. one who makes and lives by his own set of rules. So, in regards to the winking signage, what exactly does it mean to ask and receive permission to pretend to be that (autonomous) man?

Every married straight male knows that it is impossible to win an argument with your wife; if he doesn’t know it, he won’t be married long. As Camille Paglia puts it, “It is woman’s destiny to rule men.” (Us blokes do play our part, though, for, as Paglia also explains, “If civilization had been left in female hands, we would still be living in grass huts.”) So it turns out that for men it is better, indeed, to ask permission than forgiveness. In order to spare us this indignity, women have always known that the best way to get a man to do or agree to something is to make him think it’s his idea. Along these lines, perhaps the best way to get a man to feel autonomous is for a woman to let him pretend that he is.

The danger in this thinking, however, is in painting women as the source of male frustration, when, in fact, women have been, and always will be, the fountainhead of (straight) male desire. We must be careful to portray the woman who preemptively calls into the Harley dealership in order to grant her permission as a woman playing along with the fantasy, as opposed to understanding her as the demonic force undermining male autonomy, even, and especially, if that autonomy is an illusion perpetuated by female fiat. Because the former maintains the fragile male ego even as it sustains women in their place as masculinity’s legitimate holy grail, while the latter plants seeds of misogyny.

Understanding all of this requires getting at the root of male desire in order to see exactly why the illusion of male autonomy is enough, why men will (almost) always be satisfied by a game of make believe, by a Harley Davidson. Doing so requires asking two simple questions: 1) Is marriage a better deal for men or for women?; and 2) Which exactly is the weaker sex? Correctly answering both of these will provide the answer to a third elusive question, one that eluded Freud, who famously never asked “What does a man really want?”

Taking our questions one at a time, we begin with #1) Is marriage a better deal for men or for women? The data, quoted from Foxnews.com of all places (http://magazine.foxnews.com/food-wellness/love-better-mens-or-womens-health), is nearly unanimous, category by category:

• Longevity: “The link between marriage and longevity is much stronger among husbands than wives… Marriage is especially good at warding off fatal accidents, violence, and other semi-avoidable calamities, which are more common in younger people… But regardless of age, men's life spans appear to benefit more from marriage than women's.” (emphasis added)

• Heart disease: “While married men are three times less likely to die from heart disease than men who have never tied the knot, marriage only halves the risk of cardiac death for women.”

• Healthy choices: “Simply put, women may be a better influence on men than vice versa. Wives tend to be the more emotionally supportive partner and are more likely to encourage their husbands to refrain from drinking or smoking.”

• Stress: “Contrary to popular belief, men tend to get stressed out more easily than women. Lab experiments have shown that when given a stressful task, men exhibit greater spikes in the stress hormone cortisol than women. Fortunately for men, being in a romantic relationship — not just marriage — may curb their stress response. A 2010 experiment found that paired-off men had smaller spikes in cortisol levels than single men after taking part in a competitive game, whereas single and spoken-for ladies had comparable cortisol increases.”

• Sex: “where sex is concerned, marriage appears to be a better deal for men. In a landmark national sex survey conducted in the 1990s, 49 percent of married men said they were ‘extremely’ emotionally satisfied with their sex life, compared to just 33 percent of men who were unmarried or not living with a partner. By contrast, only 42 percent of married women were extremely satisfied with their sex lives, compared to 31 percent of women who didn't live with a partner.

So, men get more longevity, better health, less stress, and better sex out of marriage than women. One can either wonder at the odds of a marriage proposal being accepted, 1 in 1.001 (it goes without saying that 95% of proposals come from men), or one can consider the possibility that men get more out of marriage because they need more. Which brings us to question #2) Which exactly is the weaker sex? If we take our Darwin seriously, the data regarding this question are just as definitive. Because women, it comes as no surprise, outlive men, i.e. they are basically fitter. But what just might surprise are the facts as reported by Robert Krulwich at NPR.org :

“Women, it turns out, don't just win in the end. It seems that women consistently outlive men in every age cohort. Fetal boys die more often than fetal girls. Baby boys die more often than baby girls. Little boys die more often than little girls. Teenage boys, 20-something boys, 30-something boys — in every age group, the rate of death for guys is higher than for women. The difference widens when we hit our 50s and 60s. Men gallop ahead, then the dying differential narrows, but death keeps favoring males right to the end.” (http://www.npr.org/blogs/krulwich/2013/06/17/192670490/why-men-die-younger-than-women-the-guys-are-fragile-thesis)

Of all the possible reasons Krulwich explores for this longevity gap, only one seems to stand up to the simple fact that the gap exists in every age cohort, even in utero, and it is also the one which dovetails nicely with the idea that men bring more needs into marriage. The culprit? Simply put: male weakness. Krulwich’s 1934 quotation from Mayo Clinic doc E.V. Allen is well worth repeating:

"For each explanation of the lack of inherent vitality of the male there are objections, but these do not influence the fact; the male is, by comparison with the female, a weakling at all periods of life from conception to death. Venery, alcoholism, exposure, overwork, and various other factors may influence the susceptibility to disease and the greater mortality of the adult male, but they are only straws placed on the greater burden of his sex-linked weakness. There seems to be no doubt that, speaking comparatively, the price of maleness is weakness."

I would argue that men intuitively know this about themselves, and that a man’s greatest wish is to transcend this weakness. To answer the question Freud never asked by way of borrowing a phrase from Spok, what a man really wants is to live long and prosper. Given what we know about the effects of marriage on male longevity and prosperity (in the holistic sense of the word), when we say that what a man really wants is to live long and prosper what we are really saying is that what a (straight) man really wants is a good wife.

It would, however, be naïve to close without recognizing that this arrangement is not without its complications. Or else the odds of a married couple reaching their 25th anniversary would be higher than 1 in 6. (These odds, as well as the aforementioned odds of a marriage proposal being accepted are courtesy of Stewart O’Nan’s captivating novel cum meditation on marriage, The Odds: A Love Story.) I will leave it to the stronger sex to explain why men are so bloody difficult to cohabitate with, but will take a brief stab at explaining the undercurrent of resentment that men feel towards women, even and especially as women function as our salvation, a resentment perhaps best captured by the old saying “Women, can’t live with ‘em, can’t live without ‘em.” By way of explanation: Women, can’t live with ‘em (because, per Paglia, they rule over us), can’t live without ‘em (because without them we will quite literally die). The solution to this paradoxical masculine impasse: the fantasy of male autonomy.

So ladies, do let your man have his Harley Davidson, and do call up the dealership to let them know it’s okay. Let your husband spend quality time in what Mr. Rogers liked to call the Neighborhood of Make Believe, and he will, again quite literally, have more years to spend right here on earth with you, his good wife, which is all he really wants.

Sunday, November 23, 2014

These are not the Droids You're Looking For

One of my favorites among the many unforgettable scenes in Star Wars comes when Obi-Wan Kenobi is trying to slip through the Empire’s tentacles in the Mos Eisley spaceport with Luke, R2D2 and C3PO in tow. The latter two, of course, were wanted by the Empire for stealing away with the plans to the Death Star when jettisoned from Princess Leia’s consular ship, the Tantive IV, and landing on Tatooine. With Mos Eisley crawling with Storm Troopers, Kenobi’s party eventually comes to an Imperial checkpoint. With the two droids in plain sight, the gig appears to be up. But, with a wave of his hand, Obi-Wan simply says “These are not the droids you’re looking for.” Like a perfectly compliant husband, the Storm Trooper immediately repeats back the required thought: “These are not the droids we’re looking for.” Moments later, Kenobi, Luke and the wanted droids breeze through the checkpoint, scot-free.

One of the interesting things about the Jedi Mind Trick, the Force power utilized by Obi-Wan at the Mos Eisley checkpoint, is that you’ll actually hear people here on earth talking about using it themselves. This, in my own experience, is unique to the Jedi Mind Trick among all of the Force powers; people generally don’t go around, e.g., talking about how they levitate their car keys to themselves from across the room. But they do go around talking about how they used the Jedi Mind Trick to, e.g., get out of a speeding ticket. Until recently, whenever I heard a claim like this I took it as a cute way of saying you had talked your way out of said speeding ticket. But a new understanding of how the mind works suggests that the two methods of getting out of the ticket are distinct, a distinction which (accurately) presumes the reality of the Jedi Mind trick.

The really existing Jedi Mind Trick is made possible by mirror neurons: “A mirror neuron is a neuron that fires both when an animal acts and when the animal observes the same action performed by another. Thus, the neuron ‘mirrors’ the behavior of the other, as though the observer were itself acting.” (http://en.wikipedia.org/wiki/Mirror_neuron) In other words, if I’m sitting in a La-Z-Boy recliner and you’re standing in front of me doing jumping jacks, the same neurons that are firing in your brain as you jump are firing in my brain as I laze. This has all kinds of implications, and potential implications:

“mirror neurons may be important for understanding the actions of other people, and for learning new skills by imitation. Some researchers also speculate that mirror systems may simulate observed actions, and thus contribute to theory of mind skills, while others relate mirror neurons to language abilities… In addition, Iacoboni has argued that mirror neurons are the neural basis of the human capacity for emotions such as empathy. It has also been proposed that problems with the mirror neuron system may underlie cognitive disorders, particularly autism.” (ibid)

A close reading of this laundry list of potential implications reveals that research on mirror neurons to date (they were only discovered by researchers in the 1990’s) has focused on the impact of mirror neurons on the receiving end; mirror neurons have been conceptualized in what I would call a passive voice. Because of my mirror neurons I empathize, I imitate, I relate, even and especially verbally. But a far more interesting (to me) way of understanding the possible implications of mirror neurons comes when we think of them in the active voice. Because what does it mean that I can fire whichever neurons in your brain I wish to simply by firing them in my brain in your sight? It means, of course, that the Jedi Mind Trick is real, and that instead of spinning a convincing tale of exactly why I should not be held accountable for doing 55 in a 35, e.g., I am hurrying home in order to get dressed to go to church (a real excuse shared with me by an old friend when we were teenagers- and one that worked), one simply needs to exhibit the mannerisms and tone of voice of someone who should, at worst, be let off with a warning. Of course, the genius is in knowing exactly what those mannerisms and tones consist of, which is why it is the Jedi Mind Trick, i.e. this is advanced stuff. But I would hazard a guess that more than practicing 7 particular habits, the world’s highly effective people, those who get things done, are all people who have mastered the Jedi Mind Trick.

As everyone knows, the Force can be used for good or evil, and the Jedi Mind Trick is no exception. “Because strait is the gate, and narrow is the way, which leadeth unto life, and few find it.” Among the many potential abuses one thinks of seduction, and the adoption of the mannerisms and tone of someone who truly, deeply cares when nothing, in fact, could be further from the truth. Caveat emptor, indeed. But, if we are going to get on with saving the planet, mirror neurons and the Jedi Mind Trick just may supply the means to do so. Because the radical truth found in mirror neurons is that the social world is essentially a mirror. The most powerful act in the world is simply to quite literally gaze peacefully into that mirror. And never blink. Or, more realistically, try again tomorrow when you inevitably blink. Here on earth, Yoda’s “Do or do not. There is no try.” must always be translated into Beckett’s “Try again. Fail again. Fail better.” Look longer.


Sunday, November 09, 2014

Get Thee to a Primary School

Every now and again House of Pain’s bombastic 1992 single, “Jump Around,” still gets played on the radio. Naturally, I turn it up full blast and wait for the opening of verse two, at which point I rap along with Everlast to my favorite single line of lyrics in record industry history: “I’ll serve your ass like John McEnroe”

Unpacking the lyrics’ impact first requires tearing off the sport of tennis’ country club veneer to reveal the truth at its core, captured precisely by David Foster Wallace when he described tennis as a hybrid of chess and boxing. In other words, tennis is the ultimate combination of physical and mental combat. And yet the force of the lyrics is almost entirely embodied by McEnroe, if not the sport’s greatest player, certainly its singular genius, as perhaps best described by Dick Enberg’s remarks that “everyone else plays tennis, McEnroe plays music.” So, McEnroe was artist and, per D.F. Wallace’s definition of tennis, warrior. But, just as importantly, he was also quite infamously the anti-hero. And because we all tacitly acknowledge that almost every hero is on some important level a fraud, leaving anti-heros as, to borrow a phrase from Princess Leia, “our only hope,” we loved McEnroe not in spite of, but precisely because of all the Sturm und Drang. Artist, warrior, anti-hero. In evoking John McEnroe, Everlast has captured the three primary elements of contemporary masculinity. “Jump Around,” then, becomes the response, twenty years later, to Helen Reddy’s 1972 smash hit and the unforgettable lyrics, “I am woman, hear me roar.”

Would that we could stop there, declare “Jump Around” the yang to “I am Woman’s” yin, and celebrate our enlightened post-feminism. But the potpourri of art, war, and anti-heroism that makes up contemporary masculinity has a certain stench to it. And the odor wafts right out of “Jump Around,” in particular those opening lines of the second verse; “I’ll serve your ass like John McEnroe” is but the first half of a couplet. Completing the rhyme is this: “If your girl steps up, I’m smacking the ho”

That “Jump Around” pairs the definitive image of really existing masculinity with violence against women is, I would suggest, no coincidence; there are any number of words that end with a long-o sound. The couplet could just as easily have been, e.g., “I’ll serve your ass like John McEnroe, and my girl drops elbows like as if she fights for G.L.O.W.,” (G.L.O.W., of course, standing for the quite real Gorgeous Ladies of Wrestling), or “I’ll serve your ass like John McEnroe, if you step to me in battle you’ll get doused by my flow.” That’s battle as in freestyle rap battle folks, which, based on the quality of my imaginary substitute “Jump Around” lyrics, is a battle I am clearly not yet ready to wage. But you get my drift, which is that when Everlast penned the lyrics that would come to define modern man, he had no choice but to complete the definition with the second half of the couplet. I.e., language was speaking through Everlast, as it so often does with all of us, when the picture of today’s man was paired with the image of male violence against women. Language knows what so many of us pretend ignorance to. Language, the Symbolic Order that Jacques Lacan (rightly) suggests is the cost of doing any human business, knows that “I’ll serve your ass like John McEnroe” is the dictionary definition of the word man, and “if your girl steps up I’m smacking the ho” is the picture in the dictionary next to that definition.

How exactly have we arrived here, here being the time and place where manhood is not just inextricably linked to violence against women, but in fact emerges out of violence against women? Because if art, war, and anti-heroism dominate the foreground of masculinity, that foreground is thrown into sharp relief by the backdrop to it all, a background scene of symbolic and actual violence against women perfectly captured by Everlast’s “smacking the ho.” (See “smacking” for the actual physical violence, and see “ho” for the perhaps more dangerous and inevitably more potent symbolic violence.) Author Judy Y. Chen’s new book, When Boys Become Boys: Development, Relationships, and Masculinity, offers some new insight into just how we’ve taken this extraordinarily wrong turn. (Full disclosure: when I went to pick up When Boys Become Boys from the local bibliotheque, it was already checked out. So we are relying here on the informative review posted at http://metapsychology.mentalhelp.net/poc/view_doc.php?type=book&id=7231. ) Chen’s research-based thesis is that boys become boys “in opposition to femininity,” i.e. a boy is that which is not a girl, but, more specifically, not a girl by virtue of aggression, and, ultimately, aggression towards the feminine.

Just so, Chen (as quoted in the mentalhelp.net review) writes that "…boys' socialization towards cultural constructions of masculinity that are defined in opposition to femininity seems mainly to force a split between what boys know (e.g., about themselves, their relationships, and their world) and what boys show. In the process of becoming ‘boys,’ these boys essentially were learning to disassociate their outward behavior from their innermost thoughts, feelings, and desires.” Moreover, Chen’s research supports the disturbing reality that the split between what boys (and men) show and what they know plays out as violence against girls (and women), as per the mentalhelp.net review:

“Another significant find in her study was the fact that one of the high status boys in the class came up with an all boy group named ‘The Mean Team.’ The Mean Team targeted girls in the sense that they teased them or were mean to them. Being a successful Mean Team participant ensured a higher status and participation in a group and became another way for the boys to establish hierarchy and segregate themselves based on gender.”

Being one of the boys, which is precisely how one becomes a man in this culture that has no truck with rites of passage, requires joining The Mean Team. That is to say that the harsh reality is that men are The Mean Team. And the even harsher reality is that this includes just about every last one of us, which sounds confusing given the relative abundance of nice guys, until one realizes that nice guys are the good cops to the overtly mean guys’ bad cops. While bad cops busy themselves “smacking the ho,” and while good cops go about the business of making nice to only those women the world deems “good girls,” rest assured that they are both policing female bodies and the symbols that represent those bodies. (As an example of this policing, see every 13 year-old good girl who wears skirts and shorts that would make even Daisy Duke blush in the desperate hopes of earning the attention of the nice guys in 7th grade.) If the only way forward is to first admit that you have a problem, then the unvarnished problem is this: like the doctrine of the privation of good, which holds that evil is the absence or lack of good, masculinity as we know it, and as understood in the work of Chen, is the absence of femininity. To connect the dots, masculinity as we know it, is, to the degree that it is constituted in the absence of the feminine and then organized in aggression towards that very constitutional lack, evil.

The monumental task before us is to reinvent masculinity so that, returning to an image from the third paragraph of this essay, masculinity at long last takes its rightful place as the yang to the feminine’s yin. Building on Chen’s work, masculinity must be reinvented such that it is no longer born out of opposition to the feminine, but instead, building on the concepts inherent to the yin and the yang, blooms in complement to the feminine (with, to be clear, both masculine and feminine paths open wide to people with either sets of genitalia, although I would maintain that it is certainly reasonable to expect certain sets of genitalia to generally gravitate towards one pole of the masculine/feminine continuum- at least for now). Borrowing from the idea that real human history will only begin once a genuine socialism has taken root, I would argue that authentic masculinity will only begin once genuine nonviolence has won the day. But in a time when “world peace” has been reduced to clichéd punch line, global nonviolence isn’t even a glimmer in our collective third eye. So, we need to take the very first baby steps towards reinventing the masculine and undoing the current chokehold of evil. In planning these very first steps we would be wise to note that 98% of pre-school and kindergarten teachers are women, as are 81% of elementary and middle school teachers. Is it any wonder that boys are defining themselves in opposition to women, when there are quite literally little to no men around to model themselves after? If you are a man with even the vaguest sense of commitment to nonviolence, and if you want to (again quite literally) save the world, you should take a job in any primary school that will have you. It is a place to begin now, in a hurry, before we find that the show is abruptly over.

Sunday, October 26, 2014

The Wisdom of Leisure

In his provocative 1927 essay, The Future of an Illusion, Sigmund Freud outlines his take on the root source of humanity’s seemingly unflagging suffering, finding it in “the sacrifices which civilization expects… in order to make a communal life possible.” These sacrifices consist, for Freud, in a coerced “suppression of the instincts,” foremost among which instincts are sex and aggression, with a nod to death. But a crucial passage, and as I will argue a crucial error, in the opening pages of The Future of an Illusion points us in an entirely different direction from Freud’s theory of civilization as the ground of an inevitable discontent. And, while not a guarantor of unguarded optimism (our earth is far too close to the brink for anything but the most closely guarded forms of optimism), this opposite direction is, or has the potential to be, hopeful.

The passage reads thusly:

“For masses are lazy and unintelligent; they have no love for instinctual renunciation, and they are not to be convinced by argument of its inevitability; and the individuals composing them support one another in giving free rein to their indiscipline.”

His near bottomless contributions to our understanding of the human psyche notwithstanding, Freud here has it exactly backwards. Because masses are (quite often individually and without question collectively) so intelligent I would call them wise, and, furthermore, their genius consists precisely in what Freud calls laziness, but what I would describe as the wisdom of leisure.

In short, I am suggesting that the root source of suffering isn’t Freud’s suppression of the instincts (or, for that matter, Buddhism’s desire, or Christianity’s original sin, etc., etc.), but simply the fact that civilization (as we know it) is so much damn work. (There is some irony in the fact that Freud, a self-described “godless Jew,” sounds, in the passage quoted above, like a mouthpiece for the God-subscribing, goyish “Protestant work ethic.”)

While exercising caution that we don’t romanticize hunter-gatherer civilizations as some kind of Garden of Eden, it is nevertheless instructive to consider the relative workloads of our ancient forebears. To do so, we turn to Charles Eisenstein’s The Ascent of Humanity: “Ethnographic studies of isolated Stone Age hunter-gatherers and premodern agriculturalists suggest that ‘primitive’ peoples, far from being driven by anxiety, lived lives of relative leisure and affluence.” Eisenstein then describes anthropologist Richard Lee’s study of the !Kung of the Kalahari Desert, which found that, for the !Kung, “an average workweek consisted of approximately twenty hours spent in subsistence activities,” and that “Moreover, much of the ‘work’ spent on these twenty hours of subsistence activity was by no means strenuous or burdensome.” Compare this to the typical forty hour work week, which forty hours is a paltry sum when one considers the hours clocked by anyone with any real aspirations to climb the career ladder. (America is run by workaholics; we know this because they are emailing the rest of us between midnight and 3:00 AM.) Ask yourself also if some or much of the work you perform in your forty-plus hours isn’t “strenuous or burdensome.”

In sum, “premodern” civilizations worked a lot less than us and were also a lot less anxious. Work less, feel better. A simple formula that we, in all our technological glory, just can’t seem to grasp. But, of course, civilization isn’t going anywhere anytime soon. Leaving us stuck with Freud’s “love and work,” when what we really need is “love and play.” But the closest we seem to be able to get to the latter, as reported by The Washington Post, are employers (usually outdoorsy activity gear companies) who encourage us to take a half hour break for a hike, or a five day paid vacation to go camping somewhere really pretty, with a nod to the bottom line. (http://www.washingtonpost.com/business/a-company-that-profits-as-it-pampers-workers/2014/10/22/d3321b34-4818-11e4-b72e-d60a9229cc10_story.html?hpid=z5) Happy workers make productive workers. Just so, in our civilization, play is always in the service of work (and profits). We will know the revolution has finally come when we can each devote our lives to working really hard on something out of sheer pleasure, i.e. in a spirit of play. As Terry Eagleton says, socialism is about “leisure, not labor.”

Of course, Eagleton is quick to remind us of Oscar Wilde’s wry observation that “The problem with socialism is that it takes up too many evenings.” Wilde’s insight provides much needed ironic distance from the cruel truth that the source of the never-ending tide of strenuous and burdensome work can only be undone by yet more such work, making the dismantling of labor in favor of leisure something of a damned if you do, damned if you don’t proposition. All of which makes going through the automatic motions of showing up for work every day in order to put food on the table a symptom of the paralysis that comes from knowing that the only possible cure for what ails us is more of what ails us. In assessing the depth of this paralysis I would submit that the double bind diagnosed by Wilde has as much if not more to do with the failure of really existing 20th century socialism to displace capitalism than the atrocities propagated by Stalin and his ilk. Whereas the latter marked socialism as a brutal failure in Eastern Europe and Asia, the former made it a non-starter in the west, meaning it was over before it ever even started.

In place of socialism’s cure, perhaps the best we can hope for at this moment in time is a therapy, with at least one eye always watching for the event that will break open the possibility of playful leisure, which is the possibility of the reemergence of the human. We need new words for socialism anyway, given the tragic emptying of the term via unstaunched 20th century bloodletting, and playful leisure just might do. We need words that can never again lead to the gulag and the show trial, but that are still apposite to the soul crushing age in bloom on both sides of where the old Berlin Wall once stood. If the revolution is to succeed, it has to be funny (i.e. playful).

In the meantime, our therapy takes the axiom for our age, “Work less, feel better,” and translates it into a question that can be applied at every last decision point: How much is enough?


Tuesday, October 14, 2014

Making the Unpredictable Inevitable

As I make my daily internet check of the news headlines, of which the outbreak of Ebola and the brutal beheadings by ISIS followed by the subsequent renewal of perpetual American bombing are but the latest typical installments, I am now realizing that I can’t even imagine a potential major news event that would qualify unequivocally as good news. The only recent exception, and an exception exactly because its glad tidings were so unexpected, was the announcement by LeBron James that he would be returning to his hometown Cleveland Cavaliers after publicly jilting them four years prior in one of the biggest PR blunders of all time, pursuant to infamously “taking (his) talents to South Beach.” Perhaps there were just enough overtones of the Parable of the Prodigal Son to evoke that quintessential book of good news, the gospels. I would also venture that LeBron’s return to Cleveland qualifies as one of Alain Badiou’s events, which are described by Clayton Crockett and Jeffrey W. Robbins in Religion Politics, and the Earth thusly: “when a singular event occurs, it is an event, because there is something completely unpredictable or unforeseen, and it enables people to invent new ways of thinking and living in response.” Just so, LeBron’s return to Northeast Ohio from South Beach contains the radically new (for us) perspective that the best and most important place in the world is wherever one’s soul happens to be rooted. Before LeBron, moving from Northeast Ohio to South Beach and then, voluntarily, back again to Northeast Ohio was as unthinkable as reversing the flow of time. And while LeBron hasn’t reversed time, the event of LeBron’s return to Cleveland has shifted the flow of this particular river of spacetime that we call home in a slightly, but also noticeably better direction.

But outside of LeBron, the daily rundown of news headlines has begun to feel like a countdown to The End, each news item the tick of one more second off the fast expiring clock. When the news media, and the world, is experienced precisely this way, one particular piece of bad news mutates into the exception that proves the rule found in the inverse of “no news is good news,” i.e. all news is bad news. That exception, the bad news which gets translated into good news by the filtering effect of all the other bad news, is, of course, climate change. I know this because when I scan the daily headlines I always click on the articles about climate change in the hopes of hearing the good news that there is more bad news about climate change. And I know this because the people writing these climate change update pieces can barely contain their glee, too. Case in point, a recent Salon article reporting that the oceans are heating up more quickly than previously realized. (http://www.salon.com/2014/10/06/the_oceans_are_heating_up_a_lot_more_quickly_than_we_thought/) The article was subtitled “New data is bad news for anyone who hoped global warming was on hiatus,” but is understood by the Salon readership to actually mean “New data is good news for anyone who feared that global warming was on hiatus.” After explaining how the slowing of global warming is more than offset by a rise in the oceans’ temperatures “about 24 to 58 percent more quickly than models suggested,” the article quotes oceanographers Gregory Johnson and John M. Lyman’s assertion that “One could say that global warming is ocean warming.” (Oceanographers renaming global warming as ocean warming does sound a little bit like a football team’s offensive coordinator exclaiming that “the best defense is a good offense.”) From there, it is short work for article author Lindsay Abrams, whose byline identifies her as “reporting on all things sustainable,” to close by declaring that “In other words it (climate change) isn’t over. It’s just getting started.” Which, like her subtitle, comes off like appropriate handwringing about climate change, if, that is, one ignores the subjective heartbeat thrumming throughout the entire article. Subjectively speaking, Abrams closing thought is, quite simply, ecstatic.

This ecstasy is grounded in the political left’s understanding of climate change as the guarantor of the left’s belief that the future belongs to us. It is a vision grounded in the mythological narrative that the left has conflated with the brute facts of climate change. This mythology pictures climate change as the flames of late capitalism burning itself down, and then, rising from the ashes like a phoenix, voila, a new democratic socialism. Or anarcho-syndicalism, which is fine because it isn’t capitalism. As someone who traffics in this teleological fantasy as often as I scan the news headlines, “It’s just getting started” sounds like a battle cry of freedom. The fantasy hinges on the seemingly commonsensical logic that since it is abundantly obvious that globalized capitalism caused climate change, and since climate change is horrifically bad, then capitalism will at last be seen for the malignant cancer that it is and, at long last, be consigned to the dustbin of history.

The problem with this is that it ignores the first axiom of commonsense, which is that the best predictor of future behavior is past behavior. Which makes it highly probable that capitalism will be able to succeed in turning climate change into yet another business opportunity, e.g. by commodifying “sustainability” via a “green economy.” This process is already in full swing at your local Whole Foods and Toyota Prius dealership. Which is not to say that eating organic or conserving energy are bad in and of themselves, but that, like recycling, they not only won’t save us, but will also pave the road to hell with good intentions. (That intention, of course, is a “free market economy” that works for all of us, which is akin to running on a platform of “Elitism for everyone!”, making us the gullible fools who really believe the carnival barker when he says of capitalism, “Step right up, everyone’s a winner!”) This is not to say that climate change won’t ultimately render the earth uninhabitable, but to say that climate change may not present a “limit to growth” until that rendering is a fait accompli. Two metaphors may help. The first is of a giant balloon that keeps expanding right up until the moment it has sucked in the very last drop of air. Or, if you prefer, the Blob, which continues expanding until it has consumed every last molecule.

In short, we should be seeking our own liberation, and hope that in the process we save the earth, as opposed to counting on the death of the earth to save and liberate us. The problem is that just as I can’t imagine any good news other than more climate change, we can’t think of anything we can do to liberate ourselves. Our challenge, then, in imagining our liberation is in thinking the inconceivable. We should ask ourselves, “What is an impossible future?”, and start there. And if we lose our nerve, recall that LeBron is already back in Cleveland. So, because it would only be appropriate to close by putting a paradoxical spin on Badiou’s events with a line inspired by a sports movie, Field of Dreams’ “If you build it, they will come,” I would say this:

If we can think it, the completely unpredictable and unforeseen event is inevitable.

Sunday, October 05, 2014

Definitely Maybe

You know an ideology is totalizing when the only conceivable resistance is more of the very same ideology. Case in point: the Baltimore Sun recently reported on a City Council bill to outfit all 3,000 BCPD officers with body cameras. (http://articles.baltimoresun.com/2014-09-22/news/bs-md-ci-police-cameras-20140922_1_body-cameras-police-brutality-baltimore-police-officer) The bill comes on the heels of several instances of alleged police brutality, including the death of “44-year-old Tyrone West, who died while he was in police custody,” (ibid) and an incident in September in response to which “Baltimore police officials suspended an officer shown on camera beating a man at a North Avenue bus stop.” (ibid)

To get a sense of how the proposed body cameras fit into the ideology of total surveillance, one must first situate the police within the structure of the surveillance state. Police officers contribute very little to the actual surveillance of the American citizenry; that function is filled quite capably by a slew of other actors, most notably the NSA by way of its monitoring of our email and cell phone accounts, but also by corporate America through the monitoring of its employees’ and consumers’ behavior. Regarding the former, if your job involves a computer your boss essentially knows your every move. And as to the latter, cameras are ubiquitous in brick and mortar stores, and every click on the internet is reduced to raw data. So, whether one is on-line, at work, or in the (“virtual” or “real”) marketplace, or even just in range of a cell phone camera, one is essentially in Bentham and Foucault’s panopticon, where a “single watchman (can) observe (-opticon) all (pan-) inmates of an institution without the inmates being able to tell whether or not they are being watched.” (http://en.wikipedia.org/wiki/Panopticon) But if the police aren’t the panopticon’s “single watchman,” what, then, is their role? Which is where police violence, exponentially increased by police militarization, comes into play. Because if surveillance is the brains of the operation, police are the muscle.

Foucault was perhaps most famous for Discipline and Punish, a work exploring our panopticon culture. Playing off the title of that work, I would say that if we are disciplined by surveillance, then we are (corporally) punished by the police. Violence always carries a message. Where domestic violence says “This is our little secret,” and where terrorism of all stripes (i.e. including State terror) says “This could be you,” police violence now says, to borrow a famous phrase from George Orwell, “Big Brother is watching.” Police violence must needs only explode intermittently to serve its purpose; like the watchman’s gaze in the panopticon, just the fact that it might happen upon you is enough. Because, as Orwell explains in his masterpiece, Nineteen Eighty-Four, “there was of course no way of knowing whether you were being watched at any given moment… you had to live… in the assumption that every sound you made was overheard, and, except in darkness, every moment scrutinized.” The NSA might be watching, and the police might decide to treat me like Abner Louima. The whole point of which “mights” is that it is, as Orwell astutely points out, safer to live in the assumption that maybe is, paradoxically, definitive.

You know an ideology is totalizing when the only conceivable resistance is more of the very same ideology. This is just a fancy way of saying if you can’t beat ‘em, join ‘em. By pinning our hopes for relief from police brutality on police body cameras, we are endorsing the very surveillance that police brutality announces. In doing so we are continuing to follow Orwell’s script, retreating from the rushing darkness by escaping into an enveloping darkness.

The old question, “Where are the police when you really need them?”, has taken on new meaning in our surveillance state cum militarized police. Once a rhetorical question, it now has an answer. They are on camera. And if you’re wondering whether this means you should be preparing for your big close up, check out the title of Oasis’ (bitchin’) first album, Definitely Maybe.

Sunday, September 28, 2014

Asking the Impossible

I should be clear from the start that football has been a part of my life for as long as I can remember. More to the point, it has been a pleasure, a source of joy, even. As a kid, I played back yard football whenever I could, which was more often than you might think given that the closest pick-up game was over a mile way, most of which was down a long dirt driveway. When I couldn’t get a game I spent hours punting and kicking back and forth across the front yard, which wasn’t much smaller than a proper gridiron (we lived in the country). During this same period a picture of the Heisman Trophy cut from a magazine hung from my bedroom door for years. In high school I finally got to play organized football, and I can still remember the twinge of melancholy whenever our coach got us focused in practice by reminding us that these eight or nine games a year were the last football games in which almost every last one of us would ever get to play. (We weren’t bad, top ten in Delaware my last year, but Delaware isn’t Texas so only a couple of us would go on to play Division III college football, and I wasn’t one of them.) I have watched football on Saturdays and Sundays in the fall since at least 1980, the year of the first Super Bowl to enter my consciousness, XV between the Raiders and Eagles, when we still lived in Pennsylvania. (My childhood dog, Riggins, was named after the hero of Super Bowl XVII.) And now that we haven’t had a TV in nine years, I’ve still managed to watch football on Sundays at my mother-in-law’s house. And any time my favorite teams win, I get a blast of dopamine in my head. I listen to sports radio, which is at least 75% football-focused in-season or out, every day that I drive my car. I check the sports page on the internet every day, and probably 75% of the content is similarly devoted to football. Most years I have a (losing) fantasy football team. (Like sands through the hourglass, so far this year my squad, Aqua Velva, has one win to two losses.)

I once heard it said that one shouldn’t make a religion out of literature, because it is at the same time both more than and less than a religion. Without knowing exactly what that means, I would both agree and say that this is also exactly how I feel about football.

All of which is to say that for me, like tens of millions of other Americans, football matters. It has emotional heft and occupies significant chunks of the synapses in my brain. Which has made the revelation in the last several years of the brutal head trauma suffered by so many professional football players, a pattern of trauma which simply isn’t captured by the word concussion, more than a little troubling, twisting football from a pleasure into a guilty pleasure, and not the kind we mean when we say we get a guilty pleasure out of watching Jersey Shore; i.e. we don’t think “Isn’t it funny that I’m watching this,” but instead think “Maybe I shouldn’t really be watching this anymore.” And that was before this month’s epochal scandal, the video footage of Ray Rice’s explosive domestic violence, through which we have arrived at a defining moment. But the question remains: definitive of exactly what?

The phrase looping around and around in my head as I try to come to grips with the punch Ray Rice threw at his beloved is this: the compartmentalization of violence. We expect professional football players to engage in ultraviolence on the field of play, and comport themselves as gentlemen everywhere else, which means that we expect the impossible. In a schizophrenic arrangement (as the term is used colloquially), one compartment holds David Banner, the other the Hulk. Unfortunately, aggression leaks and eventually someone pisses off David Banner. There is an old saying that if you keep going to see a surgeon, eventually you’re going to get cut (i.e., it’s what they do). By the very same logic, if you hang around professional football players long enough, you’ll see someone get hit. Inside and outside the lines. (The latter of which, ironically, is the name of the ESPN news outlet which has alleged that the NFL’s league office and the management of the Baltimore Ravens have engaged in an ongoing cover-up of what they knew about Ray Rice and when they knew it, which, if true, is a topic for another day. I would only point out here that the fact that the cover up is always worse than the crime is the universe’s way of insisting that we learn from our mistakes instead of denying and/or burying them. Okay, I will also point out that I don’t know which is worse, if I am Roger Goodell’s employer: that Goodell knew the videotape of Ray Rice punching his fiancé was in the league office -note that it has been factually established that the tape was in the office- and he has been lying about it, or that the tape was in the league office without Goodell having a clue, i.e. he’s a liar or a fool, as in a commissioner from the planet Krypton who supervised underlings without expressly warning them about the possible effects of Kryptonite on their beloved boss is foolish, indeed. But I digress. Okay, okay, one more digression- Another tragic aspect of all of this is that prior to the incident Ray Rice would have been the very last Baltimore Raven anyone would have imagined punching his significant other’s lights out. He seemed to be the embodiment of the ideal of compartmentalization, lethal on the field and a model citizen in the community, an officer and a gentleman, if you will.)

If the logic of “if you hang around professional football players long enough, you’ll see someone get hit, inside and outside the lines” is valid, it would stand to reason that the arrest rates for NFL players involved in violent assault would be off the charts. When, in fact, the arrest rate for NFL players in cases of non-domestic assault is only 16.7% of the national average. (http://fivethirtyeight.com/datalab/the-rate-of-domestic-violence-arrests-among-nfl-players/) So, at first blush, pro football players seem to be the kind of guys you actually would want your daughter to bring home. But that picture drastically and tragically changes when you account for domestic violence:

“Domestic violence accounts for 48% of arrests for violent crimes among NFL players, compared to our estimated 21% nationally. Moreover, relative to the income level (top 1 percent) and poverty rate (0 percent) of NFL players, the domestic violence rate is downright extraordinary.” (http://fivethirtyeight.com/datalab/the-rate-of-domestic-violence-arrests-among-nfl-players/)

Given that the rate at which American domestic violence cases are reported is found in research studies to be as low as less than 1% (http://en.wikipedia.org/wiki/Domestic_violence_in_the_United_States), and given that the economic incentive alluded to above (“top 1 percent”) provides the victims of domestic NFL violence literally millions of reasons not to report, the depth and breadth of the NFL’s domestic violence problem could very well be unthinkably vast. The tendency of domestic violence victims not to report is also a stark reminder that the NFL players are, in one very grim sense of the term, rational actors in the perpetration of domestic violence; “we only hurt the ones we love” has been replaced by “we only hit the ones who won’t press charges.”

Although perhaps “rational actor” is a bit of a misnomer, implying as it does the process of conscious reasoning, when what’s really in play here is the unconscious process of displacement, defined by Norman A. Polansky in Integrated Ego Psychology as “turning one’s impulse aside from its original unacceptable target to one that involves less anxiety.” And, it goes without saying, there is certainly less anxiety involved in targeting those who won’t call the police, i.e. players’ wives/girlfriends, then everyone else they have the impulse to punch in the mouth. The classic example of displacement is the worker who takes crap from his boss all day, only to come home and take out (displace) his anger on the family dog with a swift kick. It is, then, the cruelest of ironies that the women getting punched and kicked by professional football players are also tagged by machismo locker room culture with the moniker “bitches.”

I am left wondering how to dole out the blame between the pro football players throwing the left hooks and we adoring masses. It is we, after all, who expect (and reward) the compartmentalization of violence, which is akin to expecting water to flow uphill. If we can say anything definitive about violence it is that aggression spreads. In this, it is no different than love. Violence, it turns out, cannot be compartmentalized; the closest it can come to disappearing is to be aimed at the vulnerable in the hopes that their invisibility will rub off on the violence. Which is why it is no surprise that within days of the release of the Ray Rice videotape, pictures surfaced documenting star Minnesota Viking Adrian Peterson’s alleged felony abuse of his son. In the National Football League, the compartmentalization of violence is code for violent displacement onto women and children. If Ray Rice and Adrian Peterson have accomplished anything, it is the removal of every last shred of our collective plausible deniability about that. Meaning Hannah Storm’s eloquent question to the NFL can be asked of all of us who love the game and feed the machine: What exactly do we stand for?

Sunday, September 21, 2014

When Safe is Just a Feeling

Two key markers have appeared in what is almost certainly a response to the waves of school shootings that crested first at Columbine High School and then again in New Town, Connecticut, but which have lapped steadily at the national consciousness since that fateful April morning in Littleton, Colorado: 1) We have begun the process of arming our teachers (or, as we shall see in at least one case, they are arming themselves, or at least their vice principal is), and 2) We are beginning to militarize our school police, who, it should be noted, already come equipped with the standard issue Glock sidearm.

As to #1, The Washington Post reports that “The Argyle Independent School District in north Texas has started the 2014-15 school year, as KDAF-TV noted, ‘with guns blazing’ — or, rather, with newly armed teachers who have been given the right to use them ‘to protect our students.’” (http://www.washingtonpost.com/blogs/answer-sheet/wp/2014/09/06/texas-school-district-arms-teachers-and-posts-warning-signs/) The Post adds that “In fact, nearly 20 states have laws allowing adults to carry licensed guns into schools.” And where schools aren’t proactively arming teachers, individual school personnel may be taking matters into their own hands, as, per Salon, in the case of one California public school administrator:

“Kent Williams is the vice principal of Tevis Junior High School in Bakersfield, California, and ever since he got his concealed-carry permit in 2010, he’s been bringing a handgun with him to work. Until recently, this wasn’t an issue (chiefly because other school administrators didn’t know he was doing it).” (http://www.salon.com/2014/09/10/middle_school_administrator_fights_for_right_to_bring_handgun_to_work/)
The Salon article goes on to explain that while Williams is currently on paid leave while the Panama-Buena Vista Union School District investigates matters (even as Williams’ lawyer threatens a lawsuit if Williams isn’t returned to the job), Williams faces no legal recriminations, with authorities “having concluded that his permit did not have any restrictions.” (ibid)

And regarding #2 above, the militarization of the school police, The Washington Post has also reported that “some school police in Compton will be permitted to carry semiautomatic AR-15 rifles — the same kind of rifle used in a recent Oregon school shooting – in schools… ‘in response to situations that clearly evidence a need or potential need for superior firepower to be used against armed suspects.’” (http://www.washingtonpost.com/news/morning-mix/wp/2014/08/21/in-compton-school-police-can-use-semiautomatic-weapons/) One can only imagine the potential crossfire, and wonder as to exactly how much better off we will be when our schools are battlefields instead of killing fields, i.e. will the post-battle carnage be any less than post-massacre?

Still, we’re used to cops with guns, and, increasingly, cops with military grade weaponry; the American police state is something of a fait accompli, sold to us as the cost of doing business in the post-9/11 world, and part and parcel of the national security state. No one says “Yes, sir” or “Yes, ma’am” anymore, but we all know how to say “Yes, officer.” But putting guns in the hands of your friendly neighborhood seventh grade social studies teacher, whose private life could heretofore be politely ignored as long as the teacher’s use of social media stayed within certain unspoken boundaries but whose every sick day must now be parsed for hints of distress and/or despair, gives one pause. Does Mr. Johnson have a nasty head cold, or does his recent break up with his fiancée have him reaching nihilistic conclusions about the point of it all such that I think I will just keep the twins home from middle school for the rest of the week? How many of America’s 3.1 million public school teachers could hold it together long enough to e.g. meet the Argyle Independent School District’s requirements to “obtain a license to carry it (the gun), pass a psychological evaluation and get training in how to use the weapon,”(http://www.washingtonpost.com/blogs/answer-sheet/wp/2014/09/06/texas-school-district-arms-teachers-and-posts-warning-signs/) but who have absolutely no business packing heat on lunch duty. (Anyone who can effectively run lunch duty will have long since mastered the Jedi Mind Trick, and, it goes without saying, guns are much too “clumsy and random” for Jedis.)

When one reads, again in The Washington Post, that an Idaho State University “instructor carrying a concealed gun accidentally shot himself in the foot in the chemistry lab,” and that “students were in attendance at the time but luckily none of them were hit,”(http://www.washingtonpost.com/blogs/answer-sheet/wp/2014/09/06/texas-school-district-arms-teachers-and-posts-warning-signs/) one begins to get a sense of how misguided the effort to protect children by arming their teachers really is. One is immediately reminded of the infamous data, accessible via a quick Google search, that a gun in the house makes homicides 2.7 times more likely. (The pro-gun websites pooh-poohing this data are just as easy to find on Google, but the critiques are robustly parried here: www.huppi.com/kangaroo/L-kellermann.htm) And since there’s no reason to think that the schoolhouse is immune to all of the contingencies of the household, as exemplified by the accidental shooting in the Idaho chem lab, making it obvious to any rational observer that we don’t actually know how to make our children safe from school shootings and instead choose to place them at increased risk, then what are we really on about when we arm our teachers?

The answer, praise Jesus (or the Sacred of your chosen tradition ☺), is not that we want to place our children at increased risk, but that this increased risk is a (still deeply troubling) side effect of the adults’ own efforts to feel safe. We are pointed to this conclusion by two further Google-accessed data points. The first of the two involves a classic case of belief’s conquest of the facts on the ground; pace “2.7 times more likely,” “for the first time a majority of Americans say having a gun in the household makes it a safer place to be, according to a new Washington Post-ABC News poll. By a wide 51 to 29 percent margin, more people say a gun in the house makes it safer rather than more dangerous.” (http://www.mediaite.com/online/poll-guns-make-people-who-own-them-feel-really-safe-everyone-else-not-so-much/) In other words, guns make those who possess them feel safer, even as gun ownership increases the risk of gun violence (e.g., “for every time a gun in or around the home was used in self-defense, or in a legally justified shooting, there were four unintentional shootings, seven criminal assaults or homicides, and eleven attempted or completed homicides. That’s one self-defense shooting for twenty-two accidental, suicidal, or criminal shootings, hardly support for the notion that having a gun handy makes people safer.” (http://www.jsonline.com/blogs/purple-wisconsin/184209741.html))

Our second follow-up data point highlights an inverse relationship between gun owners’ feelings and those of everyone around him: “By a margin of five to one, Americans feel less safe rather than more safe as more people in their community begin to carry guns.” (http://www.hsph.harvard.edu/hicrc/firearms-research/firearms-archives/) Apparently, “2.7 times more likely” means different things to different people. The confusion even extends into the home of the gun owner:

“Drill down on the 75% of the people in gun-owning households who think it makes their house a safer place to be, which leaves 25% who do not. Of those people who live in gun-owning households, 31% do not own a gun themselves. If you make the fair assumption that almost all of the people who actually own the guns think they make their home safer, that leaves almost all of the people who don’t personally own a gun, but live in a household with one, don’t think it makes them safer.” (http://www.mediaite.com/online/poll-guns-make-people-who-own-them-feel-really-safe-everyone-else-not-so-much/)

In sum, school shootings are horrifying, and teachers faced with the possibility that such an event could happen in their school and in their classroom understandably reach for the feeling of safety that comes with a gun, most likely telling themselves that the aura of safety fills the classroom, when in fact they have only ramped up the fear for the children they believe they are protecting. This process, coping with fear, anger, and pain by attempting to block those feelings in ways that amplify those very feelings in others, is, tragically, the common coin of our stunted social realm. It is elemental, to speak in the Christian idiom, to our fallenness. It doesn’t have to involve guns or even physical violence; as part of our everyday human scenery it is called blaming. I do it all the time, especially around the house, no matter that when I am whining yet again about who moved my stuff where I can’t find it almost invariably I discover that I moved my stuff. And even if it wasn’t me, big deal, let’s try for some perspective here.

Since perspective is in such short supply when one is busy blocking out pain by offloading it onto others (see the whining), I have in the last few weeks started a daily practice for some help in this area. Each morning I visit www.random.org to receive a randomized number between 1 and 59, which is how many Lojong slogans there are. These slogans are, according to the dust jacket for Always Maintain a Joyful Mind, the collection which houses all 59 slogans with commentaries by Pema Chodron, “a collection of 59 pith teachings to help… develop wisdom and compassion amid the challenges of daily living.” Random.org promises me that the numbers it provides are truly random, not, for example, like the random function on my old CD player which always cued up the same sequence of songs on Tom Petty’s masterpiece, Into the Great Wide Open. Nevertheless, in the first weeks of my new practice several of the numbers have been repeating, i.e. I am hardheaded and need to hear things more than once before getting the point. (I think the Buddhists call this ego.) Yesterday, I got #34 for at least the third time. It is a pithy prescription for those who would otherwise cope with fear, anger, and pain by attempting to block those feelings in ways that amplify those very feelings in others, i.e. for teachers who would carry guns and for forty-year olds who can’t shut up about who had the audacity to move their cell phone charger cord.

So, for all of us, #34, with Pema Chodron’s commentary in parentheses:

“Don’t transfer the ox’s load to the cow. (Don’t transfer your load to someone else. Take responsibility for what is yours.)”

Sunday, September 07, 2014

What's in a Name?

In the current edition of ESPN Magazine, Howard Bryant’s column delves into exactly why, in the wake of the events surrounding Michael Brown’s death in Ferguson, Missouri, so few professional athletes, including the many African Americans in their ranks, have spoken out. Bryant’s piece is highly recommended, touching lucidly on “fault lines of race and class,” and “the growing culture of militarism that is now everywhere in America.” But I’d like to use one peculiar, provocative element in Bryant’s column as a jumping off point for an entirely different conversation, if one that also sits unsteadily on “fault lines of race and class.” That subtle provocation occurs smack in the middle of Bryant’s report on the near-silence from pro jocks re: Ferguson:

“In the wake of curfews, arrests and tear gas, the St. Louis Rams offered tickets to the youth of Ferguson; some of the Washington football players held their hands up as they emerged from the tunnel before a preseason game, adopting the ‘hands up, don’t shoot’ symbol of protest in solidarity with a community roiling.”

How ironic that virtually the only professional athletes in America willing to take the risk of making public their support for the Black folks, and especially the young Black males, of Ferguson, Missouri are the very athletes who play for a team whose mascot Bryant cannot in good conscious even mention: the Washington Redskins.

The controversy surrounding the Washington football club’s mascot has reached a crescendo during a summer which saw the team lose its trademark protection from the United States Patent and Trademark Office (a ruling the team is, predictably, appealing), fifty United States senators sign a letter urging the NFL to take action and force a name change, and the announcement this week by the New York Daily News that it will no longer include the word Redskins in its newspaper (http://www.nydailynews.com/opinion/sack-article-1.1926865), an announcement that follows on the heels of the hometown Washington Post’s editorial page announcing that it will no longer print the word Redskins either (though the Post’s sports page will continue to refer to the team as the Redskins). (http://www.washingtonpost.com/opinions/washington-post-editorials-will-no-longer-use-redskins-for-the-local-nfl-team/2014/08/22/1413db62-2940-11e4-958c-268a320a60ce_story.html )

But before delving into the reasons that 71% of Americans still believe the team should not change its name (http://www.washingtonpost.com/sports/redskins/new-poll-says-large-majority-of-americans-believe-redskins-should-not-change-name/2014/09/02/496e3dd0-32e0-11e4-9e92-0899b306bbea_story.html), and that team owner Daniel Snyder has, per NBC’s Al Michaels, stated that the team will change its name “over my (Snyder’s) dead body,” some personal history is in order. I have been a Washington Redskins fan since moving within their TV broadcast territory as a 6-year old just at the dawn of the team’s golden age, a ten year burst in which the franchise would bring home three Super Bowl championships under the guidance of likable obnoxious rich guy (team owner) Jack Kent Cooke, California surfer dude (general manager) Bobby Beathard, and nascent NASCAR kingpin (head coach) Joe Gibbs. Everything about the team was fun, especially the winning, but also the nicknames (Hogs, Smurfs, Fun Bunch, etc.) and the cast of characters, e.g. the performance art of Hall of Fame running back John Riggins passing out drunk on the floor of a White House dinner after telling Justice O’Connor “Loosen up, Sandy Baby.”

But all good things must come to an end, and before long Cooke was dead, Beathard was guiding the San Diego Chargers to their first ever Super Bowl appearance, and Joe Gibbs was winning the Daytona 500. The Daniel Snyder era, with perpetual mediocrity punctuated by brief bursts of total incompetence, had begun. Not even Gibbs, in his brief second run with the team, could put things right; he got the hell out of Dodge after squeaking into the playoffs enabled him to leave with his dignity, and reputation, largely intact. It hasn’t been much fun to be a Redskins fan for the last twenty years, and without the fun to distract you, there’s that name. Just sitting there. On my 1987 Super Bowl t-shirt. On my 1982 Super Bowl Coke bottle. Talking to me. Telling me that the reason I was a Redskins fan for all the good years without once even thinking about the implications of the team’s name was because I could. Which doing things because you can without even thinking about them is the very definition of white privilege. And I continued not even thinking about it even several years into the losing, until a girlfriend (now ex) who had worked actively to change the University of Illinois’ Chief Illiniwek mascot as an undergraduate at Champagne-Urbana, looked at me a certain way whenever I used my MBNA Washington Redskins Visa card. Which is how my ’87 t-shirt is looking at me right about now. My privilege, singing “Hail to the Redskins” without ever once thinking about the words coming out of my mouth, was undone because my girlfriend made me think. Getting white folks to stop and think, and then hopefully feel, is the only way we ever change, and was the chief strategy of Martin Luther King’s nonviolent resistance movement.

Fast forward to 2014 and, slowly, more folks, white and otherwise, are beginning to stop and think about the name of the football franchise that represents our nation’s capital. While 71% of Americans still think its hunky dory if the Redskins keep their name, that number is significantly lower than 1992’s 89%. And, his “over my dead body” stance notwithstanding, Daniel Snyder may be starting to see the handwriting on the wall, as rumors have begun to circulate that Snyder may be willing to change the name for the right stadium deal (http://sports.yahoo.com/blogs/shutdown-corner/former-gm--dan-snyder-might-drop-redskins-name-for-new-stadium--super-bowl-144942689.html). But given that the latter is currently mere internet speculation, as well as the fact that Snyder and his minions have been mounting an aggressive defense of their beloved mascot, including a creepy new charity campaign aimed at select American Indian tribes, the task of answering the arguments floated in favor of keeping the name is still at hand. The four most oft repeated arguments turn out to be, respectively, false, (almost) real, stonewalling, and ironic.

The false argument, and one put forward in bad faith, is that the name Redskins is intended to honor American Indians. This is patently absurd, mirroring other lies such as segregation’s “separate but equal” and Fox News’ “fair and balanced.” In all three of these cases, the lie functions to announce the real intentions (i.e., respectively, making sport of a dominated civilization, terrorizing Blacks, and shilling for the neoliberal militarists) in a socially acceptable manner. The truth is that using and maintaining the name Redskins is just a perpetual game of Cowboys and Indians, a game that has never been about honoring American Indians, but about casting them as brutal savages to be overcome by western civilization. Cowboys and Indians is the children’s version of a ritual, here being played by grown men and women, the sole purpose of which is to absolve (European) Americans of any lingering guilt over genocide.

The only (quasi) real argument being made in favor of keeping the name goes something like this: “You’re just a bunch of uptight liberals trying to ruin our harmless fun.” Real as in honest, if only honest to a point. Honest that it is a lot of fun playing Cowboys and Indians (Redskins fans claim to hate the Cowboys, when really we’re all Cowboys fans too, enjoying with our Dallas brethren the thrill of a victory that delivered an entire continent; the only difference is that we get to play the Indian while our Dallas brethren get to play the Cowboy), but deeply disingenuous in its denial of the incalculable harm done to an entire people.

The (not quite) real argument is closely related to the argument that is used more frequently than all of the others combined, i.e. the stonewalling maneuver of stating some version of “Man, I’m not into all that political correctness stuff.” This strategy depends on the logic that 1) all reasonable people recognize political correctness for what it is- a liberal mind control device that deflects our attention from attending to matters that are actually important, 2) and therefore anything judged to be politically correct can be dismissed out of hand, 3) and furthermore the concern over the Redskins name is a classic case of political correctness run amok. Repressed and implicit in this logic is the nefarious belief that the bloody history between European colonists and indigenous American Indians, not to mention the current relations, are of negligible importance. Which is to say, it doesn’t matter because the victors have already accrued their spoils.

The final, ironic argument is perhaps the darkest: “We must keep the Redskins name because it is a source of unity for the people of the District of Columbia and the surrounding suburbs.” Putting aside the fact that this so-called unity is entirely non-contingent on the name Redskins, i.e. it has been just as readily provided by the Ravens in Baltimore despite the city’s historic connection to the name Colts, let us note what we are really saying when we claim that the price of the unity between Blacks, Whites, Latinos and Asians is paid with the name Redskins: It (North America) is ours now.

Monday, September 01, 2014

Back Soon

If, like me, one’s spiritual foundation was laid down in the Christian tradition, one must reckon with a number of events which are, if one is in a religious mood, plainly miraculous, or, if one is feeling secular and (post?)modern, rather kooky. Among these are, of course, the virgin birth, walking on water, sundry healings, the resurrection, and the second coming. All of these, save the last, are in the past tense, the upshot of which leaves Christianity hanging on a prediction which can be paraphrased in two words: Back soon. Which two words, interestingly enough, are also at the center of a rather amusing passage in The House at Pooh Corner that may help sort out whether Christianity leans over into the abyss or leans back to pull us out of our own.

But, before we get to Pooh, a brief summary. Jesus’ account of the end times and his prophesy of “the Son of Man coming on the clouds of heaven,” i.e. the prediction of his very own second coming, appears in all three synoptic gospels. The denouement at first seems to find Jesus putting all of his chips in the middle of the table as he tells us exactly when this will all go down: “Truly I tell you, this generation will not pass away until all these things have taken place.” But then, one sentence later, it devolves into what can only be described as history’s biggest mixed message: “But about that day and hour no one knows, neither the angels of heaven, nor the Son, but only the Father.” A few verses later, the confusion is compounded: “the Son of Man is coming at an unexpected hour.” “Back soon” is scrambled into a garbled mess before the ink of “Truly I tell you…” is even dry; we are seemingly left to make do with “hurry up and wait.”

It is the garbled mess which brings us to Pooh. Twice. Because the key words, “Back soon,” don’t just appear in A.A. Milne’s second volume of Pooh stories, published in 1928. They are back again, one might say with a vengeance, in the 2011 animated film, Winnie the Pooh. And strangely enough, the two different readings of “Back soon” in the Winnie the Pooh universe just might inform our reading of Jesus’ own “Back soon,” which reading, on a grand enough collective scale, just might in turn help make the difference in the direction our material universe makes as we (Christians, Jews, Muslims, Hindus, Buddhists, Pagans, Seculars, etc.) stand at the proverbial fork in the road.
On our first pass we turn to the passage in The House at Pooh Corner, in which Christopher Robin leaves his garbled version of Back Soon in a note, happened upon by Rabbit:

“GON OUT
BACKSON
BISY
BACKSON
C. R.”

Rabbit takes the indecipherable Message to Owl, whose modus operandi is to maintain an air of deep wisdom. Since putting on these kind of airs always depends on making things up (see all of western philosophy; not that this is a knock on western philosophy, it’s actually what makes it so damn fun and helpful, therapeutic even, as long as we remember that it’s all made up), Rabbit does exactly this:

“’It is quite clear what has happened, my dear Rabbit,’
he said. ‘Christopher Robin has gone out somewhere with
Backson. He and Backson are busy together. Have you seen a
Backson anywhere about in the Forest lately?’
‘I don't know,’ said Rabbit. ‘That's what I came to ask
you. What are they like?’
‘Well,’ said Owl, ‘the Spotted or Herbaceous Backson is
just a—‘
‘At least,’ he said, ‘it's really more of a----‘
‘Of course,’ he said, ‘it depends on the----‘
‘Well,’ said Owl, ‘the fact is,’ he said, ‘I don't know
what they're like,’ said Owl frankly.
‘Thank you,’ said Rabbit. And he hurried off to see
Pooh.”

We should note several crucial elements of this passage before moving on to the return of the Backson in 2011. Perhaps most importantly, there is nothing whatsoever at all about the Backson that is threatening. In fact, Owl from the very first imagines the Backson as a friend for Christopher Robin. Completely befuddled by Christopher Robin’s garbled “Back soon,” Owl has filled in the gap with something good. But something that is no less mysterious for its certain goodness. The “Spotted or Herbaceous Backson” is a friend, but beyond that we can say no more. Even Owl must admit “I don’t know what they’re like.” (Happily, Owl isn’t quite as far gone as your standard outfit narcissistic humanities professor.)

Fast forward eighty three years, and the scene in the Hundred Acre Wood has taken an alarming turn. With boss man Disney now calling the shots, both the Backson and his back story have mutated beyond recognition. To begin with, Rabbit has been elbowed to the periphery; with apologies to Piglet, it is Pooh who brings home Disney’s bacon, so it is Pooh who has the honors of discovering Christopher Robin’s note. Apparently Pooh is now the early Michael Jordan, and has to take every big shot, making Rabbit into John Paxson. (Tigger would have to be Dennis Rodman, and I think Eeyore would make a really convincing Bill Cartwright.) This much I could stomach; would that our only problem was the celebrity cult of personality. For a moment it even seems like we’re back in safe, familiar territory, as Pooh consults Owl about Christopher Robin’s mysterious note.

But then we remember that it’s the second decade of the 21st Century and violence is the air we breathe. We should note that it’s not as if A.A. Milne wrote the original version of the Backson story in the Garden of Eden; 16 million had died just a decade before in World War I. But if violence was already the out breath then, it is now also the in breath. Where Tolstoy once wrote of war and peace, we now have war and war, i.e. wars hot and cold and a subsequent peace dividend in the form of the war on terror. And where Owl once pictured Christopher Robin “gone out somewhere with Backson. He and Backson are busy together,” Owl now informs Pooh and friends “of their new enemy. He is a ferocious creature who enjoys torturing others and creating misfortune.” (http://disney.wikia.com/wiki/The_Backson) That the torture is put in Hundred Acre Wood context, the Backson is “responsible for holes in socks, broken teeth, aging, theft, catching colds, etc.,” makes not a lick of difference. Winnie the Pooh now lives in the same world as us, which is where Abu Ghraib and ISIS are. So, not surprisingly, Pooh and friends make their martial preparations for the Backson, preparing to trap it (evoking extreme rendition and Guantamo), or “to battle the beast if necessary” (evoking Iraq and Afghanistan). If this sounds daft or melodramatic, I would but ask if it is really mere coincidence that a 1928 Pooh story rewritten in 2011 includes torture as a central narrative element? (If Phase 1 of the American shift to the techniques of violence without limits was accomplished in the foundational act of the Cold War, i.e. the atomic bombing of Nagasaki and Hiroshima qua shot across Moscow’s bow (http://www.newscientist.com/article/dn7706-hiroshima-bomb-may-have-carried-hidden-agenda.html#.VASeFaCdCFI), Phase 2 began with the establishment of western democratic state-sponsored torture as the boundary condition for the Global War on Terror; in considering that the US was the first to think both of these unthinkables we should acknowledge that the first is rarely the last, and secondly hope that this week’s revelation of ISIS’ use of the waterboarding they learned from watching us (http://www.washingtonpost.com/world/national-security/captives-held-by-islamic-state-were-waterboarded/2014/08/28/2b4e1962-2ec9-11e4-9b98-848790384093_story.html) is not a foreshadowing of a similar symmetry with Phase 1, i.e. ISIS, with its terrifying the-best-defense-is-a-good-offense stratagem, doesn’t seem likely to follow the logic of nuclear weapons as deterrent that we’ve all been clinging to, despite evidence to the contrary from 8/6 and 8/9/45.)

I would suggest that the twinned surges of relativism and fundamentalism (twins explored in the previous post, “Doubt without Doubt,” on this very blog) between 1928 and 2011 go a long way to explaining the link between first Rabbit and then Pooh’s interpretations of Christopher Robin’s “Backson” and the contemporaneous respective interpretations of Jesus’ own “Backson.” In other words, A.A. Milne and Disney were each, without knowing it, doing theology- and doing the kind of theology native to their time and place. In 1928, even after The War to End All Wars, it was still possible to believe without knowing; the Backson could be both Christopher Robin’s friend and, per Owl, something we know absolutely nothing about. And that other Backson, Jesus’ second coming, could still be understood exactly as it was described in Mark, Matthew, and Luke, which is to say our dear friend is on his way, he’s almost here in fact, although he’s altogether the type of fellow who’s likely to get held up at the train station for God knows how long, so we better use this time to get things in as good an order as we can.

By 2011, with relativism and fundamentalism in full bloom, one can either believe that it isn’t possible to know anything at all (relativism), or believe that one knows everything (fundamentalism); and since Christianity has by and large become the purview of the latter, Disney’s Backson is “a ferocious creature who enjoys torturing others and creating misfortune.” Which, natch, sounds a lot like the kind of (fundamentalist) Jesus whose “Back soon” involves the Tribulation, i.e. a “period of time where everyone will experience worldwide hardships, disasters, famine, war, pain, and suffering, which will wipe out more than 75% of all life on earth before the Second Coming takes place.” (http://en.wikipedia.org/wiki/Great_Tribulation) Except, of course, “those who choose to follow God,” i.e. the fundamentalist Christian God, “will be Raptured before the tribulation, and thus escape it.” Lucky them.

All of which is not to say that there weren’t people before 1928 who believed in the Tribulation, or that there aren’t Christians in 2011 or 2014 who believe that God’s infinite and reckless love and mercy extend to every last one of us. But it is to say that the latter has been divorced from what the word Christian symbolizes in the popular imagination, which I’d like to believe as recently as 1928 was (often) an imagining of a close friend who was nevertheless likely to appear to us as a stranger on whichever road we’re traveling. Instead, the symbol of Jesus on the cross has been reduced to a straw man for the relativists to stick their pitch forks in, and the cross itself has been inverted into the fundamentalists’ sword, whose literal reading of “I came not to bring peace, but to bring a sword,” is roughly as nuanced as understanding the “Open joints on bridge” road signs as an invitation to toke up.

As C.S. Lewis once said, “if you have taken a wrong turning then to go forward does not get you any nearer… Going back is the quickest way on.” In this case that means going back to The House on Pooh Corner (this being a discussion of the second coming, it is hoped that the reader will indulge ending with a bit of repetition):

“’It is quite clear what has happened, my dear Rabbit,’
he said. ‘Christopher Robin has gone out somewhere with
Backson. He and Backson are busy together. Have you seen a
Backson anywhere about in the Forest lately?’
‘I don't know,’ said Rabbit. ‘That's what I came to ask
you. What are they like?’
‘Well,’ said Owl, ‘the Spotted or Herbaceous Backson is
just a—‘
‘At least,’ he said, ‘it's really more of a----‘
‘Of course,’ he said, ‘it depends on the----‘
‘Well,’ said Owl, ‘the fact is,’ he said, ‘I don't know
what they're like,’ said Owl frankly.
‘Thank you,’ said Rabbit. And he hurried off to see
Pooh.”