I hate wearing neckties. My loathing began in high school, when I attended a boarding school that required boys to wear jacket and tie. Unlike The Olive Garden, where my wife waited tables and was required to wear a tie just like les garcons, the girls at St. Andrew’s had sartorial carte blanche from the neck up; in addition to their naked napes they could, of course, grow their hair as long as they wished, while we boys had to keep our manes above the collar. Despite the apparent privilege of these feminine liberties, the injunction against growing facial hair, while technically only applicable to the boys, somehow still managed to feel like a twist on the old idea that “Gentlemen may remove their hats; all others must”: “Gentlemen may refrain from having beards and moustaches; all others must.” Despite the school’s efforts to undo patriarchy by way of having male and female co-presidents for each class, diligent guarding against facial hair reminded us that the world consisted of gentlemen and everyone else.
Gentlemen, of course, wear neckties. Or, apparently, turtlenecks, which were a permissible substitute. Given that the turtleneck’s bisection of the head and body mainly succeeds in bringing to life Descartes’ surreal image of brains floating in vats, the dons at St. Andrew’s likely trusted that the turtleneck would remain in its supporting role, one from which it was never a threat to steal any scenes. And since my singular accomplishment at St. Andrew’s took the form of being consistently and remarkably unthreatening, I was a natural for turtle necks. By senior year, I was wearing them every day, and pairing them with the black K-Swiss tennis shoes that were as forbidden as they were unnoticed, my act of rebellion so manifestly harmless that if any faculty members ever even noticed my sneakers it would never have occurred to them to either make me change or give me demerits (“marks” in the local lexicon); my sneakers were the exception that only proved how slavishly I had always followed the rules. In them, I padded down the corridors of Founder’s Hall those last few months without making a sound. To bastardize T.S. Eliot, and at the risk of sounding melodramatic, a necessary risk whenever discussing one’s adolescence: This is the way high school ends This is the way high school ends This is the way high school ends Not with a bang, nor a whimper, but with a forgetting
I don’t wear turtlenecks anymore, but I still don’t wear neckties. And, just as I hope to a) re-unsilence the S on the end of Illinois, b) get everyone to pronounce ketchup as catsup, and c) get everyone to pronounce the Atlanta Falcons such that the mascot rhymes with Balkans, I hope to d) foment a movement to replace neckties with funky socks. I go forth armed with the following talking points:
• Where neck ties are pure decoration, funky socks are equal parts form and function, the perfect marriage of aesthetics and utility, the reconciliation of doing and being.
• Where neckties say business must always come before pleasure, funky socks hint at the erotic, a flash of neon green argyles like a glimpse of a lady’s ankle in Victorian England.
• Where a decent silk necktie costs at least fifty bucks, two pairs of funky socks can be had for a mere five dollars at Nordstrom Rack. (Finding the funky sock stockpile at Nordstrom Rack was, for me, a lot like that moment in the Blind Melon music video when the bee girl finally discovers the field full of bee people.)
• Neck ties have become a fashion crutch for men, and too often become the equivalent of lipstick on a pig. Funky socks have their limits, and are not a fashion cure-all. But they are a place to start, which is exactly what men need, as opposed to the built in finish line of neckwear.
• If you must wear something around your neck, rock a manly scarf like Dr. Who. • And if you insist upon silk, there’s a whole world of pocket squares.
• No one can choke you by your funky socks, a not insignificant benefit if, like me, you have ever worked on a locked psychiatric ward.
It is tempting to build on this last bullet by observing that when the white collar worker wears a tie to the office his boss has him by the neck. But this would be making the white collar worker into a victim, when, in fact, he is complicit. The correct image is that of the white collar worker standing in front of his mirror affixing his tie; he has himself by the neck, symbolically making the international gesture for choking. As he wraps the silk around his neck he accomplishes a micro-stroke, cutting off just enough of the oxygen to the brain to “have difficulty thinking, making judgments, reasoning, and understanding concepts,” (http://www.mayoclinic.com/health/stroke/DS00150/DSECTION=complications) all of which, of course, are more likely to impede than facilitate the salaryman’s credo of getting the job done. Choking is also the term used in sports for over-thinking at just the wrong moment, what Arthur Ashe described as “paralysis by analysis.” Note that the white collar worker wears his necktie every single day, preempting the possibility of thinking at just the wrong moment by never thinking at all. He is the permanent choke artist.
All hope is not lost for the gentleman in the tie. According to the Mayo Clinic, the stroke symptoms listed above “may improve with rehabilitation therapies.” The reference to therapy brings to mind Ludwig Wittgenstein, whose later work was explicitly therapeutic. Anat Biletzki describes it thusly: “Philosophy, traditional, classic philosophy- is an ailment, and good philosophy-revolutionary philosophy as Wittgenstein would have us do it- is therapy for the ailment.” (http://ndpr.nd.edu/news/35797-philosophical-delusion-and-its-therapy-outline-of-a-philosophical-revolution-2/) Having already bastardized Eliot, I will do so again for Wittgenstein (by way of Biletzki): “Work, traditional, classic work- is an ailment, and good work-revolutionary work- is therapy for the ailment.” The therapy and the revolution begin in the same place: the worker’s mind. Since good things come in threes, let’s go one better and bastardize Marx: White collar workers of the world unite, you have nothing to lose but your neckties!
Blue collar workers and women are encouraged to slip into funky socks and open their minds as well. But since white collar males still make up the bulk of the corporate Man On Our Backs, it is important to frame funky socks primarily as a replacement for neckties, as opposed to a fashion accessory for anyone. Simply put, white collar men have a harder time thinking than any other demographic, else the world wouldn’t be in the mess that it is, and therefore require full restoration of blood flow to the brain as soon as possible. Pairing funky socks with neckties would do as little to start revolutionary work as has Casual Friday. (The absence of ties on Friday is a guarantee of their return on Monday. And co-opting funky socks under the aegis of neckties would enable everyone to go around thinking that every day is Friday, when in fact it would make every day feel like Sunday, which is the day that sounds happy but, per Morrissey, is infinitely sad.)
When asked what was his aim in doing philosophy, Wittgenstein famously answered, “To show the fly the way out of the fly bottle.” Wittgenstein thought these flies were philosophers doing bad philosophy because they were thinking about the wrong things and asking the wrong kind of questions. As suggested above, I would universalize this observation. Gasping for air, we ask “How can we get this job done?” Instead, we need to breathe deeply, and then think about an entirely different question: “What exactly is it that we should be doing?” The easiest way out of the bottle is to widen the bottleneck. To do so we must release the chokehold on our own necks. And, like sneezing and peeing, it is impossible to be funky and choke at the same time. To bastardize one more line, this one from the Beastie Boys: Hey ladies and gentlemen, get funky
Tuesday, December 31, 2013
Friday, December 27, 2013
Find Your Passion!
A piece in the current edition of Johns Hopkins Magazine highlights JHU alum and motivational speaker Jay Shepherd’s “’Five P’s’ of occupational happiness,” the first of which, of course, is passion. If we are to believe Shepherd and basically everyone else who has ever opined about the proverbial color of one’s parachute, passion is to workplace satisfaction as winning is to sports, i.e. it is something of a cure-all. The first and most important step in landing one’s dream job is, per the “passion for your work” belief system, dreaming about said job. Like many of our contemporary reigning ideologies, the “passion for your work” narrative is total poppycock, and ignores the fact that most people dream about not having a job at all not from a lack of insight into their own passions, but in order to devote themselves full time to these very passions. We all know what color our parachutes are. It’s just that for most of us, The Wall Street Journal fanatics notwithstanding, that color isn’t green.
In his wonderful collection of aphorisms, The Bed of Procrustes, Nassim Nicholas Taleb succinctly captures why we dream about giving up work for our passions instead of translating our passions into paid labor, why, if we restrict ourselves to the primary colors, we avoid dreaming in a combination of yellow and blue. But before turning to Taleb, I should note that if one first learns, as I did, that yellow and blue make green from the Glad sandwich bag advertising slogan, is it any wonder that the boundaries around permissible happiness are erected at the outer limits of one’s dream job? As a child when I dreamed of happiness it was most often as a multimillionaire athlete- instead of enjoying sports for sports sake I fantasized it into a profession. As a college student my friends and I gave serious consideration to opening a tavern together- it wasn’t enough to sit around drinking and enjoying one another’s company. And as an adult I imagine waking up to find my name on the New York Times bestseller list- writing for pleasure could only ever be masochistic because the only possible pleasure would be the pain of not getting published. So, pace what I said above in paragraph 1 and the start of paragraph 2 about what we’re really dreaming of, perhaps the truth is that while we’d all like to imagine that if we won the lottery we’d quit our jobs and devote ourselves full time to whatever our own version is of e.g. photographing the indigenous wildlife, we’re all much more likely to end up becoming someone’s boss. Which makes Taleb’s take on what it means to be alive all the more damning: “You exist if and only if you are free to do things without a visible objective, with no justification and, above all, outside the dictatorship of someone else’s narrative.”
If Taleb is correct, as I believe he is, then “the dignity of work” is one of the greatest lies ever told, and there can only ever be “the dignity of leisure.” As members of the unhappiest profession (per official Forbes magazine rankings), lawyers know this. See Shepherd’s description of how a bunch of lawyers reacted to one of his motivational speeches on the “Five P’s”, as quoted in Johns Hopkins Magazine: “Nobody threw any tomatoes.” In the magazine Shepherd’s line is described as a joke, and perhaps it is; I have yet to figure out a way to tell my wife something serious without first joking about it (in what I’m sure she would describe as a painfully unfunny fashion.) Shepherd’s speech to the lawyers about the importance of passion reminds me of the time I was at a Jewish gathering in celebration of the holiday Tu B’Shevat, the “new year” celebration for trees. Several folks had been asked to prepare short readings for the celebration, and one individual, who happened to be Israeli, began a nice story about a tree that was used to make a cradle for a baby, which sounded nice, but was then used to make a boat for a miracle worker who could calm the seas with a word, at which point everyone started to get a little antsy. When the story continued and the tree was being used to make a crucifix the entire room erupted in a single voice: “That’s a story about Jesus!!” No one threw tomatoes at the reader, who, by way of explanation, had only read the first paragraph of the story before deciding it was a perfect tale to tell at Tu B’Shevat, but, needless to say, we didn’t get to hear how the story ended. Just so, I’m surprised Shepherd made it through all “Five P’s” without the lawyers exclaiming in unison, “Hey, this story’s about happiness!”, and heading for the door.
But it’s not just the lawyers who are unhappy. A recent Gallup poll found that amongst the global workforce, only 13% of workers like their jobs. ( http://www.washingtonpost.com/blogs/on-leadership/wp/2013/10/10/only-13-percent-of-people-worldwide-actually-like-going-to-work/ ) 63% are described as “not engaged,” while the remaining 24% are “actively disengaged,” which is a polite way of describing the difference between those who dislike their jobs and those who hate them. When I read the dust jacket for Taleb’s book I found the description of his central project, unmasking “modern civilizations’s hubristic side effects- modifying humans to satisfy technology, blaming reality for not fitting economic models, inventing diseases to sell drugs, defining intelligence as what can be tested in a classroom, and convincing people that employment is not slavery” (emphasis added) a touch hyperbolic. In America, after all, we are not even 150 years removed from the war that ended slavery qua slavery. But as I read more about the Gallup poll it occurred to me that it just might be my status as an American that was blinding me to the truth of Taleb’s assertion that “There is no intermediate state between ice and water but there is one between life and death: employment.” The poll found that the United States, along with Canada, had the highest proportion of workers who reported liking their jobs, at 29%. That figure, 29%, is uncomfortably close to the 30.4% of Americans who hold a Bachelor’s degree. ( http://www.nytimes.com/2012/02/24/education/census-finds-bachelors-degrees-at-record-level.html?_r=0 ) And because I live and work among the happy 30%, I have no real notion of what it’s like to be a wage slave, my teenage summer gigs at McDonald’s and my quarter-life crisis nadir during a two-week stint stocking the shelves at a liquor store notwithstanding. No real notion that Taleb is waving the truth in front of a nose that I don’t even realize I’ve stuck up in the air.
The 13% of global workers who like their jobs is a figure just close enough to another number to give me pause. As defined by Wikipedia, Stockholm Syndrome is “a psychological phenomenon in which hostages express empathy and sympathy and have positive feelings towards their captors, sometimes to the point of defending them.” ( http://en.wikipedia.org/wiki/Stockholm_syndrome ) And just how many hostages experience this “phenomenon?” Again per Wikipedia: “The FBI’s Hostage Barricade Database System shows that roughly 8% of victims show evidence of Stockholm Syndrome.” And almost exactly half-way between 13% and 8% we find 10.9%, which is the exact percentage of Americans with advanced degrees, who, I humbly submit, are the pool from which the very happy, the 8%, are drawn. Subtracting the lawyers, who are incapable of happiness, should easily get us down from 10.9 to 8%.
To be crystal clear, I am suggesting that the happiest of the happy, those who don’t just like their jobs, but those who actually love them, are suffering from a form of Stockholm Syndrome, one in which they (or we, given my advanced degree and overall job satisfaction, which easily meets the 13%’s “engaged” criteria ) have so aligned themselves with their captors that they take “What’s good for IBM is what’s good for America” as an article of faith. In his novel The Pale King, David Foster Wallace captures exactly how this happens in a passage describing “the soldier personality, the type that believes in order and power and respects authority and aligns themselves with power and authority and the side of order and the way the whole thing has got to work if the system’s going to run smoothly.” If you’re a soldier, you will, assuredly, “want to line yourself up with the real power. Have the wind at your back. Tell them listen: Spit with the wind, it goes a whole lot further.” And if you’re a soldier with an advanced degree, you’re already this close to being one of the 8%. To put a twist on a phrase from Slavoj Zizek: Enjoy your syndrome! Or, to translate it into motivational self help-speak: Find your passion!
In his wonderful collection of aphorisms, The Bed of Procrustes, Nassim Nicholas Taleb succinctly captures why we dream about giving up work for our passions instead of translating our passions into paid labor, why, if we restrict ourselves to the primary colors, we avoid dreaming in a combination of yellow and blue. But before turning to Taleb, I should note that if one first learns, as I did, that yellow and blue make green from the Glad sandwich bag advertising slogan, is it any wonder that the boundaries around permissible happiness are erected at the outer limits of one’s dream job? As a child when I dreamed of happiness it was most often as a multimillionaire athlete- instead of enjoying sports for sports sake I fantasized it into a profession. As a college student my friends and I gave serious consideration to opening a tavern together- it wasn’t enough to sit around drinking and enjoying one another’s company. And as an adult I imagine waking up to find my name on the New York Times bestseller list- writing for pleasure could only ever be masochistic because the only possible pleasure would be the pain of not getting published. So, pace what I said above in paragraph 1 and the start of paragraph 2 about what we’re really dreaming of, perhaps the truth is that while we’d all like to imagine that if we won the lottery we’d quit our jobs and devote ourselves full time to whatever our own version is of e.g. photographing the indigenous wildlife, we’re all much more likely to end up becoming someone’s boss. Which makes Taleb’s take on what it means to be alive all the more damning: “You exist if and only if you are free to do things without a visible objective, with no justification and, above all, outside the dictatorship of someone else’s narrative.”
If Taleb is correct, as I believe he is, then “the dignity of work” is one of the greatest lies ever told, and there can only ever be “the dignity of leisure.” As members of the unhappiest profession (per official Forbes magazine rankings), lawyers know this. See Shepherd’s description of how a bunch of lawyers reacted to one of his motivational speeches on the “Five P’s”, as quoted in Johns Hopkins Magazine: “Nobody threw any tomatoes.” In the magazine Shepherd’s line is described as a joke, and perhaps it is; I have yet to figure out a way to tell my wife something serious without first joking about it (in what I’m sure she would describe as a painfully unfunny fashion.) Shepherd’s speech to the lawyers about the importance of passion reminds me of the time I was at a Jewish gathering in celebration of the holiday Tu B’Shevat, the “new year” celebration for trees. Several folks had been asked to prepare short readings for the celebration, and one individual, who happened to be Israeli, began a nice story about a tree that was used to make a cradle for a baby, which sounded nice, but was then used to make a boat for a miracle worker who could calm the seas with a word, at which point everyone started to get a little antsy. When the story continued and the tree was being used to make a crucifix the entire room erupted in a single voice: “That’s a story about Jesus!!” No one threw tomatoes at the reader, who, by way of explanation, had only read the first paragraph of the story before deciding it was a perfect tale to tell at Tu B’Shevat, but, needless to say, we didn’t get to hear how the story ended. Just so, I’m surprised Shepherd made it through all “Five P’s” without the lawyers exclaiming in unison, “Hey, this story’s about happiness!”, and heading for the door.
But it’s not just the lawyers who are unhappy. A recent Gallup poll found that amongst the global workforce, only 13% of workers like their jobs. ( http://www.washingtonpost.com/blogs/on-leadership/wp/2013/10/10/only-13-percent-of-people-worldwide-actually-like-going-to-work/ ) 63% are described as “not engaged,” while the remaining 24% are “actively disengaged,” which is a polite way of describing the difference between those who dislike their jobs and those who hate them. When I read the dust jacket for Taleb’s book I found the description of his central project, unmasking “modern civilizations’s hubristic side effects- modifying humans to satisfy technology, blaming reality for not fitting economic models, inventing diseases to sell drugs, defining intelligence as what can be tested in a classroom, and convincing people that employment is not slavery” (emphasis added) a touch hyperbolic. In America, after all, we are not even 150 years removed from the war that ended slavery qua slavery. But as I read more about the Gallup poll it occurred to me that it just might be my status as an American that was blinding me to the truth of Taleb’s assertion that “There is no intermediate state between ice and water but there is one between life and death: employment.” The poll found that the United States, along with Canada, had the highest proportion of workers who reported liking their jobs, at 29%. That figure, 29%, is uncomfortably close to the 30.4% of Americans who hold a Bachelor’s degree. ( http://www.nytimes.com/2012/02/24/education/census-finds-bachelors-degrees-at-record-level.html?_r=0 ) And because I live and work among the happy 30%, I have no real notion of what it’s like to be a wage slave, my teenage summer gigs at McDonald’s and my quarter-life crisis nadir during a two-week stint stocking the shelves at a liquor store notwithstanding. No real notion that Taleb is waving the truth in front of a nose that I don’t even realize I’ve stuck up in the air.
The 13% of global workers who like their jobs is a figure just close enough to another number to give me pause. As defined by Wikipedia, Stockholm Syndrome is “a psychological phenomenon in which hostages express empathy and sympathy and have positive feelings towards their captors, sometimes to the point of defending them.” ( http://en.wikipedia.org/wiki/Stockholm_syndrome ) And just how many hostages experience this “phenomenon?” Again per Wikipedia: “The FBI’s Hostage Barricade Database System shows that roughly 8% of victims show evidence of Stockholm Syndrome.” And almost exactly half-way between 13% and 8% we find 10.9%, which is the exact percentage of Americans with advanced degrees, who, I humbly submit, are the pool from which the very happy, the 8%, are drawn. Subtracting the lawyers, who are incapable of happiness, should easily get us down from 10.9 to 8%.
To be crystal clear, I am suggesting that the happiest of the happy, those who don’t just like their jobs, but those who actually love them, are suffering from a form of Stockholm Syndrome, one in which they (or we, given my advanced degree and overall job satisfaction, which easily meets the 13%’s “engaged” criteria ) have so aligned themselves with their captors that they take “What’s good for IBM is what’s good for America” as an article of faith. In his novel The Pale King, David Foster Wallace captures exactly how this happens in a passage describing “the soldier personality, the type that believes in order and power and respects authority and aligns themselves with power and authority and the side of order and the way the whole thing has got to work if the system’s going to run smoothly.” If you’re a soldier, you will, assuredly, “want to line yourself up with the real power. Have the wind at your back. Tell them listen: Spit with the wind, it goes a whole lot further.” And if you’re a soldier with an advanced degree, you’re already this close to being one of the 8%. To put a twist on a phrase from Slavoj Zizek: Enjoy your syndrome! Or, to translate it into motivational self help-speak: Find your passion!
Saturday, December 21, 2013
When it Takes 70,000 Signatures to Get to Second Base
Last week I received an email encouraging me to sign a petition requesting fair media coverage of the Affordable Care Act, coverage that would include the perspectives of millions of Americans for whom the odds of staying alive and well have suddenly improved. The petition was the work of a father-daughter team, and they planned to deliver their signatures, 70,000 strong at the time I received the email, directly into the hands of CBS News. And for all of the problems with the ACA, from its dysfunctional website to the built in incentives for the unholy alliance between M.D.’s and Big Pharma to engage in further profiteering, it remains, on balance, a good thing; like the old saying about 1 billion Red Chinese, 44 million newly insured Americans can’t be wrong. But despite my solidarity with Obamacare and my respect for this honorable effort to pressure the media into “fair and balanced” reporting, I didn’t sign.
It would be easy to chalk up my abstention to the busyness of life. The petition seemed to fall into the category of benevolent could-do’s that are almost always trumped by daily have-to’s. But if that had really been the case the petition, like the steady stream of requests in the mail from local charity groups for the donation of gently used clothing, would have quickly been forgotten. Instead, several days later, I am still thinking about it. And what I‘ve realized is that I abstained from signing the petition not because I simply didn’t have to, in the way that I never seem to give to the destitute folk with signs at city intersections because nothing at all will change about my day if I keep my window up and the sports radio on, my absence of charity resting on a foundation of nothing to lose (this “it costs me nothing if I ignore my neighbor” orientation is both a) what some might plainly call sinful, and b) strange bedfellows with another belief that informs my praxis, which is that “it costs me nothing to smile”; apparently one can be both very friendly and selfish at the same time, which combination is likely how we can remain self-centered enough to survive in an increasingly unfettered free market economy while still thinking of ourselves as basically good people) but because something about the petition felt off kilter.
Reading the petition felt strangely like what it must have been like trying to get laid at Antioch College in the early 1990’s. It was then that Antioch put into effect the following infamous policy: “Obtaining consent is an ongoing process in any sexual interaction. Verbal consent should be obtained with each new level of physical and/or sexual contact/conduct.” The Antioch policy is very much like the petition in that both are born out of good intentions, the former to address the very real problem of sexual assault against women, and the latter, of course, to confront our staggering cultural disregard for the basic needs of others (to see how far the total disregard for the poorest among us has gone, one need only look as far as the Hawaiian State Representative who spent several weeks toting around a baseball bat to smash up the personal belongings of homeless people, with whom he is “disgusted.” And no one did anything to stop him. Please note that this guy is a Democrat, bringing to this registered Democrat’s mind the old saying that with friends like these, who needs enemies. I’m not making this up: http://thinkprogress.org/economy/2013/11/19/2966371/hawaii-homeless-smash/ ) Despite these good intentions, the authors of Antioch’s policy lack the most basic level of insight into human sexual behavior. This is not to suggest that men should take what they want without first obtaining consent (and, to be clear, no indeed ALWAYS means no), but the devil is in the details, that devil being the fact that some questions can only ever be answered, but never asked (unlike those questions which can be asked but never answered, e.g. “Does this dress make my ass look fat?). These unspeakable questions include “Do you love me?”, “Will you forgive me?”, “Are you going to tell the truth?” (which can be asked in the courtroom, but nowhere else), and, in play here, “Do you want me?”
The art of seduction requires that one ask that last question without ever verbalizing it; “Do you want me?” can only ever be pantomimed. It is, like the Tao, a truth that can never be spoken. And everyone on planet earth, but for the well meaning folks at Antioch, knows this, thereby collapsing the policy into absurdity. It is no less farcical than requiring everyone to stop and think before each breath. Not to mention how confusing it would be to put into practice. Would the traditional first, second, third, and home base progression suffice for defining “each new level of physical and/or sexual contact/conduct,” or would it require questions such as “May I now press my thigh into your mons veneris?”
The similarity between the petition and the policy goes much deeper than the mere fact that both result from a good faith effort to make the world a better place. To understand how the petition asks a question as unspeakable as “Do you want me?”, we must turn to Noam Chomsky, the Grand Poobah of that endangered species, the leftist American public intellectual. (After Chomsky and Cornel West, I am drawing a blank; I can’t think of anyone else with the chops to qualify. As Tony Kornheiser would say, “That’s the list.” A vigorous opposition being part and parcel of a functioning democracy, the near vacuum one finds when looking for substantial thinkers who question the reigning American ideology is but one more reason that Bill Moyers may be right when he says, in a Salon article last week, “We are this close to losing our democracy.” (http://www.salon.com/2013/12/12/bill_moyers_we_are_this_close_to_losing_our_democracy/ )
Chomsky is crystal clear in his take on what he thinks the media should be doing, which contrasts with the role he sees it actually playing, and his formula for the former is as simple as it is clear: “what the national press ought to be doing is looking at the world from the point of view of its population.” (http://www.chomsky.info/articles/199710--.htm ) But instead, “the product of the media, what appears, what doesn’t appear, the way it is slanted, will reflect the interest of the buyers and sellers, the institutions, and the power systems that are around them. If that wouldn’t happen, it would be kind of a miracle.” (Interestingly, if we fuse these two thoughts, Chomsky is essentially saying that miracles ought to be occurring. And if there is one thread in human history that seems universal, it is that the miracle of our existence is never enough. There is an urban legend that in China if you save another person’s life they don’t owe you a thing, but you are suddenly obligated to care for them for the rest of your life. Perhaps God is one of these mythical Chinese, and by indulging in the miracle of creating us, She is forevermore on the hook for swinging us more and more miracles.)
Chomsky’s analysis then divides the corporate media in two. In the first camp are the mass media, who are “basically trying to divert people.” If you have ever watched five minutes of The Jerry Springer Show you know exactly what Chomsky is talking about and know everything you ever need to know about the mass media; Springer’s formula of gawking at poor people for the first 55 minutes of his show and then tying it all up in a bow by delivering a moral platitude in the show’s last five minutes can rightly be categorized as pure evil. (In a depressing echo of the bat-wielding Hawaiian fascist, Springer too was once a Democrat in elected office, in his case as mayor of Cincinatti. The only difference is that rather than smashing up the belongings of the impoverished, Springer demolishes their dignity.) Chomsky’s second group, among whom he explicitly includes CBS News, serves an altogether different function: “the elite media set a framework within which others operate.” The elite media establish and guard the boundaries of the ruling ideology. Looking back over our shoulders at the petition, we might say that the elite media determine the unspeakable. Money, of course, is where the ideological rubber meets the road; Chomsky draws on George Orwell’s observation that “the press is owned by wealthy people who only want certain things to reach the public.”
Before connecting the remaining dots, a word in defense of Chomsky’s assault on the vanguard of the elite corporate media. One man’s leftist public intellectual is another man’s radical left wing nut bag, and it would be easy to dismiss Chomsky’s media critique as the tired rant of an old man whose definitive characteristic is a cancerous animosity towards his own country’s way of life, and perhaps just towards his own country. I’m just saying. So to put Chomsky’s analysis to the test, I thought of two questions: when did the labor movement, along with voting rights the greatest tool the people ever did have, peak, and when did television, the greatest tool wealthy people who only want certain things to reach the public ever did have, become widespread? The answer: The labor movement, legalized in the 1930’s under FDR, saw widespread growth in power and influence right up into the 1950’s, when membership peaked, the same decade in which that ultimate tool in the manufacture of consent first sunk its teeth into our collective skulls. The correlation in the subsequent decline in union rolls with the increased consumption of televised media is as rock solid as the correlation between the rise in CO2 gasses and the rise in global temperatures. Noam Chomsky is right as rain, while, to borrow an image from David Foster Wallace, we’re all busy staring at the furniture.
Delivering a petition to CBS News for fair coverage of the Affordable Care Act, then, is no different than asking “Do you want me?” Asking an unspeakable question provides its own answer, and, whether one is asking after love, forgiveness, truth, desire, or fairness, the answer is always already no. The rich have their own way of putting the very same thing. “If you have to ask how much it costs, you can’t afford it.” The rest of us need to put down our impotent petitions while also laying down our arms in the pointless culture wars that only serve to distract us from the only unspeakable that matters: One never asks for permission to go on strike. (Or, for that matter, to turn off one’s television.)
It would be easy to chalk up my abstention to the busyness of life. The petition seemed to fall into the category of benevolent could-do’s that are almost always trumped by daily have-to’s. But if that had really been the case the petition, like the steady stream of requests in the mail from local charity groups for the donation of gently used clothing, would have quickly been forgotten. Instead, several days later, I am still thinking about it. And what I‘ve realized is that I abstained from signing the petition not because I simply didn’t have to, in the way that I never seem to give to the destitute folk with signs at city intersections because nothing at all will change about my day if I keep my window up and the sports radio on, my absence of charity resting on a foundation of nothing to lose (this “it costs me nothing if I ignore my neighbor” orientation is both a) what some might plainly call sinful, and b) strange bedfellows with another belief that informs my praxis, which is that “it costs me nothing to smile”; apparently one can be both very friendly and selfish at the same time, which combination is likely how we can remain self-centered enough to survive in an increasingly unfettered free market economy while still thinking of ourselves as basically good people) but because something about the petition felt off kilter.
Reading the petition felt strangely like what it must have been like trying to get laid at Antioch College in the early 1990’s. It was then that Antioch put into effect the following infamous policy: “Obtaining consent is an ongoing process in any sexual interaction. Verbal consent should be obtained with each new level of physical and/or sexual contact/conduct.” The Antioch policy is very much like the petition in that both are born out of good intentions, the former to address the very real problem of sexual assault against women, and the latter, of course, to confront our staggering cultural disregard for the basic needs of others (to see how far the total disregard for the poorest among us has gone, one need only look as far as the Hawaiian State Representative who spent several weeks toting around a baseball bat to smash up the personal belongings of homeless people, with whom he is “disgusted.” And no one did anything to stop him. Please note that this guy is a Democrat, bringing to this registered Democrat’s mind the old saying that with friends like these, who needs enemies. I’m not making this up: http://thinkprogress.org/economy/2013/11/19/2966371/hawaii-homeless-smash/ ) Despite these good intentions, the authors of Antioch’s policy lack the most basic level of insight into human sexual behavior. This is not to suggest that men should take what they want without first obtaining consent (and, to be clear, no indeed ALWAYS means no), but the devil is in the details, that devil being the fact that some questions can only ever be answered, but never asked (unlike those questions which can be asked but never answered, e.g. “Does this dress make my ass look fat?). These unspeakable questions include “Do you love me?”, “Will you forgive me?”, “Are you going to tell the truth?” (which can be asked in the courtroom, but nowhere else), and, in play here, “Do you want me?”
The art of seduction requires that one ask that last question without ever verbalizing it; “Do you want me?” can only ever be pantomimed. It is, like the Tao, a truth that can never be spoken. And everyone on planet earth, but for the well meaning folks at Antioch, knows this, thereby collapsing the policy into absurdity. It is no less farcical than requiring everyone to stop and think before each breath. Not to mention how confusing it would be to put into practice. Would the traditional first, second, third, and home base progression suffice for defining “each new level of physical and/or sexual contact/conduct,” or would it require questions such as “May I now press my thigh into your mons veneris?”
The similarity between the petition and the policy goes much deeper than the mere fact that both result from a good faith effort to make the world a better place. To understand how the petition asks a question as unspeakable as “Do you want me?”, we must turn to Noam Chomsky, the Grand Poobah of that endangered species, the leftist American public intellectual. (After Chomsky and Cornel West, I am drawing a blank; I can’t think of anyone else with the chops to qualify. As Tony Kornheiser would say, “That’s the list.” A vigorous opposition being part and parcel of a functioning democracy, the near vacuum one finds when looking for substantial thinkers who question the reigning American ideology is but one more reason that Bill Moyers may be right when he says, in a Salon article last week, “We are this close to losing our democracy.” (http://www.salon.com/2013/12/12/bill_moyers_we_are_this_close_to_losing_our_democracy/ )
Chomsky is crystal clear in his take on what he thinks the media should be doing, which contrasts with the role he sees it actually playing, and his formula for the former is as simple as it is clear: “what the national press ought to be doing is looking at the world from the point of view of its population.” (http://www.chomsky.info/articles/199710--.htm ) But instead, “the product of the media, what appears, what doesn’t appear, the way it is slanted, will reflect the interest of the buyers and sellers, the institutions, and the power systems that are around them. If that wouldn’t happen, it would be kind of a miracle.” (Interestingly, if we fuse these two thoughts, Chomsky is essentially saying that miracles ought to be occurring. And if there is one thread in human history that seems universal, it is that the miracle of our existence is never enough. There is an urban legend that in China if you save another person’s life they don’t owe you a thing, but you are suddenly obligated to care for them for the rest of your life. Perhaps God is one of these mythical Chinese, and by indulging in the miracle of creating us, She is forevermore on the hook for swinging us more and more miracles.)
Chomsky’s analysis then divides the corporate media in two. In the first camp are the mass media, who are “basically trying to divert people.” If you have ever watched five minutes of The Jerry Springer Show you know exactly what Chomsky is talking about and know everything you ever need to know about the mass media; Springer’s formula of gawking at poor people for the first 55 minutes of his show and then tying it all up in a bow by delivering a moral platitude in the show’s last five minutes can rightly be categorized as pure evil. (In a depressing echo of the bat-wielding Hawaiian fascist, Springer too was once a Democrat in elected office, in his case as mayor of Cincinatti. The only difference is that rather than smashing up the belongings of the impoverished, Springer demolishes their dignity.) Chomsky’s second group, among whom he explicitly includes CBS News, serves an altogether different function: “the elite media set a framework within which others operate.” The elite media establish and guard the boundaries of the ruling ideology. Looking back over our shoulders at the petition, we might say that the elite media determine the unspeakable. Money, of course, is where the ideological rubber meets the road; Chomsky draws on George Orwell’s observation that “the press is owned by wealthy people who only want certain things to reach the public.”
Before connecting the remaining dots, a word in defense of Chomsky’s assault on the vanguard of the elite corporate media. One man’s leftist public intellectual is another man’s radical left wing nut bag, and it would be easy to dismiss Chomsky’s media critique as the tired rant of an old man whose definitive characteristic is a cancerous animosity towards his own country’s way of life, and perhaps just towards his own country. I’m just saying. So to put Chomsky’s analysis to the test, I thought of two questions: when did the labor movement, along with voting rights the greatest tool the people ever did have, peak, and when did television, the greatest tool wealthy people who only want certain things to reach the public ever did have, become widespread? The answer: The labor movement, legalized in the 1930’s under FDR, saw widespread growth in power and influence right up into the 1950’s, when membership peaked, the same decade in which that ultimate tool in the manufacture of consent first sunk its teeth into our collective skulls. The correlation in the subsequent decline in union rolls with the increased consumption of televised media is as rock solid as the correlation between the rise in CO2 gasses and the rise in global temperatures. Noam Chomsky is right as rain, while, to borrow an image from David Foster Wallace, we’re all busy staring at the furniture.
Delivering a petition to CBS News for fair coverage of the Affordable Care Act, then, is no different than asking “Do you want me?” Asking an unspeakable question provides its own answer, and, whether one is asking after love, forgiveness, truth, desire, or fairness, the answer is always already no. The rich have their own way of putting the very same thing. “If you have to ask how much it costs, you can’t afford it.” The rest of us need to put down our impotent petitions while also laying down our arms in the pointless culture wars that only serve to distract us from the only unspeakable that matters: One never asks for permission to go on strike. (Or, for that matter, to turn off one’s television.)
Monday, December 16, 2013
What an Asshole!
Growing up in my family we were all nuts for George Carlin. In particular, we loved a bit from one of Carlin’s classic comedy albums called “Asshole, Jackoff, Scumbag!,” in which Carlin plays the host of a game show where the contestants try to accurately guess whether the subjects, e.g. a rancher who lives in Texas and works for an oil company or a lawyer who lives on Long Island and is a US Congressman, are each assholes, jackoffs, or scumbags. We quickly realized that these categories were as universal to human beings as introversion and extroversion, and that they could easily be utilized to add a fifth letter to any given Meyers-Briggs results; an ESTJ might become an ESTJA or an ISTP an ISTPJ.
We had a lot of fun playing our own little versions of “Asshole, Jackoff, Scumbag!,” managing our anger towards those who had run afoul of one or another of our little clan by marking them with the appropriate label. But, perhaps wisely, we never labeled each other, utilizing the “present company excluded” clause as a loophole from our assertion that the typology was indeed universal. But I knew full well that the loophole we had exercised was a fraud, and that I too was an asshole, jackoff, or scumbag, my family’s silence on the matter enabling me to engage in some equally fraudulent self-analysis.
So it was that I went around for years thinking of myself as an INFPJ, a real jackoff. Jackoffs, of course, are the least offensive of the three types. They don’t get much done, but this deficit was easily dressed up as “being laid back,” as if I was an old soul transplanted from a series of previous incarnations lived on island time. I mightn’t get out of bed until noon or do my laundry more than once a trimester, but my sloth was restoring balance to The Force in a world where doing always seemed to trump being. (Could Darth Vader have been a misunderstood jackoff?)
I bought into my own BS until two Thanksgivings ago, when word got back to me that my wife had gotten together over turkey, gravy, stuffing, and mashed potatoes with my mom and my big sister and, like Truman, Churchill, and Stalin dividing up Europe after World War II, unanimously and unilaterally assigned me to the asshole side of the “Asshole, Jackoff, Scumbag!” map (which, to extend the metaphor, makes me a lot like France). In doing so, they saved me several years on the analyst’s couch, psychoanalysis being a process described by famed sociologist Peter Berger as “a prolonged rewriting of the patient’s biography – until, finally, he or she ‘gets it right.’” Correctly sensing that I was quite happy to maintain a self serving false consciousness in which I fancied myself a righteous jackoff, the women who love me most got it right on my behalf.
So, I’m an asshole. Not, I would hasten to add, a royal asshole, but an asshole nonetheless, and one who still engages in your garden variety asshole behavior on pretty much a daily basis. Just this morning, as my wife got the kids ready to head out the door with her, leaving me to my own devices for the next blessed six hours, I refused to multitask by plopping a sausage in the toaster oven for my youngest daughter as I watched the scrambled eggs turn yellow in the frying pan. Watching the eggs was all I could handle, or such was my claim as I shouted to my wife to handle the business with the sausage so that I could give my full attention to the egg I was cooking for her. Only a real asshole, of course, would inconvenience his wife under the auspices of being too busy meeting her needs. I followed this up by refusing to make my oldest daughter a cup of tea because I feared that the duration of boiling and brewing might slow down their departure and delay the onset of my me-time, triggering a bout of whining from my oldest that my wife had to extinguish on her way out the door. As I watched her slog through the slush to the van with our children, she had become the embodiment of the old adage that a good deed never goes unpunished.
Between moments like these I do enough so that my wife and children still love me and I can avoid feeling like the world’s biggest asshole, which is a label that no one wants to cop to. In the current edition of ESPN Magazine, Lance Armstrong admits to being just that, but after his interlocutor goes with it and repeats the title that Armstrong has just bestowed upon himself, he hedges: “I’m not sure I was the biggest asshole in the world, but I played one on TV.” Armstrong’s hemming and hawing about being the world’s biggest asshole, which everyone knows he truly was, and which doesn’t have to be that big a deal given that the world is full of world’s biggest assholes, only serves to make him look like more of an asshole.
Being an asshole isn’t really all that different than being a jackoff, or even a scumbag, whom I always considered the worst of the lot. “I may not be the world’s best husband, but I would never cheat on my wife” is, after all, just another way of saying “I may be an asshole, but at least I’m not a scumbag.” But what assholes, jackoffs, and scumbags have in common, i.e. what we all have in common, is a self-centeredness that is as pervasive and as invisible as the air we breathe. Philosopher Mark Johnston, in his brilliant Saving God: Religion After Idolatry, posits that original sin is real and that it consists of our inborn self absorption. Following Johnston, I would say that Freud, who was often accused of being a pansexualist for his tendency to trace everything back to the libido and the Oedipus complex, wasn’t quite on the mark and that the satisfaction of the sex drive is just one instance of the larger human project of, to borrow a line from the old Burger King ads, having it our way.
Victor Frankl famously refuted Freud with his theory that human behavior was spurred not by Freudian drives, but by a search for meaning, which theory provided the title to his classic work Man’s Search for Meaning, a book which I have failed to read despite having had it on my list for over a decade, a failure which, if I may be indulged in some self analysis despite the track record established by my self-identification as a jackoff, is likely a form of resistance to the optimism inherent to Frankl’s theory and his personal history of having survived the Holocaust through his own search for meaning. Instead of reading Man’s Search for Meaning I read Our Final Invention: Artificial Intelligence and the End of the Human Era, a book with a Terminator motif of super-intelligent machines consigning us to the dustbin of history, and a book I have out from the library right now. Looking through a scanner darkly at the end of the world, I am mesmerized. Frankl, living through the end of the world, was liberated. So perhaps my resistance is nothing but, to borrow a term from Erich Fromm, an escape from freedom.
Regardless, the fact remains that I have read Freud and not Frankl. So in my quest to become a reformed asshole, taking my inspiration from Haleakala, the world’s largest dormant volcano, I turn to Freud’s theory of sublimation, the notion that we can channel our sexual drive into creative endeavors, a process that is likely the very wellspring of civilization. Since I have subsumed Freud’s pansexualism into the idea that we all want what we want and we want it now, I may as well commandeer sublimation while I am at it. And, as a writer, I am perfectly positioned to do so.
In her remarkable Tiger Writing, novelist Gish Jen shares a story from grad school in which she was told by her professor that there was no such thing as a nice writer. As an asshole, then, I would seem to be well suited to the task. Jen goes on to say that she was also told in grad school that all good writing is subversive. And subversiveness could quite possibly be the only way to sublimate this asshole writer’s selfishness, harnessing it towards the greater good.
Finally, my resistance to Frankl may grow out of a fundamental misunderstanding of the subversive project. Somewhere along the way, I picked up the idea that only pessimists can be real subversives, the holy grail of this outlook being Freud’s own Civilization and its Discontents. But what could be more subversive than the idea that even the Holocaust couldn’t annihilate meaning, couldn’t, dare I say it, defeat love? I am left thinking about the Le Tigre song lyrics I heard this morning: “Is it time for me to act mature? The only words I know are more, more, more.”
Is it time for me to grow up and read Victor Frankl?
We had a lot of fun playing our own little versions of “Asshole, Jackoff, Scumbag!,” managing our anger towards those who had run afoul of one or another of our little clan by marking them with the appropriate label. But, perhaps wisely, we never labeled each other, utilizing the “present company excluded” clause as a loophole from our assertion that the typology was indeed universal. But I knew full well that the loophole we had exercised was a fraud, and that I too was an asshole, jackoff, or scumbag, my family’s silence on the matter enabling me to engage in some equally fraudulent self-analysis.
So it was that I went around for years thinking of myself as an INFPJ, a real jackoff. Jackoffs, of course, are the least offensive of the three types. They don’t get much done, but this deficit was easily dressed up as “being laid back,” as if I was an old soul transplanted from a series of previous incarnations lived on island time. I mightn’t get out of bed until noon or do my laundry more than once a trimester, but my sloth was restoring balance to The Force in a world where doing always seemed to trump being. (Could Darth Vader have been a misunderstood jackoff?)
I bought into my own BS until two Thanksgivings ago, when word got back to me that my wife had gotten together over turkey, gravy, stuffing, and mashed potatoes with my mom and my big sister and, like Truman, Churchill, and Stalin dividing up Europe after World War II, unanimously and unilaterally assigned me to the asshole side of the “Asshole, Jackoff, Scumbag!” map (which, to extend the metaphor, makes me a lot like France). In doing so, they saved me several years on the analyst’s couch, psychoanalysis being a process described by famed sociologist Peter Berger as “a prolonged rewriting of the patient’s biography – until, finally, he or she ‘gets it right.’” Correctly sensing that I was quite happy to maintain a self serving false consciousness in which I fancied myself a righteous jackoff, the women who love me most got it right on my behalf.
So, I’m an asshole. Not, I would hasten to add, a royal asshole, but an asshole nonetheless, and one who still engages in your garden variety asshole behavior on pretty much a daily basis. Just this morning, as my wife got the kids ready to head out the door with her, leaving me to my own devices for the next blessed six hours, I refused to multitask by plopping a sausage in the toaster oven for my youngest daughter as I watched the scrambled eggs turn yellow in the frying pan. Watching the eggs was all I could handle, or such was my claim as I shouted to my wife to handle the business with the sausage so that I could give my full attention to the egg I was cooking for her. Only a real asshole, of course, would inconvenience his wife under the auspices of being too busy meeting her needs. I followed this up by refusing to make my oldest daughter a cup of tea because I feared that the duration of boiling and brewing might slow down their departure and delay the onset of my me-time, triggering a bout of whining from my oldest that my wife had to extinguish on her way out the door. As I watched her slog through the slush to the van with our children, she had become the embodiment of the old adage that a good deed never goes unpunished.
Between moments like these I do enough so that my wife and children still love me and I can avoid feeling like the world’s biggest asshole, which is a label that no one wants to cop to. In the current edition of ESPN Magazine, Lance Armstrong admits to being just that, but after his interlocutor goes with it and repeats the title that Armstrong has just bestowed upon himself, he hedges: “I’m not sure I was the biggest asshole in the world, but I played one on TV.” Armstrong’s hemming and hawing about being the world’s biggest asshole, which everyone knows he truly was, and which doesn’t have to be that big a deal given that the world is full of world’s biggest assholes, only serves to make him look like more of an asshole.
Being an asshole isn’t really all that different than being a jackoff, or even a scumbag, whom I always considered the worst of the lot. “I may not be the world’s best husband, but I would never cheat on my wife” is, after all, just another way of saying “I may be an asshole, but at least I’m not a scumbag.” But what assholes, jackoffs, and scumbags have in common, i.e. what we all have in common, is a self-centeredness that is as pervasive and as invisible as the air we breathe. Philosopher Mark Johnston, in his brilliant Saving God: Religion After Idolatry, posits that original sin is real and that it consists of our inborn self absorption. Following Johnston, I would say that Freud, who was often accused of being a pansexualist for his tendency to trace everything back to the libido and the Oedipus complex, wasn’t quite on the mark and that the satisfaction of the sex drive is just one instance of the larger human project of, to borrow a line from the old Burger King ads, having it our way.
Victor Frankl famously refuted Freud with his theory that human behavior was spurred not by Freudian drives, but by a search for meaning, which theory provided the title to his classic work Man’s Search for Meaning, a book which I have failed to read despite having had it on my list for over a decade, a failure which, if I may be indulged in some self analysis despite the track record established by my self-identification as a jackoff, is likely a form of resistance to the optimism inherent to Frankl’s theory and his personal history of having survived the Holocaust through his own search for meaning. Instead of reading Man’s Search for Meaning I read Our Final Invention: Artificial Intelligence and the End of the Human Era, a book with a Terminator motif of super-intelligent machines consigning us to the dustbin of history, and a book I have out from the library right now. Looking through a scanner darkly at the end of the world, I am mesmerized. Frankl, living through the end of the world, was liberated. So perhaps my resistance is nothing but, to borrow a term from Erich Fromm, an escape from freedom.
Regardless, the fact remains that I have read Freud and not Frankl. So in my quest to become a reformed asshole, taking my inspiration from Haleakala, the world’s largest dormant volcano, I turn to Freud’s theory of sublimation, the notion that we can channel our sexual drive into creative endeavors, a process that is likely the very wellspring of civilization. Since I have subsumed Freud’s pansexualism into the idea that we all want what we want and we want it now, I may as well commandeer sublimation while I am at it. And, as a writer, I am perfectly positioned to do so.
In her remarkable Tiger Writing, novelist Gish Jen shares a story from grad school in which she was told by her professor that there was no such thing as a nice writer. As an asshole, then, I would seem to be well suited to the task. Jen goes on to say that she was also told in grad school that all good writing is subversive. And subversiveness could quite possibly be the only way to sublimate this asshole writer’s selfishness, harnessing it towards the greater good.
Finally, my resistance to Frankl may grow out of a fundamental misunderstanding of the subversive project. Somewhere along the way, I picked up the idea that only pessimists can be real subversives, the holy grail of this outlook being Freud’s own Civilization and its Discontents. But what could be more subversive than the idea that even the Holocaust couldn’t annihilate meaning, couldn’t, dare I say it, defeat love? I am left thinking about the Le Tigre song lyrics I heard this morning: “Is it time for me to act mature? The only words I know are more, more, more.”
Is it time for me to grow up and read Victor Frankl?
Wednesday, December 11, 2013
The Failed Artist's Manifesto
As reported recently by Salon (http://www.salon.com/topic/blue_is_the_warmest_color/) a backlash is brewing against this year’s Palme d’Or winner, Blue is the Warmest Color, a film that depicts a love affair between two young women. The backlash described by Salon has to do with the fact that the film’s auteur, Abdellatif Kechiche, is not only a man, but also one who doesn’t happen to be gay, in addition to the fact that it stars “two seemingly straight women.”
The author of the Salon piece, Chelsea Hawkins, describes the problem thusly: “What was missing on the set of Blue is the Warmest Color were the people who actually engage in same-sex relationships, people who understand lesbianism and queerness in a way that someone who is heterosexual does not.” Hawkins quotes a blogger (identified simply, and somewhat mysteriously, as Kate) to flesh out this critique: “A narrative about queer people as directed and portrayed and produced by straight people cannot be considered a work of queer cinema in the same sense that a film written, directed, and portrayed by queer people is.”
On a political level, I completely get this. Perhaps the greatest privilege accorded to the dominant group is that of writing the history books and all of the trimmings in the form of literature, theater, music, etc. And in the case of queer people, up until only the very, very recent past, they were summarily written out of the entire story, unceremoniously left on the cutting room floor. Under such circumstances, it would be surprising if LGBT folk weren’t a bit possessive of their narrative, in order to a) exercise some self determination over the stories that get passed off as queer, b) express a certain amount of pique over the fact that straight storytellers, who for millennia pretended there was no such thing as queer, are suddenly queuing up to tell those very stories now that there is a bona fide market share to be had, and c) wonder why it took a straight director to win the Palmes d’Or for a queer flick.
I am also reminded of the hullabaloo surrounding the original plan in Hollywood of signing up a white male director to make the major motion picture of The Autobiography of Malcolm X. Controversy erupted, and things were set right when Spike Lee was ultimately hired on to direct. Again, politically, this makes perfect sense for essentially the same reasons eluded to in the Salon piece about Blue. Part of me rejoices that Lee was in charge of Malcolm X, and it is the same part of me that wonders, given the descriptions of the sex scenes in Blue as soft core porn, if Kechiche was capable of depicting anything other than a male fantasy. (To explain why I haven’t yet seen the film I would draw upon the wisdom of Samuel L. Jackson’s character in Pulp Fiction, who explains “My girlfriend is a vegetarian, which pretty much makes me a vegetarian.” My wife doesn’t see movies, which pretty much means I don’t see movies.)
But as an artist, I find the backlash against Kechiche’s Blue more than a little disheartening. To understand why, it might help to explain that I spent the last two years of my life writing a novel centered around two main characters, one of whom is a woman, as oppressed a group, of course, as ever there was. Getting inside her head and giving her an authentic voice was the hardest thing I’ve ever tried to do as an artist. In comparison, spouting off on this blog about the (hopefully) odd angle from which I see the world in a way that is reasonably entertaining and provocative is mere child’s play. And I have been given advice from a reader whose opinion I trust and value that my efforts to bring my character Shoshana to life were successful, if successful is a word that starts with the letters u and n. I never do things half-way, oh no, writing about a woman for this white male Episcopalian wasn’t enough. Shoshana, if you didn’t happen to know it, is a Jewish name. For an Episcopalian, a lovely middle name to pair with Shoshana would be Rose. Except Shoshana means rose, which makes Shoshana Rose sound a lot like naming your daughter Boutros Boutros, which has little to recommend itself other than possibly predestining her for a career as Secretary-General at the UN. All that said, it boggles the mind to think of all the things I don’t know about being either Jewish or female. Which is both a) a perfectly good reason to never write a novel about a Jewish woman, and b) the best reason anyone who isn’t a Jewish woman could possibly have to write a novel about a Jewish woman.
Because one of the very best ways of dismantling the violence of cultural dominance is the good faith effort to incorporate the perspective and voice of the Other in one’s works of art, especially if one, like me, is inescapably a member of the dominant group. This is more and not less true given that one may ultimately be, again like me, doomed to failure in one’s efforts to depict the Other, and is one more occasion when Beckett’s famous equation (the humanities’ equivalent to Einstein’s E=MCsquared), “Try again. Fail again. Fail better.”, rings absolutely true.
I would also argue that it is well nigh impossible to make art today without crossing the Rubicon of Otherness. The commitment to making art in our times may very well consist of fidelity to just that. And thank God for the sublime failures in that quest: Woody Allen’s Annie Hall, Henry James’ aptly titled Portrait of a Lady and the aforementioned Spike Lee’s She’s Gotta Have It are indispensable examples of men trying their damnedest to write women. Going the other way, it would be impossible to deploy more convincing male characters than did Eliot and Austen in their respective masterpieces, Middlemarch and Pride and Prejudice. Clearly, with a nod to Orwell, us artists are all failures, but some failures are more equal than others.
In writing a novel it was, of course, my dream to rival James, Eliot, and Austen. The astronomical odds against my doing so don’t in any way detract from the nobility of my undertaking. Go big or go home. And it is this intentional reckless abandon in constructing one’s ambitions, an act necessary, if not sufficient, for the making of great art, that is lost if we fence off certain stories for certain artists. In fencing off one’s turf, as blogger Kate is quoted doing above, one also fences one’s self in. Such an artist can no longer shoot for the moon of making great cinema, cinema so confident that if it was a basketball team it would play anybody, anywhere, anytime. Instead these artists make queer cinema, which sounds as unheroic to me as scheduling a bunch of teams you know you can beat. I’d rather be a failure.
I would note that John Waters happens to be gay, but no one thinks of his movies as queer cinema. They just think of his movies as great cinema because his movies are bloody brilliant. And if it is true that all great artists have big egos (though not all artists with big egos are great), then here’s to blind ambition that fails to see all the things it doesn’t see, allowing one to imagine what it might be like to be someone else. I want to live in a world that celebrates Tracy Chapman’s “Fast Car,” but also makes room for Elvis Presley’s “In the Ghetto,” lest we find ourselves relegated to artistic ghettos where there’s no point in making art anymore because you already know exactly what it’s like to be me.
The author of the Salon piece, Chelsea Hawkins, describes the problem thusly: “What was missing on the set of Blue is the Warmest Color were the people who actually engage in same-sex relationships, people who understand lesbianism and queerness in a way that someone who is heterosexual does not.” Hawkins quotes a blogger (identified simply, and somewhat mysteriously, as Kate) to flesh out this critique: “A narrative about queer people as directed and portrayed and produced by straight people cannot be considered a work of queer cinema in the same sense that a film written, directed, and portrayed by queer people is.”
On a political level, I completely get this. Perhaps the greatest privilege accorded to the dominant group is that of writing the history books and all of the trimmings in the form of literature, theater, music, etc. And in the case of queer people, up until only the very, very recent past, they were summarily written out of the entire story, unceremoniously left on the cutting room floor. Under such circumstances, it would be surprising if LGBT folk weren’t a bit possessive of their narrative, in order to a) exercise some self determination over the stories that get passed off as queer, b) express a certain amount of pique over the fact that straight storytellers, who for millennia pretended there was no such thing as queer, are suddenly queuing up to tell those very stories now that there is a bona fide market share to be had, and c) wonder why it took a straight director to win the Palmes d’Or for a queer flick.
I am also reminded of the hullabaloo surrounding the original plan in Hollywood of signing up a white male director to make the major motion picture of The Autobiography of Malcolm X. Controversy erupted, and things were set right when Spike Lee was ultimately hired on to direct. Again, politically, this makes perfect sense for essentially the same reasons eluded to in the Salon piece about Blue. Part of me rejoices that Lee was in charge of Malcolm X, and it is the same part of me that wonders, given the descriptions of the sex scenes in Blue as soft core porn, if Kechiche was capable of depicting anything other than a male fantasy. (To explain why I haven’t yet seen the film I would draw upon the wisdom of Samuel L. Jackson’s character in Pulp Fiction, who explains “My girlfriend is a vegetarian, which pretty much makes me a vegetarian.” My wife doesn’t see movies, which pretty much means I don’t see movies.)
But as an artist, I find the backlash against Kechiche’s Blue more than a little disheartening. To understand why, it might help to explain that I spent the last two years of my life writing a novel centered around two main characters, one of whom is a woman, as oppressed a group, of course, as ever there was. Getting inside her head and giving her an authentic voice was the hardest thing I’ve ever tried to do as an artist. In comparison, spouting off on this blog about the (hopefully) odd angle from which I see the world in a way that is reasonably entertaining and provocative is mere child’s play. And I have been given advice from a reader whose opinion I trust and value that my efforts to bring my character Shoshana to life were successful, if successful is a word that starts with the letters u and n. I never do things half-way, oh no, writing about a woman for this white male Episcopalian wasn’t enough. Shoshana, if you didn’t happen to know it, is a Jewish name. For an Episcopalian, a lovely middle name to pair with Shoshana would be Rose. Except Shoshana means rose, which makes Shoshana Rose sound a lot like naming your daughter Boutros Boutros, which has little to recommend itself other than possibly predestining her for a career as Secretary-General at the UN. All that said, it boggles the mind to think of all the things I don’t know about being either Jewish or female. Which is both a) a perfectly good reason to never write a novel about a Jewish woman, and b) the best reason anyone who isn’t a Jewish woman could possibly have to write a novel about a Jewish woman.
Because one of the very best ways of dismantling the violence of cultural dominance is the good faith effort to incorporate the perspective and voice of the Other in one’s works of art, especially if one, like me, is inescapably a member of the dominant group. This is more and not less true given that one may ultimately be, again like me, doomed to failure in one’s efforts to depict the Other, and is one more occasion when Beckett’s famous equation (the humanities’ equivalent to Einstein’s E=MCsquared), “Try again. Fail again. Fail better.”, rings absolutely true.
I would also argue that it is well nigh impossible to make art today without crossing the Rubicon of Otherness. The commitment to making art in our times may very well consist of fidelity to just that. And thank God for the sublime failures in that quest: Woody Allen’s Annie Hall, Henry James’ aptly titled Portrait of a Lady and the aforementioned Spike Lee’s She’s Gotta Have It are indispensable examples of men trying their damnedest to write women. Going the other way, it would be impossible to deploy more convincing male characters than did Eliot and Austen in their respective masterpieces, Middlemarch and Pride and Prejudice. Clearly, with a nod to Orwell, us artists are all failures, but some failures are more equal than others.
In writing a novel it was, of course, my dream to rival James, Eliot, and Austen. The astronomical odds against my doing so don’t in any way detract from the nobility of my undertaking. Go big or go home. And it is this intentional reckless abandon in constructing one’s ambitions, an act necessary, if not sufficient, for the making of great art, that is lost if we fence off certain stories for certain artists. In fencing off one’s turf, as blogger Kate is quoted doing above, one also fences one’s self in. Such an artist can no longer shoot for the moon of making great cinema, cinema so confident that if it was a basketball team it would play anybody, anywhere, anytime. Instead these artists make queer cinema, which sounds as unheroic to me as scheduling a bunch of teams you know you can beat. I’d rather be a failure.
I would note that John Waters happens to be gay, but no one thinks of his movies as queer cinema. They just think of his movies as great cinema because his movies are bloody brilliant. And if it is true that all great artists have big egos (though not all artists with big egos are great), then here’s to blind ambition that fails to see all the things it doesn’t see, allowing one to imagine what it might be like to be someone else. I want to live in a world that celebrates Tracy Chapman’s “Fast Car,” but also makes room for Elvis Presley’s “In the Ghetto,” lest we find ourselves relegated to artistic ghettos where there’s no point in making art anymore because you already know exactly what it’s like to be me.
Sunday, December 08, 2013
Pretty in Pink?
It is a strange time to be pink. On the one hand, never before has the assignment of blue to boys and pink to girls been more rigid. As pointed out in The Washington Post last week (http://www.washingtonpost.com/blogs/compost/wp/2013/12/04/every-gift-for-children-this-year-is-terrifying-a-walk-over-the-thin-pink-line-in-target/) and as any visitor to the Target toy department can attest, the strict gender coding of children’s toys (and apparel) by color is as stark in its contrast as the opposite hued fans in the home and visitor sections at a Michigan vs. Ohio State game, where maize and blue compete with scarlet and gray. On the other hand, in the last several years pink has made deep inroads into men’s fashion and sports culture. Roger Federer and Rafael Nadal have both taken the court in pink shirts (though, oddly, the only match that Nadal has ever lost at the French Open, a tournament he has won eight times, he lost in pink). Professional and college football teams put more and more pink in their uniforms during their annual breast cancer awareness efforts, a practice which culminated with the University of Oregon Ducks, who have grown into a de facto division of Nike’s fashion empire, donning entirely pink helmets in a contest this fall against Washington State. And the kind of teenage boys who wear expensive basketball high tops, at least the ones in Baltimore, have been mixing bright pink kicks into the rotation. Pink has never been more feminine, while at the same time becoming increasingly masculine.
This schizophrenic identity is perhaps not surprising when we consider how very recently it was that the color came to be identified with gender, and even less surprising given the shifts that occurred in that short history. In the 1800’s babies and toddlers of both genders were clothed in white dresses; it wasn’t until the first part of the twentieth century that color entered the picture. And over those first several decades pink was more frequently linked to boys, if only because, as quoted in a Smithsonian Magazine piece (http://www.smithsonianmag.com/arts-culture/When-Did-Girls-Start-Wearing-Pink.html), the thinking of the day held that “pink, being a more decided and stronger color, is more suitable for the boy, while blue, which is more delicate and dainty, is prettier for the girl,” an observation which, lest we think ourselves beyond all that, doesn’t sound very far off from “sugar and spice and everything nice, that’s what little girls are made of,” a nursery rhyme which we repeat as often and as thoughtlessly as Jack and Jill or Humpty Dumpty. This trend didn’t reverse itself until the 1940’s, when the current code, pink for girls and blue for boys, began to predominate.
The odd cultural space inhabited by pink is best illuminated by the controversy surrounding the University of Iowa football stadium’s visiting locker room, the walls of which are painted pink. If pink is feminine, then the protests (reported at http://www.sportsgrid.com/ncaa-football/iowa-pink-locker-room/) of former UI professor Jill Gaulding that the painted walls are a form of “pink shaming” designed, per Gaulding, to mark the visiting Boilermakers or Cornhuskers as “a bunch of ladies/girls/sissies/pansies/etc.”, are absolutely correct. But if pink, per the Ducks and all the other young men whose color palettes Nike is radically expanding, is now as masculine as it is feminine, and as masculine as it once was prior to the 1940’s, then the University of Iowa is standing on firm ground in asserting that it paints its visiting locker room walls pink for the calming effect it has on its opponents, becalmed Boilermakers presumably being less capable of punching their opponent, i.e. the hometown Hawkeyes, in the mouth, which is football-speak for a job well done. Indeed, a quick Google search (http://psychology.about.com/od/sensationandperception/a/color_pink.htm) reveals that pink’s calming effect has been well established in color psychology research (was Nadal becalmed at Roland Garros when he suffered his only loss while wearing pink?), and that the University of Iowa, while perhaps first, is not alone in its practice (http://www.milb.com/news/article.jsp?ymd=20130717&content_id=53916674&fext=.jsp&vkey=news_milb).
As recently as ten years ago, perhaps even five, Gaulding would have been right, and the University of Iowa would have been wrong. But now that pink is hyper-feminine at the same time that it is uber-masculine, Gaulding and Iowa are both right, as if the rules of quantum physics, where an electron can be in two places at one time, have invaded the realm of ethics. In football, when electrons are simultaneously in two places, i.e. when there are penalties on both the offense and the defense on the same play, the penalties offset, resulting in a “do-over,” a concept that is as alien to our observable Newtonian world as the behavior of electrons (which is why every husband foolish enough to try to win an argument with his wife knows that once you’ve said something, even if you immediately realize you should never have said it, you can’t take it back). So perhaps Gaulding and Iowa should engage in a do-over and work through this whole pink locker room thing. Because think what might happen if Gaulding visited the locker room while the Oregon Ducks were in town, putting on pink helmets that were a perfect match for the locker room walls. If, sensing Gaulding’s presence, the Ducks asked themselves, “Are we not men?”, they would be left doing quantum ethics and calculating probabilities while inhabiting a Newtonian world where, as in the effort to definitively locate those elusive electrons, the only meaningful final answer is maybe. For if pink is both feminine and masculine, then it really is neither here nor there.
This schizophrenic identity is perhaps not surprising when we consider how very recently it was that the color came to be identified with gender, and even less surprising given the shifts that occurred in that short history. In the 1800’s babies and toddlers of both genders were clothed in white dresses; it wasn’t until the first part of the twentieth century that color entered the picture. And over those first several decades pink was more frequently linked to boys, if only because, as quoted in a Smithsonian Magazine piece (http://www.smithsonianmag.com/arts-culture/When-Did-Girls-Start-Wearing-Pink.html), the thinking of the day held that “pink, being a more decided and stronger color, is more suitable for the boy, while blue, which is more delicate and dainty, is prettier for the girl,” an observation which, lest we think ourselves beyond all that, doesn’t sound very far off from “sugar and spice and everything nice, that’s what little girls are made of,” a nursery rhyme which we repeat as often and as thoughtlessly as Jack and Jill or Humpty Dumpty. This trend didn’t reverse itself until the 1940’s, when the current code, pink for girls and blue for boys, began to predominate.
The odd cultural space inhabited by pink is best illuminated by the controversy surrounding the University of Iowa football stadium’s visiting locker room, the walls of which are painted pink. If pink is feminine, then the protests (reported at http://www.sportsgrid.com/ncaa-football/iowa-pink-locker-room/) of former UI professor Jill Gaulding that the painted walls are a form of “pink shaming” designed, per Gaulding, to mark the visiting Boilermakers or Cornhuskers as “a bunch of ladies/girls/sissies/pansies/etc.”, are absolutely correct. But if pink, per the Ducks and all the other young men whose color palettes Nike is radically expanding, is now as masculine as it is feminine, and as masculine as it once was prior to the 1940’s, then the University of Iowa is standing on firm ground in asserting that it paints its visiting locker room walls pink for the calming effect it has on its opponents, becalmed Boilermakers presumably being less capable of punching their opponent, i.e. the hometown Hawkeyes, in the mouth, which is football-speak for a job well done. Indeed, a quick Google search (http://psychology.about.com/od/sensationandperception/a/color_pink.htm) reveals that pink’s calming effect has been well established in color psychology research (was Nadal becalmed at Roland Garros when he suffered his only loss while wearing pink?), and that the University of Iowa, while perhaps first, is not alone in its practice (http://www.milb.com/news/article.jsp?ymd=20130717&content_id=53916674&fext=.jsp&vkey=news_milb).
As recently as ten years ago, perhaps even five, Gaulding would have been right, and the University of Iowa would have been wrong. But now that pink is hyper-feminine at the same time that it is uber-masculine, Gaulding and Iowa are both right, as if the rules of quantum physics, where an electron can be in two places at one time, have invaded the realm of ethics. In football, when electrons are simultaneously in two places, i.e. when there are penalties on both the offense and the defense on the same play, the penalties offset, resulting in a “do-over,” a concept that is as alien to our observable Newtonian world as the behavior of electrons (which is why every husband foolish enough to try to win an argument with his wife knows that once you’ve said something, even if you immediately realize you should never have said it, you can’t take it back). So perhaps Gaulding and Iowa should engage in a do-over and work through this whole pink locker room thing. Because think what might happen if Gaulding visited the locker room while the Oregon Ducks were in town, putting on pink helmets that were a perfect match for the locker room walls. If, sensing Gaulding’s presence, the Ducks asked themselves, “Are we not men?”, they would be left doing quantum ethics and calculating probabilities while inhabiting a Newtonian world where, as in the effort to definitively locate those elusive electrons, the only meaningful final answer is maybe. For if pink is both feminine and masculine, then it really is neither here nor there.
Thursday, December 05, 2013
Mr. Roboto
The old saying, originally coined by Ben Franklin, holds that “in this world nothing can be said to be certain, except death and taxes.” Franklin’s clear-eyed genius was proven yet again with the announcement earlier this week that Amazon will soon be deploying a fleet of drones to deliver merchandise right to the consumer’s doorstep. Drones, of course, are already famously linked with the death half of the equation as the first line of offense in the American war on terrorism. With their emergence as agents of commerce, drones can begin to reach their full Franklinian potential as cogs in the wheels of a growing economy whose end result, natch, is more tax revenue.
I found the following line in a Bloomberg.com report on Amazon’s drones chilling in its matter-of-fact description of the US practice of executing “suspected” terrorists, as if the practice of killing people based on mere suspicion was as unremarkable as dropping a new duvet cover from Amazon on someone’s front porch: “Experimentation with delivery by drones is part of a shift from the craft’s use by the U.S. military to spy on and kill suspected terrorists.” Jean Baudrillard once famously opined along the lines that the only real freedom left to us is what to purchase when we go shopping. But Baudrillard must not have read his Poor Richard’s Almanack, because in America, as demonstrated by our drones’ twin functions, we still have two choices: what to purchase and whom to kill. Taxes and death.
Amazon’s drones have a second implication which, unfortunately, will do little to lighten the mood. Delivery drones spell the demise of the delivery man or woman, thereby rendering the divorce between labor and consumption final. Until now, in getting one’s hot little hands on one’s new duvet cover from Amazon one had at the very least to encounter the physical presence of labor in the person of the UPS guy or gal, or, if nothing else, the specter of his or her presence lurking surreptitiously next to the package resting on your front porch. But with the advent of delivery drones, labor and consumption split off into parallel universes, and, henceforth, never the twain shall meet. But the divorce between labor and consumption only reinforces the marriage between death and taxes; Amazon’s drones can’t talk, but if they could they would surely ignore Asimov’s first law of robotics, “a robot may not injure a human being, or, through inaction, allow a human being to come to harm,” and never tell us about the Chinese factory worker whose lungs are coated in the plastic she labors to produce for our new laptop.
Storks delivering newborn babies to our doorsteps have given way to drones delivering consumables, as the enchantment of myth gives way to the certainty of brute facts, none more brutal than the only two, per Franklin, that are never in doubt.
I found the following line in a Bloomberg.com report on Amazon’s drones chilling in its matter-of-fact description of the US practice of executing “suspected” terrorists, as if the practice of killing people based on mere suspicion was as unremarkable as dropping a new duvet cover from Amazon on someone’s front porch: “Experimentation with delivery by drones is part of a shift from the craft’s use by the U.S. military to spy on and kill suspected terrorists.” Jean Baudrillard once famously opined along the lines that the only real freedom left to us is what to purchase when we go shopping. But Baudrillard must not have read his Poor Richard’s Almanack, because in America, as demonstrated by our drones’ twin functions, we still have two choices: what to purchase and whom to kill. Taxes and death.
Amazon’s drones have a second implication which, unfortunately, will do little to lighten the mood. Delivery drones spell the demise of the delivery man or woman, thereby rendering the divorce between labor and consumption final. Until now, in getting one’s hot little hands on one’s new duvet cover from Amazon one had at the very least to encounter the physical presence of labor in the person of the UPS guy or gal, or, if nothing else, the specter of his or her presence lurking surreptitiously next to the package resting on your front porch. But with the advent of delivery drones, labor and consumption split off into parallel universes, and, henceforth, never the twain shall meet. But the divorce between labor and consumption only reinforces the marriage between death and taxes; Amazon’s drones can’t talk, but if they could they would surely ignore Asimov’s first law of robotics, “a robot may not injure a human being, or, through inaction, allow a human being to come to harm,” and never tell us about the Chinese factory worker whose lungs are coated in the plastic she labors to produce for our new laptop.
Storks delivering newborn babies to our doorsteps have given way to drones delivering consumables, as the enchantment of myth gives way to the certainty of brute facts, none more brutal than the only two, per Franklin, that are never in doubt.
Tuesday, December 03, 2013
You Can Only Win if You Don’t Play
On my morning commute I pass a billboard that displays the current jackpot for both the Mega Millions and Power Ball lottery games, and this morning Mega Millions was up to 257 million dollars. I am unable to report on the Power Ball, as the figure doesn’t register in my mind until it tops 200 million dollars, at which point I, like almost everyone else, begin to think about stopping in at the nearest convenience store to purchase a ticket. Life’s busy treadmill usually wins out, and I don’t get really serious about buying a ticket until the lottery zooms past 300 million. This despite the ample anecdotal evidence that hitting the jackpot will more than likely ruin your life.
I am reminded of Shirley Jackson’s classic short story, “The Lottery,” in which the citizenry of a small town are all entered in an annual lottery, with the catch that the “winner” is stoned to death. But in Jackson’s story the lottery is compulsory, whereas we’re all signing up voluntarily for a chance to be, if not precisely stoned to death, then buried alive under our sacks of dough. It is tempting to believe that winning the lottery simply sets in motion an equal and opposite reaction: oh, how nice, you won tens of millions of dollars- so sorry about your dead son (a real life turn of events that happened to my former co-worker’s friend, and exactly the kind of anecdotal evidence about the lottery we are all privy to). Such a belief is grounded in devout bitterness, bitterness that is fed by the inevitable envy towards those who do win the lottery, and perversely, also fed when you learn of the winner’s equally inevitable demise, even if that demise is simply having a reputation as the world’s biggest rich asshole. Even though you rejoice that the winner’s life remains a piece of shit like your own, your bitterness only grows as you realize that should lady luck smile on you, the Lord nevertheless follows up the giveth with the taketh away, twisting life’s greatest mystery into the following: “Why do bad things happen to lucky people?”
Avoiding the temptation of this equal-and-opposite-reaction worldview requires that we re-inspect our notion of luck. Mega Millions is more like than unlike Jackson’s “Lottery” because the people who are lucky enough to be truly happy would never in a million years play Mega Millions. This means that every last one of the pool of potential Mega Millions winners is, to one degree or another, basically unhappy, at least in the moment they purchase their ticket. One of only two real changes that occurs for the Mega Millions winner is that he or she goes from being unhappy to being unhappy with money, the other change being that he or she never again has to worry about paying the bills or putting food on the table, which, while not insignificant, is, per the wisdom found in “Man does not live by bread alone,” not by itself enough to overcome a basic dissatisfaction with life. One who doubts that everyone who plays the lottery is to some degree basically unhappy must answer the question of why on earth a happy person would play the lottery. Doing so would be no different than deeply loving one’s spouse and then turning around and cheating on him or her. And infidelity, like playing the lottery, is born of dissatisfaction.
This second perspective, while discomforting to me as someone who occasionally plays the lottery (because really, what the bleep do I have to complain about re: my own circumstances, which are the only circumstances in question given that I’m certainly not playing the lottery while fantasizing about all the philanthropy I’m going to get up to. I should add, for the sake of clarity, that despite buying lottery tickets, I have never and would never cheat on my spouse, who is the main reason I am happy enough to buy lottery tickets so rarely, and, if I just opened my eyes a little wider, would keep me out of the checkout counter at Royal Farms for the rest of my life), has the advantage of nullifying the question of “Why do bad things happen to lucky people?” The truly fortunate are those lucky enough to realize how good they have it. But even these folk still face, at a minimum, old age, sickness, and death, which leaves us yet wondering “Why do bad things happen to good people?” If I can find just find an answer to that, I’ll make millions.
I am reminded of Shirley Jackson’s classic short story, “The Lottery,” in which the citizenry of a small town are all entered in an annual lottery, with the catch that the “winner” is stoned to death. But in Jackson’s story the lottery is compulsory, whereas we’re all signing up voluntarily for a chance to be, if not precisely stoned to death, then buried alive under our sacks of dough. It is tempting to believe that winning the lottery simply sets in motion an equal and opposite reaction: oh, how nice, you won tens of millions of dollars- so sorry about your dead son (a real life turn of events that happened to my former co-worker’s friend, and exactly the kind of anecdotal evidence about the lottery we are all privy to). Such a belief is grounded in devout bitterness, bitterness that is fed by the inevitable envy towards those who do win the lottery, and perversely, also fed when you learn of the winner’s equally inevitable demise, even if that demise is simply having a reputation as the world’s biggest rich asshole. Even though you rejoice that the winner’s life remains a piece of shit like your own, your bitterness only grows as you realize that should lady luck smile on you, the Lord nevertheless follows up the giveth with the taketh away, twisting life’s greatest mystery into the following: “Why do bad things happen to lucky people?”
Avoiding the temptation of this equal-and-opposite-reaction worldview requires that we re-inspect our notion of luck. Mega Millions is more like than unlike Jackson’s “Lottery” because the people who are lucky enough to be truly happy would never in a million years play Mega Millions. This means that every last one of the pool of potential Mega Millions winners is, to one degree or another, basically unhappy, at least in the moment they purchase their ticket. One of only two real changes that occurs for the Mega Millions winner is that he or she goes from being unhappy to being unhappy with money, the other change being that he or she never again has to worry about paying the bills or putting food on the table, which, while not insignificant, is, per the wisdom found in “Man does not live by bread alone,” not by itself enough to overcome a basic dissatisfaction with life. One who doubts that everyone who plays the lottery is to some degree basically unhappy must answer the question of why on earth a happy person would play the lottery. Doing so would be no different than deeply loving one’s spouse and then turning around and cheating on him or her. And infidelity, like playing the lottery, is born of dissatisfaction.
This second perspective, while discomforting to me as someone who occasionally plays the lottery (because really, what the bleep do I have to complain about re: my own circumstances, which are the only circumstances in question given that I’m certainly not playing the lottery while fantasizing about all the philanthropy I’m going to get up to. I should add, for the sake of clarity, that despite buying lottery tickets, I have never and would never cheat on my spouse, who is the main reason I am happy enough to buy lottery tickets so rarely, and, if I just opened my eyes a little wider, would keep me out of the checkout counter at Royal Farms for the rest of my life), has the advantage of nullifying the question of “Why do bad things happen to lucky people?” The truly fortunate are those lucky enough to realize how good they have it. But even these folk still face, at a minimum, old age, sickness, and death, which leaves us yet wondering “Why do bad things happen to good people?” If I can find just find an answer to that, I’ll make millions.
Monday, December 02, 2013
Helping Man's Best Friend
I was playing doubles with my tennis buddies earlier this week, courtesy of some free indoor court time via a Parks and Rec program that has at least temporarily restored my faith in government of, by, and for the people. In the friendly post-match chat we got to talking about our dogs and one of my buddies mentioned that his dog was so anxious and whiny that his vet had put the dog on Prozac, which prescription my friend, without any apparent hesitation, has dutifully filled and refilled at ten bucks a pop for a several-month supply of the ubiquitous selective serotonin reuptake inhibitor.
After wondering whether there was a pill that could make my otherwise perfect pooch Sy stop taking dumps in the children’s play room, I got to thinking about the notion of administering antidepressants to dogs. My thinking was problematized by James Davies’ excellent Cracked: The Unhappy Truth about Psychiatry, in which Davies reports that meta analyses of both published and, crucially, unpublished (as in withheld by the pharmaceutical companies because the outcomes didn’t suit their purposes) clinical trials demonstrate that antidepressants, but for the most severely depressed patients, do not result in statistically significant better outcomes than placebo.
My initial question might well have been why we should presume that a medicine which is efficacious for the human mind would be similarly so for the canine “mind,” when it is not at all clear that words like “mind” and “mental illness” can even be used meaningfully with dogs. But having read Davies I find myself instead asking whether dogs themselves are benefitting from their own form of placebo effect, which I find rather unlikely (the placebo effect being bound up with beliefs, and while I would agree that a dog may very well believe that the ground its master walks on is holy, I wouldn’t go so far as to say that a dog might have a set of beliefs, either conscious or unconscious, about western medicine and, more particularly, psychopharmacology), or, if not, whether the placebo effect is so powerful that it can not only cure human depression by way of the patient’s belief in a pill and all it represents, but also cure a dog’s emotional anguish by way of the dog owner’s belief in the pill his or her dog is taking. If the latter suggestion is true, then we can only conclude (while temporarily tabling our confusion as to whether we can refer to a dog’s consciousness as a “mind”) that the placebo effect can “jump” between minds, even if the barrier between those minds is that between species.
This barrier is not insignificant, and is best captured by one of Wittgenstein’s pithy western koans, “If a lion could talk, we could not understand him.” But perhaps because, unlike lions, we live with dogs, and because they seem so able to understand us when we talk, whether with words, gestures, or just a look (the fact that we both show the whites of our eyes, I’ve read, is an important element of the human-dog connection), we presume that we know our dogs as well as they know us, and that, e.g. when they are sitting contentedly at our feet chewing a bone while we indulge in a bowl of Neapolitan ice cream it is essentially no different than when we share a cup of coffee with our spouse. I do, however, wonder if the relationship between dogs and humans isn’t more like that between women and men, with women understanding the open-book half of the species as intuitively as a dog does its master, while men are left wondering “What does a woman want?” in the same way that we humans can’t really know a dog’s thoughts any better than a lion’s.
But even if dog consciousness remains inscrutable to us talking apes, this wouldn’t necessarily prevent the placebo effect from “jumping” between human and dog, which, presumably, would only require the dog to continue reading its master as well as it always has. It is the dog and not us, after all, who is performing the Vulcan Mind Meld. I just wish we homo sapiens believed in the talking cure as much as we believe in our little pills, especially since it is a known fact that one cures a dying plant by talking to it. When you notice your hound looking anxious, you could just use the Cognitive-Behavioral Therapy technique of thought stopping and let the placebo effect work like jumper cables attached to Fido. If nothing else, this would spare Fido the drowsiness, dizziness, nausea, vomiting, constipation, weight changes, insomnia, decreased sex drive, impotence, difficulty having an orgasm, dry mouth, severe blistering and rash, high fever, uneven heartbeats, tremors, diarrhea, loss of coordination, headaches, memory problems, confusion, hallucinations, fainting, seizures, or breathing that stops, all of which are the side effects that may occur when Fido takes his Prozac in order to get his second hand placebo effect.
After wondering whether there was a pill that could make my otherwise perfect pooch Sy stop taking dumps in the children’s play room, I got to thinking about the notion of administering antidepressants to dogs. My thinking was problematized by James Davies’ excellent Cracked: The Unhappy Truth about Psychiatry, in which Davies reports that meta analyses of both published and, crucially, unpublished (as in withheld by the pharmaceutical companies because the outcomes didn’t suit their purposes) clinical trials demonstrate that antidepressants, but for the most severely depressed patients, do not result in statistically significant better outcomes than placebo.
My initial question might well have been why we should presume that a medicine which is efficacious for the human mind would be similarly so for the canine “mind,” when it is not at all clear that words like “mind” and “mental illness” can even be used meaningfully with dogs. But having read Davies I find myself instead asking whether dogs themselves are benefitting from their own form of placebo effect, which I find rather unlikely (the placebo effect being bound up with beliefs, and while I would agree that a dog may very well believe that the ground its master walks on is holy, I wouldn’t go so far as to say that a dog might have a set of beliefs, either conscious or unconscious, about western medicine and, more particularly, psychopharmacology), or, if not, whether the placebo effect is so powerful that it can not only cure human depression by way of the patient’s belief in a pill and all it represents, but also cure a dog’s emotional anguish by way of the dog owner’s belief in the pill his or her dog is taking. If the latter suggestion is true, then we can only conclude (while temporarily tabling our confusion as to whether we can refer to a dog’s consciousness as a “mind”) that the placebo effect can “jump” between minds, even if the barrier between those minds is that between species.
This barrier is not insignificant, and is best captured by one of Wittgenstein’s pithy western koans, “If a lion could talk, we could not understand him.” But perhaps because, unlike lions, we live with dogs, and because they seem so able to understand us when we talk, whether with words, gestures, or just a look (the fact that we both show the whites of our eyes, I’ve read, is an important element of the human-dog connection), we presume that we know our dogs as well as they know us, and that, e.g. when they are sitting contentedly at our feet chewing a bone while we indulge in a bowl of Neapolitan ice cream it is essentially no different than when we share a cup of coffee with our spouse. I do, however, wonder if the relationship between dogs and humans isn’t more like that between women and men, with women understanding the open-book half of the species as intuitively as a dog does its master, while men are left wondering “What does a woman want?” in the same way that we humans can’t really know a dog’s thoughts any better than a lion’s.
But even if dog consciousness remains inscrutable to us talking apes, this wouldn’t necessarily prevent the placebo effect from “jumping” between human and dog, which, presumably, would only require the dog to continue reading its master as well as it always has. It is the dog and not us, after all, who is performing the Vulcan Mind Meld. I just wish we homo sapiens believed in the talking cure as much as we believe in our little pills, especially since it is a known fact that one cures a dying plant by talking to it. When you notice your hound looking anxious, you could just use the Cognitive-Behavioral Therapy technique of thought stopping and let the placebo effect work like jumper cables attached to Fido. If nothing else, this would spare Fido the drowsiness, dizziness, nausea, vomiting, constipation, weight changes, insomnia, decreased sex drive, impotence, difficulty having an orgasm, dry mouth, severe blistering and rash, high fever, uneven heartbeats, tremors, diarrhea, loss of coordination, headaches, memory problems, confusion, hallucinations, fainting, seizures, or breathing that stops, all of which are the side effects that may occur when Fido takes his Prozac in order to get his second hand placebo effect.
Wednesday, November 27, 2013
The Case For (or Against?) Gluten Free Socialism
Last Friday, having failed to pack a lunch and racing between meetings from one side of town to the other, and with nary a fast food drive thru in site, I succumbed to that food of last resort, the 7-11 hot dog. I paired the quarter pound dog, minus the bun per my gluten free lifestyle but slathered in yellow mustard, with a bag of plain potato chips, Snapple, and M&M’s. Arriving at my destination a few minutes before the scheduled start of a 12:00 meeting, I sat down in the conference room and dug in. Someone happened by to use the copier machine and, using the friendly authoritative tone unique to the giving of unsolicited advice to strangers (their status as strangers cancelling out their authority; it remains impossible to have authority over someone whilst being their actual friend, which is why everyone knows you shouldn’t work for your friend or date your boss), extolled me to eat something healthy that night, preferably a mélange of leafy green vegetables, to make up for the disaster of a lunch laying prone on the conference room table.
I played along, even as I fought back the urge to retort “But it’s all gluten free!” This urge had nothing to do with self delusion or denial; I was fully aware of the nutritional contents of both my tube steak and handpicked side dishes. But my unaired protest had nothing to do with the meal set before me. I believed my meal to be healthy because, like life itself, it was constituted by what it lacked, i.e. gluten, which, I have come to be persuaded, is to the gut as smoke is to the lungs.
Although I am beginning to think that I was thusly persuaded in order to satisfy a longing much deeper than the not insubstantial need for some relief from the emotional peaks and valleys of my hypoglycemic carb-loading days. This is related to the notion I slipped in above about lack and its place at the center of the universe. Lest this sound nihilistic, no less a sage than the Kabbalah itself teaches that in order to create the universe, God first, to make room for it, had to absent Him/Herself from the scene. (The gender-inclusive pronouns are mine; not sure where the Kaballah stands on the question of whether God goes to the men’s room or the ladies room, or both.) The Kaballah, from the little I know of it, goes on ad infinitum to explain how despite God’s constitutive absence we nevertheless remain connected through what sounds to me like an elaborate version of the life lines on Who Wants to be a Millionarre? I will leave the parsing of the details on that to the Kabbalists, but the take home message is clear to me. The universe began with God’s absence. We exist because, at least right here and now, God, at least in all His/Her fullness, does not. We are all donuts bent around a primordial lack.
Two of the three Abrahamic faiths, Islam and Judaism, deal with this constitutive lack, at least in part, through their respective halal and kosher dietary laws. Food, especially when there is a plenitude, is the opposite of lack. The cornucopia, on our minds this week as we celebrate Thanksgiving, is the perfect symbol of God spilling over into creation in all His/Her abundance. This sounds like the ultimate good thing, until we remember that it was God’s very absence that made room for us to begin with. At which point the cornucopia is transformed into The Blob, and God’s abundance spilling over into creation, in the form of a table very much like the one we all plan to sit down to this Thursday, threatens to squeeze us into oblivion. Islam and Judaism defuse this threat by inscribing lack into nourishment, placing God at a safe remove under the guise of upholding His/Her law, like parents stealing off to work each day purportedly to put food on the table for their children but really just to get some time away from them.
Christianity, alone among the Abrahamic faiths, makes do without comprehensive dietary restrictions, but for the faint echo heard in meatless Fridays for Roman Catholics. Instead, Christianity made the radical move of inscribing lack on our very persons, in the form of Original Sin. It is perhaps no surprise that the cure for Original Sin, God the Son, made only a brief appearance, exiting the scene before His abundance Blob had time to grow into an existential threat, leaving behind the third member of the Holy Trinity, the Holy Ghost. Ghosts, of course, are pure lack, meaning that Christianity has made all the necessary arrangements for an ongoing surfeit of inner (Original Sin) and outer (Holy Ghost) lack, at least until the Second Coming.
Through its three major religions, western civilization had eased the tensions stemming from its missing foundation, the ground of all being that was, in fact, pure groundlessness. But after several centuries along came capitalism, overturning the “straight and narrow path” of lack’s prohibitions and replacing it with “everything in moderation,” which, if the last 250 years has taught us anything, can only ever lead to everything in excess. Where the three Abrahamic faiths have sanctified lack, capitalism has duped us into thinking we can become our own Blobs, consuming lack out of existence, even as in doing so we feed lack until it has grown into the existential threat that God once was.
People appear to be choosing one of three paths forward:
1. Continue trying to consume lack out of existence until it consumes us,
2. Return to the traditional straight and narrow path of one’s preferred Abrahamic faith, or
3. Like me, cede Christianity (or the Abrahamic faith you happened to be born into) to the Evangelical right (or its Jewish or Muslim equivalent), replacing it with a progressive secularism tinged with individualized spirituality, an arrangement that inevitably proves as unsatisfying as being “friends with benefits.” Then, unwilling to engage in option 1 or 2, reinscribe lack into one’s individualized creed by becoming either vegan or gluten free, and/or socialist, which is just spending most of your time thinking about how much people are lacking.
Personally, I recommend gluten free socialism, although my wife is covering all of her bases by trying to go gluten free (option 3) and become an observant Jew (option 2) at the same time. One thing she doesn’t lack is chutzpah.
I played along, even as I fought back the urge to retort “But it’s all gluten free!” This urge had nothing to do with self delusion or denial; I was fully aware of the nutritional contents of both my tube steak and handpicked side dishes. But my unaired protest had nothing to do with the meal set before me. I believed my meal to be healthy because, like life itself, it was constituted by what it lacked, i.e. gluten, which, I have come to be persuaded, is to the gut as smoke is to the lungs.
Although I am beginning to think that I was thusly persuaded in order to satisfy a longing much deeper than the not insubstantial need for some relief from the emotional peaks and valleys of my hypoglycemic carb-loading days. This is related to the notion I slipped in above about lack and its place at the center of the universe. Lest this sound nihilistic, no less a sage than the Kabbalah itself teaches that in order to create the universe, God first, to make room for it, had to absent Him/Herself from the scene. (The gender-inclusive pronouns are mine; not sure where the Kaballah stands on the question of whether God goes to the men’s room or the ladies room, or both.) The Kaballah, from the little I know of it, goes on ad infinitum to explain how despite God’s constitutive absence we nevertheless remain connected through what sounds to me like an elaborate version of the life lines on Who Wants to be a Millionarre? I will leave the parsing of the details on that to the Kabbalists, but the take home message is clear to me. The universe began with God’s absence. We exist because, at least right here and now, God, at least in all His/Her fullness, does not. We are all donuts bent around a primordial lack.
Two of the three Abrahamic faiths, Islam and Judaism, deal with this constitutive lack, at least in part, through their respective halal and kosher dietary laws. Food, especially when there is a plenitude, is the opposite of lack. The cornucopia, on our minds this week as we celebrate Thanksgiving, is the perfect symbol of God spilling over into creation in all His/Her abundance. This sounds like the ultimate good thing, until we remember that it was God’s very absence that made room for us to begin with. At which point the cornucopia is transformed into The Blob, and God’s abundance spilling over into creation, in the form of a table very much like the one we all plan to sit down to this Thursday, threatens to squeeze us into oblivion. Islam and Judaism defuse this threat by inscribing lack into nourishment, placing God at a safe remove under the guise of upholding His/Her law, like parents stealing off to work each day purportedly to put food on the table for their children but really just to get some time away from them.
Christianity, alone among the Abrahamic faiths, makes do without comprehensive dietary restrictions, but for the faint echo heard in meatless Fridays for Roman Catholics. Instead, Christianity made the radical move of inscribing lack on our very persons, in the form of Original Sin. It is perhaps no surprise that the cure for Original Sin, God the Son, made only a brief appearance, exiting the scene before His abundance Blob had time to grow into an existential threat, leaving behind the third member of the Holy Trinity, the Holy Ghost. Ghosts, of course, are pure lack, meaning that Christianity has made all the necessary arrangements for an ongoing surfeit of inner (Original Sin) and outer (Holy Ghost) lack, at least until the Second Coming.
Through its three major religions, western civilization had eased the tensions stemming from its missing foundation, the ground of all being that was, in fact, pure groundlessness. But after several centuries along came capitalism, overturning the “straight and narrow path” of lack’s prohibitions and replacing it with “everything in moderation,” which, if the last 250 years has taught us anything, can only ever lead to everything in excess. Where the three Abrahamic faiths have sanctified lack, capitalism has duped us into thinking we can become our own Blobs, consuming lack out of existence, even as in doing so we feed lack until it has grown into the existential threat that God once was.
People appear to be choosing one of three paths forward:
1. Continue trying to consume lack out of existence until it consumes us,
2. Return to the traditional straight and narrow path of one’s preferred Abrahamic faith, or
3. Like me, cede Christianity (or the Abrahamic faith you happened to be born into) to the Evangelical right (or its Jewish or Muslim equivalent), replacing it with a progressive secularism tinged with individualized spirituality, an arrangement that inevitably proves as unsatisfying as being “friends with benefits.” Then, unwilling to engage in option 1 or 2, reinscribe lack into one’s individualized creed by becoming either vegan or gluten free, and/or socialist, which is just spending most of your time thinking about how much people are lacking.
Personally, I recommend gluten free socialism, although my wife is covering all of her bases by trying to go gluten free (option 3) and become an observant Jew (option 2) at the same time. One thing she doesn’t lack is chutzpah.
Thursday, November 21, 2013
Capitalism with Asian Values, American Style
Earlier this week The Washington Post reported that the Dow Jones Industrial Index was approaching an inflation-adjusted all-time high on the same day that NPR’s All Things Considered reported that national approval ratings of our elected government officials in Washington, across both major parties, was at an all-time low. This juxtaposition was striking in that bull markets have for decades been reliable fuel for the signature American optimism. A traditional African saying holds that if you want to know whether or not times are good simply ask “How are the children?”; In America we have more often asked “How’s the economy?”, and we have turned to the numbers on Wall Street to get much, if not all, of our answer.
But the times, as reflected by our general disgust with the folks we have sent to Washington to represent our interests, seem anything but good, even as Wall Street chugs right along. One obvious narrative is that the rich are getting richer, but that has ever been the case. What has changed is the fact that the have-nots, who have always cut the rich some serious slack based on their own caviar dreams, are fed up because not only are they still not rich, now they don’t even have a functioning democracy.
All of which means that the Asian Century is right on schedule, if we understand that the “Asian Century” is a term carefully selected because it is less threatening to the west than the “Chinese Century.” China, of course, is home to a thriving brand of capitalism that has no truck with democracy. This is politely referred to as capitalism with Asian values. Tuesday November 19th, 2013, the day the Post and NPR made their twin reports, marks the beginning of the Asian Century in America, a mere 13 years and change after the putative beginning of the 21st century, making us strangely like Orthodox Christians celebrating Christmas 13 days after December 25th.
But the times, as reflected by our general disgust with the folks we have sent to Washington to represent our interests, seem anything but good, even as Wall Street chugs right along. One obvious narrative is that the rich are getting richer, but that has ever been the case. What has changed is the fact that the have-nots, who have always cut the rich some serious slack based on their own caviar dreams, are fed up because not only are they still not rich, now they don’t even have a functioning democracy.
All of which means that the Asian Century is right on schedule, if we understand that the “Asian Century” is a term carefully selected because it is less threatening to the west than the “Chinese Century.” China, of course, is home to a thriving brand of capitalism that has no truck with democracy. This is politely referred to as capitalism with Asian values. Tuesday November 19th, 2013, the day the Post and NPR made their twin reports, marks the beginning of the Asian Century in America, a mere 13 years and change after the putative beginning of the 21st century, making us strangely like Orthodox Christians celebrating Christmas 13 days after December 25th.
Tuesday, November 19, 2013
Let's Talk About It
As recent revelations that the NSA’s subterfuge includes spying on allies, as in the case of German Chancellor Angela Merkel, and on ourselves, via snooping into the data centers of Google and Yahoo sans court approval, the old adage that “We have met the enemy… and he is us,” has taken on new layers of meaning. More noteworthy, if not at all surprising, is the fact that in my day to day travels since this news hit I haven’t heard a peep about it from anyone, making us, as in the case of the Afghanistan and Iraq wars we collectively discussed only slightly more frequently than the Spanish-American War, our own worst enemies all over again. The few people with whom I have broken the code of silence by asking their opinion have, in so many words, asked me how I could be so daft to express surprise at the headline “Spy agency caught spying.”
I am left wondering why almost no one is thinking about the implications of this, or, if they do stop and think, write the whole thing off as spies being spies, as if this were no different than Manny being Manny or any other version of boys being boys. It may, perhaps, have something to do with the rate at which we broadcast the minutiae of our lives on social media; what can the NSA uncover that I haven’t already posted on Facebook? But it is more deeply rooted in the fact that the vast majority of us wake up in the morning and get the kids off to school, then go to work to make a living so that when the kids get home from school there is food on the table. Even those of us with political leanings towards the outer limits of the bell curve have difficulty imagining that the government could unearth anything more damning than a pattern of checking out books at the library which call into question the current trajectory. My Tea Partying next door neighbor may be checking out the books accusing Obama of socialism while I check out the books wishing he were, but we both have bills to pay. And in a country where credit is as omnipresent as death and taxes, what could be more American than that?
But it is my experience of the credit industry that gives me pause. My wife and I were recently alerted by our credit card company that someone had gotten hold of our credit card numbers and used them to make on-line purchases. The credit card company contacted us because they knew, quite accurately, that my wife and I would never in a million years have made the purchase in question. This is, quite simply, the practice of profiling. And the credit card companies are batting a thousand in their profiling of me and my true love. They have signed off on every single purchase but one in the ten years we have shared an account, even during the years when Jen and I were vying to see whether she could trump my accumulation of tennis racquets (always bought in pairs) with her collection of baby wearing wraps. Maybe the fact that I purchase the same twelve items at Trader Joe’s every single weekend (the gluten free frozen pancakes make up for what they lack in texture with an accurate flavor reminiscent of the idea at the core of the I Can’t Believe It’s Not Butter brand identity) makes me an easy mark for the credit card company algorithms, but one hundred percent is one hundred percent. Socrates’ advice to know thyself rings a little hollow when MasterCard already knows me better than I ever could.
Now if my credit card company has this much of a bead on me, what might the NSA have gleaned from my internet footprints? I am reminded of the Tom Cruise vehicle Minority Report, based on the near future sci-fi story by Philip K. Dick, in which police apprehend criminals prior to the committing of crimes, based on the input of psychics. It is all too easy to imagine a near future in which the NSA, relying, in lieu of psychics, on the pattern seeking software it surely already uses on your Google account, begins making accusations of threat prior to crimes, the precedence for which already exists in the form of the preemptive strikes taken in the aforementioned Iraq and Afghanistan, and in the everyday experience of racial profiling by Black men everywhere. Now imagine taking the stand to defend yourself against a District Attorney who, like my credit card company, is never wrong. In the words of Maryland’s own Stephen L. Miles, criminal defense attorney extraordinaire, “Let’s talk about it,” lest Miles and his ilk be rendered permanently extraneous (making “Save the lawyers” the new “Save the whales”).
I am left wondering why almost no one is thinking about the implications of this, or, if they do stop and think, write the whole thing off as spies being spies, as if this were no different than Manny being Manny or any other version of boys being boys. It may, perhaps, have something to do with the rate at which we broadcast the minutiae of our lives on social media; what can the NSA uncover that I haven’t already posted on Facebook? But it is more deeply rooted in the fact that the vast majority of us wake up in the morning and get the kids off to school, then go to work to make a living so that when the kids get home from school there is food on the table. Even those of us with political leanings towards the outer limits of the bell curve have difficulty imagining that the government could unearth anything more damning than a pattern of checking out books at the library which call into question the current trajectory. My Tea Partying next door neighbor may be checking out the books accusing Obama of socialism while I check out the books wishing he were, but we both have bills to pay. And in a country where credit is as omnipresent as death and taxes, what could be more American than that?
But it is my experience of the credit industry that gives me pause. My wife and I were recently alerted by our credit card company that someone had gotten hold of our credit card numbers and used them to make on-line purchases. The credit card company contacted us because they knew, quite accurately, that my wife and I would never in a million years have made the purchase in question. This is, quite simply, the practice of profiling. And the credit card companies are batting a thousand in their profiling of me and my true love. They have signed off on every single purchase but one in the ten years we have shared an account, even during the years when Jen and I were vying to see whether she could trump my accumulation of tennis racquets (always bought in pairs) with her collection of baby wearing wraps. Maybe the fact that I purchase the same twelve items at Trader Joe’s every single weekend (the gluten free frozen pancakes make up for what they lack in texture with an accurate flavor reminiscent of the idea at the core of the I Can’t Believe It’s Not Butter brand identity) makes me an easy mark for the credit card company algorithms, but one hundred percent is one hundred percent. Socrates’ advice to know thyself rings a little hollow when MasterCard already knows me better than I ever could.
Now if my credit card company has this much of a bead on me, what might the NSA have gleaned from my internet footprints? I am reminded of the Tom Cruise vehicle Minority Report, based on the near future sci-fi story by Philip K. Dick, in which police apprehend criminals prior to the committing of crimes, based on the input of psychics. It is all too easy to imagine a near future in which the NSA, relying, in lieu of psychics, on the pattern seeking software it surely already uses on your Google account, begins making accusations of threat prior to crimes, the precedence for which already exists in the form of the preemptive strikes taken in the aforementioned Iraq and Afghanistan, and in the everyday experience of racial profiling by Black men everywhere. Now imagine taking the stand to defend yourself against a District Attorney who, like my credit card company, is never wrong. In the words of Maryland’s own Stephen L. Miles, criminal defense attorney extraordinaire, “Let’s talk about it,” lest Miles and his ilk be rendered permanently extraneous (making “Save the lawyers” the new “Save the whales”).
Tuesday, November 12, 2013
When the Forest Goes All Incognito
As I write this, the Jonathan Martin/Richie Incognito imbroglio is reaching a crescendo as the Miami Dolphins, the organization whose workplace is either, in the words of a third Dolphins offensive lineman, Mike Pouncey, home to a “band of brothers,” or, conversely, as toxic as downtown Chernobyl, or, possibly, both simultaneously, take center stage on the national broadcast of Monday Night Football. Among the Martin/Incognito story’s many facets, which have been discussed ad nauseum on every sports radio broadcast I have tuned into during my daily commute for the last week, the element I find most revealing is that when Martin’s agent complained to Dolphins General Manager Jeff Ireland about Incognito, Ireland’s proposed solution was for Martin to punch Incognito in the face.
Ireland’s response tells us exactly how the Dolphins “family,” building on Pouncey’s idea of a “band of brothers,” functions. Ireland, from his position in senior management, is the father who greets his son’s report that the neighborhood bully just bloodied his nose by telling his son to get his butt back outside and not to even think about coming home until he has settled his score with the bully. The countless fathers who have used this approach always believe that they are doing what’s best for their sons, who, for all of the obvious reasons, must learn how to handle themselves. Just so, Ireland thought he was doing Martin a favor when, via his exchange with the agent, he essentially sent Martin back out the front door with his bloody nose.
What is so odd and, ultimately, galling about this scenario is that an organization worth at least half a billion dollars by the most conservative assessments, an organization that pays its players tens of millions of dollars, and its coaches and management millions as well, would ascribe to an ethos that is the product of endemic poverty. The cycle of violence in which a father sends his son out to the streets to sink or swim, desperately hoping that his son will prove just violent enough to swim (too little and too much violence both placing one at risk of drowning), is rooted in streets from which there is literally no way out. Each son thrown to the wolves by his father (for those lucky enough to have a father to do the throwing) serving as one more example of systemic economic injustice playing out one violent episode at a time.
The Miami Dolphins are rolling in money while mimicking the desperate and violent practices of those who have little or no choice, and who have been deprived of that choice through the violence perpetrated by a system set up to benefit the Miami Dolphins of the world at everyone else’s expense. It is the ultimate form of hubris and one we don’t even talk about as we natter on about whether Incognito is a racist bully or if Martin is a mentally unstable wuss. It is exactly where we are at the tail end of 2013, when we have never been less capable of seeing the forest for the trees.
Ireland’s response tells us exactly how the Dolphins “family,” building on Pouncey’s idea of a “band of brothers,” functions. Ireland, from his position in senior management, is the father who greets his son’s report that the neighborhood bully just bloodied his nose by telling his son to get his butt back outside and not to even think about coming home until he has settled his score with the bully. The countless fathers who have used this approach always believe that they are doing what’s best for their sons, who, for all of the obvious reasons, must learn how to handle themselves. Just so, Ireland thought he was doing Martin a favor when, via his exchange with the agent, he essentially sent Martin back out the front door with his bloody nose.
What is so odd and, ultimately, galling about this scenario is that an organization worth at least half a billion dollars by the most conservative assessments, an organization that pays its players tens of millions of dollars, and its coaches and management millions as well, would ascribe to an ethos that is the product of endemic poverty. The cycle of violence in which a father sends his son out to the streets to sink or swim, desperately hoping that his son will prove just violent enough to swim (too little and too much violence both placing one at risk of drowning), is rooted in streets from which there is literally no way out. Each son thrown to the wolves by his father (for those lucky enough to have a father to do the throwing) serving as one more example of systemic economic injustice playing out one violent episode at a time.
The Miami Dolphins are rolling in money while mimicking the desperate and violent practices of those who have little or no choice, and who have been deprived of that choice through the violence perpetrated by a system set up to benefit the Miami Dolphins of the world at everyone else’s expense. It is the ultimate form of hubris and one we don’t even talk about as we natter on about whether Incognito is a racist bully or if Martin is a mentally unstable wuss. It is exactly where we are at the tail end of 2013, when we have never been less capable of seeing the forest for the trees.
Monday, November 11, 2013
Shut Down Nostalgia
In the condensed timescape of the twenty four hour news cycle, it is now officially appropriate to begin feeling nostalgia for the October government shutdown. We have, it seems, reached the point where the only storyline that captures our collective attention is one of existential threat, which, with the looming specter of federal government debt default, the shut down provided in spades. For just a moment, before House Speaker John Boehner let it slip that he would not, in fact, allow us to welsh on our debt and launch a global economic meltdown on his watch, it was beginning to feel like late 1991 in the Soviet Union. For those of us not out a paycheck this moment was, among many other things, spectacularly entertaining. It was like watching an overtime NFL playoff game, except that instead of the possibility of the end of the road for Ravens Nation (until next season) we were actually witnessing the possible final act for our really existing nation state, without the “just wait ‘til next year” safety net. This made for some real Must See TV, an archaic phrase popularized by the National Broadcasting Company’s dominant Thursday night comedy lineup in the 1990’s, an institution ultimately done in by reality TV. The government shutdown is, of course, nothing but the ne plus ultra form of reality television.
To get a sense of just how entertaining the government shutdown was, one needs only tune in to the news media’s current sky-is-falling narrative account of the difficulties in the implementation of the Affordable Health Care Act. It turns out that “Hey look, Obamacare isn’t working,” isn’t anywhere near as compelling as “Hey look, our way of life hangs in the balance,” although many on the right and many in the media would seek to conflate the two, both judging, perhaps correctly, that doing so is good for their bottom line. But, as I heard Tony Kornheiser say on his radio show the other day in response to Ted Cruz and Sarah Palin’s grandstanding about the World War II Memorial’s shutdown enforced closure, the American people aren’t stupid. As with pornography, we know looming catastrophe when we see it. And, not unlike the eight billion dollars’ worth of porn we consume annually, when we see it we can’t stop watching. (And if that isn’t testament to Freud’s pairing of Eros and Thanatos, the libido and the death instinct, then perhaps nothing is.)
It would be nice to close by saying “Wake me up when Obamacare is as taken for granted as Social Security.” But that presumes a future, one with both a functioning democracy and something resembling a modest social safety net, that was, if I am being optimistic, placed in jeopardy by the government shutdown, or, in my more pessimistic moods, was actually foreclosed by the half-way point of Ronald Reagan’s first term in office. If the latter, then our inability to stop watching the government shutdown was already a form of nostalgia for something long since lost. But since this perspective is simultaneously maudlin, defeatist, and realistic, I choose optimism. In the wake of the government shutdown, and with the full frontal assault on the Affordable Health Care Act in full swing, it may be the peak of naivete to give thanks and exclaim “We just dodged a bullet,” but I prefer the term chutzpah. The only way forward for the left is to have more chutzpah than the right and their corporate media acolytes, who have effectively pronounced Obamacare dead on arrival, to which I can but say “Obamacare is dead. Long live Obamacare!”
To get a sense of just how entertaining the government shutdown was, one needs only tune in to the news media’s current sky-is-falling narrative account of the difficulties in the implementation of the Affordable Health Care Act. It turns out that “Hey look, Obamacare isn’t working,” isn’t anywhere near as compelling as “Hey look, our way of life hangs in the balance,” although many on the right and many in the media would seek to conflate the two, both judging, perhaps correctly, that doing so is good for their bottom line. But, as I heard Tony Kornheiser say on his radio show the other day in response to Ted Cruz and Sarah Palin’s grandstanding about the World War II Memorial’s shutdown enforced closure, the American people aren’t stupid. As with pornography, we know looming catastrophe when we see it. And, not unlike the eight billion dollars’ worth of porn we consume annually, when we see it we can’t stop watching. (And if that isn’t testament to Freud’s pairing of Eros and Thanatos, the libido and the death instinct, then perhaps nothing is.)
It would be nice to close by saying “Wake me up when Obamacare is as taken for granted as Social Security.” But that presumes a future, one with both a functioning democracy and something resembling a modest social safety net, that was, if I am being optimistic, placed in jeopardy by the government shutdown, or, in my more pessimistic moods, was actually foreclosed by the half-way point of Ronald Reagan’s first term in office. If the latter, then our inability to stop watching the government shutdown was already a form of nostalgia for something long since lost. But since this perspective is simultaneously maudlin, defeatist, and realistic, I choose optimism. In the wake of the government shutdown, and with the full frontal assault on the Affordable Health Care Act in full swing, it may be the peak of naivete to give thanks and exclaim “We just dodged a bullet,” but I prefer the term chutzpah. The only way forward for the left is to have more chutzpah than the right and their corporate media acolytes, who have effectively pronounced Obamacare dead on arrival, to which I can but say “Obamacare is dead. Long live Obamacare!”
Tuesday, January 08, 2013
The Code
The verdict in the court of public opinion is in: Mike Shanahan was a fool for allowing his franchise quarterback to play on an increasingly gimpy knee in the Redskins’ first round playoff loss to Seattle on Sunday. His risk was rewarded with the nauseating sight of RGIII laying prone on the grass, his immediate, and perhaps long-term, future in doubt. Pundits everywhere have characterized Shanahan’s failure to protect the most valuable asset the long suffering Redskins have had in at least twenty years as an epic fail.
The public and the pundits are doing the automatic Monday morning quarterback thing we all do so well, pretending that the actors involved in the moment as the shit went down were operating out of the same moral universe that these critics awakened to on Monday morning. But the world of professional sports, especially that of professional football, has its own set of moral imperatives. First and foremost among these is the requirement to, at all times and in all places, man-up. The code of manhood that rules professional sports is perhaps best exemplified by the fact that in 2013, when gay marriage is increasingly available at the nearest courthouse, and when gays are breaking down barriers left and right, including most recently that uber-exclusive club the United States Senate, we still have yet to see an openly gay athlete compete in any of the four major professional men’s sports leagues in North America (NFL, NBA, MLB, and NHL).
Two recent examples will suffice as exhibits of the manhood code that rules pro football. In the Dallas-Washington game just one week prior to Griffin and Shanahan’s partnership in crime (just who enabled who here remains unclear), superstar Dallas pass rusher Demarcus Ware played with a chronically separating shoulder essentially duct taped in place. Hours prior to the game Ware Tweeted that “Pain is temporary; quitting is forever.” And in the NFL playoffs just one year ago, as astutely pointed out by former pro hoopster Etan Thomas in The Washington Post, Chicago Bears quarterback Jay Cutler was universally excoriated as a quitter for coming out of the game with the same basic injury as Griffin, a balky knee. The requirement to man-up in the playoffs is trebled, at least. One either complies with the code, like Ware risking the future functioning of his arm, or, like Cutler, one is labeled a quitter and, much worse, less than a man (it goes without saying that being less than a man means being a woman, so that the word quitter is code for girl).
It is only within the confines of this code that Shanahan’s outrageous gamble, which appeared to be the thoughtless risk of at least a decade of future winning seasons in hopes of securing one wildcard victory (reminding me of the story of the husband who showed up with his unaware wife to close on their new house having gambled away the check from the bank at the racetrack), comes into focus as a decision made with Griffin’s best interest in mind (given the strictures of the code). Baseball managers, as in the case of the Washington Nationals shelving of Stephen Strasburg this summer to protect his surgically repaired throwing arm, can get away with rational precautionary measures. But baseball is a pastime, whereas professional football is a proxy for war. Had Shanahan sent Griffin to the bench at any point that Griffin could still limp onto the field, he would have sent the message to Griffin, the team, and the millions watching on TV, that he didn’t think Griffin was man enough to tough it out, play through the pain, etc., pairing Griffin forever with Cutler (since “quitting on his team” Cutler has become persona non grata to the point that e.g. ESPN Magazine, wishing to make a case for why 2-time Super Bowl winner Eli Manning isn’t actually that great, simply compared his stats to Cutler’s on the magazine cover, which was remarkably like Republicans’ efforts to smear Obama by comparing him to Jimmy Carter). Sending Griffin to the bench would have been to effectively emasculate him as the team’s field general. It would have been like Lee sending Stonewall Jackson to the rear guard with a flesh wound. Seen through the lens of the code, Shanahan’s decision to keep Griffin in the game ultimately came down to this: he risked having Griffin cut off at the knees rather than personally cutting off his balls.
The public and the pundits are doing the automatic Monday morning quarterback thing we all do so well, pretending that the actors involved in the moment as the shit went down were operating out of the same moral universe that these critics awakened to on Monday morning. But the world of professional sports, especially that of professional football, has its own set of moral imperatives. First and foremost among these is the requirement to, at all times and in all places, man-up. The code of manhood that rules professional sports is perhaps best exemplified by the fact that in 2013, when gay marriage is increasingly available at the nearest courthouse, and when gays are breaking down barriers left and right, including most recently that uber-exclusive club the United States Senate, we still have yet to see an openly gay athlete compete in any of the four major professional men’s sports leagues in North America (NFL, NBA, MLB, and NHL).
Two recent examples will suffice as exhibits of the manhood code that rules pro football. In the Dallas-Washington game just one week prior to Griffin and Shanahan’s partnership in crime (just who enabled who here remains unclear), superstar Dallas pass rusher Demarcus Ware played with a chronically separating shoulder essentially duct taped in place. Hours prior to the game Ware Tweeted that “Pain is temporary; quitting is forever.” And in the NFL playoffs just one year ago, as astutely pointed out by former pro hoopster Etan Thomas in The Washington Post, Chicago Bears quarterback Jay Cutler was universally excoriated as a quitter for coming out of the game with the same basic injury as Griffin, a balky knee. The requirement to man-up in the playoffs is trebled, at least. One either complies with the code, like Ware risking the future functioning of his arm, or, like Cutler, one is labeled a quitter and, much worse, less than a man (it goes without saying that being less than a man means being a woman, so that the word quitter is code for girl).
It is only within the confines of this code that Shanahan’s outrageous gamble, which appeared to be the thoughtless risk of at least a decade of future winning seasons in hopes of securing one wildcard victory (reminding me of the story of the husband who showed up with his unaware wife to close on their new house having gambled away the check from the bank at the racetrack), comes into focus as a decision made with Griffin’s best interest in mind (given the strictures of the code). Baseball managers, as in the case of the Washington Nationals shelving of Stephen Strasburg this summer to protect his surgically repaired throwing arm, can get away with rational precautionary measures. But baseball is a pastime, whereas professional football is a proxy for war. Had Shanahan sent Griffin to the bench at any point that Griffin could still limp onto the field, he would have sent the message to Griffin, the team, and the millions watching on TV, that he didn’t think Griffin was man enough to tough it out, play through the pain, etc., pairing Griffin forever with Cutler (since “quitting on his team” Cutler has become persona non grata to the point that e.g. ESPN Magazine, wishing to make a case for why 2-time Super Bowl winner Eli Manning isn’t actually that great, simply compared his stats to Cutler’s on the magazine cover, which was remarkably like Republicans’ efforts to smear Obama by comparing him to Jimmy Carter). Sending Griffin to the bench would have been to effectively emasculate him as the team’s field general. It would have been like Lee sending Stonewall Jackson to the rear guard with a flesh wound. Seen through the lens of the code, Shanahan’s decision to keep Griffin in the game ultimately came down to this: he risked having Griffin cut off at the knees rather than personally cutting off his balls.
Subscribe to:
Posts (Atom)