Thursday, December 30, 2010

Keep Your Daughters Out of Ballet

Darren Aronofsky and I have a troubled relationship. I think his films are brilliantly constructed, but so viscerally disturbing that I can never watch them more than once. At the same time, there’s a ridiculousness to them that I find funny in those few moments of relief from bodily terror. The Wall Street and Kabbalistic conspirators who hunt the protagonist in Pi are so amateurish that they become fools. If Ellen Burstyn’s character in Requiem for a Dream wasn’t so pathetic, her hallucinations of a jumping and belching fridge would set me cackling. If the endings of these films weren’t so unsettling, they’d be black comedies. Maybe that’s the best way to characterize the general tenor of Aronofsky’s work (The Fountain aside): black comedies for sociopaths.

Dennis Lim at Slate examines the weird sense of humour in Black Swan, which I saw last week and thought was pretty amusing in that ridiculous way. I wasn’t really sure what to make of Natalie Portman’s performance, though. I have a good friend who has gone through the gauntlet of professional ballet, and from what she’s told me, Portman’s psychological breakdown is horrifyingly accurate to the mental state of the average professional ballet dancer.

These are people treated more as machines than humans, driven into anorexia and hideous personal collapse. Nina in Black Swan is a perfect illustration of the double bind of ballet dancers. She’s pressured to stay grotesquely thin, while her work requires tense athleticism. She’s molded into a concept of femininity as innocent childish asexuality, then sexually manipulated by her director and the demands of a role that she has no concept of how to portray.

Her hallucinations as she loses her grip on reality are multifaceted and fascinating, and constitute the character just as much as the actual performance and dialogue. The self-mutilation is a typical Aronofsky stomach-churner, and the autonomous mirror images are typical Aronofsky techniques to unsettle you mentally. Some of her swan transformations are actively hilarious, and the final black swan dance sequence is genuinely beautiful, a triumph of the character, which because this is an Aronofsky film, doesn’t last.

What Lim describes as the biggest aesthetic puzzle to the film is where it lands in the matrix of camp. Everyone in this film is an over the top caricature except Natalie Portman. Mila Kunis plays the less-talented oversexed party girl. Barbara Hershey is the overbearing self-obsessed hyper-possessive mother. Vincent Cassel is a walking cliché of a genius greaseball director. It helps that he’s French. Winona Ryder’s bitter, forcibly retired diva is the most obvious throwback to Showgirls, which I am increasingly convinced is slowly becoming one of the most influential films of the end of the twentieth century.

Now look at Natalie Portman in the middle of all this. Her character has no sense of humour at all: every situation she’s in is heightened to an incredible emotional intensity. Everyone in the film understands, to some degree, the ridiculousness of the world of professional ballet. It’s the most campy, over the top, laughably zany of all the traditionally high arts. Everyone can see, to some degree, the partial silliness of this world. Except Nina.

She takes all this with an immense seriousness, giving weighty significance to everything that happens to her. Her mother, sitting in her studio of mediocre self-portraits, has some sense of what a cartoon she is: that’s what she paints. Her director talks the pretentious talk when he’s wooing investors, but he knows it’s all a matter of kissing ass. Her rival just loves looking good on stage and getting laid. But for Nina, Swan Lake is the culmination of her existence. The tragic dimension of the movie is seeing that such a serious, perfectionist attitude ultimately gains you nothing. If there's a lesson to be learned here, it's that if you're considering enrolling your four year old daughter in ballet, watch this movie first, and make sure she understands what kind of thinking will turn her into Nina.

The next day, I ended up seeing Tron, which was awesome, and 3D, and had Jeff Bridges in it.

Thursday, December 16, 2010

Crippled By Moral Sensitivity

A very funny moment happened during my first public reading of my short fiction. A friend was outside stumping for me, trying to get passersby along the Hamilton Art Crawl where I was performing to come inside and listen to me. One person asked my friend who I was, and she said that I was a PhD student in philosophy. This person then walked away faster.

I can understand that reaction. Academics and literature rarely go well together. It’s a very strange development to watch university MFA programs become the thriving new home for American short fiction. But those programs are actual creative writing programs, there to teach people how to write literature itself. They’re more like trade schools than academic institutions. And the MFA creative writing programs I’ve visited myself are free of a lot of the pretention and elitist attitudes of high-level academic institutions. Academics are often taught to keep their language dry, free from controversy, easily understandable, unchallenging, to stay away from ambition or broad scopes of meaning. I’ve never gotten along well with these academics, and I’ve worked best with philosophers who are just as hostile and apathetic toward the boring aspects of academic writing as I am.

But now that I have enough stories for a full set list myself, I’ve actually looked at my completed works so far and noticed an interesting trend. Half (or more, depending on whether you include writing about students and not just academic professionals) of my completed stories so far are about philosophers. Perhaps I’ve internalized the stereotypical adage of ‘Write what you know,’ because I’ve definitely gotten to know university and academic culture pretty well over the last few years. However, I think there’s a larger point that has snuck into my thoughts, which has to do with what kind of stories and what kind of characters I find interesting.

I’m most intrigued, as a writer, with hypocrisy. I’m not against hypocrisy per se; I never explicitly denounce hypocrisy in any of my fiction works – neither the stories or my novel. I’m a hypocrite myself. But I find that hypocrisy and inconsistency of character makes for the most intriguing literature. I’ve never been all that interested in literature about characters who have no internal conflicts and just deal with problems that arise around them. I’m not into plots. I don’t like narratives structured around things happening. I’m far more fascinated by narratives that reveal strange, multifaceted characters. Inconsistency in the beliefs and desires that are most important to your character makes for an amazing literary exploration.

I think this is the more profound reason why my ideas for stories keep coming back to philosophers. We’re the so-called lovers of wisdom. It’s in the etymology of the name of the fucking profession. A wise person is supposed to be a person without serious internal conflict, a person without hypocrisy. We call people wise who can guide people out of personal conflict and into more harmonious lives. Philosophers study ethics itself, so our own ethical beliefs we often hold to a higher standard than those of people outside the profession.

The ethical and personal obligations of a philosopher for consistency in living and freedom from internal conflict are at their highest intensity. A philosopher who becomes aware of his or her own hypocrisy or inconsistency of character is going to have the most intense conflict, because of all professions, philosophers have more skills to analyze these concepts and so understand their own internal conflicts more deeply than others who may not have been trained to be as articulate with ethical concepts. “Mobilization of the Oppressed,” which I just submitted to University of Toronto’s Echolocation fiction journal this afternoon, explores the disconnects from reality that can happen when you firmly believe that knowledge makes one moral. “My Perfect Lover” explores the hypocrisy of a man whose desires and emotions lead him to use his skills of reasoning and argument to defend a regime of slavery that he knows to be wrong. I have an untitled story in draft form about a professor whose drive to discover through philosophical argument the nature of a perfectly benevolent God turns him into a bitter old man incapable of love.

I thought of another idea today about a philosopher, the idea that made me realize that my profession was such a frequent subject for my fiction. This philosopher is so deeply committed to his utilitarian ethical beliefs and arguments, that the rich should give almost all they can to alleviate poverty, that the North is morally obligated to bankrupt themselves to feed the South. But as he comes to this ethical stance, he realizes that the institution of the university is incorrigably inegalitarian: according to his deeply held ethical beliefs, he shouldn’t hold a position that trains upper class elites of affluent North Americans and be paid from the profits gained from forcing thousands of young people each year into crushing student loan debt. But by the time he figures this out, he has his own family to support: children to feed and put through school. By his own philosophical beliefs, he should sacrifice the well-being of his three children to alleviate the pain of suffering millions. But when he goes home to see his own kids, he can’t. So he goes back to a job he hates every day.

Perhaps one day, I’ll publish a collection of stories about philosophers and their conflicts and hypocrisies. I might call it Thinkers. Perhaps it will be valuable.

Sunday, December 12, 2010

Arrogance Is Philosophy’s Most Widespread Paradox

Over the past couple of years, I’ve been building myself a tidy little transdisciplinary specialty that I like to call Critical Theory of Knowledge. The essays I’ve presented and published through the Book Conferences in 2010 in Switzerland and 2009 in Edinburgh are my major public efforts in this so far. But a kindly old professor once told me that it’s always good for a philosopher’s career if you can put something to do with knowledge and/or epistemology somewhere prominent on your CV. This suits me pretty well.

My main thrust is, at heart, to take knowledge and rationality off its high horse, without falling into the traps of post-modernism that would keep me from being read by people who still venerate rationality. Preaching to the choir might be an easy way to sell books, but I never took the easy way out. In my two published essays, I examine how peer review works in academic journals, and how attitudes of arrogance on the part of professors, editors, and article reviewers can stifle creative, unorthodox ideas, and render a field of study moribund and stagnant. My critique goes something like this: If someone has worked hard enough, and become widely recognized as an expert in their field, then they tend to take their own ideas as gospel. They’re the expert, after all, so their perspective on their field is the same as the truth. When someone disputes that perspective, the first response of a typical expert, working under this premise, is that the disputer is wrong. I wrote about this last week, so you can just scroll down to 3 December for a more detailed treatment of the argument.

Attitude is far more important to disseminating and taking seriously novel creative approaches to a field than most people generally realize. With this focus on attitude in mind, I remembered a curious commonality in my academic life. Keep in mind that this is entirely anecdotal, but what’s most important to take away from this story is not a certain truth, but an intriguing idea, a particular point of view, a conceptual nudge to the ribs.

Some of the most arrogant, curmudgeonly professors that I’ve ever met, the quickest and most vicious attackers of ideas that didn’t fit with their own established conclusions, were all devotees of Benedict de Spinoza.

A confession: I haven’t read an entire book by Spinoza in its entirety until this past week. Actually, I was emphatically turned off Spinoza’s philosophy by the egotistical and pretentious way it was presented to me in a class I took when I was 19. Spinoza wasn’t even on the class curriculum, but the professor would go on and on about “the divine beauty of Spinoza” in a way that communicated none of the important ideas, and just delayed us from covering the actual course material. With my current doctoral project using many ideas from the ontology of Gilles Deleuze, important background reading has turned out to be Spinoza. Deleuze’s big book on Spinoza, Expressionism, was the first presentation of his philosophy that made me feel good about it. This week, I’ve barrelled through Spinoza’s Ethics. And I’ve found something very intriguing.

Spinoza’s book Ethics is a philosophical guide to living. It’s written as a series of geometry-style proofs about the nature of God, existence, thought, emotion, and reason whose ultimate goal is to indicate how one can live through the guidance of reason, and so live a life of joy and exultation in existence itself. Pretentious? Maybe more than a little. Uplifting? Inspiring? Definitely! How could such a book, written with such sincerity by such a generous, magnanimous, and admirable personality inspire such arrogance among some of its devotees? The picture assembled itself slowly, but I’m convinced that I’ve worked out how it happens.

Spinoza has little time for people who live according to their emotions alone. They’re passive before the fluctuating situations of life, living as slaves to forces beyond their control. Part four of five, on how emotions can wreck someone’s thinking and personality, is actually titled “Of Human Bondage.” And he’s a master of the subtle burn. Reading the Ethics, I find myself laughing at a book laid out like a mathematical proof, because of the cunning ways he inserts light-hearted jabs about people who let their emotions carry them away, or who generally don’t think and live “guided by reason.”

And then it hit me. It was a sudden realization, which one should generally mistrust, but as I thought about it, the idea just made so much sense. Part four (Of Human Bondage), proposition 73, in the elaboration paragraphs labelled Scholium, Spinoza describes how the strong person is a person who lives guided by reason, a person who “hates nobody, is angry with nobody, envies nobody, is indignant with nobody, despises nobody, and is in no way prone to pride.” Yet when my Spinozist professors spoke to any students, colleagues, or even higher-ranked professors who expressed an idea hesitantly, or lacking detail, or fuzzily, or even just experimentally, the self-declard Spinozist would respond with anger, indignation, and spite. Anyone who articulated an idea with any less than the perfect precision with which Spinoza himself wrote and argued, was dismissed and insulted with great condescension and arrogance.

But sometimes, an idea needs to be given a chance to percolate in one’s thoughts, to drift around conversations, displaying roughness, but also promise. A lack of clarity may obscure a bounty of potential. These self-declared Spinozists of my anecdotes attack and dismiss an idea for lacking that perfect clarity of expression that it may not yet have had time to achieve. Spinoza’s burns and jokes are written with no cruelty, but a pleasant wit. His barbs come with the extended hand of friendship, never the spitefully dismissive spirit that I have heard from the self-declared Spinozists who ruthlessly attack all ideas in progress, unfinished, incomplete. But the same words Spinoza wrote, when delivered with a tone of anger, are words of hatred, rage, and dismissal.

In Switzerland, I spoke about humility as the most difficult, but most important task of a thinker. Humility is the ability to wonder sometimes, whether you are on the right track: The expert must sometimes question his own expertise to avoid destroying the vibrancy of the field to which he’s committed his life. Sometimes, if you dedicate yourself to Spinoza, patron saint of a life lived guided by reason, you can say to yourself, “I’m a follower of Spinoza, so I must be guided by reason. If I’m guided by reason, I must be right, and it’s my duty to stop those who are still in bondage to their lesser instincts, who are not yet guided by reason as I am!”

I hope you see the parallel structure of that thought, and my thought at the start of this post. The first sign that you’re no longer guided by reason is that you no longer think it’s required that you check to see if you’re guided by reason. Spinoza wrote that he who is guided by reason lives free from error, strife, or mistake. But the first and easiest mistake to make is to believe yourself incapable of mistake. That mistake is called pride.

Wednesday, December 8, 2010

70, 30, 40; 44, 6, 38

John, et al, "Instant Karma," 1972.

I thought of two ways to understand stars today. One is to look at how little of the sun's energy is actually absorbed by the Earth, and how much is wasted, radiating into space, never used by any intelligent creatures. It can feel like an astronomical waste, an entire star burning away to nothingness for no reason. Or you can think about an enormous body that creates a fire of which we only became capable of imagining a few decades ago, a gigantic ball of gas that lives, pulsating energy for billions of years. It's the difference between burning and shining.

Dimebag, et al, "Revolution Is My Name," 2000.

It's easy to be overshadowed, even though Dimebag was shot by what was as conscious a Mark Chapman ripoff as you could become. History creates some strange patterns, the shapes of which are amazingly difficult to figure out. No one could work out satisfying reasons for these killings even if they had infinite time to live.

Friday, December 3, 2010

Publication Diaries: The Problem with Subtlties

So I just sent in the publishing contract for my second essay to come out in the International Journal of the Book, “The Danger of Institutional Conservatism in the Humanities.” It will be available in the 2011 edition of the journal, and I’m quite proud of it. I’m not sure if I’d say it’s the best work I’ve done so far, but it’s definitely my most experimental so far that’s being published in an academic journal.

As I learn more about the peer review process, especially its problems and difficulties (for details, see my article in the Book Journal last year), I think interdisciplinary journals are best suited for a lot of my work writing philosophy articles. I’ve come to this conclusion for reasons that will sound very self-serving, if you want to interpret me maliciously. But I think my reasons are actually very insightful, if you interpret me charitably. I personally think it’s a very self-serving insight, but quite insightful nonetheless. I've noticed in academic culture, that the more specialized one’s knowledge is, the more zealously one tends to guard one’s perspectives from critique. In learning more and more about an increasingly specific subject matter, one tends to acknowledge one’s own expertise: At a particular point, different for everyone, one tends to presume that one’s own perspective on the subject matter is the right perspective. “I am the expert,” says the expert, “so my own knowledge is the standard of my field. If it wasn’t the standard, then I wouldn’t be an expert.” These people are very often submission reviewers for the academic journals in their specialty.

This attitude creates a potentially terrible problem for creative thinkers, especially people who are younger and/or less experienced, still trying to establish themselves in their field. Such a young person, a new entrant, may have ideas that differ from the established experts. Being newer to the field, they don’t yet have the experience or prestige that a long career in a specialized field offers. But they may also have innovative new ideas and approaches to their field, which may not be compatible with the approaches of the experts. And if the established expert has come to identify their own way of thinking as the only way of thinking, then that new writer will be rejected. The expert will hold them to be wrong, when the new writer may just be in disagreement or holding a different approach than the expert. The expert will reject their work, preventing an innovative approach from being disseminated.

At this point, I think it should be clear that the person I’ve been calling a specialized expert is better titled an academic curmudgeon.

I think this attitude becomes more prevalent, or at least more likely to encounter, in highly specialized academic environments. This, right now, is just a matter of anecdotal evidence, but the anecdotal evidence is beginning to stack up. What this has to do with my mutually beneficial relationship with interdisciplinary journals is that one is less likely to encounter this attitude in a less specialized academic environment. So my own strange ideas and approaches are more likely to be given a chance than they would be in a highly specialized journal with a greater probability of curmudgeonliness.

My forthcoming essay is a more experimental in form than any essay I’ve attempted to put in the public view. Read by one of my former professors, he described it as uncategorizable into any typical genre or division of philosophy. I took this to be a compliment. He also called it cranky decades beyond my years, which I considered a backhanded compliment. When I presented it at the Book Conference in Switzerland last month, it was received with gaping mouths, and it took a while for the ideas to sink in to the audience. It’s a very dense essay for 4,000 words, and has some subtlties in its tone and language that may not be noticed.

The essay is a continuation of my critique of how academic knowledge is generated, and contains potential solutions to the ways in which a field of knowledge can become moribund, uncreative, and boring. Key to the solution, which I note – there and here – is much easier to talk about than actually to achieve, is an attitude of humility. One of my reviewers had no critiques of the content of my essay, but often told me to remove what s/he called ‘self-referencing,’ sentences starting with ‘I.’ I will admit that I didn’t follow this direction in every case, because I didn’t want to give the essay a tone of pure objectivity and distance that is one of the signs of the arrogance of the expert. When I describe the attitude of humility, the reviewer annotated that I should re-write my introductory sentences to display more of this attitude. It was cheeky, and I laughed, but s/he also didn’t understand the subtle point I was trying to make with my cranky tone.

The most difficult part about inculcating an attitude of humility into academic professionals is that our personalities, and academic society generally, are shaped to make it immensely difficult to have actual humble attitudes. We’re rewarded for being distinctively smarter than our colleagues, and especially the general public. There’s a casual disdain for undergraduates and ordinary students in academic culture that I never really noticed in universities until I was no longer one of those ordinary students. And I’m still uncomfortable with bragging in a non-professional context. It’s difficult for me to accept compliments about my work in philosophy and literature, because of the conflicts it gives me: I want to be a humble, easily-relatable person, but I also want to produce remarkable, superior, inspiring writing.

I tell my friends in the philosophy department how many different and intriguing ideas I have in the course of a week, and I feel awkward when they tell me they don’t have nearly so active a brain. If there’s one thing I don’t want to become, it’s an insufferable genius, even though I can see myself eventually heading for near complete Rain Man territory as I get older. Academics are not humble people, and our increasingly exclusive social circles of other graduate students and eventually other academics and highly educated professionals only encourage that attitude of superiority to everyday people.

So I wrote my essay about encouraging humility in a very superior, bordering on arrogant, tone. It’s an illustration in the tone of the writing itself of how genuinely difficult the task of humility is. It’s written by an arrogant man who knows, despite his own instincts, that his goal of encouraging innovation and works of brilliance (of which he considers much of his own work), will only be achieved by inculcating widespread attitudes of humility. The paradox unfolds along many different levels of articulation.

Brilliant, isn’t it?
In other news, the new Kanye album is absolutely fantastic, and I don’t use the word absolute in a positive sense very often. It’s a very appropriate clip to end a post that talks about the importance of humility.