Tuesday, September 29, 2009

Alain Badiou on the State of Capitalism

From "Is The Word 'Communism' Forever Doomed?" (via):
Just like maybe after 1840, we are now confronted with absolutely cynical capitalism, more and more inspired by the ideas that only work backwards: poor are justly poor, the Africans are underdeveloped, and that the future with no discernable limit belongs to the civilized bourgeoisie of the Western world. All kinds of phenomena from the 19th century reappear, extraordinarily extended forms of misery within these countries themselves. Forever growing in inequality, the radical cut between the people of the working classes, of the uninformed, and the middle class, the complete dissolution of political power in the service of property and capitalist profits. Several years of ratiocination, disorganization of revolutionaries, and the nihilist despair of large portions of the youth, the servility of the large majority of them, and the experience of the base obsequiousness of formal groups in the quest of the contemporary means to establish, re-establish, find new definitions for the Communist hypothesis.

All these characteristics are very close to the political situation which was dominant in Europe in the middle of the 19th century. Which is why the apparent victory of capitalism, occasion to the second sequence of the Communist hypothesis [1917 to 1976], had been, in fact, a very strong reaction, a very strong return to something very old. The politicization of contemporary capitalism is as you see the return to the cynical capitalism of the 19th century. And it is probably why after the 19th century the question is not for us the victory of the Communist hypothesis, but the conditions of its identity. Our problems are much more the problems of Marx than the problems of Lenin, and that was the great question of the revolutionaries of the 19th century.
Badiou is arguing that the problems the "communist hypothesis," which is a very broad hypothesis, is presented with today are not like those that gave rise to the Bolshevik revolution and the subsequent authoritarianism of the Communist Party, but like those that led to the positing of the hypothesis in the middle of the 19th century. The problem for the Bolsheviks was victory: how to get beyond local action and local successes to the triumph of communism generally. The problem of the 1800s was of how to overcome capitalism and its iniquities, and create a better society. That's where we are today, Badiou argues, and the question once again begs for an answer.

I've often wondered if the crises of today will lead to action and change, or merely a reaffirming of the present institutions. Capitalism, at least in the West, has been quite resilient. When pressed, it bends, but only in order to curve around and tackle the very forces pressing it. And what presses it most, ultimately, is itself. So that it often seems as though capitalism will be the death of itself, only to rise again even more dominant, even more cynical, even more controlling, even more reactive. And what would replace it? I sometimes share Conrad's cynicism in Under Western Eyes. Meet the new boss; same as the old boss? I wish I had more hope. I'm not sure we even have a blueprint for radical change. "Communism" as a hypothesis, as Badiou expresses it, is to abstract. It's little more than "human flourishing" with a ring of equality. And that's barely even a hypothesis. It's more like an impression, and a vague one at that.

How I Lost My Accent

It's disappointing to see progressives acting as elitists, and particularly so when dealing with people who should be targets of progressive reform, not progressive mocking, as in this post at Pandagon, for example. Many of the comments focus on the guys accent, with the almost gratuitous references to inbreeding, instead of the guys hate and ignorance.

I once had a pretty heavy southern accent. It still comes out now and then, and my short vowels are forever muddled. If I were ever in a life-or-death situation that required quickly distinguishing between a "pen" and a "pin," I would be screwed. Most of the time, though, my accent is all but completely gone, and has been since I was an undergraduate. The reason for this is pretty simple: once I left home, where pretty much everyone had an accent like mine, I got made fun of for my accent. This was especially true when I was among smarter folks, most of whom were from the Midwest. They associated my accent with ignorance and stupidity, and made this very clear to me.

The first time I realized my accent was "different" came during my second week of college. I was in an Intro to Sociology course with about 50 students, one of whom was Antoine Walker. For basketball fans this a big deal. I was shy, and generally didn't speak up in class, even in high school. However, when the professor asked if there was anyone in the class who thought he or she didn't have an accent, I raised my hand, thinking both that I didn't have an accent, because everyone where I'm from talked just like me, and that a bunch of other people would raise their hands too. Unfortunately, I was wrong on both counts. I was the only one who raised his hand, and when the professor then asked me where I was from, I was forced to speak. I replied simply, "Nashville," but in dialect that word is pronounced "Nayshvull." The entire class burst into laughter at that one word. I was horrified. I was being laughed at in front of 50 peers, one of whom happened to be a basketball player of whom I was a big fan.

From that moment on, I was constantly and deeply conscious of my accent, and when people continued to make fun of it, including faculty and fellow students (particularly in the philosophy department, where being snooty is raised to an art form), I consciously chose to get rid of it. And I did.

And that's a shame, because my accent was part a big part of who I am, because where I'm from was and remains a big part of who I am.

Monday, September 28, 2009

Football Quote of the Day

From a post on Mississippi State's near upset of LSU this past weekend, by John Wilkersley at Kentucky Sports Radio (emphasis mine):
On paper, MSU is the worst team in the league. Throw in the fact they’re transitioning from Sly Croom’s punt-focused offense to the spread, and that State’s already won a conference game and came within a simple outstrectch of the football from beating LSU, is pretty damned impressive.
"Punt-focused offense." That's good stuff.

Sunday, September 27, 2009

Working With "Human-centered Metaphor"

Humans are evolved animals. In order to produce not only a deeper understanding of humans, but also to produce a deeper understanding as humans, this must be taken as a given. What this means is that humans are situated, involved beings, who don't simply come to the world around them as tabula rasae, directly connecting with things as they are, or even with themselves as they are. Humans are part of the world, dependent on it, tasked with navigating it to survive, and as a result, their representation of the world will be geared for navigating and surviving. We enter the world with an understanding designed for use. This representation, opaque as it is as a result of being the product of investment, is further complicated and obscured by the fact that we are not just evolved animals, but evolved social animals. This means that our representations are further situated, not only in what we naively refer to as the natural world, but also the social world, the world of culture and civilization. As animals, this is of course of great benefit to us. Not only does it allow us to learn from those around us, but indirectly, from all of those who came before us and who are out there not but to whom we do not have direct access.

As a result of our evolved nature, our thought and our language are structured around what we need to know, and do, to survive in the world, in culture, in civilization, etc. Our minds are usage-based and usage-built. Whatever they don't need, they discard, and whatever they do need, they need for a reason, which is to say, there's some environmental cause for its necessity, its existence, and its persistence. There is no escaping this: no matter how hard we try to refine language, to clarify it and rigorize it, as science and positivism have tried to do, the beings who are representing, comprehending, and understanding it will still be beings with minds built for use. There is no ideal language, because there is no ideal mind, and without minds, there is no language.

The upside of all of this is that any human endeavor, be it cooking, passing laws, building bridges, carrying out scientific investigations, or doing philosophy, is valuable only to the extent that it helps us to better navigate the world. Knowledge for knowledge's sake is pointless, because there is no such thing as knowledge for knowledge's sake. The more we try to abstract knowledge from our situatedness, and therefore our humanness, the more we remove knowledge from anything like understanding. Knowledge just becomes a mere play thing, an object of entertainment, and as such, paradoxically, the sort of knowledge most connected to our humanness.

In one sense, then, our situatedness privileges what we would generally think of as the practical knowledge of the sciences and engineering over the, let's say, more mediate knowledge of philosophy or art. What we usually think of as practical knowledge is more directly involved with navigating the world. It puts bridges over rivers, makes farms more productive, cures diseases, etc. But philosophy and the arts have their roles as well, and whether we like it or not, their knowledge is ultimately practical knowledge as well. They help us to navigate the world of concepts, the world of representations, the world of emotions and expression and symbols better. They impart knowledge and experience indirectly, often, at least when compared to the sciences, but one of the consequences of being human is that all knowledge in some ways indirect, or mediated, and some things -- at least for now -- have to be approached more indirectly, more mediately, than others.

I bring all of this up because I recently came across the following quote from this site, at this site, via this one (speaking of the indirect):
Vernacular language is simply too infused with human-centered metaphor to ever be useful in talking about how bridges are when no one is looking [my example of the task of speculative realism, which had Latour and Harman in mind, though the latter especially shows how this gets much more complex]. So while I agree that onticology is not a priori a materialistic ontology, nevertheless, mathematical-physical discourse is arguably more useful than philosophical language when it comes to discussing bridges. Afterall, to paraphrase Dretske, engineering discourse will actually allow you to construct a bridge whereas object-oriented “philosophy” will not. Which one then is more truly object-centered
This in the context of a critique, or at least a criticism, of speculative realism.

To which Mike Johnduff (second link) replies:
When you say “Vernacular language is simply too infused with human-centered metaphor to ever be useful in talking about how bridges are when no one is looking,” I think a speculative realist or OOP [Object-Oriented Philosophy] person would say that this proves their point exactly: we need to change vernacular language to bring it away from its use-function, or that which actually ties it to a scientific language that is based, ultimately, on use.
And continues:
Of course the "change" involved is not primarily changing language itself, but rather changing how we see what language (among other things) is already doing.
As the preceding probably makes clear, this last bit is something with which I could not agree more. The purpose of any philosophy that takes language seriously as a problem should recognize that it's not a problem to be overcome, but simply one to be understood. What is language doing? It's doing something related to our situatedness, to our needs as determined by our situatedness. If it weren't, we wouldn't have language. As such, it will always be dominated by human-centered metaphors, and it should be, because that's what makes language valuable: it's being centered around us.

Furthermore, a good philosophy will recognize that there's something to these metaphors. We understand and talk about things metaphorically, or analogically, or however you want to describe it. Good, now what? There must be a reason why particular metaphors/analogies/whatever stick. Maybe a good way at getting at objects, as the object-oriented philosopher and speculative realist seem to want to do, would be to analyze these metaphors and figure out what aspects of the world, and our being in it, they capture for us (and the for us is important: "for whom is it?" is almost always a better question than "what is it?"). What is it that thinking about X in terms of Y, or as related to Y, buys us? And what does that say about our understanding of X and Y? Or about the way in which X and Y relate to us? This is of course easier said than done, because our metaphors don't just reflect our relations to the things that comprise them, but also social and cultural relations, and which metaphors stick are in large part a result of culture. Still, if we're stuck with language as it is, and we are so stuck, then wouldn't it be smart to learn as much as we can about what that language is doing for us, and by extension, what it says about our relations with the things we use language to talk about?

Walt Whitman Commercials

Levi's is using a Walt Whitman poem in a couple of their jeans commercials. The first uses an actual recording of Whitman reciting the poem "America":



The second uses "Pioneers! O Pioneers!":



This is disturbing. What's next, a Walt Whitman mall? Oh, yeah.

Memory Metaphors: Remembering as Light


As I said, I like memory metaphors. One I hear a lot, since the song is frequently on the radio (including just now):
I seem to recognize your face
Haunting, familiar, yet I cant seem to place it
Cannot find the candle of thought to light your name.
From Pearl Jam's "Elderly Woman Behind a Counter in a Small Town." I sometimes think of remembering as shining a flashlight on some memory stored somewhere in my head. A candle is a bit more poetic, I suppose.

Saturday, September 26, 2009

Ortega y Gasset on Possibility, Freedom, and Identity

From "Man as a Project":

Regarding these possibilities of being it is important to comment upon the following:

First: That they are not given to me, rather I must invent them for myself, be it in an original manner or by transmission of other men, even in the ambience of my life. I plan projects of being and doing in light of the circumstances. Circumstance is the only thing that I find which is given to me. It is too often forgotten that man without imagination is impossible without the capacity of inventing for himself a life-figure, and of idealizing the person which he will become. Man is the novelist of himself, be he original or a plagiarist.

Second: I must choose among these possibilities. Therefore, I am free. But understand well: I am free by obligation. I am free whether I want to be or not. Liberty is not an activity exercised by a being which, apart from and before exercising it, already has a fixed being. Being free means lacking constitutive identity, not pertaining to a determined being, being able to be something other than that which one was, and not being able to adhere oneself finally and eternally on any determined being. The only thing that must be fixed and stable in the free being is constitutive instability.

Memory Metaphors: Memories as Pigeons


I am fascinated by memory and its role in our identities, so I am always on the lookout for talk of memory. And one of the things that fascinates me the most about memory is just how difficult to talk about directly. Most talk of memory is couched in figurative language: images, metaphors, similes, analogies. There's even a book on that subject.

I suppose the reason the reasons that memory is so difficult to talk about are twofold. First, any talk of memory requires memory: memory for concepts, memory for the words that refer to concepts, memory for what the concepts represent, and memory of memory itself: the experience of it, examples of remembering, and so on. So the very act of talking about memory is circular and muddled. The second is that we don't seem to have much conscious access to the inner workings, if you will, of memory. It's not just that we don't get to "see" how memories are stored or accessed, but even the act of triggering a memory is opaque. Why is it that seeing a particular shape or object, smelling a particular smell, or hearing a particular song, elicits the particular memories? Nothing about the experience of remembering answers this question. Or if it does, the answer is not obvious in that experience. All of this leaves memory in a cloud of vagueness and confusion. So the best way to talk about memory, at least outside of a scholarly context (and often even in a scholarly context) is to refer it to something more vivid, something more clear.

I told you all of that to tell you this. When I come across interesting metaphors for memory, I write them down in a little notebook. Today, as I was reading a book that is largely about memory, Remainder, by Tom McCarthy. It's the incredibly strange story of a man who, after being struck by a mysterious (to the reader, at least) falling object, and spending some time in a coma, has some memory issues. On several occassions, the man describes his memories coming back to him as scenes, like in a soap opera (his metaphor). That's not an entirely original metaphor, as memory has been referred to as theater for centuries, and as pictures (moving or otherwise) for some time as well. But on page 91-92, I came across a metaphor for memory that I'd never seen before, and I thought I'd share it with you. Here it is (emphasis mine):
After the accident I forgot everything. It was as though my memories were pigeons and the accident a big noise that had scared them off. They fluttered back eventually...
Memory as startled pigeons. Now that's interesting.

Nietzsche Myths

Philosophy Bites has an informative interview with Brian Leiter on myths about Nietzsche. The interview focuses on four main myths: the Ubermensch, Nietzsche's antisemitism, the Will to Power, and Nietzsche the postmodernist. Dispelling the notion that Nietzsche was an anti-Semite is obviously the most important in general, but for philosophers, lay and professional alike, clarifying the roles of the Ubermensch and Will to Power in his philosophy is very important as well.

I always thought it was fairly obvious that the Ubermensch was a tool to make a particular point in one particular book (the only book in which the concept of the overman has a significant role, Thus Spoke Zarathustra), and that, since Nietzsche himself seems to get rid of the Ubermensch in that very work, replacing him with the Eternal Return, focusing on the Ubermensch when interpreting Nietzsche was a sign of a very careless reader1.

The Will to Power myth is even easier to dispell, since, as Leiter notes in the interview, Nietzsche only mentions it a few times in his published work, and the book with that title (which casual Nietzsche readers should probably not read at all) was published, after being cobbled together and heavily edited by his anti-Semite sister, against his wishes. Also, I wish Leiter had suggested that no one, even Heidegger fans, ever read Heidegger's multi-volume work on Nietzsche. It's the best way to get bad ideas about Nietzsche, even if you've read Nietzsche extensively.

As for the postmodernist Nietzsche, Leiter does an OK job of suggesting where this myth came from. My first serious readings of Nietzsche took place in a class on Nietzsche by Dan Brezeale, who, it just so happens, edited and published a bunch of Nietzsche's early, unpublished writing under the title Philosophy and Truth. So as you might imagine, we talked a lot about those early writings, including the essay that Leiter blames for the postmodernist reading of Nietzsche, "On Truth and Lies in an Extra-Moral Sense" (Brezeale translates it as "Non-Moral Sense"). I like that essay, but Leiter's probably right, it hasn't helped. But I think he elides the role of one of Nietzsche's published works (Leiter highlights that the "On Truth and Lies" essay was never published), The Birth of Tragedy. Again, a careful reader of Nietzsche's will see that this early work was written and published when Nietzsche was under the spell of Schopenhauer, and more perniciously, Richard Wagner, a spell he began to cast off sometime between the Untimely Meditations and Human, All Too Human, and therefore not assign that work too much importance in interpreting Nietzsche's entire corpus2. However, in my experience at least, the aestheticism that many readers find in The Birth of Tragedy has had a huge influence on the postmodernist reading of Nietzsche. But as it was a published work of Nietzsche's (we might also add a biased reading of the Part 1 of Beyond Good and Evil) that played a big role in the postmodernization of Nietzsche, Nietzsche himself shoulders some of the blame.

1 A careful reader might also note that Zarathustra says quite explicitly that there has never been an Ubermensch, and afterward, discovers the Eternal Return which says that everything that will be has been. This pretty much excludes the possibility of the Ubermensch doesn't it? Undoubtedly one of the reasons Zarathustra is initially so distraught at the idea of the Eternal Return.
2 Reading the 1886 preface to The Birth of Tragedy would also help a anyone to understand the role that work should play in any interpretation of Nietzsche. It's probably not unimportant that he begins that "attempt at self-criticism" with the phrase (emphasis mine), "Whatever may be at the bottom of this questionable book", and later writes of it (emphasis still mine):

But the book in which my youthful courage and suspicion found an outlet—what an impossible book had to result from a task so uncongenial to youth! Constructed from a lot of immature, overgreen personal experiences, all of them close to the limits of communication, presented in the context of art—for the problem of science cannot be recognized in the context of science—a book perhaps for artists who also have an analytic and retrospective penchant (in other words, an exceptional type of artist for whom one might have to look far and wide and really would not care to look); a book full of psychological innovations and artists' secrets, with an artists' metaphysics in the background; a youthful work full of the intrepid mood of youth, the moodiness of youth, independent, defiantly self-reliant even where it seems to bow before an authority and personal reverence; in sum, a first book, also in every bad sense of that label.

Wednesday, September 23, 2009

I AM The Superfluous Man!


This could now be a portrait of this blog's author.


When I was 16 or 17, I read Turgenev's Fathers and Sons, because I had to, being of a certain political and literary persuasion, and I loved it. I loved it so much that I went out and read every novel I could find by Turgenev, from the wonderful On the Eve to the somewhat dated Virgin Soil, but it wasn't until college that I discovered his short stories, which I immediately decided were perfect, especially First Love. But it was his story The Diary of a Superfluous Man that had the greatest impact on me. Not only did I love it, but I immediately recognized myself in it -- as did, I imagine, pretty much anyone who has spent entirely too much time in academia. It was this story that was my entry into 19th century Russian literature, which has since dominated my fiction-reading life. From there, I went to Lermontov's A Hero of Our Time, Dostoevsky's Notes from the Underground, Pushkin's Eugene Onegin, and of course the apotheosis of the superfluous man genre, Goncharov's Oblomov. I was in heaven: here a superfluous man, there a superfluous man, everywhere a superfluous man man. In other words, in my less confident moments at least, here a me, there a me, everywhere a me me!

Over the last week, I have been laid up with a wickedly obstinate ear infection which decided to merely laugh at the puny prescription ear drops I first tried throwing at it, and thus required much stronger drops and oral antibiotics. Lying in bed, as I've done pretty much constantly since last Tuesday, I've done nothing but think about all of the new projects I'm going to start when I finally have enough energy and get rid of the pain: I'm going to finally get around to writing that paper I've wanted to write, but have been putting off; I'm going to read all of that literature on that philosophical topic I've wanted to read but haven't gotten around to it; I'm going to finally fix those things around the house that I've been meaning to fix since pretty much forever; I'm going to do this, and that, and the other. In other words, my whole life has become making plans for when I'm not just lying in bed making plans. I have finally become Oblomov. I am the superfluous man! There's no denying it anymore. I don't know if this is a good thing.

It also should be noted that it was during this time of mere planning that I started this blog. Perhaps blogging is the ultimate instantiation of superfluity.

All Things Go

Here in Austin, there's a bit of controversy brewing, or at this point, fully brewed, over the semi-removal of two programs from the local public radio station, KUT. You can read about the hubbub here. I say semi-removed because, while Paul Ray's Jazz and the Phil Music Program, hosted by Larry Monroe, were removed from the station's lineup, as well as the late night program that Ray and Monroe hosted together, they both still have programs on the station, including Monroe's "Blue Monday" (which I listen to most weeks). The shows that were removed have been replaced with programs that are basically commercial radio programs without commercials.

A group comprised of an ex-mayor of Austin, several local musicians, and KUT listeners, are upset at the change in programming, and have started a Facebook group and a website called Save KUT. They argue that this isn't just about getting rid of Ray's and Monroe's programs, but about the overall change in the programming at KUT. But let's face it, it really is about Ray and Monroe (something the website tacitly admits over and over again).

I don't really have a dog in this race, because the one program other than NPR that I listen to regularly on KUT, "Blue Monday," is still on the air. I occasionally caught Ray's jazz program, and only stopped to listen to Monroe's when I heard Robert Earl Keen ("The road goes on forever, and the party never ends!"). I do hope that, if enough KUT listeners really do want the Phil Music Show and Paul Ray's Jazz back, they get it, either by showing their disapproval through their donations, or lack thereof, or by making enough of a fuss to convince KUT that if it doesn't put them back on the air, it will suffer the consequences come donation time.

However, I honestly don't see the big deal. They were two radio shows, one of which (the Phil Music Show) was uniquely Austin, I'll admit, and the two talented DJs are still on the air every week. But ya know, to me it just feels like, well, what happens with stuff in the media all the time. I mean, most of the radio stations I've liked over the years don't even exist anymore, to say nothing of the programs I liked on those stations. TV shows come and go, radio programs come and go, DJs come and go. All things go. C'est la vie.

The amount of money that's been put into these protests -- at least enough for a website, town meetings, benefit shows to raise more money, and $1200 a week full page ads in the Chronicle -- seems to me, in the midst of an economic crisis, when I can't walk 3 steps from my apartment door without seeing people suffering because they were laid off and can't find work, or walk past a food kitchen without noticing how much longer the lines are now than they were this time last year, to be a bit over the top. How many of those people do you think are sitting around worrying about the current direction of KUT programming? There are better uses for this energy, for this money, and for this time, than protesting because two guys didn't really lose their jobs, when so many others are, and because the programming direction of a station isn't to your liking. I mean, log on to the friggin' internet and listen to some jazz or Texas music, or hell, create your own jazz/Asleep at the Wheel station on Pandora. And give your money to a real cause, where it's really needed.

Tuesday, September 22, 2009

I Know Jack About Economics, But...

Brian Leiter has a post post containing a long response by Alexander Rosenburg to John Cochrane's response to this New York Times article by Paul Krugman on the failure of economics. A quick summary of the situation: Krugman says that, by failing to predict the recent economic crisis, economics itself has failed, and economists need to seriously reexamine some of their cherished beliefs and assumptions, and Cochrane responds that no they don't, because. You can read Rosenburg's entire response to Cochrane at Leiter's blog, but this particular passage from it caught my eye:

The efficient markets thesis is that the market makes complete use of all relevant information, and the “proof” is roughly that in a perfectly competitive market among perfectly rational agents prices invariably and instantaneously reflects all agents’ real beliefs and real desires. Any one who knows anything that can make him or her money acts on it—buys or sells—and that signal is picked up by every one else, who also acts on it, thus preventing any one from making excess profits—rents--long-term.

The first thing a philosopher notes about this notion is that since most people have false beliefs, especially about the future, an efficient market doesn’t internalize knowledge, but only beliefs. If they are mostly false, then the market isn’t efficient at internalizing (correct) information, it’s efficient at internalizing mostly false beliefs. If false beliefs are normally distributed around the truth, then they’ll cancel out and the proof of a probabilistic version of the efficient markets theorem will go through—market prices reflect the truth most of the time. Too bad false beliefs don’t always take on this tractable distribution. Even worse, when enough people notice the skewed distribution of false beliefs, they can make rents, as the markets crash. This is what Cochrane seems to think can't happen. How many times will it have to happen for the Chicago School to give up the efficient markets hypothesis?


When I read this, I was immediately reminded of a talk I accidentally attended (it was in a modeling session, and came between the two talks I was actually there to hear) a few years ago. Unfortunately, I don't remember the speaker's or his co-authors' names, so I can't credit them, and I don't even remember the specifics of the model, but I do remember what the model showed. What the modelers did, basically, was have a bunch of individual actors behave irrationally (i.e., sub-optimally), and look at the behavior of the system as a whole and see if it was irrational in a corresponding way. The conclusion they came to, after running the thing a whole bunch of times, as modelers are wont to do, was that even when most of the actors behave irrationally, the system as a whole results in a surprisingly rational end state.

Now, this doesn't mean that Rosenburg's wrong in the above-quoted passage. For one, under at least some views of rationality, one can have false beliefs and still behave rationally based on those beliefs. So rationality does not necessarily mean efficiency, especially to a philosopher (which Rosenburg is). And it is one model, presented at a conference by people whose names I can't even remember, after all. But it does make me wonder, and I'm not an economist so this may be a long-ago answered question, whether the move from recognizing that many, if not most actors in a system like the market have (mostly) false beliefs to the position that the behavior of the system as a whole will be inefficient is a logical necessity. Specifically, if people behave rationally, but based on false beliefs (which, to a naive modeler would be equivalent to irrational behavior), or behave sub-optimally in general, does this necessarily doom the system (market) to irrationality? On the surface it seems obvious that it does, but as science has often shown, surface appearances can be deceiving, and the model the speaker presented was designed to show that this particular appearance may in fact be so.

I don't mean any of this as a defense of the efficient market hypothesis. Again, I'm not an economist, but from what I little I do know of economics, and of the recent crisis, it seems pretty clear that the EMH has been empirically falsified. But an interesting question, particularly when picking up the pieces of economics and rebuilding it, is why it failed, and if we simply assume that it's because people have false beliefs, or because people behave sub-optimally (i.e., irrationally), and that this therefore dooms the system to behave sub-optimally, we may be missing other possible explanations. I know EMH supporters tend to assume that actors are rational, but what I'm asking, I suppose, is whether those trying to figure out why the market hasn't been efficient need to treat that assumption as the only way to arrive at the concept of an efficient market, particularly when we're trying to figure out why markets aren't, in fact, all that efficient. Theories, after all, didn't create the crisis (even if acting in accordance with them facilitated it), so we can't just address the theories that people actually hold when trying to answer practical questions and, when we've shown that a theory's reasoning is wrong, be convinced that we've already shown why its conclusions are wrong.

In other words, as with most questions, this seems like an empirical one.

Monday, September 21, 2009

The Virtues of Disunity or Why Civility Is More Important Than Unity

On the 8th anniversary of the September 11th attacks, I was reading my favorite sports blog, when I came a cross a post remembering those attacks. The post had links to a couple videos, including Paul Simon's memorable performance of "The Boxer" on Saturday Night Live, backed by New York firefighters. The post's comments quickly devolved into politics, as you might imagine if you've read anything about the September 11th attacks in the last few years, and several of the commenters remembered with fondness the political unity that followed the attacks, lamenting the relative lack of unity we're seeing in America today. I couldn't have agreed less.

Now I will agree, for the sake of discussion, to forget what that unity actually bought us: two wars, at least one of which was a tragic mistake (and which was sold to us with lies that would likely have been much less convincing without that unity), the Patriot Act, which most members of Congress failed to read (likely for the sake of unity), Guantanamo Bay and indefinite detentions, a second Bush term1, etc., etc., etc. But even with such agreed forgetfulness, I'm still loath to believe that the post-September 11 unity was a good thing. For one, democracy, only works, to the extent that it does, with healthy debate and discussion, and the sort of unity this country experienced after September 11, 2001 made debate and discussion nearly impossible. Can you imagine what would have happened to politicians who had vocally opposed our going to war in Afghanistan? Oh wait, I'm supposed to be forgetting that stuff.

The other, related problem with unity of this sort is that a lot of stuff that should get done, or should at least be the subject of discussion, doesn't, because it would threaten the unity. Politicians, particularly those who are members of the party that is not in power, are unlikely to broach certain subjects unrelated to the cause of the unity (e.g., abortion, health care, etc.) that are likely to spark dissent. It's probably not a coincidence that Bush's popularity took a nosedive when he went after Social Security. By doing so, he alienated the American political middle who had, by and large, stuck with him since they had become Bush fans on September 12, 2001, and thereby killed the last vestiges of the post-9/11 unity. We were then left with the old divisions -- left, right, and middle/undecided. And it's not as though Bush hadn't been planning to go after Social Security all along. It's just that he waited until reelection so that his reelection bid could benefit from his residual post-9/11 popularity. With unity comes complacency.

If you ask me (and it's not as though anyone actually did, but still), what we need more than unity is civility. The only thing that can rival unity's ability to stifle political debate and discussion is incivility. Now don't get me wrong, I'm not a champion of civility for civility's sake: the next time some asshole driver turning right, but only looking left, nearly runs me over while I try to cross the street, the finger I extend and the words that freely flow from my mouth will show just how committed I am to civility in that context. I don't even believe that civility is called for in every discussion. The purveyors2 of Young Earth Creationism, for example, deserve nothing but derision dripping with mocking and smothered in secret disrespect sauce. But in political discourse, the goal at least should be to convince people that your position is the correct one. And incivility, of the sort that talk radio and, in "today's 24-hour news cycle" (scare quotes cannot express how much I hate that phrase), TV news channels thrive on, that inspires political signs painting Bush or Obama as Hitler, or causes books and articles proclaiming liberalism to be based on fascist principles or conservatism to be based in mental instability to be widely read and agreed with, makes convincing anyone impossible. Or at least, it leads to a situation in which the only way to convince anyone of anything is to first convince them that your opposition is a bigger villain than the villain your opposition is portraying you to be. This ultimately gets us nowhere.

Witness the health care debate, with its socialist/Marxist/fascist bogeymen and threats of death panels. Sure, this has seemingly changed certain people's minds, given that until recently, the vast majority of Americans favored universal healthcare and even single-payer health care, but now are convinced that Democrats are out to kill their grandmothers, but where do we go from here? The discussion, or lack thereof, doesn't leave us with many outs, even should we decide that government-run health care, in the form of a public option or whatever, isn't the way we as a country want to go. The Baucus bill, which is a result of this political "discussion," is exactly the sort of meaningless incivility. Meanwhile, health care remains in crisis. Unity wouldn't help us here, because if we were united as we were after September 11, 2001, we wouldn't even be talking about health care (we sure as hell didn't talk about it much from 2002-2007), but a little bit of civility might. At least then we'd actually be able to discuss a single-payer health care, a public option, co-ops, or any number of other potential changes to our health care system, and decide where all of that leaves us. Where it leaves us can't be worse than where we are now.

1I know, I know, Bush was reelected in one of the most contentious elections ever, winning by one of the lowest margins (percentage-wise) ever, particularly for a war president, but ask yourself this question: would Bush have had any chance of reelection, given how close the election in fact was, if he hadn't benefited from the residual popularity he had from that post-September 11 unity?
2The purveyors, but not the believers. The purveyors are to be derided, but the believers are to be educated. Education requires trust, and trust requires, at a minimum, civility.

Sunday, September 20, 2009

The Use and Abuse of Richard Dawkins

Brandon of the excellent blog Siris links to a post at the Intelligent Design blog Uncommon Descent, which presents the following argument (as laid out by Brandon):
(1) Atheistic naturalism is true. (assumption)
(2) One can’t infer an "ought" from an "is." (assumption)
(3) All that is is the natural world, and the natural world is all there is. (from 1)
(4) There is nothing in the natural world from which we can infer an "ought." (from 2 and 3)
(5) For any action, there is nothing from which one can infer that one ought to refrain from performing that action. (from 4 and 3)
(6) For any action, it is permissible if and only if it’s not the case that one ought to refrain from performing that action. (assumption)
(7) For any action, it’s permissible to perform that action. (from 5 and 6)

Brandon and his commenters do a fine job of showing what's wrong with this argument, but I was intrigued by something else in the U.D. post (emphasis mine):

(1) That atheistic naturalism is true.

(2) One can’t infer an “ought” from an “is.” Richard Dawkins and many other atheists should grant both of these assumptions.

Richard Dawkins is a biologist. Anyone who's read his non-biological writings knows that he is not, by any stretch of the imagination, a philosopher. So building an argument based on a philosophical position espoused by Dawkins, whatever the ultimate validity of that argument, seems like cheating, doesn't it? Why not begin with assumptions that atheists who've actually read Hume might (should? should seems ironic given the content of the argument) make?

To be fair to the U.D. poster, Barry Arrington (himself a lawyer, and not a philosopher), atheists seem to have propped Dawkins up as their spokesperson on all matters, be they biological, theological, sociological, psychological, or philosophical (the cultish Dawkins worshiping is somewhat disturbing, in fact), so I suppose they're getting what they deserve.

Dostoevsky vs. Rand


This is undoubtedly not a unique insight, but doesn't Crime and Punishment strike you as a preemptive polemic against pretty much everything Ayn Rand ever wrote? If nothing else, placing Rand into the light of Raskolnikov brings into stark relief her (ironic?) Napoleonism.

Saturday, September 19, 2009

Testing 1,2,3

Testing... testing... is this thing on?