A common-enough believer’s comeback to the atheist who says ‘I don’t believe in God’ is the retort: ‘ah but He believes in you!’ Setting aside the hubris of believing one can descant upon the private beliefs of the divine, we might want to consider the inverse of this. Imagine what it would be like if you believed in God but He didn’t believe in you. Not that He was indifferent to you, or dismissive, but that He actually didn’t believe in you. God the ahumanist. What exactly is there, after all, in human existence that makes it so believable?
Or to put this another way: the true Christian may consider her/himself (in relation to God) a worm, a sinful blot, crusted all over with ghastliness unworthiness and so on; but s/he will not take this extra step, and consider her/himself fundamentally incredible. We might wonder why not.
Thursday, 31 March 2011
Wednesday, 30 March 2011
MarinE.T.
'This fetishism of the future crops up on almost every page of 100 Artists’ Manifestoes ... Marinetti’s Futurist Manifesto of 1909, which as Danchev points out founded not only Futurism but the very idea of the artistic manifesto, celebrates “the beauty of speed”. “A racing car, its bonnet decked with exhaust pipes like serpents with galvanic breath ... is more beautiful than the Winged Victory of Samothrace” ... Like Romanticism, the revolutionary avant-garde was staffed by the young, full of contempt for their experimentally challenged elders. In its more flamboyant moments, of which it had more than a few, it raised adolescence to an ideology.' [Terry Eagleton, ‘Fast Forward’, TLS March 25 2011]
So, too, SF! (Raising adolescence to an ideology ... very good) What’s cool about the Marinetti image is the way its ‘future’ is now so very retro ... exhaust pipes on the bonnet, ‘galvanic’ instead of ‘electric’ and so on. This in turn is, we might say, ‘steampunk’.
So, too, SF! (Raising adolescence to an ideology ... very good) What’s cool about the Marinetti image is the way its ‘future’ is now so very retro ... exhaust pipes on the bonnet, ‘galvanic’ instead of ‘electric’ and so on. This in turn is, we might say, ‘steampunk’.
Tuesday, 29 March 2011
On Being an Atheist Christian
Doing some reading with respect to this: and taking the prompts from this fascinating post by Brad Johnson, of An Und Fur Sich. My immediate thought is: Johnson's position (and judging by the comments thread, the positions of various other people) is that they are 'atheist Christians' in the sense that their belief in a transcendental divine principle underpinning the legitimacy and orthodoxy of their religion is, shall we say, on the continuum from complicated to nonexistent, but they are still members of their church, they still see merit and positive potential in being part of that community. That makes a lot of sense to me; but it's not what I'm interested in arguing myself. It's the being part of a community, defined by commonality and tradition, that seems to me radically Unchristian, not the belief of nonbelief in a transcendental God. Still Johnson very usefully points me to: Thomas Altizer’s Genesis and Apocalypse (1990) and Dorothee Soelle’s Christ the Representative (1967). I could add: obviously I am not an Atheist Christian.
Monday, 28 March 2011
The Interplanetary Rebel's Hymn
You who govern Venus, where the disk is smooth and grey:
The Ulanovs rule your System—but you’re greater, far, than they!
Now as the laws are questioned and the police sloops blast and glide,
Mithras, lord of the planets, give strength to those who died.
You who govern mottled Earth, a disk of white and blue,
The doorway men knocked first at, and which many have passed through;
(Left behind some millions stumbling, g-force dulled and drowse)
Mithras, lord of the planets, keep us all true to our vows !
You who govern Mars, where rust has reddened the terrain,
There you died immortal; immortal there you rose again!
Where thin air and low g corrode the strength of gods of war
Mithras, lord of the planets, make them mighty as before!
Asteroid govenor and shepherd, where worlds cross and clash
And billions eke out life in caves of granite and of ash,
Subjected, spurned, though full of heart; tied by the Ulanov rope:
Mithras, lord of the planets, give our Revolution hope!
You who govern Jupiter, cold simulacrum star,
God of midnight spaces: here your truest faithful are.
Give us word that you will lead us rushing back into the Light
Mithras, lord of the planets, let us stand-up for your right!
The Ulanovs rule your System—but you’re greater, far, than they!
Now as the laws are questioned and the police sloops blast and glide,
Mithras, lord of the planets, give strength to those who died.
You who govern mottled Earth, a disk of white and blue,
The doorway men knocked first at, and which many have passed through;
(Left behind some millions stumbling, g-force dulled and drowse)
Mithras, lord of the planets, keep us all true to our vows !
You who govern Mars, where rust has reddened the terrain,
There you died immortal; immortal there you rose again!
Where thin air and low g corrode the strength of gods of war
Mithras, lord of the planets, make them mighty as before!
Asteroid govenor and shepherd, where worlds cross and clash
And billions eke out life in caves of granite and of ash,
Subjected, spurned, though full of heart; tied by the Ulanov rope:
Mithras, lord of the planets, give our Revolution hope!
You who govern Jupiter, cold simulacrum star,
God of midnight spaces: here your truest faithful are.
Give us word that you will lead us rushing back into the Light
Mithras, lord of the planets, let us stand-up for your right!
Sunday, 27 March 2011
Saturday, 26 March 2011
Space
Two chapter epigraphs from this book:
"The immensity of the universe becomes a matter of satisfaction rather than awe; we are citizens of no mean city." Sir James Jans, British physicist. [9]Something wrong with his awe-gland, maybe? Though the cosmos as a city is an intriguing idea.
"It is a brazen conceit to suppose that a machine can be built even one-half as capable as the mind of man." Scott Crossfield, X-15 test pilot. [51]This is the same Crossfield who flew throught the sky unaided, twice as fast as an X-15, by the power of his mind alone!
Friday, 25 March 2011
Direction
Interesting:
The speed of light is a dimensional quantity and so, as has been emphasized in this context by João Magueijo, it cannot be measured. Measurable quantities in physics are, without exception, dimensionless, although they are often constructed as ratios of dimensional quantities. For example, when you measure the height of a mountain you really measure the ratio of its height to the length of a meterstick. The conventional SI system of units is based on seven basic dimensional quantities, namely distance, mass, time, electric current, thermodynamic temperature, amount of substance, and luminous intensity.I'm missing something obvious, of course, or perhaps merely displaying my slow-wittedness; but where is orientation on this list? (I mean, for instance, the angle in a triangle's corner). Is 45⁰ the ratio between two dimensional qualities? Surely not. Another way of putting it: from the point of view of a photon, time and space are effectively dimensionless; but direction of travel still matters.
Thursday, 24 March 2011
In the coffee shop
Is that woman shaking a sachet of sugar before tearing it open for her coffee -- or is she playing air guitar?
Wednesday, 23 March 2011
Why is the universe so much bigger than us?
Imagine humanity is the only intelligent and selfaware lifeform in the universe. Isn't it the disparity of scale that boggles the mind, with this supposition? Such an unimaginably vast universe, temporally and spatially; such a tiny portion of that universe given over to us. But here's an SFnal hypothesis: imagine if the universe processes information by embodying it; and that every individual human branches into alternate versions (possibly two; perhaps four) of her/himself at every moment. The 'supplemental' universe could just be as big as it needs to be to handle such vast quantities of data.
Tuesday, 22 March 2011
Monday, 21 March 2011
Kant's sublime
Kant defines his sublime by its boundlessness: the infinity of space in his mathematical sublime, the unlimited power and motion in the dynamical sublime. Astonishment, awe, reverence, terror and horror. Immortality is more sublime than mortality; freedom than slavery. I wonder if this isn't one-eighty-degrees about. Endlessness is extension, and therefore repetition, without end: it is familiarity not astonishment, boredom not terror. It is the abruptness, the ontological arbitrariness of borderlines that is properly sublime: the always-inevitably unxpected, terrifying astonishment of one's own death.
Sunday, 20 March 2011
Homo S(F)acer
I've been thinking a little more about the discussion begun here; or more precisely, I have been thinking about one aspect of it.
In a recent LRB essay ‘I am the decider’, Hal Foster reviews the new translation of Derrida’s The Beast and the Sovereign, and in doing so rehearses the recent theorising of ‘homo sacer’ via Agamben, Derrida himself and Eric Santner (via that old Nazi Carl Schmitt). Derrida:
This, I think, has to do with one of the elephants in the futuristic room of SF: its juvenility. That it is in many ways an adolescent mode of art seems to me not a thing to deplore or hide, still less a thing to be purged in the evolution of the genre into some notional full aesthetic 'maturity'. It seems to be precisely the ground of the genre's potential for true greatness. Alone amongst the genres of contemporary literature, SF understands that the energies informing contemporary life, its kinetic restlessness, its tech-facility, its cyclotropic moods, its to-the-bone fasciation with sex and violence, are precisely adolescent ones. At the same time this is the quantity about which contemporary thought and culture is most ashamed.
To bring in a parallel, this is what Foster says about 'my own field, modernist art':
In a recent LRB essay ‘I am the decider’, Hal Foster reviews the new translation of Derrida’s The Beast and the Sovereign, and in doing so rehearses the recent theorising of ‘homo sacer’ via Agamben, Derrida himself and Eric Santner (via that old Nazi Carl Schmitt). Derrida:
At the two extreme limits of the order, the sovereign and homo sacer present two symmetrical figures and have the same structure and are correlativeL the sovereign is the one with respetc to whom all men are potentially homines sacri, and homo sacer is the one with respect to whom all men act as sovereigns.It's interesting, especially the paragraps on Santner's On Creaturely Life (2006), a book I hadn't hitherto come across.
Creaturely life, as Santner defines it -- 'life abandoned to the state of exception/emergency, that paradoxical domain in which law has been suspended in the name of preserving law -- is close to bare life. But he adds two important touchstones of his own, Kafka and W.G.Sebald, some of whose characters, caught between human and nonhuman states, or stranded in the vertiginous space of exile, allow Santner to imagine bare life from the position of homo sacer, 'on the threshole where life takes on its secific biopolitical intensity, where it assumes the cringed posture of the creature.'Not to go from the negative sublime to the ridiculous, but I'm interested in the 'exceptional' state of SF with respect to other genres of literature; the 'cringe' of embarassment it can't shake off, howsoever prouldy its adherants proclaim its princely supremacy. SF is the genre sacer, outside the law as a way for 'genre' itself to uphold the law.
This, I think, has to do with one of the elephants in the futuristic room of SF: its juvenility. That it is in many ways an adolescent mode of art seems to me not a thing to deplore or hide, still less a thing to be purged in the evolution of the genre into some notional full aesthetic 'maturity'. It seems to be precisely the ground of the genre's potential for true greatness. Alone amongst the genres of contemporary literature, SF understands that the energies informing contemporary life, its kinetic restlessness, its tech-facility, its cyclotropic moods, its to-the-bone fasciation with sex and violence, are precisely adolescent ones. At the same time this is the quantity about which contemporary thought and culture is most ashamed.
To bring in a parallel, this is what Foster says about 'my own field, modernist art':
...in particular its pesistent fascination with the art of the child, the insane and the primitive. For the most part the [critical] inquiry into this has been conducted in terms of the unconscious and the other, that is, in the languages of psychoanalysis anthropology. This is not wrong as far as it goes, but might we not also view these identifications as creaturely expressions of a 'fissure in the soace of meaning' opened up by 'exposure to a traumatic dimension of political power'?Mutatis mutandi, this comittment to a version of the 'the child, the insane and the primitive' defines SF too.
Saturday, 19 March 2011
Politics begins with ...
Glen Newey ‘Limits of Civility’ [LRB 17 March 2011] precedes an account of ‘Uruk [springing] from the alluvial plains of Mesopotamia, its walls founded, according to legend, by Gilgamesh’ with this snappy apothegm: ‘politics begins with walls, and death.’ Nice, but too neat. Might we not equally want to say: ‘politics begins with gateways, and life’ …?
Friday, 18 March 2011
Religion as weakness
Young religions assume that their priority is strength, even if this means (killing others in religious wars to assert their primacy, torturing and executing heretics to illustrate the power of orthodoxy) violating the central ethical tenets of the religions themselves -- Christianity's love-thy-enemy, the centrality of Peace and Mercy to Islam. The tacit assumtion, I suppose, is not worldly glory for the sake of it -- there certainly were some people driven by such base motives, of course, but that's not what I'm interested in here. It's the sense that what differentiates the great world religions from the many thousands of other religious that have come and gone is that they proved themselves more singlemindedly self-assertive.
There may be something in this, although times have changed.
This is what has got me thinking about this: though some people are hot against it, most people are indulgent about astrology (a pseudo-religion, perhaps, but there you go). To believers it is a valid transcendental meaning-system and, to a certain extent, a community. To most it appears too feeble a cultural discourse to be worth getting upset over ('what harm does it do?' after all). But I'm not sure the long-term survival of religion is assured by keeping one's head down. The recent, terrible, heart-hurting earthquake and tsunami in Japan provoked this monstrous response from the Daily Mail astrologers, who 'explain' the disaster in terms of astrological conjunctions -- although those conjunctions did not enable them to predict the disaster ahead of time. But when a weak system like this is confronted by pain and death on this scale, it wilts (the obvious human reaction to the DM article is outrage, of course). Christianity, with two thousand years of putting to the sword and burning at the stake, and (of course) with the icon behind everything of a human male tortured to death, is better able to endure the shock when the Titanic of human existence strikes the iceberg of human suffering.
There may be something in this, although times have changed.
This is what has got me thinking about this: though some people are hot against it, most people are indulgent about astrology (a pseudo-religion, perhaps, but there you go). To believers it is a valid transcendental meaning-system and, to a certain extent, a community. To most it appears too feeble a cultural discourse to be worth getting upset over ('what harm does it do?' after all). But I'm not sure the long-term survival of religion is assured by keeping one's head down. The recent, terrible, heart-hurting earthquake and tsunami in Japan provoked this monstrous response from the Daily Mail astrologers, who 'explain' the disaster in terms of astrological conjunctions -- although those conjunctions did not enable them to predict the disaster ahead of time. But when a weak system like this is confronted by pain and death on this scale, it wilts (the obvious human reaction to the DM article is outrage, of course). Christianity, with two thousand years of putting to the sword and burning at the stake, and (of course) with the icon behind everything of a human male tortured to death, is better able to endure the shock when the Titanic of human existence strikes the iceberg of human suffering.
Thursday, 17 March 2011
Headless Democracies: New Model Army, Ignorance, Politics and Art
[On Friday 5th February this year I gave the New Year's Lecture at the Universiteit Leiden, Campus Den Haag. I had been invited by the LUC Brill Nijhoff Writing Institute to talk about my latest novel, New Model Army, in any way I liked that connected with the institutes wide range of interests in 'international law & politics, ethics, literature, anthropology and natural science'. I'm grateful to Prof. Chris Goto-Jones for the invitation, and the institute staff, especially Dr. Hyowon Kim (and her charming, superbright husband) were models of hospitality. The rest of this post is the talk I gave; although in the event I ended up ad-libbing a fair amount. Plus, as you'll see, parts of the talk had already appeared on this very blog. Still, for what it's worth, here it is. AR]
**
I sat down in England a little while ago to think about what I was going to say to you all this evening, trying to keep in mind the remit I had been given: ‘to show how writing is an essential part of academic expression’ and to steer my thoughts via ‘LUC interests: global justice—sustainability—human interaction.’ Conscious of the very great honour you have done me by inviting me to address you, I am at the same time aware of how little I know, really, about the disciplines of political theory, social policy and international; jurisprudence. This, though, need not be a disadvantage. I am a subject of the British crown and a citizen of Europe—such a mournful chasm, there, between subject and citizen—and as such democracy, politics, international law, environmental sustainability and ‘human interaction’ in the broadest sense ... all these things determine my being-in-the-world. The same is true of all of us. Humans are social and political animals: claiming ignorance of social policy is like a fish claiming ignorance of brine, or a tiger claiming ignorance of hunting. The points, very germane here and now, is that there are different modes of knowledge: the detailed expertise that comes from careful study of a discipline is one, though by virtue of its codes of research it tends towards a kind of passivity (familiarising oneself with data fields and conceptual models). I don’t use ‘passivity’ in a pejorative sense, by the way: after all the word is linked, etymologically, with passion, and this body of scholarly expertise can be passionate pursued and equally passionately disseminated. But there are other modes of knowledge, and some of those are active. Sports scientists also (generally) play sport; but sportsmen are by necessity sports scientists, if only in an intuitive way. Musicologists almost always play musical instruments, but professional pianists and violinists must actualise and embody musicology or they wouldn’t be any good. I mention this right at the beginning because it seems to me rather obvious—we can almost, indeed, take it as axiomatic. I’m going to guess that we’re all prepared to agree from the beginning that ‘writing is an essential part of academic expression’. It is only foolish academics who hold the opinions of those who have not immersed themselves in years of minute scholarship in low esteem. As Samuel Johnson once noted, you need not be trained as a cobbler to be able to say that your shoes are pinching your toes.
But there’s more to say here, to do not only with the different modes of knowledge but their different magisteria—not simply whether a body of knowledge (let’s say: global justice—sustainability—human interaction) is considered from a theoretical or a practical point of view, but whether the ground of either sort of knowledge is collective or individual, whether it is political or aesthetic. It is one thing to say that professional academics don’t have a monopoly on knowledge; it is quite another to say that social policy should be guided by a principle of ignorance. This, though, is often what happens. In the recent general election in my country, the parties competed on their different plans for addressing the economic calamities of the credit crunch. And the British population voted, despite the fact that almost all of us know almost nothing about economics. Or to take a slightly more contentious argument: by what criterion of expertise is it the democratic will of the United States to go to war in Iraq when a large proportion of the American electorate can’t even identify where Iraq might be found on a map of the world? To ask this question is not to insist the war was wrong, or right; and nor am I trying to score cheap shots at the expense of Americans (we might ask, for instance, why should an office worker in Ohio or Nebraska need to know the intricate political situation that obtains in Iraq? She has plenty of other things to occupy her life; indeed, she has remarkable expertise about a dozen areas of knowledge.) My point here is to suggest that an inevitable part of the way democracy works is that popular will is predicated not upon knowledge but intuition, not upon ‘truth’ but—that fantastically useful theoretical category—‘truthiness’. Not that decisions are made on the basis on ignorance, but certainly that ignorance need not be dissolved away by the acid of actually finding shit out before decisions are made. One (minor) manifestation of democracy in my country is radio phone-in shows: a host poses a hot topic of the day—immigration, sexual morality, the economy, religion whatever—and then ‘ordinary members of the public’ phone in with their opinions. The parameters are lightly drawn in these forums: if callers are actually incite hatred on the grounds of race, religion or sexual orientation they will be rebuked or censored. But otherwise their views, no matter how oddball or ignorant, are given equal platform, space. The implicit premise behind these shows is that merely by voicing an opinion, one is performing democracy. There is no requirement that opinion be modified by actual knowledge. To quote Mitchell and Webb expert parody of the form: ‘What about global warming? How should the situation in the middle east be resolved? Almost certainly you know nothing about these issues, but I’m sure you reckon something. Give us a call!’
Contemporary democracy, in other words, is a performative rather than a connotative statement. To talk about nuclear physics in a lecture hall with a view to teaching students, one needs at least a basic understanding of nuclear physics. But to utter the classic examples of performative utterances—to say ‘I do’ during a marriage ceremony, to say ‘you’re fired’—your discourse needs no actual expertise. Or to put it another way, the knowledge is in the performance only. The statement ‘carbon 12 is an isotope of carbon 14’ includes knowledge, in the sense that it is possible to get it wrong. The statement ‘I do’, uttered during a marriage service, is perfectly ignorant in terms of marriage.* I perform democracy by (let’s say) voting, going on demos, writing angry letters to the newspaper and so on. All these things, being performative, cannot be wrong—I may vote Labour or Conservative, but my vote cannot be wrong.**
This question of relative knowledge and relative ignorance is a key one for me, at least in terms of what I do as a writer. So on the one hand I am an academic, after all—professor of nineteenth-century literature at the University of London—as well as being a creative writer. My chosen field of creative endeavour is science fiction, and unusually amongst my fellow practitioners of SF my educational background is not in the sciences. My first degree (Aberdeen) was in English and Classics; my PhD (Cambridge) 'Robert Browning and the Classics'. I am not wholly ignorant when it comes to science, but I know no more than anybody it who reads the various popular science books and magazines like New Scientist. I don’t have time to talk about ‘science’ more broadly—although it has been my experience than the (academic) scientists I know are generally much better informed about art and culture, than the (academic) humanities scholars I know are about science. But I don’t want to stray from the point.
In late 2009 and early 2010 I wrote a novel called New Model Army. My starting point was an article I read in the Jan 2009 London Review of Books called ‘Bouncebackability’, a David Runcimann review of Josiah Ober’s Democracy and Knowledge: Innovation and Learning in Classical Athens (Princeton Univ. Press, 2008). Here’s a blog-post I made immediately after reading it:
A properly headless democracy
One of my aims in NMA was to behead democracy; or to put it a little more exactly, to imagine a democracy that could not be beheaded because it never had a head in the first place. In Alice in Wonderland, the Cheshire cat manifests over the Queen of Hearts’ garden as a head. When the Queen insists the cat be beheaded, there is disagreement among her servants. The headsman takes this view, saying that without a body to behead from, a head cannot be beheaded. The Queen takes the contrary view, arguing that anything with a head can be beheaded. The Queen, I think, is right.This is a very important and rather profound point, actually. The Cheshire cat, as he manifests in Alice, can be beheaded. But if he had the foresight to manifest as a body only, he could not. To put this another way: the invincibly ignorance archetype of the headless man sidesteps this dilemma. He cannot be beheaded, because he is always-already-beheaded. He is ahead of you.
Ho ho.
The effacement of democracy; the notion of an acephalic democracy
What do we mean when we talk of the giant’s head? The most obvious way of elaborating on that would be to itemise the dictator, the popularly-elected figurehead (figure + head). In some cases, this figured headishness is so prominent, and the individual ‘faces’ the democractic process that got him to that position so emphatically, he can simply dispense with democracy itself. The logic here is that the figure of the head organically ‘completes’ the demos’s body, obviating the need for further democratic process. So Napoleon III, or Hitler, or Mubarak wins power by plebiscite and then effaces the process of the vote as superfluous, declaring himself dictator for life. The democracy now has a head, and the head become a Mekon-head, a brain-in-a-vat, such that all the cells of the body exist only to provide cephalic service.
But there is another way in which we might talk about the giant’s head—one, indeed, not incompatible with the pretensions of a dictator-for-life. This is to think of democracy, society and history as embodying (in a strict sense) a cephalic ideal. Let’s say; Fuhrerprinzip, the charismatic genius-leader, the epitome of the will-of-the-people, Hegel’s ‘world-spirit’ whatever it might be. Let me quote some Adorno:
This is part of a larger critique, of course, of the dangers of ‘majoritarian thinking’. The introduction to Negative Dialectics doesn’t mince its words (not that Adorno is normally very mincing, of course):
The idea of a ‘headless’ democracy is not a new one, of course; it is only that it has almost always appeared in previous social and political discourse in negative terms. Its avatar is The Mob. Destructive, purposeless, violent, mindless, terrifying. In NMA one of my main aims was to lay out the notion ‘a headless democracy need not be a mob’. And here we come to the nub of the matter: how writing is an essential part of academic expression. Because the academic way of supporting that statement, researching its supporting evidence, is hamstrung by the fact that history provides almost no examples of headless democracy. Received wisdom is that social groups must have leaders or they will revert to unproductive chaos. It is, however, possible to engage in speculative thought-experiment about these things—that phrase, indeed, is a nice thumbnail definition of SF
Creative work often begins by jamming together various previously unjuxtaposed ideas, concepts or observations. In the case of NMA I wondered if participatory democracy (on the model of the Greek polis) necessarily gave way to representational democracy because the group size grew too large to fit in one ekklesia. So I posited a Greek-polis-sized group, a small army. But of course, the technologies of computing and network connectivity mean that our ekklesia need no longer be a physical space. So, the novel became as much about the modes of social interaction and the possibilities of political action the Net has enabled. Social networks are—really—extraordinary things.
It turns out that there is such a thing a the wisdom of crowds. There’s such a thing as the foolishness of crowds too, of course; but I’ve come to the conclusion that the former significantly outweighs the latter.
Received wisdom, since the days of the Enlightenment philosophes, was that an encyclopedia can only be assembled by experts. It is a cornerstone of modern economics that people work when motivated to do so by financial reward. Yet Wikipedia is now the world’s greatest encyclopedia, open-source software has revolutionised computing and I get all my breaking news through Twitter. There’s a tendency to want to force these networks back into the procrustean beds of old media models, of course. Take Wikileaks, for instance. News reporting on that phenomenon have focussed on the troubled circumstances of Julian Assange, which makes for a thoroughly dramatic, salacious and individualised narrative. But the assumption that Wikileaks is ‘the body’ and Assange ‘the head’ of this phenomenon is surely mistaken ... I mean no personal derogation of Mr Assange, and I know too little about his personal circumstances to judge him; but to think (as the authorities seem to) that Assange is Wikileaks is to perpetrate a kind of category error.
There is a genuine problem on the level of representation here, of course, and I don’t mean to gloss over it. The media represent Wikileaks as ‘the dramatic and diverting narrative of Julian Assange’ not because they are pawns of the government authorities embarrassed by the Wikileaks revelations, but because the formal, narrative logic of news broadcasting is heavily biased towards the story arc of the individual. This is true for deep, systemic reasons; the same structural forces that shape novel writing, film-making and so on. Indeed, so deeply implicated are our narrative possibilities in telling the story of the head that it is a struggle to find alternate modes. A few years ago I wrote a story called Land of the Headless that wrestled with this dilemma, to only partial success. In NMA I try to inflect the central consciousness through which the novel is filtered in certain ways, to prepare the ground for the shift in the book’s final section to a kind of playful collective headlessness. My writing more generally is often criticised (in reviews and so on) for its lack of conventionally ‘appealing’ or identifiable characters. There’s some truth in this criticism, although in my defence I’d like to suggest that it could be, as the contemporary phrase goes, a feature not a bug—in that this is one of the things I’m interested in trying to do. The literary-critical way of putting this would be to say that I’m interested in deconstructing the forms and assumptions of (what remain) fundamentally 19th-century novelistic preconceptions of what constitutes a ‘good novel’. This doesn’t mean that I necessarily succeed, of course; but it is to suggest that criticising my attempts to decapitate the novel simply because one disapproves of the idea of decapitating a novel may be to miss the point of what I’m about. The Queen of Hearts’ slogan—off with his head!—makes for a cruel and unjust political or judicial programme, but is an inspiring call to arms on the metaphorical and conceptual level. At the heart of the new Revolution, then: a conceptual guillotine.
**
I sat down in England a little while ago to think about what I was going to say to you all this evening, trying to keep in mind the remit I had been given: ‘to show how writing is an essential part of academic expression’ and to steer my thoughts via ‘LUC interests: global justice—sustainability—human interaction.’ Conscious of the very great honour you have done me by inviting me to address you, I am at the same time aware of how little I know, really, about the disciplines of political theory, social policy and international; jurisprudence. This, though, need not be a disadvantage. I am a subject of the British crown and a citizen of Europe—such a mournful chasm, there, between subject and citizen—and as such democracy, politics, international law, environmental sustainability and ‘human interaction’ in the broadest sense ... all these things determine my being-in-the-world. The same is true of all of us. Humans are social and political animals: claiming ignorance of social policy is like a fish claiming ignorance of brine, or a tiger claiming ignorance of hunting. The points, very germane here and now, is that there are different modes of knowledge: the detailed expertise that comes from careful study of a discipline is one, though by virtue of its codes of research it tends towards a kind of passivity (familiarising oneself with data fields and conceptual models). I don’t use ‘passivity’ in a pejorative sense, by the way: after all the word is linked, etymologically, with passion, and this body of scholarly expertise can be passionate pursued and equally passionately disseminated. But there are other modes of knowledge, and some of those are active. Sports scientists also (generally) play sport; but sportsmen are by necessity sports scientists, if only in an intuitive way. Musicologists almost always play musical instruments, but professional pianists and violinists must actualise and embody musicology or they wouldn’t be any good. I mention this right at the beginning because it seems to me rather obvious—we can almost, indeed, take it as axiomatic. I’m going to guess that we’re all prepared to agree from the beginning that ‘writing is an essential part of academic expression’. It is only foolish academics who hold the opinions of those who have not immersed themselves in years of minute scholarship in low esteem. As Samuel Johnson once noted, you need not be trained as a cobbler to be able to say that your shoes are pinching your toes.
But there’s more to say here, to do not only with the different modes of knowledge but their different magisteria—not simply whether a body of knowledge (let’s say: global justice—sustainability—human interaction) is considered from a theoretical or a practical point of view, but whether the ground of either sort of knowledge is collective or individual, whether it is political or aesthetic. It is one thing to say that professional academics don’t have a monopoly on knowledge; it is quite another to say that social policy should be guided by a principle of ignorance. This, though, is often what happens. In the recent general election in my country, the parties competed on their different plans for addressing the economic calamities of the credit crunch. And the British population voted, despite the fact that almost all of us know almost nothing about economics. Or to take a slightly more contentious argument: by what criterion of expertise is it the democratic will of the United States to go to war in Iraq when a large proportion of the American electorate can’t even identify where Iraq might be found on a map of the world? To ask this question is not to insist the war was wrong, or right; and nor am I trying to score cheap shots at the expense of Americans (we might ask, for instance, why should an office worker in Ohio or Nebraska need to know the intricate political situation that obtains in Iraq? She has plenty of other things to occupy her life; indeed, she has remarkable expertise about a dozen areas of knowledge.) My point here is to suggest that an inevitable part of the way democracy works is that popular will is predicated not upon knowledge but intuition, not upon ‘truth’ but—that fantastically useful theoretical category—‘truthiness’. Not that decisions are made on the basis on ignorance, but certainly that ignorance need not be dissolved away by the acid of actually finding shit out before decisions are made. One (minor) manifestation of democracy in my country is radio phone-in shows: a host poses a hot topic of the day—immigration, sexual morality, the economy, religion whatever—and then ‘ordinary members of the public’ phone in with their opinions. The parameters are lightly drawn in these forums: if callers are actually incite hatred on the grounds of race, religion or sexual orientation they will be rebuked or censored. But otherwise their views, no matter how oddball or ignorant, are given equal platform, space. The implicit premise behind these shows is that merely by voicing an opinion, one is performing democracy. There is no requirement that opinion be modified by actual knowledge. To quote Mitchell and Webb expert parody of the form: ‘What about global warming? How should the situation in the middle east be resolved? Almost certainly you know nothing about these issues, but I’m sure you reckon something. Give us a call!’
Contemporary democracy, in other words, is a performative rather than a connotative statement. To talk about nuclear physics in a lecture hall with a view to teaching students, one needs at least a basic understanding of nuclear physics. But to utter the classic examples of performative utterances—to say ‘I do’ during a marriage ceremony, to say ‘you’re fired’—your discourse needs no actual expertise. Or to put it another way, the knowledge is in the performance only. The statement ‘carbon 12 is an isotope of carbon 14’ includes knowledge, in the sense that it is possible to get it wrong. The statement ‘I do’, uttered during a marriage service, is perfectly ignorant in terms of marriage.* I perform democracy by (let’s say) voting, going on demos, writing angry letters to the newspaper and so on. All these things, being performative, cannot be wrong—I may vote Labour or Conservative, but my vote cannot be wrong.**
I need to qualify that statement, briefly. We live, in our various European countries, under representative democracies. The nature of representative democracy is, in essence, that it has a head and a body. The body (the populace) empowers by the head (the political classes; MPs, senators, etc) by periodic elections; but whilst the populace performs democracy the group voted into power must act in an informed and expert manner—the discourse of democracy at this level is referential not performative. But it’s a common enough criticism that democratically elected leaders shrink ‘democracy’ in the largest sense to a much smaller concept—accountabiloity—and then go about the business of ruling in much the same way that absolute rulers always do. It may be that there are good reasons for this. I’m interested, though, in what would happen if this head actually followed the logic of the body. Or if it were removed altogether.[* Note 1: To be clear: I suppose it is possible to get the ‘I do’ wrong in the sense that one thereby marries the wrong person (years of fights and bitterness leading to a rancorous divorce). But this is a different kind of getting things wrong, I think. It’s not possible for the performative ‘I do’ to be wrong about what it performs—the act of becoming marriage, whether to your ideal life partner or some horrible individual. On the other hand a referential statement can easily be wrong on its own, referential terms: as I would be, if I began a lecture on geography with ‘Holland is a mountainous country.’]
[** Note 2: This, actually, unpacks into large and important questions. One common criticism of the West’s project to ‘democratise’ the rest of the world is precisely that it violates this core principle—that the voters of ‘Palestine’ or Egypt or (potentially) Iraq and Afghanistan vote for the wrong people when they vote for radical Islamist politicians. I don’t have time to go into this here, I suppose, although it seems to me that such a position (Palestinian democracy means the perfect freedom of the Palestinians to elect whomsoever they choose provided it’s not Hamas’) has everything to do with international relations and nothing to do with democracy.]
This question of relative knowledge and relative ignorance is a key one for me, at least in terms of what I do as a writer. So on the one hand I am an academic, after all—professor of nineteenth-century literature at the University of London—as well as being a creative writer. My chosen field of creative endeavour is science fiction, and unusually amongst my fellow practitioners of SF my educational background is not in the sciences. My first degree (Aberdeen) was in English and Classics; my PhD (Cambridge) 'Robert Browning and the Classics'. I am not wholly ignorant when it comes to science, but I know no more than anybody it who reads the various popular science books and magazines like New Scientist. I don’t have time to talk about ‘science’ more broadly—although it has been my experience than the (academic) scientists I know are generally much better informed about art and culture, than the (academic) humanities scholars I know are about science. But I don’t want to stray from the point.
In late 2009 and early 2010 I wrote a novel called New Model Army. My starting point was an article I read in the Jan 2009 London Review of Books called ‘Bouncebackability’, a David Runcimann review of Josiah Ober’s Democracy and Knowledge: Innovation and Learning in Classical Athens (Princeton Univ. Press, 2008). Here’s a blog-post I made immediately after reading it:
Josiah Ober’s Democracy and Knowledge (Princeton 2008) (or, to be precise, this LRB review of it by David Runcimann) raises some interesting question. The premise is that ‘knowledge aggregation’ (the wisdom of crowds, infotopia, wikinomics) is a positive feature of contemporary life: ‘lots of people knowing many small things can result in a very big deal for everyone’. Democracy ought to be the paradigm for this, but isn’t. Modern democracies, unlike the successful bottom-up collective endeavours such as wikipedia, are not truly democratic: ‘they are not direct but representative, which makes them top-down keader-oriented popularity contests, not exercises in knowledge aggregation. Ober, though, argues that ancient Athens was precisely this sort of democracy, and that it owed its success as a polis to that fact. Runcimann:And so I did.
Athens had many things going for it—philosophy, oratory, drama, magnificent buildings—but it was also a violent, faction-ridden, capricious, war-mongering slave-owning society, clinging precariously to its privileged position and regularly picking fights it couldn’t win. It doesn’t exactly sound like the Google (company motto: ‘don’t be evil’) of the ancient world.Ah, but:
Josiah Ober is here to tell us that we have this last point completely wrong … Athenian democracy really was an open, flexible, dynamic and remarkably successful political society, able to marshal its resources and outperform its rivals. … Essentially his argument has two parts. First, he needs to show that Athens did indeed outperform its rivals to become the most successful polity of its age. Second he needs to show that this advantage was a direct result of its being a democracy, because as a democracy it was able to acquire, aggregate and codify knowledge in ways that its non-democractic rivals couldn’t match.Ober thinks he can demonstrate both these points; Runcimann isn’t quite so sure. But it’s a fascinating, and of course relevant, question.
One thing it makes me think is the way political debates of the 1930s and early 1940s sometimes restated these premises. For instance, a good number of people believed that World War II was effectively a fight to see whether a democratic system could beat an authoritarian one. Fascists argued that democracy was necessarily riddled by internal contradiction; that for instance no democracy could focus the will to stay in a long, destructive and expensive war. On the other side the allies’ victory was taken by many precisely to be the victory of democracy over authoritarianism. We might object that the allies ran their democracies in pretty authoritarian ways—people who opposed the war tended to be silenced, or locked up, or if they disagreed vocally enough shot. But this in fact speaks to the effectiveness of ‘representative democracy’ rather than actual democracy to do things like, for instance, win wars.
Around 500 BC Athens got democracy, but less than twenty years later they also got lucky, and rich, with the opening up of a new group of silver mines in southern Attica which produced a substantial windfall profit for the state. … Ober weaves this big slice of natural advantage into his story of democratic achievement by pointing out that when the assembly had to decide what to do with the first influx of extra wealth it chose to spend it on building the navy that went on to defeat the Persians at Salamis in 480BC rather instead of distributing it among individual citizens. Compare and contrast, say, with Sarah Palin’s Alaska (admittedly one of the least plausible candidates ever for that hotly disputed title ‘Athens of the North’) With oil prices high early last year, Palin decided to use the extra state income to fund $1000 credits to every Alaskan to help with their fuel bills. Ancient democracies used their good fortune to take tough decisions in the common interest; modern democracies use it to bribe the voters with handouts.I’m not sure about that last line (neither, as it goes, is Runcimann). But the article as a whole is very stimulating, and I’ve been wondering about joining up its dots. New e-democracy utpopianism is fuelled by new technologies that make it much simpler to canvas everybody’s opinion quickly and efficiently. One of the shaping ideological forces of the second half of the twentieth century is that democracy is not just ethically better than dictatorship, it is practically superior—viz. the number of wars fought between the two regimes and always won by the former. This is fair enough; and personally I’m very glad that the democratic allies won WWII rather than the fascists. But although armies from democratic nations (USA, UK) fought armies from authoritarian nations (Germany, Italy, Japan) and won, nobody suggested that the armies themselves should be run on democratic lines. There has never been in the history of humankind a properly democratic army.
But why not? The obvious objection—that it would be impracticable to orchestrate the trappings of democracy, the hustings and votes, in the heat of battle—is rendered null by new technologies. The conceptual objection (that soldiers would tend to vote to run like cowards rather than engage the enemy) seems to me equally unfounded: the history of democracy suggests the reverse. Indeed, morale (military code for: making sure that feudal soldiers don't feel too much like slaves led by people who don't especially care if they live or die) would be much less of a problem; logistics would be easier -- new model soldiers would not specialise; specialisation is the bane of feudalism ... all would have net-access to enormous bodies of expertise, practical, medical, tactical, and all would wield it. They'd revolutionise warfare.
I'll write a book about it to show what I mean.
A properly headless democracy
One of my aims in NMA was to behead democracy; or to put it a little more exactly, to imagine a democracy that could not be beheaded because it never had a head in the first place. In Alice in Wonderland, the Cheshire cat manifests over the Queen of Hearts’ garden as a head. When the Queen insists the cat be beheaded, there is disagreement among her servants. The headsman takes this view, saying that without a body to behead from, a head cannot be beheaded. The Queen takes the contrary view, arguing that anything with a head can be beheaded. The Queen, I think, is right.This is a very important and rather profound point, actually. The Cheshire cat, as he manifests in Alice, can be beheaded. But if he had the foresight to manifest as a body only, he could not. To put this another way: the invincibly ignorance archetype of the headless man sidesteps this dilemma. He cannot be beheaded, because he is always-already-beheaded. He is ahead of you.
Ho ho.
The effacement of democracy; the notion of an acephalic democracy
What do we mean when we talk of the giant’s head? The most obvious way of elaborating on that would be to itemise the dictator, the popularly-elected figurehead (figure + head). In some cases, this figured headishness is so prominent, and the individual ‘faces’ the democractic process that got him to that position so emphatically, he can simply dispense with democracy itself. The logic here is that the figure of the head organically ‘completes’ the demos’s body, obviating the need for further democratic process. So Napoleon III, or Hitler, or Mubarak wins power by plebiscite and then effaces the process of the vote as superfluous, declaring himself dictator for life. The democracy now has a head, and the head become a Mekon-head, a brain-in-a-vat, such that all the cells of the body exist only to provide cephalic service.
But there is another way in which we might talk about the giant’s head—one, indeed, not incompatible with the pretensions of a dictator-for-life. This is to think of democracy, society and history as embodying (in a strict sense) a cephalic ideal. Let’s say; Fuhrerprinzip, the charismatic genius-leader, the epitome of the will-of-the-people, Hegel’s ‘world-spirit’ whatever it might be. Let me quote some Adorno:
What is irrational in the concept of the world-spirit it borrowed from the irrationality of the course of the world. In spite of this it remains fetishistic. History has to this day no total subject, however construable. Its substrate is the functional context of real individual subjects ... In the concept of the world-spirit the principle of divine omnipotence was secularized into that which posited unity, the world-plan into the pitilessness of what occurs. The world-spirit is worshipped like a deity; it is divested of its personality and all its attributes of providence and grace. [Negative Dialectics, ‘Part III. Models. World-spirit and Natural History. Excursus on Hegel’ 299-300]We could put it this way: God must have a head. We might even go further and say; God must be a head—must approach a perfected spherical cephalic emobodiment of will, thought and reason. The Wizard rules Oz as a giant, spectral head, appearing as if to Moses on the mountainside wreathed in green fire. The actual principle of running the city is otherwise, but we must pay no attention to the figure behind the curtain. (That’s always struck me as a nicely anti-performative statement, like me saying ‘I shall not think of elephants!’ Is there a name for that rhetorical trick?). In secularised social terms, we still live under a social-philosophical logic that sees the point of society to be the head.
This is part of a larger critique, of course, of the dangers of ‘majoritarian thinking’. The introduction to Negative Dialectics doesn’t mince its words (not that Adorno is normally very mincing, of course):
The construction of the truth according to the analogy of the volonté de tous—the most extreme consequence of the subjective concept of reason—would betray everyone of everything which they need ... The criterion of truth is not its immediate communicability to everyone. [50]The old USSR model of the Party as the archetypal democratic ‘Head’ is amongst the purest examples of what I am talking about. What’s interesting, and what Adorno puts his finger on, is the way the old Soviet rhetoric worked by abrogating to the head the function of the body. Let’s think instead in terms of removing the politburo altogether.
The Party is supposed to have a cognitive power that is a priori superior to that of every individual solely due to the number of its members, even if it is terrorized or blinded. The isolated individual however, unencumbered by the ukase, may at times perceive the objectivity more clearly than a collective, which in any case is only the ideology of its committees. Brecht’s sentence, the Party has a thousand eyes, the individual only two, is as false as any bromide. The exact imagination of a dissenter can see more than a thousand eyes wearing the same red-tinted glasses. [56]
The idea of a ‘headless’ democracy is not a new one, of course; it is only that it has almost always appeared in previous social and political discourse in negative terms. Its avatar is The Mob. Destructive, purposeless, violent, mindless, terrifying. In NMA one of my main aims was to lay out the notion ‘a headless democracy need not be a mob’. And here we come to the nub of the matter: how writing is an essential part of academic expression. Because the academic way of supporting that statement, researching its supporting evidence, is hamstrung by the fact that history provides almost no examples of headless democracy. Received wisdom is that social groups must have leaders or they will revert to unproductive chaos. It is, however, possible to engage in speculative thought-experiment about these things—that phrase, indeed, is a nice thumbnail definition of SF
Creative work often begins by jamming together various previously unjuxtaposed ideas, concepts or observations. In the case of NMA I wondered if participatory democracy (on the model of the Greek polis) necessarily gave way to representational democracy because the group size grew too large to fit in one ekklesia. So I posited a Greek-polis-sized group, a small army. But of course, the technologies of computing and network connectivity mean that our ekklesia need no longer be a physical space. So, the novel became as much about the modes of social interaction and the possibilities of political action the Net has enabled. Social networks are—really—extraordinary things.
It turns out that there is such a thing a the wisdom of crowds. There’s such a thing as the foolishness of crowds too, of course; but I’ve come to the conclusion that the former significantly outweighs the latter.
Received wisdom, since the days of the Enlightenment philosophes, was that an encyclopedia can only be assembled by experts. It is a cornerstone of modern economics that people work when motivated to do so by financial reward. Yet Wikipedia is now the world’s greatest encyclopedia, open-source software has revolutionised computing and I get all my breaking news through Twitter. There’s a tendency to want to force these networks back into the procrustean beds of old media models, of course. Take Wikileaks, for instance. News reporting on that phenomenon have focussed on the troubled circumstances of Julian Assange, which makes for a thoroughly dramatic, salacious and individualised narrative. But the assumption that Wikileaks is ‘the body’ and Assange ‘the head’ of this phenomenon is surely mistaken ... I mean no personal derogation of Mr Assange, and I know too little about his personal circumstances to judge him; but to think (as the authorities seem to) that Assange is Wikileaks is to perpetrate a kind of category error.
There is a genuine problem on the level of representation here, of course, and I don’t mean to gloss over it. The media represent Wikileaks as ‘the dramatic and diverting narrative of Julian Assange’ not because they are pawns of the government authorities embarrassed by the Wikileaks revelations, but because the formal, narrative logic of news broadcasting is heavily biased towards the story arc of the individual. This is true for deep, systemic reasons; the same structural forces that shape novel writing, film-making and so on. Indeed, so deeply implicated are our narrative possibilities in telling the story of the head that it is a struggle to find alternate modes. A few years ago I wrote a story called Land of the Headless that wrestled with this dilemma, to only partial success. In NMA I try to inflect the central consciousness through which the novel is filtered in certain ways, to prepare the ground for the shift in the book’s final section to a kind of playful collective headlessness. My writing more generally is often criticised (in reviews and so on) for its lack of conventionally ‘appealing’ or identifiable characters. There’s some truth in this criticism, although in my defence I’d like to suggest that it could be, as the contemporary phrase goes, a feature not a bug—in that this is one of the things I’m interested in trying to do. The literary-critical way of putting this would be to say that I’m interested in deconstructing the forms and assumptions of (what remain) fundamentally 19th-century novelistic preconceptions of what constitutes a ‘good novel’. This doesn’t mean that I necessarily succeed, of course; but it is to suggest that criticising my attempts to decapitate the novel simply because one disapproves of the idea of decapitating a novel may be to miss the point of what I’m about. The Queen of Hearts’ slogan—off with his head!—makes for a cruel and unjust political or judicial programme, but is an inspiring call to arms on the metaphorical and conceptual level. At the heart of the new Revolution, then: a conceptual guillotine.
Wednesday, 16 March 2011
Theotemporality
Following up a little on the theology of this post, I've been reading as much of Gregory Ganssle and Paul Helm's God & Time: Four Views (2001) as Google Books lets me. It's interesting, although the conjunction of the belief in an eternal God and the absolute consonance of God and Christ leads precisely to a rather Miltonic view of the incarnation.
An easier belief might be: God is 'outside time' and eternal; Christ was 'inside time' and wasn't. But then you've got to dissipate the integrity of the trinity, and wars were fought and many people killed by earlier generations of Christians over that question.
The incarnation is a unique case of God’s acting in time. One thing to note is that if God the Son is timelessly eternal and yet incarnate in Jesus Christ, there is no time in his existence when he was not incarnate, though since he became incarnate at a particular time in our history there were times in that history before the incarnation, and times since. … The incarnation is the “projection” of the eternal God [quoting Herbert McCabe]. There is therefore no sense in talking of the eternal Song of God apart from the incarnation except to make the point that the incarnation was logically contingent. That this, there is not point to it if by this we mean there was a time when the eternal Son of God existed unincarnated … the point is, as Herbert McCabe says, there is no pre-existent Christ with a life history independent of and prior to the incarnation. There is no time when the eternal God was not Jesus of Nazareth. [54]This, it seems, is a more widespread view amongst Theologians than I had realised; but it still leaves me with the problem: how can one reconcile that with a notion that what makes Christianity unique is its predication of the radical novelty of grace? But this also makes me wonder: have any sects lived by the belief that time itself began with the birth of Christ, and ended with the ascension? That Christ created the world he was born into (complete with built-in backstory), and that by killing Christ humanity ended that world? That we might think we are living in 2011, but 2011 is actually only a moment existing, in some refracted sense, within 4BC-AD33? [Philip K Dick believed, according to his own coiling Exegesis, that time ended in AD33 and started again in AD1974; but for complicated reasons.]
An easier belief might be: God is 'outside time' and eternal; Christ was 'inside time' and wasn't. But then you've got to dissipate the integrity of the trinity, and wars were fought and many people killed by earlier generations of Christians over that question.
Tuesday, 15 March 2011
The Past Tense of To Be poem
The mural in the tomb
depicts a less exhalted life:
his dog, his shoe, his comb.
His trouble. His strife.
His heart and pineal gland,
his drinks and foods,
his everything-I-leave-behind
funereal goods.
The tomb roof, though, is turfed
and sunlit and green.
It says: 'man's unwept and unlaughed
now he's gone and been'.
depicts a less exhalted life:
his dog, his shoe, his comb.
His trouble. His strife.
His heart and pineal gland,
his drinks and foods,
his everything-I-leave-behind
funereal goods.
The tomb roof, though, is turfed
and sunlit and green.
It says: 'man's unwept and unlaughed
now he's gone and been'.
Monday, 14 March 2011
The speed of light
I've been thinking about this subject for something I'm writing; and I'm struck (as many have been) by the thought that from a photons point of view, time does no pass -- light goes from the Andromeda Galaxy into my eye and it seems, to me, to have taken thousands of years to have made the journey ... although for the photons no time has passed at all. This in turn makes me think: we tend to conceptualise light as the ne plus ultra of acceleration; but from the photons point of view it is quite the opposite. Decelerating from c can be thought of a process of accelerating into time. From the photons point of view it is amazing that we are able to travel at the breathtaking speed of one hour into the future every hour.
Sunday, 13 March 2011
Gagyard
And in studied contrast to yesterday's post ... gags.
Those signs saying 'no Flash Photography' don't bother me. My photos are consistently dour and understated.
I never know whether 'i' is short or long when followed by two 's's. Ah well: ignorance is bleeess.
Dr Seuss. Did the Vicar sneeze when Christening him, or something?
Russian folk music. Or 'Country & Eastern' as I like to call it.
The O2 Arena is named after Dermot O'2, the famous Irish numeral.
Apparently the CPS is taking my Subconscious to court! Luckily I'm eligible for legal id.
The last Carry On film, *Carry On Kenneth Williams Committing Lonely Suicide*, didn't enjoy the commercial success of the earlier films.
Best Picture featuring Sean Connery in a Leather Loincloth. And the winner is ... #Zardoscar
Best Adaptation of an Early Steven King Novel. And the winner is ... #Oscarrie
Fucking Best Fucking Picture Nominations? Say hello to mah Leedle Friend! #Oscarface
According to a book on the shelf at my left, The Major Works Tennyson. That's a neat trick for a military man.
I've hoiked my underpants up and scratched my buttocks on a thousand planets! Honestly, I could write an Itch-hoikers Guide to the Galaxy.
Right: off to the corner shop. I don't know why we call it that. It does't even sell corners.
Next on my to-do list: 'stop grasping my own bottom'. I've been meaning to get to this for ages, but I'm afraid I've been very behindhand.
A depressed cat; a cauldron of gruel; drizzle; a spell to endure the ashen misery of another day. Yes. I practice Bleak Magic.
The phone rang, and a man offered me human respiratory syncytial virus. I told him: 'I don't buy anything from cold callers.'
Shakespeare pondering which container would be best for his ale: 'Toby jug? Or not Toby jug?'
For hetereosexuals who want to zip about -- the Segway! And for people of other orientations -- the Seg-gay! Zoom, zoom!
Time to wake up and smell the coffee. As opposed to that time I was farting from my mouth, When it was time to wake up and cough the smelly.
I feed the thread in at my right ear and out through my left; then I jiggle it assiduously. Works wonders. I call it 'mental floss'.
On entering the immaculate officers-only club, Lieutenant Hardy turned to Captain Laurel and said, admiringly: 'here's another fine mess!'
I've memorised a good bit of T S Eliot's poetry; but I know his Greek contemporary 'T S Oiliost' backwards.
Ecce Homo is a great work of philosophy; although it's not as compelling as Nietzsche's follow-up volume, Ecce Thump.
Breaking News: NASA to be funded by sweet manufacturers! Next mission: A-Polo 1.
'Pynchon on the Kindle' sounds like a small Bedfordshire village
I said to the man in Robert Dyas 'I want a piece of leather to sharpen my razor.' He said: 'Strop.' I said: 'Hammer time!'
I remember when Charlie Sheen was just starting out, flying that miniature plane to advertise furniture polish.
I've never understood the phrase 'hip! hip! hooray!' Surely one's hooray is located in between ones hips, not at the end?
The White Stripes do seem to be milking their split for publicity. Still, there's no use crying over a milked split.
Soviet Russia had the A-Bomb. British Marxism has the Hobs-bawm.
Yogi Bear wasn't actually a yogi. Unlike His Holiness the Maharishi Donald Duck.
Simon and Garfunkel's great song of conflict in the Somerset skies: 'Trouble over Bridgewater'
What a shame there are only a couple of working German spa-towns! It's just too bad.
Norm from Cheers bought half a pair of jeans. As I said, sarcastically, to him: 'Oh! Good buy, Norm -- *a* jean!'
Men! I know HOW YOU CAN CREATE A SENSATION AMONGST THE LADIES! Simply walk in and perform as if it is the Gents.
Men! I have the secret of HOW TO ATTRACT WOMEN LIKE FLIES! The main part of the secret is -- you have to not mind women like flies.
Why has nobody created a Pork-based version of Bovril? We could call it Pigsvil.
Combining the music of Sun Ra, Dio, and the soundtrack to the Monkees' film 'Head' ought to sound like Radiohead. But it doesn't. Odd.
Newts. So smooth skinned! But then again, you should look through the ponds for some Oldts -- they're wrinkly as old bollocks.
I refuse to use the LOLcronym 'ROFL'. Out of respect for one of my favourite Muppets.
New children's book from Chuck Palahniuk: *The Very Angry Caterpillar*
I've invented a device for communicating with dead window cleaners. I'm calling it a Squeegie Board.
I'm breaking up with you. We're too incompatible; me with Twitter, you with your geometry. Honestly, it's not Euclid. It's meclid.
I'm drinking whisky from a glass! Yeah! I'm living the dram!
We English call it a Library. Because our lies are buried there.
Erik the Red discovered America. Other Vikings had failed to do so, but Eric was a Norse of a different colour.
I have just eaten a Creme Egg, That's one Creme Chicken that's never going to grow and live in the world.
I'm doing a Spike Milligan impression right now. Start as you mean to goon, that's my motto.
Those signs saying 'no Flash Photography' don't bother me. My photos are consistently dour and understated.
I never know whether 'i' is short or long when followed by two 's's. Ah well: ignorance is bleeess.
Dr Seuss. Did the Vicar sneeze when Christening him, or something?
Russian folk music. Or 'Country & Eastern' as I like to call it.
The O2 Arena is named after Dermot O'2, the famous Irish numeral.
Apparently the CPS is taking my Subconscious to court! Luckily I'm eligible for legal id.
The last Carry On film, *Carry On Kenneth Williams Committing Lonely Suicide*, didn't enjoy the commercial success of the earlier films.
Best Picture featuring Sean Connery in a Leather Loincloth. And the winner is ... #Zardoscar
Best Adaptation of an Early Steven King Novel. And the winner is ... #Oscarrie
Fucking Best Fucking Picture Nominations? Say hello to mah Leedle Friend! #Oscarface
According to a book on the shelf at my left, The Major Works Tennyson. That's a neat trick for a military man.
I've hoiked my underpants up and scratched my buttocks on a thousand planets! Honestly, I could write an Itch-hoikers Guide to the Galaxy.
Right: off to the corner shop. I don't know why we call it that. It does't even sell corners.
Next on my to-do list: 'stop grasping my own bottom'. I've been meaning to get to this for ages, but I'm afraid I've been very behindhand.
A depressed cat; a cauldron of gruel; drizzle; a spell to endure the ashen misery of another day. Yes. I practice Bleak Magic.
The phone rang, and a man offered me human respiratory syncytial virus. I told him: 'I don't buy anything from cold callers.'
Shakespeare pondering which container would be best for his ale: 'Toby jug? Or not Toby jug?'
For hetereosexuals who want to zip about -- the Segway! And for people of other orientations -- the Seg-gay! Zoom, zoom!
Time to wake up and smell the coffee. As opposed to that time I was farting from my mouth, When it was time to wake up and cough the smelly.
I feed the thread in at my right ear and out through my left; then I jiggle it assiduously. Works wonders. I call it 'mental floss'.
On entering the immaculate officers-only club, Lieutenant Hardy turned to Captain Laurel and said, admiringly: 'here's another fine mess!'
I've memorised a good bit of T S Eliot's poetry; but I know his Greek contemporary 'T S Oiliost' backwards.
Ecce Homo is a great work of philosophy; although it's not as compelling as Nietzsche's follow-up volume, Ecce Thump.
Breaking News: NASA to be funded by sweet manufacturers! Next mission: A-Polo 1.
'Pynchon on the Kindle' sounds like a small Bedfordshire village
I said to the man in Robert Dyas 'I want a piece of leather to sharpen my razor.' He said: 'Strop.' I said: 'Hammer time!'
I remember when Charlie Sheen was just starting out, flying that miniature plane to advertise furniture polish.
I've never understood the phrase 'hip! hip! hooray!' Surely one's hooray is located in between ones hips, not at the end?
The White Stripes do seem to be milking their split for publicity. Still, there's no use crying over a milked split.
Soviet Russia had the A-Bomb. British Marxism has the Hobs-bawm.
Yogi Bear wasn't actually a yogi. Unlike His Holiness the Maharishi Donald Duck.
Simon and Garfunkel's great song of conflict in the Somerset skies: 'Trouble over Bridgewater'
What a shame there are only a couple of working German spa-towns! It's just too bad.
Norm from Cheers bought half a pair of jeans. As I said, sarcastically, to him: 'Oh! Good buy, Norm -- *a* jean!'
Men! I know HOW YOU CAN CREATE A SENSATION AMONGST THE LADIES! Simply walk in and perform as if it is the Gents.
Men! I have the secret of HOW TO ATTRACT WOMEN LIKE FLIES! The main part of the secret is -- you have to not mind women like flies.
Why has nobody created a Pork-based version of Bovril? We could call it Pigsvil.
Combining the music of Sun Ra, Dio, and the soundtrack to the Monkees' film 'Head' ought to sound like Radiohead. But it doesn't. Odd.
Newts. So smooth skinned! But then again, you should look through the ponds for some Oldts -- they're wrinkly as old bollocks.
I refuse to use the LOLcronym 'ROFL'. Out of respect for one of my favourite Muppets.
New children's book from Chuck Palahniuk: *The Very Angry Caterpillar*
I've invented a device for communicating with dead window cleaners. I'm calling it a Squeegie Board.
I'm breaking up with you. We're too incompatible; me with Twitter, you with your geometry. Honestly, it's not Euclid. It's meclid.
I'm drinking whisky from a glass! Yeah! I'm living the dram!
We English call it a Library. Because our lies are buried there.
Erik the Red discovered America. Other Vikings had failed to do so, but Eric was a Norse of a different colour.
I have just eaten a Creme Egg, That's one Creme Chicken that's never going to grow and live in the world.
I'm doing a Spike Milligan impression right now. Start as you mean to goon, that's my motto.
Saturday, 12 March 2011
Gosse's Supplementarity of Nachträglichkeit
Isn't that a splendid title for a blogpost?
The kernel of an essay on Gosse's Father and Son occurs to me, which would take the following passage from Jameson's Valences of the Dialectic as a theoretical jumping-off point:
Gosse's 'solution' is that the world is both c.6000 years old (as it says in the Bible) and millions of years old (as the fossil record implies); that God created the world in exactly the way described in Genesis, but that he created it already old. The geological and fossil evidence is there because that's what you'd expect to find in a really old world, which is just what God wants, even though the world is actually relatively young. This, I think, is a neat thesis (and unfalsifiable, to boot). But Victorian society rejected it:
The kernel of an essay on Gosse's Father and Son occurs to me, which would take the following passage from Jameson's Valences of the Dialectic as a theoretical jumping-off point:
The dialectic [in Marx's Eighteenth Brumaire] is a constant reversal of older stereotypes of causality, of historical and narrative efficacy and efficiency ... But here we now draw closer to an understanding of the dialectic as a set of operations, in situation, rather than as some static "view" or even "philosophy"--in this case of history itself. ... now at a certain point along the way, it seems to me that such dialectical narratives froze over and became codified; so that new narratives procedures had to be invented to undermine them in their turn. The two I would above all wish to single out for twentieth-century historiography are Freud's Nachträglichkeit, his so-called retroactive effect, whereby the arrival of puberty, for example, triggers events thathave in some sense already happened at the age of three, but in another sense have not yet happened at all; and alongside that, Derrida's notion of supplementarity, in which, following Jakobson's notion of the synchronic, a new moment in the system comes complete with its own brand-new past and (as in Husserl) reorders our perception around itself as a center (the permanence is then projected back endlessly into time). I believe that both these historiographic forms have occasionally been understood as critiques of the dialectic, and I understand why; but I hope it will also be clear why I now perversely consider both as contributions to a dialectical rewriting of history rather than some newfangled post-dialectical inventions. [287-88]Father and Son might not appear, at first glance, terribly dialectical: recall how it opens:
This book is the record of a struggle between two temperaments, two consciences and almost two epochs. It ended, as was inevitable, in disruption. Of the two human beings here described, one was born to fly backward, the other could not help being carried forward. There came a time when neither spoke the same language as the other, or encompassed the same hopes, or was fortified by the same desires. But, at least, it is some consolation to the survivor, that neither, to the very last hour, ceased to respect the other, or to regard him with a sad indulgence.But what interests me particularly about the memoir is the centrality of its account of Gosse père's attempts to synthesise religious and scientific truth. This from the fifth chapter:
So, through my Father's brain, in that year of scientific crisis, 1857, there rushed two kinds of thought, each absorbing, each convincing, yet totally irreconcilable. There is a peculiar agony in the paradox that truth has two forms, each of them indisputable, yet each antagonistic to the other. It was this discovery, that there were two theories of physical life, each of which was true, but the truth of each incompatible with the truth of the other, which shook the spirit of my Father with perturbation... This was the great moment in the history of thought when the theory of the mutability of species was preparing to throw a flood of light upon all departments of human speculation and action ... In the year before, in 1856, Darwin, under pressure from Lyell, had begun that modest statement of the new revelation, that 'abstract of an essay', which developed so mightily into 'The Origin of Species'.Gosse sr's intervention into this debate was the Omphalos, a really most extraordinary book.
Gosse's 'solution' is that the world is both c.6000 years old (as it says in the Bible) and millions of years old (as the fossil record implies); that God created the world in exactly the way described in Genesis, but that he created it already old. The geological and fossil evidence is there because that's what you'd expect to find in a really old world, which is just what God wants, even though the world is actually relatively young. This, I think, is a neat thesis (and unfalsifiable, to boot). But Victorian society rejected it:
Never was a book cast upon the waters with greater anticipations of success than was this curious, this obstinate, this fanatical volume. My Father lived in a fever of suspense, waiting for the tremendous issue. This 'Omphalos' of his, he thought, was to bring all the turmoil of scientific speculation to a close, fling geology into the arms of Scripture, and make the lion eat grass with the lamb .... But, alas! atheists and Christians alike looked at it, and laughed, and threw it away. In the course of that dismal winter, as the post began to bring in private letters, few and chilly, and public reviews, many and scornful, my Father looked in vain for the approval of the churches, and in vain for the acquiescence of the scientific societies, and in vain for the gratitude of those 'thousands of thinking persons', which he had rashly assured himself of receiving. As his reconciliation of Scripture statements and geological deductions was welcomed nowhere, as Darwin continued silent, and the youthful Huxley was scornful, and even Charles Kingsley, from whom my Father had expected the most instant appreciation, wrote that he could not 'give up the painful and slow conclusion of five and twenty years' study of geology, and believe that God has written on the rocks one enormous and superfluous lie',—as all this happened or failed to happen, a gloom, cold and dismal, descended upon our morning teacups.This is a shame, I think; because Gosse sr's theory is precisely a Nachträglichkeit-isation of our understanding of the world. Martin Gardner:
This is not as ridiculous as it way seem at first. Consider, for example, the difficulty which faces any believer in a six-day creation. Although it is possible to imagine Adam without a navel, it is difficult to imagine him without bones, hair, teeth, and fingernails. Yet all these features bear in them the evidence of past accretions of growth. In fact there is not an organ or tissue of the body which does not presuppose a previous growth history.... The same is true of every plant and animal. As Gosse points out, the tusks of an elephant exhibit past stages, the nautilus keeps adding chambers to its shell, the turtle adds laminae to its plates.... In short — if God created the earth as described in the Bible, he must have created a 'going concern.' (Martin Gardner 1957: 126)And here are Gosse sr's own words:
The life of the individual consists of a series of processes which are cyclical … the life of the species consists of a series of processes which are cyclical … It is certain that, when the Omnipotent God proposed to create a given organism, the course of that organism was present to his idea as an ever revolving circle. He created it at some point in the circle, and gave it thus an arbitrary beginning, but one which involved all previous rotations of the circle, though only as an ideal, or in other phrase, prochronic.Phase II (in which Doris gets her oats) will follow, in which this circularity, this Freudo-Derridean calcification of the dialectic, is read back into Edmund Gosse's own account of origins, creation, circles and retrospectivity. But that's for another day.
Friday, 11 March 2011
Seen Perspectively by Finitude
Coleridge was rather fond of his own formulation ('I thought I exprest my Thoughts well when I said...'): 'Religion is only Reason, seen perspectively by a finite Intellect' [543 in Perry's Notebooks: Selections -- this dates from 'Feb 23rd 1816']. Of course it's possible that it is all the things that Coleridge loves in his faith that will disappear when the (mortal) Mercator's projection is folded back into (infintite) reality; that Greenland will shrink to being a relatively small, unimportant, barren island and not the heavenly Île verte of human fantasy.
Thursday, 10 March 2011
Wednesday, 9 March 2011
A definition of science fiction
SF marks a point, or many points, of difference between the world we actually live in and the world of the text. For this reason, the grain of SF is dystopian. 'For this reason?' you ask. 'Surely not! A point of difference in the text can as well be an improvement over the real world as a disaster!' And so it is, obviously; we can all think of many utopian or quasi-utopian examples. But this is the thing: human sensibility is geared to notice negative differences more acutely than positive ones. If our lives improve, incrementally, we barely notice it; if they go into incremental decline we whine and whinge and raise all hell. If a man loses all his earthly possessions he will raise his arms to heaven and cry, with tear-stricken face, 'why me, O Lord? Why me?' But if a man should double his net worth he will (of course) be pleased, and may even raise a glass of champagne. But he won't -- and this is the crucial thing -- he won't cry to the sky 'why me, O Lord? Why me?' The gradiant runs the other way. And so for SF.
Tuesday, 8 March 2011
Master-slave dialectic in Tolkien
I've been asked by more than one person to say a little more about this rather throwaway blog comment.
The so-called 'master-slave dialectic' is from Hegel's Phenomenology of Spirit: google it and you'll find a wealth of sites and definitions and analyses. It has been widely interpreted, actually; a fact which is in part an index to the extent that nobody's entirely sure what Hegel means by it. So: there's a Wikipedia entry on the term:really don't have time to surely don't need to: the lineaments of such a reading are fairly obvious, I think.
Which is to say: Lord of the Rings is clearly a book very much concerned with questions of mastery and slavery, a novel precisely about the proper and improper boundaries of power, about relationships between masters and servants. Sauron's evil manifests predominantly in a desire for mastery; and the novel construes pairs in dialectical power-relationships that work through antithetical inversions, often violently, or in the larger context of violence and death -- Frodo and Sam (where the salient is the section where Sam thinks his master dead and takes the ring, becoming the master at exactly the point where Frodo is becoming more and more like Gollum); Saruman and Gandalf; Saruman and Wormtongue; Frodo and Gollum; Aragorn and Boromir; Aragorn and Denethor. In each case, and to different degrees, consciousness becomes self-consciousness, partial knowledge is sublated into a higher, more coherent knowledge (wisdom, we might say, although it is rarely, in Tolkien, a thing of comfort). Indeed, part of the mode of eucatastrophic writing is the delay of the getting of wisdom to the very end: as with Sauron, moments before the destuction of the ring, 'all the magnitude of his own folly was revealed to him in a blinding flash' [III:223].
All of this takes place in the novel under the 'higher' rubric (formally speaking) of the Ring, and the way its 'serving' function -- the usefulness of being able to make oneself invisible -- necessarily turns about to the point where it is the ring that is the master and the wearer who is the servant. The ring, of course, stands as a sort of emblemmatisation of the principle of master-servant domination, or oppression; and one of the memorable aspects of it is the way its domination becomes written on the body via a particular mode of Hegelian 'death struggle', an ageless withering, a nightmare-death-in-life existence ... the "abstract negation" of life itself. Contradiction and resolution can only happen, in the novel, via a moment of what amounts to self-mutilation: Frodo having become Gollum, the master the servant -- or rather, a kind of conceptual short circuit is effected with Frod and Gollum literally struggling (as master/servant) in order to discover who will become the servant of a higher master (the Ring). The contradiction is resolved not via a Hegelian negotiation and recognition of interdependence, but by the literal dissolution of (Gollum's) body and the ring in fire.
This in turn opens the book to a much more radical reading (politically, or ideologically speaking) than has often been the case with critics, fixated as they often have been on Tolkien's personal traditionalist, Catholic and conservative affiliations. It's not Epic Pooh; it's sword and dialecticsorcery, Hegel-Fantasy.
One more thing: reading the text this way, it seems to me, opens a different perspective on the way the novel ends (something rather splendidly, if ludicrously, brought out by the Return of the King movie, with its interminable, groundhog-day-ish cycle of recirculating endings). That is to say: LotR is a book that ends its own history; it stages 'the end of history' in its imaginative world in more thoroughgoing way than any other novel I can think of. Kojève (of course) had his own ideas about 'the end of history and the last man'; the last man, for our purposes, being Sam; and history ending as a synthetic working through of a particular historical dynamic (master-slave).
The so-called 'master-slave dialectic' is from Hegel's Phenomenology of Spirit: google it and you'll find a wealth of sites and definitions and analyses. It has been widely interpreted, actually; a fact which is in part an index to the extent that nobody's entirely sure what Hegel means by it. So: there's a Wikipedia entry on the term:
The Master-Slave dialectic (Herrschaft und Knechtschaft in German; also translated Lordship and Bondage) is a famous passage of Hegel's Phenomenology of Spirit. It is widely considered a key element in Hegel's philosophical system, and has heavily influenced many subsequent philosophers. It describes, in narrative form, the encounter between two self-conscious beings, who engage in a "struggle to the death" before one enslaves the other, only to find that this does not give him the control over the world he had sought.The 20th-century French (well, Russian) philosopher Alexandre Kojève made a great deal of the master-slave dialectic. In Michael S Roth's words:
'As in all his works, [Kojève] adopted the master/slave dialectic from Hegel's Phenomenology as the schema for organizing change over time. In this schema, man is defined by his desire for recognition-a desire that can be satisfied only with the conservation of its object-and his will to risk his life in order to satisfy this desire. Kojeve used the master/slave dialectic as an allegory of human development: There is bloody battle followed by the rule of the master over the working slave. The master, however, cannot satisfy his human desire, because he is recognized by a mere slave. Eventually, the slaves take over, but they remain in servitude in relation to their work. Real freedom comes only through universal recognition of all and each as citizens. Each moment of the master/slave dialectic is at the origin of an ideal type of justice: Mastery is the basis for "equality" (Droit aristocratique); victory of the slave is the basis for "equivalence" (Droit bourgeois); citizenship is the basis for the synthetic justice of equity. Thus, Kojève extracted what he took to be the crucial dialectic in the Phenomenology, and used it to emplot the evolution of right and the connection of this evolution to things human.' ['A Note on Kojève's Phenomenology of Right']Other thinkers have taken it as a kind of allegory for the relationship between subjectivity and objectivity; others (more justifiably, I thinnk) argue that the social context of what Hegel is saying is vital. Here's the relevant bit of Terry Pinkard's Hegel's Phenomenology: The Sociality of Reason (CUP 1994), or rather Howard Tuttle's summary of it:
In Chapter 3, "Masters, Slaves, and the Subjective Point of View" Pinkard shows that it is necessary to understand Hegel's famous master-slave dialectic as something more than an epistemological relation between a knowing subject and an independent object. Such a relation must always be defined in a social context. The dominance of the master over the slave is a fact that demands the social recognition of both parties. Each has found, according to Pinkard, "that he cannot identify what is his own without reference to the other's point of view -- without, that is, reference to the sociality common to both" (p. 62). The objective point of view which seems to be established by a higher dialectical reason is actually grounded in the social assumptions of the parties involved.I could go on, but won't. Indeed, I could write many thousands of words about the master-slave dialectic in Lord of the Rings, but
Which is to say: Lord of the Rings is clearly a book very much concerned with questions of mastery and slavery, a novel precisely about the proper and improper boundaries of power, about relationships between masters and servants. Sauron's evil manifests predominantly in a desire for mastery; and the novel construes pairs in dialectical power-relationships that work through antithetical inversions, often violently, or in the larger context of violence and death -- Frodo and Sam (where the salient is the section where Sam thinks his master dead and takes the ring, becoming the master at exactly the point where Frodo is becoming more and more like Gollum); Saruman and Gandalf; Saruman and Wormtongue; Frodo and Gollum; Aragorn and Boromir; Aragorn and Denethor. In each case, and to different degrees, consciousness becomes self-consciousness, partial knowledge is sublated into a higher, more coherent knowledge (wisdom, we might say, although it is rarely, in Tolkien, a thing of comfort). Indeed, part of the mode of eucatastrophic writing is the delay of the getting of wisdom to the very end: as with Sauron, moments before the destuction of the ring, 'all the magnitude of his own folly was revealed to him in a blinding flash' [III:223].
All of this takes place in the novel under the 'higher' rubric (formally speaking) of the Ring, and the way its 'serving' function -- the usefulness of being able to make oneself invisible -- necessarily turns about to the point where it is the ring that is the master and the wearer who is the servant. The ring, of course, stands as a sort of emblemmatisation of the principle of master-servant domination, or oppression; and one of the memorable aspects of it is the way its domination becomes written on the body via a particular mode of Hegelian 'death struggle', an ageless withering, a nightmare-death-in-life existence ... the "abstract negation" of life itself. Contradiction and resolution can only happen, in the novel, via a moment of what amounts to self-mutilation: Frodo having become Gollum, the master the servant -- or rather, a kind of conceptual short circuit is effected with Frod and Gollum literally struggling (as master/servant) in order to discover who will become the servant of a higher master (the Ring). The contradiction is resolved not via a Hegelian negotiation and recognition of interdependence, but by the literal dissolution of (Gollum's) body and the ring in fire.
This in turn opens the book to a much more radical reading (politically, or ideologically speaking) than has often been the case with critics, fixated as they often have been on Tolkien's personal traditionalist, Catholic and conservative affiliations. It's not Epic Pooh; it's sword and dialecticsorcery, Hegel-Fantasy.
One more thing: reading the text this way, it seems to me, opens a different perspective on the way the novel ends (something rather splendidly, if ludicrously, brought out by the Return of the King movie, with its interminable, groundhog-day-ish cycle of recirculating endings). That is to say: LotR is a book that ends its own history; it stages 'the end of history' in its imaginative world in more thoroughgoing way than any other novel I can think of. Kojève (of course) had his own ideas about 'the end of history and the last man'; the last man, for our purposes, being Sam; and history ending as a synthetic working through of a particular historical dynamic (master-slave).
Monday, 7 March 2011
Dolce & Gabbana est
The gold knife in the double-breast
Is much more mendace than gory:
Brand new lies -- Dolce & Gabbana est
Pro couture mori.
Is much more mendace than gory:
Brand new lies -- Dolce & Gabbana est
Pro couture mori.
Sunday, 6 March 2011
Anthropomorphic, Abstract
Reading Jameson’s dense account of Ricoeur’s Time and Narrative in Valences of the Dialectic.
This is all good, of course; except that abstraction is itself an anthropomorphic category. It is unique to humans, after all; the projection of a particular form of humanocentric thought onto the cosmos as a whole (maths, philosophy, certain forms of religion). We could go further and say: it is precisely in the extent to which religions resist abstraction—the focus on the incarnated physicality of Christ for instance—that they manage, paradoxical though it may seem, to escape the merely anthropomorphic.
Ricoeur’s polemical target here is, to be sure, complicated by the presence on his agenda of not one but two key texts of Aristotle, namely the Poetics and the Physics, which are evaluated positively and negatively respectively: the first staging an essentially anthropomorphic account of human time in terms of narrative, while the second offering [sic], as we have already seen, a philosophical description of temporality which omits the distinctiveness of the human or existential dimension. The crucial attack on narrative semiotics which lies at the heart of this essentially traditionalist project of Ricoeur is aimed explicitly at what he considers to be semiotics wilful and perverse substitution of abstact categories for anthropomorphic ones. [Valences, 488]Jameson adds ‘we must leave aside the obvious retort that Augustine does not resolve the aporia of objective versus subjective time either’ (he kind of does, though, doesn’t he? In the sense that ‘God’ can be positied as a resolution for every dilemma of this sort).
This is all good, of course; except that abstraction is itself an anthropomorphic category. It is unique to humans, after all; the projection of a particular form of humanocentric thought onto the cosmos as a whole (maths, philosophy, certain forms of religion). We could go further and say: it is precisely in the extent to which religions resist abstraction—the focus on the incarnated physicality of Christ for instance—that they manage, paradoxical though it may seem, to escape the merely anthropomorphic.
Saturday, 5 March 2011
A Theory of Beauty
I'm fascinated that all these faces are so conventionally beautiful: as if beauty itself is nothing more than an approximation to the ideal average ...
Friday, 4 March 2011
Something important is wrong with Paradise Lost
I don't mean the usual grounds of criticism; and I'm not trying for a deliberately low-brow grunting objection to Milton's Latinisms, or anything like that. I mean something more specific; something at its theological core. It has to do with the poem's deepest relationship to newness. This to spin-on from what I say here: the point of Christianity, in a deep sense -- the main thing it adds to the body of human religious thinking -- is its apotheosis of the novum. God created the world many thousands (or as we would now say; billions) of years ago ... but he waited until 0BC/AD (or 5 BC; let's no splt hairs) to incarnate himself into that world. Of course this is a little problematic: if God created the world and the people in it, why did he wait so long to come into it himself? We might imagine him creating the cosmos as a home for himself, in which case He would be there from the beginning; or we might imagine him (as some early Christians did; and many sects of contemporary Millenarian Christians do too) waiting until very near the end of things to come on stage, as a sort of climactic act. But that's not what we see in Christianity: we see a God happy for the world to trundle along, unredeemed, for thousands -- or billions -- of years; and then, at an otherwise unremarkable period in history, to insert himself into creation. Islam doesn't have this problem, for all that its most revered prophet appears even later in the historical narrative. I say this despite the fact that Islam specifically considers its faith complete, universal and primordial (as Milton did Christianity). This is because, although Mohammed has, of course, a special place in the traditions of Muslim religious observance, Islam is actually perfectly hospitable to other, earlier prophets (including Abraham, Moses and Jesus) who have all over time played their part in the gradual perfection of the revelation of God's purpose to the world.
This, then, is one of Milton's problems. He regards his own Christian faith as complete, universal and primordial; but at the same time he sets out to write a poem set thousands (billions) of years before the key event of his own faith. It's all the prelude to a prelude. But Milton can't have that; so he decides to violate one of the things -- the structure of time -- that is at the delicate heart of the beautifully narrative and temporal revelations of the New Testament. In Paradise Lost everything, from the Creation to the Crucifiction, happens all at once. Characters spend a goodly portion of the poem either retrospecting or proleptically anticipating key events; and Jesus is there from the get-go, a character in the poem, debating with God the Father in Book 3. Consecutivity is dissolved away, or forced through a scrap-metal-yard's car-crusher.
What does this mean? Well it flirts with absurdity, in a bad way: not only the temporal paradox issue of how we can have free will in a universe in which everything is divinely ordained and has (as it were) already happened -- scholars have taken immense pains in debating that thorny notion, in the poem and in the broader theological sense. I have no wish to stick my own hand into that metaphorical Gom Jabbar. No, I mean the way in which Milton ends up staging a debate between a father who does not temporally precede his son; the way the whole perspective of the larger cosmic narrative is crushed and crowded into a ludicrously small space. The way Book 3 exhibits, in manifold ways, a contempt for the very idea of newness as such.
Milton entombs the newness of Christianity in a sepulchral oldness (a primordialness, an ancientness) that kills it. But for Milton death-in-life is not the nightmare figure it was for Coleridge. Death is the gate back to Eden; the mode by which salvation and redemption is effected. Christ at the beginning of Paradise Lost -- at the beginning of the world, the beginning of the human story -- is already dead:
That's fair enough; but we can step outside Christian doctrinal squabbles far enough to say: it's not Christianity. If Christianity is a religion founded on the transcendental assertion 'everything is different now', then Paradise Lost is a text that sets itself at the opposite pole from that. It's a poem about the end of things, for all that it presents itself as set at the beginning. It is, indeed, a poem that rather splendidly fetishizes endings as such. The clue is in the title, I suppose.
This, then, is one of Milton's problems. He regards his own Christian faith as complete, universal and primordial; but at the same time he sets out to write a poem set thousands (billions) of years before the key event of his own faith. It's all the prelude to a prelude. But Milton can't have that; so he decides to violate one of the things -- the structure of time -- that is at the delicate heart of the beautifully narrative and temporal revelations of the New Testament. In Paradise Lost everything, from the Creation to the Crucifiction, happens all at once. Characters spend a goodly portion of the poem either retrospecting or proleptically anticipating key events; and Jesus is there from the get-go, a character in the poem, debating with God the Father in Book 3. Consecutivity is dissolved away, or forced through a scrap-metal-yard's car-crusher.
What does this mean? Well it flirts with absurdity, in a bad way: not only the temporal paradox issue of how we can have free will in a universe in which everything is divinely ordained and has (as it were) already happened -- scholars have taken immense pains in debating that thorny notion, in the poem and in the broader theological sense. I have no wish to stick my own hand into that metaphorical Gom Jabbar. No, I mean the way in which Milton ends up staging a debate between a father who does not temporally precede his son; the way the whole perspective of the larger cosmic narrative is crushed and crowded into a ludicrously small space. The way Book 3 exhibits, in manifold ways, a contempt for the very idea of newness as such.
Milton entombs the newness of Christianity in a sepulchral oldness (a primordialness, an ancientness) that kills it. But for Milton death-in-life is not the nightmare figure it was for Coleridge. Death is the gate back to Eden; the mode by which salvation and redemption is effected. Christ at the beginning of Paradise Lost -- at the beginning of the world, the beginning of the human story -- is already dead:
I offer, on mee let thine anger fall;Christ dies right here ('now to Death I yield'); but does so only so as to turn himself into a kind of metadeath, a death that preys on death. This is styled according to the larger 'resurrection' logic of Christian belief, of course; but its effect here is rather a claustrophobic reduplication of death. And what is that, except the crystallisation of a larger aesthetic and theological project to de-bone the temporal specificity of the Incarnation, and make it something always already accomplished and ended? If, in the Areopagitica Milton sees no contradiction in simultaneously hymning books as newly and vitally alive and as dead and embalmed ('a good book is the precious lifeblood of a master spirit, embalmed and treasured up on purpose to a life beyond life') -- it is because his whole project wants to make no distinction, on a deep level, between being alive and being dead; or to put it more carefully, between being at the beginning of something and being at the end.
Account mee Man; I for his sake will leave
Thy bosom, and this glorie next to thee
Freely put off, and for him lastly die
Well pleas'd, on me let Death wreck all his rage;
Under his gloomy power I shall not long
Lie vanquisht; thou hast givn me to possess
Life in my self for ever, by thee I live,
Though now to Death I yield, and am his due
All that of me can die, yet that debt paid,
Thou wilt not leave me in the loathsom grave
His prey, nor suffer my unspotted Soule
For ever with corruption there to dwell;
But I shall rise Victorious, and subdue
My Vanquisher, spoild of his vanted spoile;
Death his deaths wound shall then receive, & stoop
Inglorious, of his mortall sting disarm'd. [III:237-53]
That's fair enough; but we can step outside Christian doctrinal squabbles far enough to say: it's not Christianity. If Christianity is a religion founded on the transcendental assertion 'everything is different now', then Paradise Lost is a text that sets itself at the opposite pole from that. It's a poem about the end of things, for all that it presents itself as set at the beginning. It is, indeed, a poem that rather splendidly fetishizes endings as such. The clue is in the title, I suppose.
Thursday, 3 March 2011
An Atheist's Apology for Christianity
I'm thinking of writing this (as you can tell, from my last few posts on this blog); although I'm did wonder whether 'An Atheist's Defence of Christianity' mightn't be a better title.
One key thing is to make clear that this isn't an exercise in a mode of lost-faith nostalgia, or sepia-tinted affection for the charming architecture and rites of (for instance) the Church of England. As it happens, I am neither nostalgic nor especially affectionate for that institution. My mother's father was a Vicar, but my mother was herself an atheist from an early age, and my father lost his evangelical faith in his teens. I was raised, non-dogmatically but effectively, in disbelief, which leaves me with nothing (personally speaking) about which to be nostalgic, and no especial emotional connection with the community of Anglicans.
So what would the point be? It would in part be about what has become a pretty polarised and aggressive 'pro-faith' vs 'Dawkinsian anti-faith' debate. The heat, here, is of a similar temperature to (although more civilised mores, or more social inertia, has resulted in a lower body count than) the old debates on doctrine, in which two groups who shared not only religious faith but belief in the same God and saviour, the same Holy Book and many other points of identity, nevertheless made war upon one another in the name of suppressing heresy and slaughtered by the thousand. I have a heresy of my own: it is that atheism is the only non-blasphemous Christianity. I'll need more than a blog post to argue it, though.
Of course, a Christian might say: the last person qualified to "apologise" for Christianity is an atheist! You're further removed from the core of what Christianity entails -- belief in God, a soul, the divinity of Christ, the incarnation, substitutionary atonement and so on. Or else a Christian might say, in less hostile mode: but why write An Atheist's Apology for Christianity? Why not just ... become a Christian! Both objections touch on a key point.
In a nutshell, the argument would be that Christianity is radically concerned with the excluded, the un-chosen people, the scums and bums; and that once it becomes the dominant religion on the planet, the key category of excluded becomes 'the unbelievers'. So only an atheist can truly be a Christian. In appropriately paradoxical mode. This may come over merely as glib; but I hope not. It is (or it seems to me) an observation both profound and important.
On balance I think 'An Atheist's Defence of Christianity' isn't as good a title as 'An Atheist's Apology for Christianity'. 'Apologia' in Latin means 'defence' or 'apology' but 'apologo' means 'reject, spurn'.
One key thing is to make clear that this isn't an exercise in a mode of lost-faith nostalgia, or sepia-tinted affection for the charming architecture and rites of (for instance) the Church of England. As it happens, I am neither nostalgic nor especially affectionate for that institution. My mother's father was a Vicar, but my mother was herself an atheist from an early age, and my father lost his evangelical faith in his teens. I was raised, non-dogmatically but effectively, in disbelief, which leaves me with nothing (personally speaking) about which to be nostalgic, and no especial emotional connection with the community of Anglicans.
So what would the point be? It would in part be about what has become a pretty polarised and aggressive 'pro-faith' vs 'Dawkinsian anti-faith' debate. The heat, here, is of a similar temperature to (although more civilised mores, or more social inertia, has resulted in a lower body count than) the old debates on doctrine, in which two groups who shared not only religious faith but belief in the same God and saviour, the same Holy Book and many other points of identity, nevertheless made war upon one another in the name of suppressing heresy and slaughtered by the thousand. I have a heresy of my own: it is that atheism is the only non-blasphemous Christianity. I'll need more than a blog post to argue it, though.
Of course, a Christian might say: the last person qualified to "apologise" for Christianity is an atheist! You're further removed from the core of what Christianity entails -- belief in God, a soul, the divinity of Christ, the incarnation, substitutionary atonement and so on. Or else a Christian might say, in less hostile mode: but why write An Atheist's Apology for Christianity? Why not just ... become a Christian! Both objections touch on a key point.
In a nutshell, the argument would be that Christianity is radically concerned with the excluded, the un-chosen people, the scums and bums; and that once it becomes the dominant religion on the planet, the key category of excluded becomes 'the unbelievers'. So only an atheist can truly be a Christian. In appropriately paradoxical mode. This may come over merely as glib; but I hope not. It is (or it seems to me) an observation both profound and important.
On balance I think 'An Atheist's Defence of Christianity' isn't as good a title as 'An Atheist's Apology for Christianity'. 'Apologia' in Latin means 'defence' or 'apology' but 'apologo' means 'reject, spurn'.
Wednesday, 2 March 2011
On Pascal's Wager
Two problems here. One is that the famous wager invites each of us to make a bet with monopoly money that, once made, entails us with immediate real-money (as it were) debts in the world, and in our day-to-day lives. But the other is precisely the currency (hah!) of the metaphor, once we unpack it, of a 'wager'. One wagers wealth; but Christianity is the religion of the poor. There's a mismatch there which is more than just rhetorical.
Tuesday, 1 March 2011
Illativity again
A conversation with a friend sent me back to thoughts on Newman's Grammar of Assent (about which I had previously blogged, here). On balance, I think the 'Black Adam', in that post, was a cheap shot (right, but cheap), and I've been thinking again about why I find Newman's 'Illative sense' so wrongheaded -- even I have to concede that it is a heartfelt and genuine attempt by a man of deep faith to square the circle of one of the 19th-century's Big Problems: the increasing chasm between scientific truths and the revealed truths of religion. Thinking again, I've come to the conclusion that my disagreement with Newman has to do with the sense that the Illative sense necessarily plays-into that with which we are familiar and that with which we are comfortable. If it's something we're used to, we're more likely 'Illatively' to feel it to be right. It's not just that I mistrust that, personally (although I do); it's that it seems to me radically the contradict the whole point of Christianity -- which before it is anything else is a faith of 'Everything is Different Now', concerning which I have also blogged.
To pick a not-uncontentious example: homosexuality. Many people feel sure, 'illatively', that this is wrong, a sin, an affront to God. They don't actively persecute homosexuals, and they can't justify their animadversion rationally; but they don't need to -- the Illative sense closes the gap between the way the world is and their 'assent' in the prevalent, longstanding discourses of homophobia. Those people are incorrect; but so long as the feedback loop of 'this is what I'm used to' - 'this is what I'm comfortable with' powers their Illative instincts on this matter, they're not going to realise they're wrong, and a great injustice, and enormous human misery, will continue to be propagated.
To pick a not-uncontentious example: homosexuality. Many people feel sure, 'illatively', that this is wrong, a sin, an affront to God. They don't actively persecute homosexuals, and they can't justify their animadversion rationally; but they don't need to -- the Illative sense closes the gap between the way the world is and their 'assent' in the prevalent, longstanding discourses of homophobia. Those people are incorrect; but so long as the feedback loop of 'this is what I'm used to' - 'this is what I'm comfortable with' powers their Illative instincts on this matter, they're not going to realise they're wrong, and a great injustice, and enormous human misery, will continue to be propagated.
Subscribe to:
Posts (Atom)