Maybe I should be watching more TV but I honestly had no idea what Kate Middleton looked like. There must have been times I would have heard references to her and thought she was some kind of actress or model. And frankly, I still don’t care.
But I do care about her dress. I am quite naive in the way of things (despite my interest in the ethnography of the everyday) so the fact that it may matter what the future queen will be wearing on her wedding day was not obvious to me. And matter more to many than whether a former prime minister or two were invited. I heard about it on the Today programme this morning and realised that I had been pretty silly not to have thought about it and then all was clear when I got a text from a female friend who professed that the dress was the main thing of import.
And there’s really nothing all that wrong with that. The fate of two luminary has beens is really of no more ritual import (probably quite a bit less) than the dress of one of the key participants in the nuptial proceedings. There are all sorts of legitimate concerns such as the allusions to fashions and traditions, the nationality and status of the designer, the possibility of a radical statement being made, and so on. Dresses obviously matter. And the fact that they matter more to women than men should make them no less important.
But then I saw it. It was by accident on a page talking about online streaming records (the sort of thing I take an interest in of an afternoon). And I remembered how much I hate female dresses (almost as much as women’s shoes). First, I hate them because they look uncomfrotable (but then a certain amount of discomfort is to be expected during a ritual). But more importantly I hate them because of the actual ritual symbolism. Ms Middleton was standing next to Mr Windsor who was clad in some sort of military get up as befits the protector of a nation. And she was smiling presumably to indicate some sort of emotional attachment to the man (whether a genuine affection or a public performance or a mixture of both is of no relevance to me). However, whereas his dress spoke of conquest and spoils of war, hers was all captured fecundity. She was the symbol of putting all of the women whose gaze was firmly fixed on every frill and curve in their procreative and caring place. The Royal family captured the youth and fertility of the nation into its net and was displaying its prize.
I probably would have hated any wedding dress but this one seemed to me particularly insidious in the understated manner with which it suggested tradition and a sexual act of possession being performed right before the adoring eyes. Why not break with tradition and wear Clintonesque trousers? You can allude to the past with a frilly hat if absolutely necessary! OK, perhaps here I am imputing my revolutionary desire onto a woman who probably quite enjoys the opportunity to play the role of a traditional princess. And I don’t believe there’s anything wrong with women playing traditional roles if that’s the sort of game they either enjoy or from which they can derive power. Also, it is entirely possible since I don’t speak the language of dresses that I misunderstood the symbolism of this one. Maybe it is one merging modernity with the past in a way I’m not able to understand. But even then I believe my general point stands: We need new rituals of formality for new times not reliant on gender roles just for the sake of them. And since the roles of men and women in marriage have changed, shouldn’t the way in which they are consecrated change also?
All dress is a mask in the performance of one role or another. But it is important that like any actor, a woman can step away from the role she playing. There’s lots of evidence that a burqa can be an empowering statement fraught with social and political as well as gender classifications. Only it is too easy a mask to weld onto a woman’s face. Just as much as a dress stressing sexual availability while the adjacent man’s demands executive power. There is nothing in our society that justifies that division of power so why not try to invent new rituals that promote the kind of public equality we want to see.
I have nothing against men and women signalling their sexual identity through dress in the appropriate situations like clubs and pubs. But I think at formal public functions there’s no need to promulgate this stereotype. Dapper still indicates easy prosperity while resplendent or ravishing implies sexual possession. Why not have men extentuate their genitalia (as they do in many cultures) in these situations rather than pretend they’re there just to grab a woman for a bit of procreation and get back to their busy lives managing trade in the East Indies.
I started Metaphor Hacker to question the easy over-interpretation of metaphor as a trope of inevitable consequence. It is of course possible for a woman to code switch and perform important executive role while dressed in a conventional female manner. The historical interpretation of the female formal dress need not apply today (as it may not have straigthforwardly applied in the past). What I’m asking for here is a political act relying on that historical interpretation and saying ‘no more’ or at least ‘only if we really feel like it’.
However, one feature of human communication I’m particularly keen to discuss more is the notion of inventory of patterns of expressions. So although the conceptual underpinnings of the ritual may not be straightforward, if the constant unvaried performance is all that there is, women may not be able to reach into the inventory for an effortless alternative. So while dresses are something that women talk about and should not be thought of as being of lesser value, it is all too easy to see them that way. Women should not be apologetic for this and I’m sure they’re not. But we all need to be careful not to go back to the time of Odysseus when men went to war and returned to slay suitors of the women who stayed behind to weave and keep chaste!
Withered wisteria, old tree, darkling crows –
Little bridge over flowing water by someone’s house –
Emaciated horse on an ancient road in the western wind –
Evening sun setting in the west –
Broken-hearted man on the horizon.
And indeed, he is right. The poem exudes timelessness (if a lack of something can be exuded). It is more difficult for some languages than others to avoid certain grammatical commitments (like gender or number) which makes translation more difficult but there’s always a way.
What struck me about the poem was something different. It is so rich in imagery and yet so poor in figurative language . This is by no means unique but worth a note. In fact, there is no figurative language there at all if we discount such foundational figures as “sun setting in the West”, “broken-hearted” or “man on the horizon”. In fact, had I not known this was a Chinese poem, I could have easily believed it was a description of 17th century Dutch master‘s painting or even something by Constable.
But of course, the conceptual work we’re doing while reading this poem is not that different from the work we would do if it was full of metaphor. I’m working on a post of how adjectives and predicates are really very similar to metaphors and this is one example that illustrates the point. In order to appreciate this poem, we have to construct a series of fairly rich images and then we have to blend them with each other to place them in the same place. We have to interpret “the broken hearted man on the horizon” is it just another image, the poet or ourselves? In other words, we have to map from the image presented by the poem to the images available to us by our experience. Which, in short, is the same work we have to do when interpreting metaphors and similies. But the title is the clincher: “autumn thoughts” – what if the whole poem is a metaphor and the elements in it just figures signifying age, loneliness, the passage of time and so on and so forth. There are simply too many mappings to make. And there the escape from metaphor ends.
I am an old atheist and a new agnostic. I don’t believe in God in the old-fashioned Russellian way – if I don’t believe in Krishna, Zeus, water sprites or the little teapot orbiting the Sun, I don’t believe in God and the associated supernatual phenomena (monotheism my foot!). However, I am agnostic about nearly everything else and everything else in the new atheist way is pretty much science and reason. If history is any judge (and it is) most of what we believe to be scientific truths today is bunk. This makes me feel not superior at all to people of faith. Sure I think what they believe is a stupid and irrational thing to believe, but I don’t think they are stupid or irrational people to believe it. The smartest people believe the most preposterous things just look at Newton, Chomsky or Dawkins.
But one thing I’m pretty certain about is religion. Or rather, I’m pretty certain it does not exist. It is in many ways an invention of the Enlightenment and just like equality and brotherhood it only makes sense until you see the first person winding the up guillotine. Religion only makes sense if you want to set a certain set of beliefs and practices aside, most importantly to deprive their holders of power and legitimacy.
But is it a useful concept for deliberation about human universals? I think on balance it is not. Religion is a collection of stated beliefs, internal beliefs and public and private practices. In other words, religion is a way of life for a community of people. Or to be etymological about it, it is what binds the community together. The nature of the content of those beliefs is entirely irrelevant to the true human universal: a shared collection of beliefs and practices develops over a short amount of time inside any group of people. And when I say beliefs, I mean all explicit and implicit knowledge and applied cognition.
In this sense, modern secular humanism is just as much a religion as rabid evangelicalism.
On the mundane nature of sacred thought
So, why the scientist asks, do all cultures develop knowledge system that includes belief in the supernatural? That’s because they don’t. For instance, as Geertz so beautifully described in his reinterpretation of the Azande, witchcraft isn’t supernatural. It is the natural explanation after everything else has failed. We keep forgetting that until Einstein, everybody believed in this (as Descartes pointed out) supernatural force called gravity that could somehow magically transmit motion accross vast distances. And now (as Angel and Demetis point out) we believe in magical sheets that make gravity all nice and natural. Or maybe strings? Give me a break!
What about the distinction between the sacred and mundane you ask? Well, that obviously exists including the liminality between them. But sacred/mundane is not limited to anything supernatural and magical – just look at the US treatment of the flag or citizenship. In fact, even the most porfoundly sacred and mystical has a significant mundane dimension necessitated by its logistics.
There are no universals of faith. But there are some strong tendencies among the world’s cultures: Ancestor worship, belief in superhuman and non-human (often invisible, sometimes disembodied) agents, sympathetic magic and ritual (which includes belief in empowered and/or sapient substances and objects). This is combined with preserving and placating personal and collective practices.
All of the above describes western atheists as much as the witchcraft believing Azande. We just define the natural differently. Our beliefs in the power of various pills and the public professions of faith in the reality of evolution or the transformative nature of the market fall under the above just as nicely as the rain dance. Sure I’d much rather go to a surgeon with an inflamed appendix than a witch doctor but I’d also much rather go to a renowned witch doctor than an unknown one if that was my only choice. Medicine is simply witchcraft with better peer review.
They pretty much put to rest some of the evolutionary notions and the innateness of mind/body dualism. I particularly like the proposition Helene de Cruz made building on Pascal’s remark that some people “seem so made that [they] cannot believe”. “For those people” continues de Cruz, “religious belief requires a constant cognitive effort.”
I think this is a profound statement. I see it as being in line with my thesis of frame negotiation. Some things require more cognitive effort for some people than other things for other people. It doesn’t have to be religion. We know reading requires more cognitive effort for different people in different ways (dyslexics being one group with a particular profile of cognitive difficulties). So does counting, painting, hunting, driving cars, cutting things with knives, taking computers apart, etc. These things are suceptible to training and practice to different degrees with different people.
So it makes perfect sense on the available evidence that different people require different levels of cognitive effort to maintain belief in what is axiomatic for others.
In the comments Mitch Hodge contributed a question to “researchers who propose that mind-body dualismundergirds representations of supernatural entities: What do you do with all of the anthropological evidence that humans represent most all supernatural entities as embodied? How do disembodied beings eat, wear clothes, physically interact with the living and each other?”
This is really important. Before you can talk about content of belief, you need to carefully examine all its aspects. And as I tried to argue above, starting with religion as a category already leads us down certain paths of argumentation that are less than telos-neutral.
But the answer to the “are humans natural mind-body dualists” does not have to be to choose one over the other. I suggest an alternative answer:
Humans are natural schematicists and schema negotiators
What does that mean? Last year, I gave a talk (in Czech) on the “Schematicity and undetermination as two forgotten processes in mind and language”. In it I argue that operating on schematic or in other ways underdetermined concepts is not only possible but it is built into the very fabric of cognition and language. It is extremely common for people to hold incomplete images (Lakoff’s pizza example was the one that set me on this path of thinking) of things in their mind. For instance, on slide 12 of the presentation below, I show different images that Czechs submitted into a competition run online by a national newspaper on “what does baby Jesus look like” (Note: In Czech, it is baby Jesus – or Ježíšek – who delivers the presents on Christmas Eve). The images ran from an angelic adult and a real baby to an outline of the baby in the light to just a light.
This shows that people not only hold underdetermined images but that those images are determined to varying degrees (in my little private poll, I came across people who imagined Ježíšek as an old bearded man and personally, I did not explicitly associated the diminutive ježíšek with the baby Jesus, until I had to translate it into English). The discussions like those around Trinity or the embodied nature of key deities are the results of conversations about what parts of a shared schema is it acceptable to fill out and how to fill them out.
It is basically metaphor (or as I call it frame) negotiation. Early Christianity was full of these debates and it is not surprising that it wasn’t always the most cognitively parsimoneousimage that won out.
It is further important that humans have various explicit and implicit strategies to deal with infelicitous schematicity or schema clashes, which is to defer parts of their cognition to a collectively recognised authority. I spent years of my youth believing that although the Trinity made no sense to me, there were people to who it did make sense and to whom as guardians of sense, I would defer my own imperfect cognition. But any study of the fights over the nature of the Trinity are a perfect illustration of how people negotiate over their imagery. And as in any negotiation it is not just the power of the argument but also the power of the arguer that determines the outcome.
Christianity is not special here in any regard but it does provide two millenia of documented negotiation of mappings between domains full of schemas and rich images. It starts with St Paul’s denial that circumcision is a necessary condition of being a Christian and goes on into the conceptual contortions surrounding the Trinity debates. Early Christian eschatology also had to constantly renegotiate its foundations as the world sutbbornly refused to end and was in that no different from modern eschatology – be it religion or science based. Reformation movements (from monasticism to Luther or Calvin) also exhibit this profound contrasting of imagery and exploration of mappings, rejecting some, accepting others, ignoring most.
All of these activities lead to paradoxes and thus spurring of heretical and reform movements. Waldensians or Lutherans or Hussites all arrived at their disagreement with the dogma through painstaking analysis of the imagery contained in the text. Arianism was in its time the “thinking man’s” Christianity, because it made a lot more sense than the Nicean consensus. No wonder it experienced a post-reformation resurgence. But the problems it exposed were equally serious and it was ultimately rejected for probably good reason.
How is it possible that the Nicean consensus held so long as the mainstream interpretation? Surely, Luther could not have been the first to notice the discrepancies between lithurgy and scripture. Two reasons: inventory of expression and undedetermination of conceptual representationa.
I will deal with the idea of inventory in a separate post. Briefly, it is based on the idea of cognitive grammar that language is not a system but rather a patterned invenotory of symbolic units. This inventory is neither static nor has it clear boundaries but it functions to constrain what is available for both speech and imagination. Because of the nature of symbolic units and their relationship, the inventory (a usage-based beast) is what constrains our ability to say certain things although they are possible by pure grammatical or conceptual manipulation. By the same token, the inventory makes it possible to say things that make no demonstrable sense.
Frame (or metaphor) negotiation operates on the inventory but also has to battle against its constraints. The units in the inventory range in their schematicity and determination but they are all schematic and underdetermined to some degree. Most of the time this aids effortless conceptual integration. However, a significant proportion of the time, particularly for some speakers, the conceptual integration hits a snag. A part of a schematic concept usually left underdetermined is filled out and it prevents easy integration and an appropriate mapping needs to be negotiated.
For example, it is possible to say that Jesus is God and Jesus is the Son of God even in the same sentence and as long as we don’t project the offspring mapping on the identity mapping, we don’t have a problem. People do these things all the time. We say things like “taking a human life is the ultimate decision” and “collateral damage must be expected in war” and abhor people calling soldiers “murderers”. But the alternative to “justified war” namely “war is murder” is just as easy to sanction given the available imagery. So people have a choice.
But as soon as we flesh out the imagery of “X is son of Y” and “X is Y” we see that something is wrong. This in no way matches our experience of what is possible. Ex definitio “X is son of Y” OR “X is Y”. Not AND. So we need to do other things make the nature of “X is Y” compatible with “X is the son of Y”. And we can either do this by attributing a special nature to one or both of the statements. Or we can acknowledge the problem and defer knowledge of the nature to a higher authority. This is something we do all the time anyway.
So to bring the discussion to the nature of embodiment, there is no difficulty for a single person or a culture to maintained that some special being is disembodied but yet can perform many embodied functions (like eating). My favorite joke told to me by a devout Catholic begins: “The Holy Trinity are sitting around a table talking about where they’re going to go for their vacation…” Neither my friend nor I assumed that the Trinity is in any way an embodied entity, but it was nevertheless very easy for us to talk about its members as embodied beings. Another Catholic joke:
A saussage goes to Heaven. St Peter is busy so he sends Mary to answer the Pearly Gates. When she comes back he asks: “Who was it?” She responds: “I don’t know but, it sure looked like the Holy Ghost.”
Surely a more embodied joke is very difficult to imagine. But it just illustrates the availability of rich imagery to fill out schemas in a way that forces us to have two incompatible images in our heads at the same time. A square circle, of sorts.
There is nothing sophisticated about this. Any society is going to have members who are more likely to explore the possibilities of integration of items within its conceptual inventory. In some cases, it will get them ostracised. In most cases, it will just be filed away as an idiosyncratic vision that makes a lot of sense (but is not worth acting on). That’s why people don’t organize their lives around the dictums of stand-up comedians in charge. What they say often “makes perfect sense” but this sense can be filed away into the liminal space of our brain where it does not interfere with what makes sense in the mundane or the sacred context of conceptual integration. And in a few special cases, this sort of behavior will start new movements and faiths.
These “special” individuals are probably present in quite a large number in any group. They’re the people who like puns or the ones who correct everyone’s grammar. But no matter how committed they are to exploring the imagery of a particular area (content of faith, moral philosophy, use of mobile phones or genetic engineering) they will never be able to rid it of its schematicity and indeterminacies. They will simply flesh out some schemas and strip off the flesh of others. As Kuhn said, a scientific revolution is notable not just for the new it brings but also for all the old it ignores. And not all of the new will be good and not all of the old will be bad.
Not that I’m all that interested in the origins of language but my claim is that the negotiation of the mappings between undertermined schemas is at the very foundation of language and thought. And as such it must have been present from the very begining of language – it may have even predated language. “Religious” thought and practice must have emerged very quickly; as soon as one established category came into contact with another category. The first statement of identity or similarity was probably quite shortly followed by “well, X is only Y, in as much as Z” (expressed in grunts, of course). And since bodies are so central to our thought, it is not surprising that imagery of our bodies doing special things or us not having a body and yet preserving our identity crops up pretty much everywhere. Hypothesizing some sort of innate mind-body dualism is taking an awfully big hammer to a medium-sized nail. And looking for an evolutionary advantage in it is little more than the telling of campfire stories of heroic deeds.
To look for an evolutionary foundation of religious belief is little more sophisticated than arguing about the nature of virgin birth. If nothing else, the fervor of its proponents should be highly troubling. How important is it that we fill in all the gaps left over by neo-Darwinism? There is nothing special about believing in Ghosts or Witches. It is an epiphenomenon of our embodied and socialised thought. Sure, it’s probably worth studying the brains of mushroom-taking mystical groups. But not as a religious phenomenon. Just as something that people do. No more special than keeping a blog. Like this.
Post Script on Liminality [UPDATE a year or so later]
Cris Campbell on his Genealogy of Religion Blog convinced me with the aid of some useful references that we probably need to take the natural/supernatural distinction a bit more seriously than I did above. I still don’t agree it’s as central as is often claimed but I agree that it cannot be reduced to the sacred v. mundane as I tried above. So instead I proposed the distinction between liminal and metaliminal in a comment on the blog. Here’s a slightly edited version (which may or may not become its own post):
I read with interest Hultkranz’s suggestion for an empirical basis for the concept of the supernatural but I think there are still problems with this view. I don’t see the warrant for the leap from “all religions contain some concept of the supernatural” to “supernatural forms the basis of religion”. Humans need a way to talk about the experienced and the adduced and this will very ‘naturally’ take the form of “supernatural” (I’m aware of McKinnon’s dissatisfaction with calling this non-empirical).
On this account, science itself is belief in the supernatural – i.e. postulating invisible agents outside our direct experience. And in particular speculative cognitive science and neuroscience have to make giant leaps of faith from their evidence to interpretation. What are the chances that much of what we consider to be givens today will in the future be regarded as much more sophisticated than phrenology? But even if we are more charitable to science and place its cognition outside the sphere of that of a conscientious sympathetic magician, the use of science in popular discourse is certainly no different from the use of supernatural beliefs. There’s nothing new, here. Let’s just take the leap from the science of electricity to Frankenstein’s monster. Modern public treatments of genetics and neuroscience are essentially magical. I remember a conversation with an otherwise educated philosophy PhD student who was recoiling in horror from genetic modification of fruit (using fish genes to do something to oranges) as unnatural – or monstrous. Plus we have stories of special states of cognition (absent-minded professors, en-tranced scientists, rigour of study) and ritual gnostic purification (referencing, peer review). The strict naturalist prescriptions of modern science and science education are really not that different from “thou shalt have no other gods before me.”
I am giving these examples partly as an antidote to the hidden normativity in the term ‘supernatural’ (I believe it is possible to mean it non-normatively but it’s not possible for it not to be understood that way by many) but also as an example of why this distinction is not one that relates to religion as opposed to general human existence.
However, I think Hultkranz’s objection to a complete removal of the dichotomy by people like Durkheim and Hymes is a valid one as is his claim of the impossibility of reducing it to the sacred/profane distinction. However, I’d like to propose a different label and consequently framing for it: meta-liminal. By “meta-liminal” I mean beyond the boundaries of daily experience and ethics (a subtle but to me an important difference from non-empirical). The boundaries are revealed to us in liminal spaces and times (as outlined by Turner) and what is beyond them can be behaviours (Greek gods), beings (leprechauns), values (Platonic ideals) or modes of existence (land of the dead). But most importantly, we gain access to them through liminal rituals where we stand with one foot on this side of the boundary and with another on the “other” side. Or rather, we temporarily blur and expand the boundaries and can be in both places at once. (Or possibly both.) This, however, I would claim is a discursively psychological construct and not a cognitively psychological construct. We can study the neural correlates of the various liminal rituals (some of which can be incredibly mundane – like wearing a pin) but searching for a single neural or evolutionary foundation would be pointless.
The quote from Nemeroff and Rozin that ‘“the supernatural” as that which “generally does not make sense in terms of the contemporary understanding of science.”’ sums up the deficiency of the normative or crypto-normative use of “supernatural”. But even the strictly non-normative use suffers from it.
What I’m trying to say is that not only is not religious cognition a special kind of cognition (in common with MacKendrick), but neither is any other type of cognition (no matter how Popperian its supposed heuristics). The different states of transcendence associated with religious knowing (gnosis) ranging from a vague sense of fear, comfort or awe to a dance or mushroom induced trance are not examples of a special type of cognition. They are universal psychosomatic phenomena that are frequently discursively constructed as having an association with the liminal and meta-liminal. But can we postulate an evolutionary inevitability that connects a new-age whackjob who proclaims that there is something “bigger than us” to a sophisticated theologian to Neil DeGrasse Tyson to a jobbing shaman or priest to a simple client of a religious service? Isn’t it better to talk of cultural opportunism that connects liminal emotional states to socially constructed liminal spaces? Long live the spandrel!
This is not a post-modernist view. I’d say it’s a profoundly empirical one. There are real things that can be said (provided we are aware of the limitations of the medium of speech). And I leave open the possibility that within science, there is a different kind of knowledge (that was, after all, my starting point, I was converted to my stance by empirical evidence so I am willing to respond to more).
About 10 years ago, I was looking through a book on populations changes in the Czech lands. It consisted of pretty much just tables of data with little commentary. But I was shocked when I came across the life expectancy charts. But not shocked at how short people’s lives had been but how long. The headline figure of life expectancy in the early 1800s was pretty much on par with expectations (I don’t have the book to hand but it was in the high 30s or low 40s). How sad, I thought. So many people died in their 40s before they could experience life in full. But unlike most of the comparisons reporting life expectancy, this one went beyond the overall average. And it was the additional figures that shocked me. Turns out the extremely short life expectancy only applies right at birth. Once you make it to 10, you have a pretty good chance to make it into your late 50s and at 30, your chances of getting your ‘threescore and ten’ were getting pretty good. The problem is that life expectancy rates at birth only really measure child mortality not the typical lives of adults. You can see from this chart: http://www.infoplease.com/ipa/A0005140.html that in 1850, the USlife expectancy at birth was a shocking 38 years. But that does not mean that there were a lot of 38-year-olds around dying. Because if you made it to 10, your life expectancy was 58 years and at 30, it was 64 years. Now these are average numbers so it is possible that for any age cohort, exactly half the people died at the start of it and exactly half died at the end of it. But that was not the case after a certain age. Remember, a population where exactly half the people born die at or near birth (age 0) and exactly half live to be 60 will have the average life expectancy of 30. If you reduce child mortality to 10%, you will have the average life expectancy of 54. If you reduce it to 1%, the average life expectancy will be 59.4 years. Most people still die at sixty but very few die at 1. Massive gains in child mortality reduction will have made no difference to the end of life.
In reality, as the US charts show, the life expectancy at birth doubled but life expectancy at 10 went up by only about a third. That’s still a significant gain but shows a much different profile of life span than the normal figure would have us believe. It was not unusual to live into the late 50s and early 60s. And there were still a large enough number of people who lived into their 70s and 80s. Now, there are exceptions to it, during famines, epidemics and wars and for certain groups in society, the life span was significantly shorter (notice the life expectancy of whites vs. non-whites in the US). But for most populations throughout history, the most common age of death for any given person born was before the age of 10 not in their 30s.
I don’t understand why this is not commonly known. Even many historians (particularly the ones who write popular history) either don’t know this or are unwilling to distrub their narrative of how young people died in the past (in other words, they lie). I certainly was not taught this during my brief (long-ago) studies of ancient and medieval history.
What brought all this to mind is a most disturbing example of this is in a just published book called Civilization by the prominent public historian Niall Ferguson. In the preface he quotes a poem about death and suffering from John Donne and he comments on it:
“Everyone should read these lines who wants to understand better the human condition in the days when life expetancy was less than half what it is today.”
To say I was aghast is an understatement. I nearly threw my Kindle against the wall. Here’s this public intellectual, historian who goes about preaching on how it is important to understand history and yet he peddles this sort of nonsense. If he had said days with high child mortality and a shorter typical life span, I’d have no problem with it. But he didn’t and didn’t even hint that’s what he meant.
He then goes on blathering about how awful it is that all these historical luminaries died so young. Spinoza at 44, Pascal at 39. Saying:
“Who knows what other great works these geniuses might have brought forth had they been granted the lifespans enjoyed by, for example the great humanists Erasmus (69) and Montaigne (59)?”
Common! Bringing forth great works! Really?!? Pathos much? He then goes on comparing Brahms (died old, disliked by Ferguson) and Shubert (died young, liked by Ferguson). So much for academic distance. Why on earth would Ferguson think that listing artists who died young means anything. Didn’t he ever hear of Jimmy Hendrix or Kurt Cobain?
But more importantly, he doesn’t seem to notice his own counterexamples. Erasmus died almost a hundred years before Spinoza was born. What does that tell us about life expectancy and historical periods?
Or what’s any of that have to do with how much people may have contributed, had they lived l
onger? I don’t think longevity can serve as a measure of intellectual or cultural productivity. Can we compare Plato (80) and Aristotle (60). It seems to me that Aristotle produced a lot more and varied work than Plato with 20 fewer years to do it in. Aquinas (49) was no less prolific than St Augustine (75). Is it really possible to judge the impact of the inventive John L Austin (who died at 49 – in the 20th century!) is any less than of the tireless and multitalented Russell who lived pretty much forever (97)?
But there are still more counter examples. Let’s look at the list of longest reigning monarchs. The leader of that board is a 6th dynasty Pharaoh (who arguably acended to the throne as a child but still managed to live to a hundred (2200BC!). And most other long-lived monarchs were born during times when life expectancy was about half of what it is now. Sure, they were priveleged and they are relatively rare. And there were a lot of other rulers who went in their 50s and 60s. But not typically in their 40s! Maybe there is already a study out there that measures the longevity of kings with relation to their time but I doubt a straightforward corellation can be found.
Finally, I can match Ferguson poem by poem. From the ancient:
Our days may come to seventy years,
or eighty, if our strength endures;
yet the best of them are but trouble and sorrow,
for they quickly pass, and we fly away.
to the modern:
Sadness is all I’ve ever known.
Inside my retched body it’s grown.
It has eaten me away, to what the fuck I am today.
There’s nothing left for me to say.
There’s nothing there for me to hate.
There’s no feelings, and there’s no thoughts.
My body’s left to fucking rot.
Life sucks, life sucks, life sucks, life sucks. http://www.plyrics.com/lyrics/nocash/lifesucks.html
Clearly all that medicine made less of an impact on our experience of life than Ferguson thinks.
Perhaps I shouldn’t get so exercised about a bit of rhetorical flourish in one of many books of historical cosmogeny and eschatology. But I’m really more disappointed than angry. I was hoping this book may have some interesting ideas in it (although I enter it with a skeptical mind) but I’m not sure I can trust the author not to surrender the independence of his frame of mind and bend the facts to suit his pet notions.
A very common metaphor in the political discourse on war is that of doves (peaceniks) and hawks (war-mongers). It has been around at least since the cold war. But it stops at “doves=peaceful” and “hawks=aggressive”. It completely ignores other properties of the animals, e.g. the fact that “hawks hunt and kill doves”. I did a lot of searching a could not find any such extensions of the metaphor.
This is relatively unusual. Most metaphors in public discourse get dissected every which way. Look at “Iraq=Vietnam”, “Saddam=Hitler”, “Current financial crisis=X other financial crisis”. All of these were dissected and very inventively reassembled mapping by mapping. E.g. If Saddam is Hitler then we can choose between being France or Britain. If Iraq is Vietnam than we must count the wounded as dead and must look for media swaying public opinion. Etc. Etc. We might argue that doves adn hawks are relatively dead metaphors but Palin’s crosshairs got dissected in minute detail and “having somebody in your cross hairs” no more implies that you’re going to shoot them than does being a dove imply you have wings.
So why didn’t anybody in the contentious debate over the Iraq war come up with this obvious extension of the “doves/hawks=democrats/republicans” metaphor? I would expect at least something along the lines of “the hawks went fiercely after the doves and tore them to shreds in the elections”. But there’s nothing. So, what gives?