Epistolary Frost
Robert Frost c.1905
Book Details
Donald Sheehy, Mark Richardson and Robert Faggen, editors
THE LETTERS OF ROBERT FROST
Volume 1: 1886–1920
848pp. Belknap Press. £33.95 (US $45).
978 0 674 05760 9
Robert Frost's letters are lessons in poetry, personality, disclosure and holding back
DAVID BROMWICH
Robert Frost thought of himself as a poet at the age of twenty, but he had passed forty before he received the recognition that would allow him to write for a living. The Letters of Robert Frost: Volume I: 1886–1920 takes him to the age of forty-six and the publication of his third book, Mountain Interval. He was then already much in demand for readings, informal talks on poetry in general, and eventually a position as a teacher of writing at Amherst College. Frost could not be called prosperous, and he still lacked the comfort of a safe profession, but by 1918 he had climbed out of the limbo of expectation described in the first lines of "The Investment":
Over back where they speak of life as staying
(You couldn't call it living, for it ain't) . . .
By the end of the First World War, Frost had ascended halfway to the eminence he would mark in the title of a later poem, "How Hard It Is to Keep from Being King When It's in You and in the Situation". The tone, however, of most of these letters is anything but kingly. Frost as a young man is working to convince others that he is a poet; and he does it partly in order to convince himself.
Frost dropped out of Dartmouth after one semester in 1892 and would never finish college, but he seems to have felt little regret. An early admirer and sympathetic correspondent was Susan Hayes Ward, an art critic who, as poetry editor of the New York Independent, published the first poem that Frost set much store by. His letters to Ward are open and impulsive, and full of thoughts. At the age of twenty, in 1894, he tells her who his favourite poets are. Thomas Hardy comes first, for teaching "the good use of a few words". There follows an idiosyncratic list: "Keats' 'Hyperion,' Shelley's 'Prometheus,' Tenneson's [sic] 'Morte D'Arthur,' and Browning's 'Saul' – all of them about giants. Besides these I am fond of the whole collection of Palgrave's". By 1896, he is ready to impart a more abstract discovery. Speech and the motive for speech are, he thinks, related but originally separate. The motive may be detected by interpreting the words of the speech. It is then the job of the poet to indicate (without himself following) the path on which a reader's interpretation might travel. Of course, this is also a discovery about the ways of irony: how words may frame a thought that cannot be captured by words; how words may suggest a "tone of meaning" prior to words themselves, and larger. Frost's early thoughts about poetry are indistinguishable from thoughts about himself; and they show an overwhelming interest in the possibility of fame. He is not thinking about other things, even things that should be important. "As nothing that happens matters much and as most of my thoughts are about myself I am always at a loss for likely subject matter. I am the father of a son if that is anything."
More than a decade later, he wrote to Ward from his farm in Derry: "I can't look at my little slope of fields here with leaves in the half dead grass, or at the bare trees the birds have left us with, and fully believe there were ever such things as the snug downhill churning room with the view over five ranges of mountains, our talks under the hanging lamp . . . . There is a pang that makes poetry. I rather like to gloat over it". This is already the voice of poems such as "October", from his first book, A Boy's Will:
O hushed October morning mild,
Begin the hours of this day slow.
Make the day seem to us less brief,
Hearts not averse to being beguiled,
Beguile us in the way you know.
What Frost learns to "gloat over" is the suppressed sigh, the conscious discipline against regret, the summons to a recovery of self-trust. He left New Hampshire for England in September 1912, under the accurate apprehension that his chances in poetry would be better there. Soon after taking up residence at The Bungalow, Beaconsfield, he writes to Ward: "To London town what is it but a run? Indeed when I leave writing this and go into the front yard for a last look at earth and sky before I go to sleep, I shall be able to see the not very distant lights of London flaring like a dreary dawn". Yet he pulls up short: "If there is any virtue in location – but don't think I think there is. I know where the poetry must come from if it comes".
These early letters to Susan Ward touch the note of sincerity by which one comes to measure Frost's attitude towards all his other correspondents. In a steady stream of letters to two younger men, John Bartlett and Sidney Cox, he strikes up terms of similar frankness, with results that are always interesting. Bartlett had been a student of Frost's in his early days teaching at the Pinkerton Academy; Cox, whom Frost came to know during his last year in America, would go on to teach English at Dartmouth. He always writes to them – being Bartlett's elder by eighteen years and Cox's by fifteen – as an understood authority. He dispenses with the tones of dogmatic instruction or officious advice. But still, advice there is and warmly meant, about life as well as art, about marriage and children and how to make writing compatible with living. Frost takes charge because he knows they already appreciate his command of the things they want to know. The closest friendship in his life, by his own account, was with Edward Thomas, but that was in a different category: an intimacy felt and sealed in person, in conversations during visits that must have had time for many long walks. And yet, Frost's letters to Thomas are rather casual and implicit; he counts on his friend to know what must be on his mind as he talks of nothing much. It is Bartlett and Cox who receive his subtlest and most thoughtful letters about poetry.
The doctrine of "sentence sounds" and "the sound of sense", which turns on the all-importance of the speaking voice, is first broached in a letter to Bartlett written in high spirits. It is headed (instead of The Bungalow) "The Bungs, Beaks, Bucks, Fourth-of-July, 1913", and sets up a written-out seminar with four sentences that can only be read rightly one way:
You mean to tell me you can't read?
I said no such thing.
Well read then.
You're not my teacher.
These sentences are audible as dramatic speech, but with a more delicate variety of emphasis than the average pair of exchanges in a produced play would normally have, or want to have. A detail not noticeable on first reading is that English and American inflections would give quite different turns to the first line: for an English speaker of English, read touches the highest note and takes the major stress; whereas for an American, the first You has that position, and tell and the second you assist with secondary accents. That there may be such occasional differences, between competent speakers of different origins, proves Frost's point all the more convincingly. A true poet must school the reader in the tone of meaning he wants the reader to hear.
A further development of the idea appears in a letter to Bartlett of February 22, 1914. Here, Frost separates out single lines for their expressive grace and individuality, and he pulls the following sentences, among others, into his mixture of specimens:
Never you say a thing like that to a man!
And such they are and such they will be found
Hence with denial vain and coy excuse
The thing for me to do is to get right out of here while I am able.
Two of these lines, Frost wants his listener to recognize, were written by poets whom you can find in Palgrave's Golden Treasury; the other two were plucked out of the stream of life's conversation by Frost. Yet all are poetry in the sense that chiefly counts with him. Or rather, they exemplify the never-to-be-spared elements from which poetry must arise.
"The simple declarative sentence", Frost came to think, "used in making a plain statement is one sound. But Lord love ye it mustn't be worked to death. It is against the law of nature that whole poems should be written in it. If they are written they won't be read . . . [But] it is the abstract vitality of our speech. It is pure sound – pure form. One who concerns himself with it more than the subject is an artist." The same idea is given a more complex formulation in a letter to John Cournos in July 1914. Frost now makes it clear that the variety of speaking sounds which the mind of the reader craves for the sake of the ear – or is it the other way around? – cannot be worked up without regard to an underlying regularity that continues. For the ground bass of recurrent sound is poetic metre. The artist who knows the tone of meaning, the "oversound" of speech, knows the grain of the expected metre and shows that awareness even when he delights to work against it. "There are", Frost writes to Cournos, on the one hand "the very regular accent and measure of blank verse; and there are the very irregular accent and measure of speaking intonation. I am never more pleased than when I can get these into strained relation. I like to drag and break the intonation across the meter as waves first comb and then break stumbling on the shingle."
Thomas, in his review of North of Boston, would select tactfully from "The Wood-Pile" to illustrate the workings of such "strained relation"; but Frost's practice seemed to other early readers, including some of the best of them, so remote from the common music of poetry that it might as well be prose. A review of North of Boston by Ford Madox Ford (in which Frost took considerable satisfaction) brought forward a bewildered admiration for the subject matter and the sound of Frost's poetry. "There are these natural objects and scenes – and always there is present the feeling of madness, of mysterious judgements, of weather-hardened odd people – people very uncouth and unlovely, but very real." Ford asked his readers not to be frightened by the "prosaic" quality of the verse, "so queer, so harsh, so unmusical", as he quoted with evident relish sixteen lines from "The Housekeeper", beginning with "I just don't see him living many years, / Left here with nothing but the furniture". Frost had a solid warrant for saying of the poems in North of Boston, in a letter to Thomas Mosher of July 1913: "I dropped to an everyday level of diction that even Wordsworth kept above. I trust I don't terrify you. I think I have made poetry".
In his two and a half years in England, which would end in February 1915 – the heyday of the Georgian poets and the first "years of the modern" – Frost formed a close and lasting friendship with Lascelles Abercrombie, and friendly alliances with Harold Monro and F. S. Flint. His letters of this period are selfconscious but never preening and not especially guarded; they track his gradual acceptance into the London milieu. Yeats, early on, has "asked me to make one of his circle at his Monday nights when he is in London" and "told my dazzling friend Ezra Pound that my book was the best thing that has come out of America for some time". Frost's opinion of Yeats would start high and gradually cool: "His candle-lit room is more like a shrine than a room; his manner is like that of a man in some dream he cant [sic] shake off. It is not a pose with him. He has to take himself that way". A year further in, there is a sharper judgment in a letter to Cox of September 1913: "I won't say that he is quite great judged either by the way he takes himself as an artist or by the work he has done . . . . Let him be as affected as he pleases if he will only write well. But you can't be affected and write entirely well".
Pound elicits a more pronounced initial show of interest followed by a steeper decline.
Ezra Pound, my fellow countryman, is one of the most describable of [the London circle of poets]. He is six inches taller for his hair and hides his lower jaw in a delicate gold filigree of almost masculine beard. His coat is of heavy black velvet. He lives in Grub Street, rich one day and poor the next. His friends are the duchesses. And he swears like a pirate and he writes what is known as vers libre and he translates from French, Provençal, Latin, and Italian. He and I have tried to be friends because he was one of the first to review me well, but we don't hit it off very well together. I get on better with fellows like [Wilfrid] Gibson who are less concerned to dress the part of a poet.
Even though favourably reviewed by Pound, Frost came to resent his attentions for two reasons. Pound, as was his wont, made an implicit claim to have discovered Frost and thereby to have saved him from the indignity of neglect by a philistine America. At the same time he presented Frost as a reminder of the fresh things still coming from the West – if not quite a modern in good standing, in any case a force the makers of the new should sympathize with. Frost heard a note of condescension probably deeper than Pound intended, but the treatment could certainly be read that way, and it seemed to place Frost in an untenable posture, superior to the country he had left but without a proper adoptive home. He also perceived the mischief of having his non-recognition by America turned into a case-in-point to prove the imbecility of American editors, critics and poets. He had always planned to go back, and suspected Pound had sown a native suspicion against him out of sheer irritability and excess of animus.
Much of the second half of this volume shows Frost occupied with arranging the reception of his poems in the US. Close friends are encouraged to circulate appreciative reviews and copies of his books. Amy Lowell writes a review that will advance his fame as much as any estimate published in the decade after his return; but Frost is uncertain what to make of this woman of letters, a poet of marked but limited originality and domineering habits. He never forgave (or quite recovered from) Lowell's observation that he lacked a sense of humour; and indeed, what she must have meant (and what he must have realized with a twinge) is elusive enough to make one pause and think. Frost did have a lively sense of "play", and it was play that he demanded of poetry, as the inborn trait that makes the larger virtues tenable. Play of mind, he meant, and play of temper – the quick alternation of moods. But his jokes are too packed with aggression to be counted as humour in the accepted sense. One thinks of the choice of recipes for ending the world in "Fire and Ice", of "God's last Put out the light" as the shorthand for apocalypse in "Once by the Pacific", and of the drawn-out anecdote of "The Code" about dumping a wagon-load of hay on a stiff manager who used his authority badly.
Occasional but palpable insincerities creep into the letters, starting in America in 1915. Joseph Warren Beach, a professor at the University of Minnesota, whom Frost liked well enough and cultivated short of friendship, was apparently wounded by Frost's criticism of some of his own poems. Frost writes back promptly: "if I didn't seem to like your poetry enough, I want to add that I liked it better than my own". Yet some way under, one can detect a real self-disgust that contributes to responses of this kind. His first two years back in America were hard: Frost was having difficulty paying for a new house and farm in Franconia, and he reproached himself for the failure in supporting his family by anything he put his hand to. Out of the same mood comes the stricture in a letter to Louis Untermeyer of July 8, 1915: "All I insist on is that nothing is quite honest that is not commercial". Then he adds defensively: "You must take that as said in character". But how often did Frost choose to inhabit this character? An unmistakable line of "Provide, Provide" tells his listener to "Make the whole stock exchange your own!"; and that strange and memorable poem runs all in the same vein of imperative exhortation. Build a fortress for self-defence. Better to die admired by paid-for friends than die without them. Frost suffered deep humiliations which he felt to the core; this was a hidden source of his affinity with Keats. Yet he had a power without peer for prolonging the resentment and recoiling from his own weakness and susceptibility.
Another letter to Untermeyer, of considerable biographical interest, ventures a personal disclosure that is both fascinating and aberrant. On the brink of publishing Mountain Interval, Frost confesses that all the poems in this third book and in North of Boston were written years before, and all of them close together. He adds that further books are to come from that early accumulation; which must mean that his critics (including Untermeyer) will stand exposed as fools if their reviews pretend to discover evidence of "development", or decline, or even a wonderful maturing of imaginative faculties. I am all one, Frost is saying, and I was everything I am a long time ago. So he has plotted it all in advance. "The poet in me died nearly ten years ago", he declares in this letter of May 1916:
Fortunately he had run through several phases, four to be exact, all well-defined, before he went. The calf I was in the nineties I merely take to market. I am become my own salesman . . . . Did you ever hear of quite such a case of Scotch–Yankee calculation? You should have seen the look on the face of the Englishman I first confessed this to. I won't name him lest it should bring you two together . . . . As you look back don't you see how a lot of things I have said begin to take meaning from this? Well.
It is an extraordinary act of self-aggrandizement and self-negation, and, knowing Frost, one concludes that it was probably in some measure true but deliberately exaggerated. This intuition is confirmed by his next letter to Untermeyer, which begins with a retraction: "Seriously I am fooling" – a playful and ambiguous sentence that still holds back a good deal.
Some of the best letters in the second half of The Letters of Robert Frost are written to his daughter Lesley. She gets cut from the tennis team at Wellesley College, for reasons she does not understand, and Frost gives her the closest of instructions how to launch an inquest into the injustice, and then how to put the humiliation behind her whatever the facts may be. Another letter generously shares Lesley's indignation at being chided by her Latin teacher for some slight grammatical infraction, and assures her that the study of Latin is in "Egypt" compared to the study of English; but he goes on to lay out a plan for improving her mastery of the grammar, in her own time and by a suitable method. Frost always hoped that his confidence would be contagious – "They would not find me changed from him they knew – / Only more sure of all I thought was true" – and it must be said that these early letters carry the burden of his poetry so finely as to be no embarrassment to the poetry. The book has been edited by Donald Sheehy, Mark Richardson and Robert Faggen with continuous tact and sensitivity to the likely demands of a literate reader; there are enough notes and just enough (they never strike one as intrusions pretending to be elucidations). A good index and a biographical glossary complete the authority of a book that has been printed with the care and elegance it deserves. Frost's letters seem the inevitable expressions of a personality, so that, even when a mask is on, there is interest to be found in exactly what it reveals.
Thursday, 31 July 2014
Sunday, 27 July 2014
A history of human prejudice
A history of human prejudice
"L'expulsió dels Moriscos", 1894, by Gabriel Puig Roda
Book Details
Francisco Bethencourt
RACISMS
From the crusades to the twentieth century
464pp. Princeton University Press. £27.95 (US $39.50).
978 0 691 15526 5
The destructive legacy of racism
DAVID ARMITAGE
Racism, like nationalism, seems to be at once ancient and modern. Debates over the history of nationalism pit "primordialists", who trace it back to the Israelites, against "modernists", who see it arising as a political force only after the French and Industrial Revolutions. Primordialist students of racism find it far back in the ancient world among the Greeks; their modernist counterparts argue for origins in the emergent human sciences of the Enlightenment. Few -isms excite more passion about their invention or greater disagreement about their birth. Is there a middle way? Or do we just need a better explanatory framework?
The title of Francisco Bethencourt's masterly new book suggests just such a model for elucidating the history of racism: Racisms in the plural not the singular, anatomized in all their contingency rather than assumed to be a singular outgrowth of human nature itself. His subtitle – "From the Crusades to the twentieth century" – also points to a middle way in the Middle Ages. Bethencourt tracks the ramifications of racism across almost a millennium, delving no deeper into the past or any closer to the present.
Much hangs on what he means by racism, of course: not just any form of intolerance based on heritable difference among groups, but more precisely, "prejudice concerning ethnic descent coupled with discriminatory action". With this definition in mind, he confidently locates racism's origins, its proliferation and its slow demise – or, at least, the decline of the formal racism that is "linked to political projects" (though what "political" includes or excludes Bethencourt never quite specifies). As he notes, anti-racism is now the norm across the world, yet informal racism is still very much with us. Racisms can at least help to distinguish collective racial discrimination from its purulent place in the hearts and minds of individual racists.
Against the primordialists, Bethencourt argues that systematic discrimination on the basis of descent did not arise before the twelfth century. Against the modernists, he asserts that there was racism before there were theories of race: "classification did not precede action". Racism in his sense – relational, hierarchical, prejudicial and found in policies, not just attitudes – was the creation of Latin Christians in the Mediterranean world. Their heirs and successors carried it across the globe as European expansion penetrated into the Atlantic, Africa and Asia. This Western weed hybridized with native prejudices and hierarchies wherever it landed. It took root most firmly in the Americas, where creole societies overwhelmed or subsumed native peoples, and where slavery and the plantation system left their most toxic legacies. The western hemisphere remains the region most scarred and stratified by the residues of racism.
The deed came before the word; racial theory only arose after centuries of practical racism. Once "scientific" racism had entered an unholy alliance with blood-and-soil nationalism in the nineteenth century, the conditions were in place for humanity's most heinous project of systematic ethnic exploitation. As Bethencourt reminds us, some 12 million foreigners entered slave labour in Germany between 1939 and 1945: a figure comparable to the number of Africans enslaved over the 350 years of the Atlantic slave trade.
This modern murderousness had relatively modest medieval beginnings. Like Benzion Netanyahu, David Nirenberg and other recent scholars, Bethencourt locates racism's origins in Iberia – his own area of speciality, as a leading historian of the Portuguese empire – in communities where converted Jews and Muslims lived cheek by jowl with Old Christians. An inclusive universalism was the foundation of Christianity – "There is neither Jew nor Gentile, neither slave nor free, nor is there male and female, for you are all one in Christ Jesus", Paul wrote in Galatians 3:28 – but pressures in the peninsula on resources of status and wealth raised barriers against New Christians. The exclusion of Moriscos (converted Muslims) became "the first crucial case (alongside that of converted Jews) in the Christian community of a persistent internal divide based on the notion of descent", where Christians discriminated against fellow Christians on grounds of ethnicity rather than belief.
This example clarifies Bethencourt's choice of criteria. No theory of race was needed to justify exclusion from ecclesiastical office, for instance. Phenotypic markers like skin colour played no part: converted Jews were physically indistinguishable from their Old Christian neighbours. Discrimination arose from intimacy, not distance. And class played an important role in cementing the economic dominance of low-status members of an in-group against a threateningly adjacent out-group, as it would later in the American South. The long-term result, in Iberia at least, was a series of purity-of-blood statutes first promulgated in 1449 and not abolished until 1835–70. Not every later instance of racism would have the same features: Bethencourt admits that persecution of the Roma stemmed mostly from their mobility, not their settled proximity. His conception of relational racism still illuminates more than it obscures in the dark history of human prejudice over the longue durée.
The bulk of Racisms catalogues the inhumanity of Europeans and Euro-Americans in their treatment of those they sought to exclude, oppress, exploit or exterminate. Bethencourt's tone in treating the crimes and follies of humankind is unfailingly cool and analytically sophisticated. His long and detailed book is a syllabus of horrors; in the hands of a less careful historian, it could simply blight one's faith in human nature and lead only to outrage or despair. There are a few bright spots, such as descriptions of the social mobility of Africans in early modern courts or the appointment in 1518 of a black bishop in Congo, but they do little to relieve the gloom. (No other black African bishop held office until the twentieth century.) So comprehensive is Bethencourt's synthesis, and so unsparing is his survey, that experts will find new information and every reader will be appalled at human ingenuity when ethnicity is at stake.
Bethencourt's achievement is to show that racism, in all its forms, was contextual and ultimately reformable, not innate and hence inevitable. The reader's journey to that conclusion is grim but necessary, as Bethencourt circles the globe in search of his examples. He argues that when Jerusalem ceased to be the centre of their mental maps, Europeans could start to structure the world's populations within novel hierarchies of continents, peoples and civilizations. Depending on the balance of numbers, European colonial societies tolerated race-mixing, set barriers to miscegenation, or managed the problem. The legacies of these strategies prompted the question that initially ignited Bethencourt's study: "How is it that a person can be considered black in the United States, colored in the Caribbean or South Africa, and white in Brazil?"
In search of answers to this conundrum, Bethencourt tacks deftly between cultural and social history. His binocular vision marks Racisms out from most previous studies of its subject, which have either examined racism as ideas in the mind or as facts on the ground. Unlike many students of Western racism, Bethencourt pays little attention to its biblical underpinnings: the division of the world into three "races", Hamitic, Semitic and Japhetic (derived from the three sons of Noah), does not even appear in his index, though it pops up occasionally in the text. Practices and perceptions rather than arguments structure much of his story: racism's textual traces attract little of Bethencourt's attention until he reaches the eighteenth century.
Iconography occupies much more space in Racisms than intellectual history. (It must be said that Bethencourt's press has not served him well here: Racisms is handsomely produced, but its illustrations are murky and visual examples are not always reproduced alongside treatments in the text.) Bethencourt is an acute reader of representations of the four parts of the Earth on Sicilian tombs and Italian fountains, in the frontispieces of atlases and on the ceilings of palazzi, to illustrate how imagery propagated hierarchy, sometimes for centuries, as the same representations endlessly cycled through the mechanisms of print culture. And his treatment of eighteenth-century Spanish American casta (caste) paintings, with their variegated images of every possible combination of mixed-race ethnicities, convincingly shows their malleability: language, hairstyle and clothing could all change perceptions of ethnicity. Their shifting categories were a long way from later conceptions of immutable racism: more treatment of early racial photographs, like those created on Louis Agassiz's 1865–6 Harvard expedition to Brazil, would have confirmed this point. Yet in regard to visual evidence, as in other areas, Bethencourt's negative findings are as important as his positive results in the search of racism's tangled roots.
Mobility characterized racism as Europeans transported it from continent to continent. Spaniards found natives they called "Moors" in the Americas; later, the British encountered "blacks" and "Indians" in early Australia. Laboratories of racism also existed within Europe long after the Middle Ages: as Bethencourt writes in his account of Carolus Linnaeus, the eighteenth-century primitivist and taxonomer, "Lapland was indeed the Swedish West Indies", its Sami peoples as intriguing to northern ethnographers as the Caribs had been to the European inventors of "cannibalism" from Columbus onwards. Racism was nonetheless unevenly distributed. The Americas were the matrix of post-medieval discrimination, which held greater purchase there than in Africa or, especially, Asia, where European colonialism faced more competition and failed to take root so deeply. Only Oceania escapes Bethencourt's panorama, even though the region was shot through with racialized cartographies by the early nineteenth century, not least in "New Guinea" and "Melanesia".
Bethencourt's long-range view – deep in time, wide in space – puts more familiar turning points in the global history of racism into novel perspective. The anti-slavery movement did little, at least in the medium term, to "diminish prejudices and discriminatory action" more generally, while scientific racism was quite compatible intellectually with anti-slavery, as in the case of Johann Friedrich Blumenbach, a supporter of abolition and yet also the inventor of enduring racial categories such as the "Caucasian". The account of the rise of racial theory in Racisms is encyclopedic in its coverage as one thinker follows another, from Linnaeus and Immanuel Kant to Agassiz and Charles Darwin, but their contributions come late in the history of racism, not as its intellectual inspiration.
More ambitiously, Bethencourt draws out racism's purpose in combating egalitarianism in Europe after 1848, in cementing racial inequality in the United States and in promoting European incursions into Asia. He rightly notes the paradox that the expansion of democratic nationalism fomented racism in the nineteenth and twentieth centuries. Little wonder, then, that indigenous peoples across the globe looked back nostalgically to times when they were guaranteed privileges within a república de Indios or could petition a monarch for protection against land-hungry settlers and creoles.
When Bethencourt reaches the past hundred years or so, the examples of the Jewish pogroms in Russia, the Armenian genocide and the Nazi racial state amply confirm his hypothesis that political projects motivate racism. After his broad, complex narrative, these modern instances appear not as paradigmatic horrors arising from eternal hatreds, but instead as the culmination of many moments of contingent discrimination. "There is no cumulative and linear racism", he insists: even Iberia, once the cradle of racism itself, had little anti-Semitism in the modern period owing to its small, residual communities.
Race lingers in census categories and self-identifications long after any significant biological basis for it has been exploded. The dream of a post-racial society is far from being achieved anywhere in the world, even in such apparently homogeneous communities as Japan or Iceland. And politically motivated acts of ethnic discrimination and destruction have continued, in Rwanda in 1994 and in Gujarat in 2002. These are among the latest acts in racism's destructive history; if Bethencourt is right, and racism is indeed conjunctural rather than bred in the human bone, they could be among the last. Racisms breeds hope that racism is eradicable, at least formally and collectively, even though the durability of racial prejudice tempers that hope. Racism may not be age-old, but nor is it just skin-deep.
"L'expulsió dels Moriscos", 1894, by Gabriel Puig Roda
Book Details
Francisco Bethencourt
RACISMS
From the crusades to the twentieth century
464pp. Princeton University Press. £27.95 (US $39.50).
978 0 691 15526 5
The destructive legacy of racism
DAVID ARMITAGE
Racism, like nationalism, seems to be at once ancient and modern. Debates over the history of nationalism pit "primordialists", who trace it back to the Israelites, against "modernists", who see it arising as a political force only after the French and Industrial Revolutions. Primordialist students of racism find it far back in the ancient world among the Greeks; their modernist counterparts argue for origins in the emergent human sciences of the Enlightenment. Few -isms excite more passion about their invention or greater disagreement about their birth. Is there a middle way? Or do we just need a better explanatory framework?
The title of Francisco Bethencourt's masterly new book suggests just such a model for elucidating the history of racism: Racisms in the plural not the singular, anatomized in all their contingency rather than assumed to be a singular outgrowth of human nature itself. His subtitle – "From the Crusades to the twentieth century" – also points to a middle way in the Middle Ages. Bethencourt tracks the ramifications of racism across almost a millennium, delving no deeper into the past or any closer to the present.
Much hangs on what he means by racism, of course: not just any form of intolerance based on heritable difference among groups, but more precisely, "prejudice concerning ethnic descent coupled with discriminatory action". With this definition in mind, he confidently locates racism's origins, its proliferation and its slow demise – or, at least, the decline of the formal racism that is "linked to political projects" (though what "political" includes or excludes Bethencourt never quite specifies). As he notes, anti-racism is now the norm across the world, yet informal racism is still very much with us. Racisms can at least help to distinguish collective racial discrimination from its purulent place in the hearts and minds of individual racists.
Against the primordialists, Bethencourt argues that systematic discrimination on the basis of descent did not arise before the twelfth century. Against the modernists, he asserts that there was racism before there were theories of race: "classification did not precede action". Racism in his sense – relational, hierarchical, prejudicial and found in policies, not just attitudes – was the creation of Latin Christians in the Mediterranean world. Their heirs and successors carried it across the globe as European expansion penetrated into the Atlantic, Africa and Asia. This Western weed hybridized with native prejudices and hierarchies wherever it landed. It took root most firmly in the Americas, where creole societies overwhelmed or subsumed native peoples, and where slavery and the plantation system left their most toxic legacies. The western hemisphere remains the region most scarred and stratified by the residues of racism.
The deed came before the word; racial theory only arose after centuries of practical racism. Once "scientific" racism had entered an unholy alliance with blood-and-soil nationalism in the nineteenth century, the conditions were in place for humanity's most heinous project of systematic ethnic exploitation. As Bethencourt reminds us, some 12 million foreigners entered slave labour in Germany between 1939 and 1945: a figure comparable to the number of Africans enslaved over the 350 years of the Atlantic slave trade.
This modern murderousness had relatively modest medieval beginnings. Like Benzion Netanyahu, David Nirenberg and other recent scholars, Bethencourt locates racism's origins in Iberia – his own area of speciality, as a leading historian of the Portuguese empire – in communities where converted Jews and Muslims lived cheek by jowl with Old Christians. An inclusive universalism was the foundation of Christianity – "There is neither Jew nor Gentile, neither slave nor free, nor is there male and female, for you are all one in Christ Jesus", Paul wrote in Galatians 3:28 – but pressures in the peninsula on resources of status and wealth raised barriers against New Christians. The exclusion of Moriscos (converted Muslims) became "the first crucial case (alongside that of converted Jews) in the Christian community of a persistent internal divide based on the notion of descent", where Christians discriminated against fellow Christians on grounds of ethnicity rather than belief.
This example clarifies Bethencourt's choice of criteria. No theory of race was needed to justify exclusion from ecclesiastical office, for instance. Phenotypic markers like skin colour played no part: converted Jews were physically indistinguishable from their Old Christian neighbours. Discrimination arose from intimacy, not distance. And class played an important role in cementing the economic dominance of low-status members of an in-group against a threateningly adjacent out-group, as it would later in the American South. The long-term result, in Iberia at least, was a series of purity-of-blood statutes first promulgated in 1449 and not abolished until 1835–70. Not every later instance of racism would have the same features: Bethencourt admits that persecution of the Roma stemmed mostly from their mobility, not their settled proximity. His conception of relational racism still illuminates more than it obscures in the dark history of human prejudice over the longue durée.
The bulk of Racisms catalogues the inhumanity of Europeans and Euro-Americans in their treatment of those they sought to exclude, oppress, exploit or exterminate. Bethencourt's tone in treating the crimes and follies of humankind is unfailingly cool and analytically sophisticated. His long and detailed book is a syllabus of horrors; in the hands of a less careful historian, it could simply blight one's faith in human nature and lead only to outrage or despair. There are a few bright spots, such as descriptions of the social mobility of Africans in early modern courts or the appointment in 1518 of a black bishop in Congo, but they do little to relieve the gloom. (No other black African bishop held office until the twentieth century.) So comprehensive is Bethencourt's synthesis, and so unsparing is his survey, that experts will find new information and every reader will be appalled at human ingenuity when ethnicity is at stake.
Bethencourt's achievement is to show that racism, in all its forms, was contextual and ultimately reformable, not innate and hence inevitable. The reader's journey to that conclusion is grim but necessary, as Bethencourt circles the globe in search of his examples. He argues that when Jerusalem ceased to be the centre of their mental maps, Europeans could start to structure the world's populations within novel hierarchies of continents, peoples and civilizations. Depending on the balance of numbers, European colonial societies tolerated race-mixing, set barriers to miscegenation, or managed the problem. The legacies of these strategies prompted the question that initially ignited Bethencourt's study: "How is it that a person can be considered black in the United States, colored in the Caribbean or South Africa, and white in Brazil?"
In search of answers to this conundrum, Bethencourt tacks deftly between cultural and social history. His binocular vision marks Racisms out from most previous studies of its subject, which have either examined racism as ideas in the mind or as facts on the ground. Unlike many students of Western racism, Bethencourt pays little attention to its biblical underpinnings: the division of the world into three "races", Hamitic, Semitic and Japhetic (derived from the three sons of Noah), does not even appear in his index, though it pops up occasionally in the text. Practices and perceptions rather than arguments structure much of his story: racism's textual traces attract little of Bethencourt's attention until he reaches the eighteenth century.
Iconography occupies much more space in Racisms than intellectual history. (It must be said that Bethencourt's press has not served him well here: Racisms is handsomely produced, but its illustrations are murky and visual examples are not always reproduced alongside treatments in the text.) Bethencourt is an acute reader of representations of the four parts of the Earth on Sicilian tombs and Italian fountains, in the frontispieces of atlases and on the ceilings of palazzi, to illustrate how imagery propagated hierarchy, sometimes for centuries, as the same representations endlessly cycled through the mechanisms of print culture. And his treatment of eighteenth-century Spanish American casta (caste) paintings, with their variegated images of every possible combination of mixed-race ethnicities, convincingly shows their malleability: language, hairstyle and clothing could all change perceptions of ethnicity. Their shifting categories were a long way from later conceptions of immutable racism: more treatment of early racial photographs, like those created on Louis Agassiz's 1865–6 Harvard expedition to Brazil, would have confirmed this point. Yet in regard to visual evidence, as in other areas, Bethencourt's negative findings are as important as his positive results in the search of racism's tangled roots.
Mobility characterized racism as Europeans transported it from continent to continent. Spaniards found natives they called "Moors" in the Americas; later, the British encountered "blacks" and "Indians" in early Australia. Laboratories of racism also existed within Europe long after the Middle Ages: as Bethencourt writes in his account of Carolus Linnaeus, the eighteenth-century primitivist and taxonomer, "Lapland was indeed the Swedish West Indies", its Sami peoples as intriguing to northern ethnographers as the Caribs had been to the European inventors of "cannibalism" from Columbus onwards. Racism was nonetheless unevenly distributed. The Americas were the matrix of post-medieval discrimination, which held greater purchase there than in Africa or, especially, Asia, where European colonialism faced more competition and failed to take root so deeply. Only Oceania escapes Bethencourt's panorama, even though the region was shot through with racialized cartographies by the early nineteenth century, not least in "New Guinea" and "Melanesia".
Bethencourt's long-range view – deep in time, wide in space – puts more familiar turning points in the global history of racism into novel perspective. The anti-slavery movement did little, at least in the medium term, to "diminish prejudices and discriminatory action" more generally, while scientific racism was quite compatible intellectually with anti-slavery, as in the case of Johann Friedrich Blumenbach, a supporter of abolition and yet also the inventor of enduring racial categories such as the "Caucasian". The account of the rise of racial theory in Racisms is encyclopedic in its coverage as one thinker follows another, from Linnaeus and Immanuel Kant to Agassiz and Charles Darwin, but their contributions come late in the history of racism, not as its intellectual inspiration.
More ambitiously, Bethencourt draws out racism's purpose in combating egalitarianism in Europe after 1848, in cementing racial inequality in the United States and in promoting European incursions into Asia. He rightly notes the paradox that the expansion of democratic nationalism fomented racism in the nineteenth and twentieth centuries. Little wonder, then, that indigenous peoples across the globe looked back nostalgically to times when they were guaranteed privileges within a república de Indios or could petition a monarch for protection against land-hungry settlers and creoles.
When Bethencourt reaches the past hundred years or so, the examples of the Jewish pogroms in Russia, the Armenian genocide and the Nazi racial state amply confirm his hypothesis that political projects motivate racism. After his broad, complex narrative, these modern instances appear not as paradigmatic horrors arising from eternal hatreds, but instead as the culmination of many moments of contingent discrimination. "There is no cumulative and linear racism", he insists: even Iberia, once the cradle of racism itself, had little anti-Semitism in the modern period owing to its small, residual communities.
Race lingers in census categories and self-identifications long after any significant biological basis for it has been exploded. The dream of a post-racial society is far from being achieved anywhere in the world, even in such apparently homogeneous communities as Japan or Iceland. And politically motivated acts of ethnic discrimination and destruction have continued, in Rwanda in 1994 and in Gujarat in 2002. These are among the latest acts in racism's destructive history; if Bethencourt is right, and racism is indeed conjunctural rather than bred in the human bone, they could be among the last. Racisms breeds hope that racism is eradicable, at least formally and collectively, even though the durability of racial prejudice tempers that hope. Racism may not be age-old, but nor is it just skin-deep.
Tuesday, 22 July 2014
Lessons from the lab
Lessons from the lab
Teaching contentious classics: Sherif, Milgram and Harlow revisited
CAROL TAVRIS
The fiftieth anniversary last year of the Milgram experiments on obedience to authority provides a good vantage point from which to consider the eternal dilemma for instructors and textbook writers: how much time should we devote to the classics, and how should we teach them? In every generation, certain studies or approaches rise to prominence. But once they get planted in our books and lectures, they tend to become rooted there. Over time it gets harder to decide how much to prune – let alone whether it's time to uproot them altogether. We stop looking at the original studies closely, let alone critically; they just sit there in our courses like grand historical monuments. However, it's good to reexamine them for two important reasons: one is for our own sake, to refresh our memories and rethink their contributions; the other is for our students' sake. Students today are as eager to reject unflattering or counterintuitive portrayals of humanity as students were decades ago. Teaching the classics therefore means finding new ways of persuading students that these findings do apply to them despite the errors or limitations of the original studies.
The relationship between cultural events and research is a two-way street: an event may stimulate research, and research may influence the larger culture. The story of Kitty Genovese – murdered in New York in 1964, while thirty-eight witnesses allegedly watched from their windows and failed to call for help – still occupies a prominent position in social psychology, where it launched a long and productive line of experimental studies on bystander apathy, deindividuation and intervention. Thanks to two recent books and a critical reassessment that appeared in the American Psychologist in 2007, though, we now know that almost all of the details reported at the time were wrong. Turning Genovese's death into a story of urban alienation was largely the work of A. M. Rosenthal at the New York Times, who wanted the image of the "38 witnesses doing nothing" to be in the story's headline, where it quickly went the contemporary equivalent of viral. In fact, most of the neighbours who heard her screams could not see her – let alone watch her murder from their windows – and thought it was just another drunken domestic fight. As it turns out, only three neighbours understood the attack for what it was and failed to respond. Three is too many; but a newspaper story about three craven witnesses would not have been as shocking.
Should teachers eliminate the Kitty Genovese story out of embarrassment that we got it wrong or weren't sceptical enough? Not necessarily. Although the specifics of the story were wrong, its essence was true. The accurate Genovese story, however, raises an additional social and psychological lesson for students: what cautions does it offer for thinking critically about an equally sensational crime today? What does it reveal about the societal sources of anxiety that generate an urban legend? America at the time was undergoing political assassinations, race riots, the Vietnam War, and rising crime rates – Kitty Genovese's was one of 636 murders in New York that year. People were frightened, and the headline resonated.
With Kitty Genovese, an oversimplified but emotionally compelling narrative led to good research. It often works the other way, of course: good research can lead to an oversimplified and emotionally compelling narrative. Consider the case of Walter Mischel's "marshmallow study" of four-year-old children's ability to delay gratification. (Children were given a choice between eating one marshmallow right then or waiting a little while and getting two.) The part that got so much recent public attention was that children who had resisted temptation turned out years later to score as many as 210 points higher on their SATs than their less patient peers. Bingo!
The marshmallow study captured the public imagination just as Kitty Genovese had, but with a brighter, happier moral – one that suits our current cultural concerns. We can all see those kids in our mind's eye, and imagine what we would have done, faced with a now-or-later choice. And it has such an all-American, Calvinist moral: delay gratification and heaven will be thine. Yet Matthew Bourne's analysis in the New York Times revealed that one reason that this story resonated so much was the fact that inconvenient details about the actual scientific findings had been pruned away. Mischel's original studies focused on 653 toddlers, all of them attending the Bing Nursery School at Stanford, California, for children of professors and graduate students. The studies weren't originally designed to look at long-term outcomes; that idea occurred to Mischel much later, when he asked his own children, who had attended the Bing school, how his research subjects were faring in college. Of the original 653, he tracked down 185, of whom just ninety-four provided their SAT scores.
So we have a problem with the representativeness of the sample – originally and in the follow-up. How would that affect the results? An article in Cognition in 2012 by Celeste Kidd, Holly Palmeri and Richard N. Aslin showed that some children were more likely to eat the first marshmallow when, by virtue of their previous history, they had reason to doubt the researcher's promise to come back with a second one. For children raised in an unstable environment, they wrote, "the only guaranteed treats are the ones you have already swallowed", while children raised in a more stable environment might be willing to wait a few more minutes, confident that a second treat will materialize. And not only the stability of the environment matters: I'm an only child, so the effects of siblings didn't occur to me, until a friend said, "try the marshmallow study on those of us who have three siblings. You grabbed what you could, or else one of them got it".
To offer these caveats about the way the Mischel findings are so often told is not to debunk the excellent original study but to make students more excited about it – to alert them not only to what it found, but also to what it failed to find. The study and its popular response illustrate how, in science, each finding generates new questions; the importance of critical thinking in coming up with alternative hypotheses; and the importance of not oversimplifying. An ability to resist temptation is one factor among many that shapes our lives, but if you are a child from an unstable family, living in a tough environment, in poor health, it might not be the most important. It might not be a stable character trait. It might not even cross domains of experience.
Some of social psychology's great historical monuments – studies with a long-lasting influence – could not be replicated today: Muzafer Sherif's "Robbers Cave" study, Stanley Milgram's obedience experiments, and Harry Harlow's wire-and-cloth mother studies.
Between 1949 and 1954, Sherif and his colleagues used a Boy Scout camp in Oklahoma to test their theories of the origin and reduction of intergroup animosity and prejudice. They randomly assigned the twelve-year-old boys to one of two groups, the Eagles or the Rattlers, and set up competitive activities that quickly generated "us–them" thinking and mutual hostility. Later, to undo the warfare they had thus created, they set up situations in which the boys had to work together to achieve some shared goal.
When I revisited Sherif's original papers, I found that the study was not as rigorous as I had remembered. Most of the conclusions, Sherif wrote, "were reached on the basis of observational data" – confirmed by "sociometric choices and stereotype ratings". He said: "Observations made after several superordinate goals were introduced showed a sharp decrease in name-calling and derogation of the out-group common . . . in the contact situations without superordinate goals". (One of the pleasures of going back to read original studies is that of the unexpected discovery: The name-calling is so charmingly outdated. In 1949, boys put each other down by saying things like "all of them are stinkers" and calling their enemies "smart alecks".) Sherif did provide some numbers and percentages, a few chi squares, but this was a field study, with all of the uncontrollable variables that field studies can generate, and as "science" it would not meet today's standards. Was everything all hunky dory for the Eagles and Rattlers afterwards? The numbers of boys favourable towards the out-group improved, but the majority of boys in each group apparently maintained their hostility towards each other.
Yet Robbers Cave was and remains important for its central ideas. At the time, most psychologists did not understand, and most laypeople don't understand even today, that simply putting two competing, hostile groups together in the same room to, say, watch a movie, won't reduce their antagonism; that competitive situations generate hostility and stereotyping of the out-group; and that competition and hostility can be reversed, at least modestly, through cooperation in pursuit of shared goals. That's the story of Robbers Cave: it was true then, and it's true now.
In fact, just as the Kitty Genovese case spurred bystander-intervention experiments, Robbers Cave generated a great deal of research into the importance of superordinate goals. When Elliot Aronson went into the newly desegregated but hostile classrooms in Austin, Texas, where African American, Mexican American, and Anglo children were at war with each other, Sherif's findings strongly influenced his design of the "jigsaw" classroom. But Elliot went one better, using an experimental intervention and a control group. What a great coda to the Robbers Cave story – a direct link from Eagles and Rattlers, a made-up antipathy, to interethnic warfare in American schools, which is all too real and persisting.
Teaching the lessons of Stanley Milgram's experiments is far more complicated than Sherif's. Again, the cultural context of the times is crucial. In 1961, when Adolf Eichmann was claiming at his trial that he was "only following orders" in the murder of Jews during the Holocaust, Milgram began his effort to determine how many Americans would obey an authority figure when directly ordered to harm another human being.
Participants came to the Yale lab thinking they were part of an experiment on the effects of punishment on learning, and were instructed to administer increasing levels of shock to a "learner". The learner was a confederate of Milgram who did not receive any shocks, but played his part convincingly: as the study continued, he shouted in pain and pleaded to be released, according to an arranged script. To almost everyone's surprise at the time, some two-thirds of the participant "teachers" administered what they thought were the highest levels of shock, even though many were experiencing difficulty or distress doing so. Milgram's experiment produced a firestorm of protest about the possible psychological harm inflicted on the unwitting participants, and as a result, it could never be done today in its extreme version.
Some people hated the method and others the message, but the Milgram study has never faded from public attention and debate about it continues. Gina Perry, an Australian journalist whose book, Behind the Shock Machine (2012), aimed to discredit Milgram and his findings, interviewed everyone she could find who was connected to the original study, along with Milgram's critics and defenders. She also went through the archives of Milgram's voluminous unpublished papers.
Reinvestigations almost invariably yield some useful discoveries. Perry found violations of the research protocol: Over time, the man playing the experimenter began to drift off-script, urging reluctant subjects to keep going longer than he was supposed to. He pressed some people eight or nine times; one woman, twenty-six times. To my own dismay, I learned that Milgram committed what researchers, even then, would have considered a serious breach of ethics: He did not fully debrief subjects at the end of each experiment. They did meet the "learner" to shake hands and be assured that he was fine, but they were not told that all those escalating levels of shocks were completely fake, because Milgram was afraid the word would get out and invalidate future participants' behaviour. It was almost a year before subjects were given a full explanation. Some never got it; some never understood what the whole thing was about.
For critics like Perry, these flaws are reason enough to kick Milgram off his pedestal and out of our textbooks. I disagree. I think we need to give the Milgram experiment the prominent position we do, and for the same reason as when it was originally done. When I first read about Milgram's experiments in graduate school, I remember thinking, "Very clever, but why do we need them? Wasn't Nazi Germany evidence enough of obedience to authority?" But that was Milgram's point: in the early 1960s, Americans – and American psychologists – deeply believed in national character. Germans obeyed Hitler, it was widely assumed, because obedience was in the German psyche: look at all those high scores on the Authoritarian scale. It could never happen here.
For me, reading Perry's criticisms made it all the clearer why the Milgram experiments deserve their prominence. "Deep down, something about Milgram makes us uneasy", she writes. There is indeed something that makes everyone uneasy: the evidence that situations have power over our behaviour. This is a difficult message, and most students – indeed, most people – have trouble accepting it. "I would never have pulled those levers!" we cry. "I would have told that experimenter what a . . . stinker he is!" Perry insists that people's personalities and histories influence their actions. But Milgram never disputed that fact; his own research found that many participants resisted. Milgram wrote: "There is a tendency to think that everything a person does is due to the feelings or ideas within the person. However, scientists know that actions depend equally on the situation in which a man finds himself". Notice the "equally" in that sentence; many critics, like Perry, don't.
One of the original subjects in the experiments, called Bill, tried to explain to Perry why the studies were so valuable, and why he did not regret participating, although he was one of those who went on to the end. He hadn't thought about the experiment for twenty years, he said, until he began dating a psychology professor. She, thrilled to have met a living link to the experiment, asked him to speak to her class. "Well", Bill told Perry, "you would have thought Adolf Hitler walked in the room. I never really thought about it that way, you know?" Bill told the students, who were silently sitting in judgment on him: "It's very easy to sit back and say, 'I'd never do this or that' or 'Nobody could ever get me to do anything like that.' Well, guess what? Yes, they can".
That, of course, is the moral of the story. But the wall of hostility that Bill felt from the students means that they, along with critics like Perry, had failed to grasp that moral. They were reading about the experiment, seeing the films – and not understanding that they themselves might have been Bill.
My final classic is Harry Harlow's demonstration of the importance of contact comfort. Harlow took infant rhesus monkeys away from their mothers and raised them with a "wire mother", a forbidding construction of wires with a milk bottle connected to it, and a "cloth mother", a similar construction but one covered in foam rubber and terry cloth. At the time, it was widely believed (by psychologists, if not mothers) that babies become attached to their mother simply because mothers provide food. But Harlow's baby monkeys ran to the terry-cloth mother whenever they were frightened or startled, and clinging to it calmed them down. They went to the wire mother only for milk, and immediately abandoned it.
Every textbook tells this story, along with heartbreaking photos of the infant monkeys clinging to their cloth mother when a scary moving toy is put into their cage. Wasn't this discovery, like Milgram's, something "we all knew" – in this case, that infants need contact comfort even more than they need food if they are to flourish? Didn't we have enough data from René Spitz and John Bowlby's observations of abandoned infants warehoused in orphanages?
Well, no, apparently we didn't. As Deborah Blum describes in Love at Goon Park (2002), most American psychologists at the time were under the influence of either behaviourism or psychoanalysis, two apparently opposed philosophies that nonetheless shared a key belief: that the origin of a baby's attachment to the mother was through food. Behaviourists believed that healthy child development required positive reinforcement: the mother satisfies the baby's hunger drive; the baby becomes conditioned to associate the mother with food; mother and breast are equated. Interestingly, it was the Freudian view as well: no mother need be present, only a breast. "Love has its origin", Freud wrote, "in attachment to the satisfied need for nourishment." Why would cuddling be necessary? For the eminent behaviourist John Watson, cuddling was coddling.
But whereas Milgram's findings need constant reiteration in every generation, there is nothing surprising in Harlow's any more. One might say that the very success of his research makes teaching it unnecessary: no one would argue against Harlow's findings, as many students and laypeople always want to do with Milgram's. Adult humans could choose to walk out of Milgram's experiment at any point, and a third of them did. But the monkeys were captives, tortured by their isolation. In recent decades, psychologists have learned that "torture" is not an exaggeration to describe the experience of isolation for any primate. And to torture infants? It's horrible. But the fact that so many people think it is horrible now – and didn't then – is an extraordinary story in itself. How have we extended the moral circle to include other primates?
In 1973, as a young editor at Psychology Today, I interviewed Harlow. I walked through his lab with our photographer, Rod Kamitsuka, and looked aghast at an array of monkeys, each cowering alone in its own cage, electrodes on their heads. When Rod took a picture of one, it became wildly excited and fearful, careering around its tiny cage trying to escape. Rod and I were devastated, but Harlow was amused by us. "I study monkeys", Harlow said, "because they generalize better to people than rats do, and I have a basic fondness for people." I asked him what he thought of his critics who said that taking infants from their mothers was cruel and that the results did not justify the cruelty. He replied: "I think I am a soft-hearted person but I never developed a fondness for monkeys. Monkeys do not develop affection for people. And I find it impossible to love an animal that doesn't love back". Today, that sounds like lame moral reasoning: the fact that animals don't care for us is no justification for torturing them.
When I revisited Harlow's work, however, I was reminded of how many pioneering discoveries Harlow made, most of them lost in the telling of the main story of contact comfort. But he also demonstrated that monkeys use tools, solve problems, and learn because they are curious or interested in something, not just to get food or other rewards. He showed the importance of contact with peers, which can even overcome the detrimental effects of maternal deprivation. Harlow created a nuclear family unit for some of the monkeys and found that under those conditions, rhesus males became doting fathers – something they don't do in the wild.
Harlow was hardly the first to demonstrate the power of "mother love" and the necessity of contact comfort – and the devastation that ensues when an infant is untouched and unloved. Was experimenting with monkeys, by raising them in isolation with only wire or cloth mothers and causing them anguish that no observer could fail to see, essential to make the same point that Bowlby and Spitz had done? I don't know. What Harlow did, like Milgram, was to make his case dramatic, compelling, and scientifically incontrovertible. His evidence was based not on anecdote or observation, however persuasive, but on empirical, replicated data. As Blum showed, that is what it took to begin to undermine a scientific world view in which the need for touch and cuddling – physical expressions of mother love – had been so deeply ignored.
Harlow's work is a great contribution to psychology: it shows not only how we thought about mothers, but also how we thought about monkeys. It shows how dominant psychological perspectives influence our lives – in his day, behaviourism or psychoanalysis; in our day, genetics and brain chemistry – seeping into the questions we ask and the studies we do. The classics are living history, and we are not at the end of history by any means.
––––––––––––––––––––––––––––––––
This is an edited version of a lecture given at the Association for Psychological Science's annual convention on May 24.
Teaching contentious classics: Sherif, Milgram and Harlow revisited
CAROL TAVRIS
The fiftieth anniversary last year of the Milgram experiments on obedience to authority provides a good vantage point from which to consider the eternal dilemma for instructors and textbook writers: how much time should we devote to the classics, and how should we teach them? In every generation, certain studies or approaches rise to prominence. But once they get planted in our books and lectures, they tend to become rooted there. Over time it gets harder to decide how much to prune – let alone whether it's time to uproot them altogether. We stop looking at the original studies closely, let alone critically; they just sit there in our courses like grand historical monuments. However, it's good to reexamine them for two important reasons: one is for our own sake, to refresh our memories and rethink their contributions; the other is for our students' sake. Students today are as eager to reject unflattering or counterintuitive portrayals of humanity as students were decades ago. Teaching the classics therefore means finding new ways of persuading students that these findings do apply to them despite the errors or limitations of the original studies.
The relationship between cultural events and research is a two-way street: an event may stimulate research, and research may influence the larger culture. The story of Kitty Genovese – murdered in New York in 1964, while thirty-eight witnesses allegedly watched from their windows and failed to call for help – still occupies a prominent position in social psychology, where it launched a long and productive line of experimental studies on bystander apathy, deindividuation and intervention. Thanks to two recent books and a critical reassessment that appeared in the American Psychologist in 2007, though, we now know that almost all of the details reported at the time were wrong. Turning Genovese's death into a story of urban alienation was largely the work of A. M. Rosenthal at the New York Times, who wanted the image of the "38 witnesses doing nothing" to be in the story's headline, where it quickly went the contemporary equivalent of viral. In fact, most of the neighbours who heard her screams could not see her – let alone watch her murder from their windows – and thought it was just another drunken domestic fight. As it turns out, only three neighbours understood the attack for what it was and failed to respond. Three is too many; but a newspaper story about three craven witnesses would not have been as shocking.
Should teachers eliminate the Kitty Genovese story out of embarrassment that we got it wrong or weren't sceptical enough? Not necessarily. Although the specifics of the story were wrong, its essence was true. The accurate Genovese story, however, raises an additional social and psychological lesson for students: what cautions does it offer for thinking critically about an equally sensational crime today? What does it reveal about the societal sources of anxiety that generate an urban legend? America at the time was undergoing political assassinations, race riots, the Vietnam War, and rising crime rates – Kitty Genovese's was one of 636 murders in New York that year. People were frightened, and the headline resonated.
With Kitty Genovese, an oversimplified but emotionally compelling narrative led to good research. It often works the other way, of course: good research can lead to an oversimplified and emotionally compelling narrative. Consider the case of Walter Mischel's "marshmallow study" of four-year-old children's ability to delay gratification. (Children were given a choice between eating one marshmallow right then or waiting a little while and getting two.) The part that got so much recent public attention was that children who had resisted temptation turned out years later to score as many as 210 points higher on their SATs than their less patient peers. Bingo!
The marshmallow study captured the public imagination just as Kitty Genovese had, but with a brighter, happier moral – one that suits our current cultural concerns. We can all see those kids in our mind's eye, and imagine what we would have done, faced with a now-or-later choice. And it has such an all-American, Calvinist moral: delay gratification and heaven will be thine. Yet Matthew Bourne's analysis in the New York Times revealed that one reason that this story resonated so much was the fact that inconvenient details about the actual scientific findings had been pruned away. Mischel's original studies focused on 653 toddlers, all of them attending the Bing Nursery School at Stanford, California, for children of professors and graduate students. The studies weren't originally designed to look at long-term outcomes; that idea occurred to Mischel much later, when he asked his own children, who had attended the Bing school, how his research subjects were faring in college. Of the original 653, he tracked down 185, of whom just ninety-four provided their SAT scores.
So we have a problem with the representativeness of the sample – originally and in the follow-up. How would that affect the results? An article in Cognition in 2012 by Celeste Kidd, Holly Palmeri and Richard N. Aslin showed that some children were more likely to eat the first marshmallow when, by virtue of their previous history, they had reason to doubt the researcher's promise to come back with a second one. For children raised in an unstable environment, they wrote, "the only guaranteed treats are the ones you have already swallowed", while children raised in a more stable environment might be willing to wait a few more minutes, confident that a second treat will materialize. And not only the stability of the environment matters: I'm an only child, so the effects of siblings didn't occur to me, until a friend said, "try the marshmallow study on those of us who have three siblings. You grabbed what you could, or else one of them got it".
To offer these caveats about the way the Mischel findings are so often told is not to debunk the excellent original study but to make students more excited about it – to alert them not only to what it found, but also to what it failed to find. The study and its popular response illustrate how, in science, each finding generates new questions; the importance of critical thinking in coming up with alternative hypotheses; and the importance of not oversimplifying. An ability to resist temptation is one factor among many that shapes our lives, but if you are a child from an unstable family, living in a tough environment, in poor health, it might not be the most important. It might not be a stable character trait. It might not even cross domains of experience.
Some of social psychology's great historical monuments – studies with a long-lasting influence – could not be replicated today: Muzafer Sherif's "Robbers Cave" study, Stanley Milgram's obedience experiments, and Harry Harlow's wire-and-cloth mother studies.
Between 1949 and 1954, Sherif and his colleagues used a Boy Scout camp in Oklahoma to test their theories of the origin and reduction of intergroup animosity and prejudice. They randomly assigned the twelve-year-old boys to one of two groups, the Eagles or the Rattlers, and set up competitive activities that quickly generated "us–them" thinking and mutual hostility. Later, to undo the warfare they had thus created, they set up situations in which the boys had to work together to achieve some shared goal.
When I revisited Sherif's original papers, I found that the study was not as rigorous as I had remembered. Most of the conclusions, Sherif wrote, "were reached on the basis of observational data" – confirmed by "sociometric choices and stereotype ratings". He said: "Observations made after several superordinate goals were introduced showed a sharp decrease in name-calling and derogation of the out-group common . . . in the contact situations without superordinate goals". (One of the pleasures of going back to read original studies is that of the unexpected discovery: The name-calling is so charmingly outdated. In 1949, boys put each other down by saying things like "all of them are stinkers" and calling their enemies "smart alecks".) Sherif did provide some numbers and percentages, a few chi squares, but this was a field study, with all of the uncontrollable variables that field studies can generate, and as "science" it would not meet today's standards. Was everything all hunky dory for the Eagles and Rattlers afterwards? The numbers of boys favourable towards the out-group improved, but the majority of boys in each group apparently maintained their hostility towards each other.
Yet Robbers Cave was and remains important for its central ideas. At the time, most psychologists did not understand, and most laypeople don't understand even today, that simply putting two competing, hostile groups together in the same room to, say, watch a movie, won't reduce their antagonism; that competitive situations generate hostility and stereotyping of the out-group; and that competition and hostility can be reversed, at least modestly, through cooperation in pursuit of shared goals. That's the story of Robbers Cave: it was true then, and it's true now.
In fact, just as the Kitty Genovese case spurred bystander-intervention experiments, Robbers Cave generated a great deal of research into the importance of superordinate goals. When Elliot Aronson went into the newly desegregated but hostile classrooms in Austin, Texas, where African American, Mexican American, and Anglo children were at war with each other, Sherif's findings strongly influenced his design of the "jigsaw" classroom. But Elliot went one better, using an experimental intervention and a control group. What a great coda to the Robbers Cave story – a direct link from Eagles and Rattlers, a made-up antipathy, to interethnic warfare in American schools, which is all too real and persisting.
Teaching the lessons of Stanley Milgram's experiments is far more complicated than Sherif's. Again, the cultural context of the times is crucial. In 1961, when Adolf Eichmann was claiming at his trial that he was "only following orders" in the murder of Jews during the Holocaust, Milgram began his effort to determine how many Americans would obey an authority figure when directly ordered to harm another human being.
Participants came to the Yale lab thinking they were part of an experiment on the effects of punishment on learning, and were instructed to administer increasing levels of shock to a "learner". The learner was a confederate of Milgram who did not receive any shocks, but played his part convincingly: as the study continued, he shouted in pain and pleaded to be released, according to an arranged script. To almost everyone's surprise at the time, some two-thirds of the participant "teachers" administered what they thought were the highest levels of shock, even though many were experiencing difficulty or distress doing so. Milgram's experiment produced a firestorm of protest about the possible psychological harm inflicted on the unwitting participants, and as a result, it could never be done today in its extreme version.
Some people hated the method and others the message, but the Milgram study has never faded from public attention and debate about it continues. Gina Perry, an Australian journalist whose book, Behind the Shock Machine (2012), aimed to discredit Milgram and his findings, interviewed everyone she could find who was connected to the original study, along with Milgram's critics and defenders. She also went through the archives of Milgram's voluminous unpublished papers.
Reinvestigations almost invariably yield some useful discoveries. Perry found violations of the research protocol: Over time, the man playing the experimenter began to drift off-script, urging reluctant subjects to keep going longer than he was supposed to. He pressed some people eight or nine times; one woman, twenty-six times. To my own dismay, I learned that Milgram committed what researchers, even then, would have considered a serious breach of ethics: He did not fully debrief subjects at the end of each experiment. They did meet the "learner" to shake hands and be assured that he was fine, but they were not told that all those escalating levels of shocks were completely fake, because Milgram was afraid the word would get out and invalidate future participants' behaviour. It was almost a year before subjects were given a full explanation. Some never got it; some never understood what the whole thing was about.
For critics like Perry, these flaws are reason enough to kick Milgram off his pedestal and out of our textbooks. I disagree. I think we need to give the Milgram experiment the prominent position we do, and for the same reason as when it was originally done. When I first read about Milgram's experiments in graduate school, I remember thinking, "Very clever, but why do we need them? Wasn't Nazi Germany evidence enough of obedience to authority?" But that was Milgram's point: in the early 1960s, Americans – and American psychologists – deeply believed in national character. Germans obeyed Hitler, it was widely assumed, because obedience was in the German psyche: look at all those high scores on the Authoritarian scale. It could never happen here.
For me, reading Perry's criticisms made it all the clearer why the Milgram experiments deserve their prominence. "Deep down, something about Milgram makes us uneasy", she writes. There is indeed something that makes everyone uneasy: the evidence that situations have power over our behaviour. This is a difficult message, and most students – indeed, most people – have trouble accepting it. "I would never have pulled those levers!" we cry. "I would have told that experimenter what a . . . stinker he is!" Perry insists that people's personalities and histories influence their actions. But Milgram never disputed that fact; his own research found that many participants resisted. Milgram wrote: "There is a tendency to think that everything a person does is due to the feelings or ideas within the person. However, scientists know that actions depend equally on the situation in which a man finds himself". Notice the "equally" in that sentence; many critics, like Perry, don't.
One of the original subjects in the experiments, called Bill, tried to explain to Perry why the studies were so valuable, and why he did not regret participating, although he was one of those who went on to the end. He hadn't thought about the experiment for twenty years, he said, until he began dating a psychology professor. She, thrilled to have met a living link to the experiment, asked him to speak to her class. "Well", Bill told Perry, "you would have thought Adolf Hitler walked in the room. I never really thought about it that way, you know?" Bill told the students, who were silently sitting in judgment on him: "It's very easy to sit back and say, 'I'd never do this or that' or 'Nobody could ever get me to do anything like that.' Well, guess what? Yes, they can".
That, of course, is the moral of the story. But the wall of hostility that Bill felt from the students means that they, along with critics like Perry, had failed to grasp that moral. They were reading about the experiment, seeing the films – and not understanding that they themselves might have been Bill.
My final classic is Harry Harlow's demonstration of the importance of contact comfort. Harlow took infant rhesus monkeys away from their mothers and raised them with a "wire mother", a forbidding construction of wires with a milk bottle connected to it, and a "cloth mother", a similar construction but one covered in foam rubber and terry cloth. At the time, it was widely believed (by psychologists, if not mothers) that babies become attached to their mother simply because mothers provide food. But Harlow's baby monkeys ran to the terry-cloth mother whenever they were frightened or startled, and clinging to it calmed them down. They went to the wire mother only for milk, and immediately abandoned it.
Every textbook tells this story, along with heartbreaking photos of the infant monkeys clinging to their cloth mother when a scary moving toy is put into their cage. Wasn't this discovery, like Milgram's, something "we all knew" – in this case, that infants need contact comfort even more than they need food if they are to flourish? Didn't we have enough data from René Spitz and John Bowlby's observations of abandoned infants warehoused in orphanages?
Well, no, apparently we didn't. As Deborah Blum describes in Love at Goon Park (2002), most American psychologists at the time were under the influence of either behaviourism or psychoanalysis, two apparently opposed philosophies that nonetheless shared a key belief: that the origin of a baby's attachment to the mother was through food. Behaviourists believed that healthy child development required positive reinforcement: the mother satisfies the baby's hunger drive; the baby becomes conditioned to associate the mother with food; mother and breast are equated. Interestingly, it was the Freudian view as well: no mother need be present, only a breast. "Love has its origin", Freud wrote, "in attachment to the satisfied need for nourishment." Why would cuddling be necessary? For the eminent behaviourist John Watson, cuddling was coddling.
But whereas Milgram's findings need constant reiteration in every generation, there is nothing surprising in Harlow's any more. One might say that the very success of his research makes teaching it unnecessary: no one would argue against Harlow's findings, as many students and laypeople always want to do with Milgram's. Adult humans could choose to walk out of Milgram's experiment at any point, and a third of them did. But the monkeys were captives, tortured by their isolation. In recent decades, psychologists have learned that "torture" is not an exaggeration to describe the experience of isolation for any primate. And to torture infants? It's horrible. But the fact that so many people think it is horrible now – and didn't then – is an extraordinary story in itself. How have we extended the moral circle to include other primates?
In 1973, as a young editor at Psychology Today, I interviewed Harlow. I walked through his lab with our photographer, Rod Kamitsuka, and looked aghast at an array of monkeys, each cowering alone in its own cage, electrodes on their heads. When Rod took a picture of one, it became wildly excited and fearful, careering around its tiny cage trying to escape. Rod and I were devastated, but Harlow was amused by us. "I study monkeys", Harlow said, "because they generalize better to people than rats do, and I have a basic fondness for people." I asked him what he thought of his critics who said that taking infants from their mothers was cruel and that the results did not justify the cruelty. He replied: "I think I am a soft-hearted person but I never developed a fondness for monkeys. Monkeys do not develop affection for people. And I find it impossible to love an animal that doesn't love back". Today, that sounds like lame moral reasoning: the fact that animals don't care for us is no justification for torturing them.
When I revisited Harlow's work, however, I was reminded of how many pioneering discoveries Harlow made, most of them lost in the telling of the main story of contact comfort. But he also demonstrated that monkeys use tools, solve problems, and learn because they are curious or interested in something, not just to get food or other rewards. He showed the importance of contact with peers, which can even overcome the detrimental effects of maternal deprivation. Harlow created a nuclear family unit for some of the monkeys and found that under those conditions, rhesus males became doting fathers – something they don't do in the wild.
Harlow was hardly the first to demonstrate the power of "mother love" and the necessity of contact comfort – and the devastation that ensues when an infant is untouched and unloved. Was experimenting with monkeys, by raising them in isolation with only wire or cloth mothers and causing them anguish that no observer could fail to see, essential to make the same point that Bowlby and Spitz had done? I don't know. What Harlow did, like Milgram, was to make his case dramatic, compelling, and scientifically incontrovertible. His evidence was based not on anecdote or observation, however persuasive, but on empirical, replicated data. As Blum showed, that is what it took to begin to undermine a scientific world view in which the need for touch and cuddling – physical expressions of mother love – had been so deeply ignored.
Harlow's work is a great contribution to psychology: it shows not only how we thought about mothers, but also how we thought about monkeys. It shows how dominant psychological perspectives influence our lives – in his day, behaviourism or psychoanalysis; in our day, genetics and brain chemistry – seeping into the questions we ask and the studies we do. The classics are living history, and we are not at the end of history by any means.
––––––––––––––––––––––––––––––––
This is an edited version of a lecture given at the Association for Psychological Science's annual convention on May 24.
Bernstein or bust
Bernstein or bust
The cast of On the Town (1949)
Book Details
Nigel Simeone, editor
THE LEONARD BERNSTEIN LETTERS
606pp. Yale University Press. £25 (US $38).
978 0 300 17909 5
A protean personality as displayed in his gossipy, intelligent correspondence
MORRIS DICKSTEIN
"Dear Pupil", wrote Aaron Copland to his twenty-two-year-old protégé Leonard Bernstein in 1940:
What terrifying letters you write: fit for the flames is what they are. Just imagine how much you would have to pay to retrieve such a letter forty years from now when you are conductor of the Philharmonic. Well it all comes from the recklessness of youth, that's what it is. Of course I don't mean that you mustn't write such letters (to me, that is), but I mustn't forget to burn them.
Bernstein was just beginning his conducting studies with Serge Koussevitzky, the long-time maestro of the Boston Symphony Orchestra, but those who knew him had already singled him out as the great hope of American music. "The Big Boys here", Bernstein had written a year earlier, "have it all decided that I am to become America's Great Conductor. They need an Apostle for their music." It took him only seventeen years, not forty, to become music director of the Philharmonic, frequently performing the work of American composers. Confirming Copland's warning, one unbalanced young musician did indeed try to blackmail him with his letters, though to no avail.
Far from burning Bernstein's gossipy, indiscreet letters, Copland crossed out a few names and saved these lively effusions – for some future archive and for this volume, a rich selection of letters to and from Bernstein, meticulously edited by Nigel Simeone. The book is a mine of research and helpful information. Every detail that needs explaining is tracked and annotated and every correspondent receives a capsule biography, enabling us to follow the life in vivid bits and pieces, especially in tandem with Humphrey Burton's excellent biography of 1994, where some of these letters first appeared.
Copland assumed, quite sensibly, that his reckless young friend was destined for greatness. Bernstein's gifts were already on display when they met at a party in 1937. It was Copland's thirty-seventh birthday and the Harvard undergraduate, a gifted and showy pianist, sat down at the keyboard and played Copland's difficult Piano Variations from memory. By 1939, when he graduated, Bernstein had also begun writing music and he wondered whether the Big Boys, by fastening on him as a conductor, wanted to "to keep a rival composer out of the field". Soon he was studying conducting with the formidable Fritz Reiner at the Curtis Institute in Philadelphia, the best student Reiner ever had.
Copland became a mentor and lifelong friend – their correspondence is central to the first half of this collection – and Bernstein in turn would become the master interpreter of Copland's music, especially as it took a popular turn with ballet scores such as Billy the Kid, Rodeo and Appalachian Spring. Bernstein's first published work was his piano transcription of the infectious Latin rhythms of Copland's El Sálon México. But Bernstein's own story – his open grid of possibility, his ambition and lusty appetite for life, his largely gay sexual adventures, even his self-absorption – fascinated Copland as much as the young man's plentiful gifts. "What a letter!" he writes from Hollywood in 1943. "I had a wonderful time with it – better than any novel. But now I want to read the next chapter." Ever determined to plumb his own depths, Bernstein was seeing a psychotherapist, as he would through most of his life; his musings about his unconscious motives made the more placid Copland wonder whether he too had "an inner psyche doing funny things without my knowing it".
The wunderkind was just on the cusp of being discovered by the world at large. In late August 1943, he was appointed assistant conductor of the New York Philharmonic and just two months later, when Bruno Walter was suddenly taken ill, Bernstein, without benefit of rehearsal, made headlines with a sensational performance broadcast on national radio. The press had been alerted, his devoted Boston Jewish family was in town, and the adulation he enjoyed became a drug he would always crave. A pattern was established: though composing was his deepest aspiration, and he took frequent sabbaticals to pursue it, he could never give up the podium for any length of time. He'd long since pinpointed one symptom of his problem: "my love of people. I need them all the time – every moment . . . . I cannot spend one day alone without becoming utterly depressed". Yet all his relationships seemed like one-night stands, transient, superficial. "I still hate being alone, and yet don't want anyone in particular", he wrote to Copland in 1942. "You're the only one that persists and persists, come hell or high water."
By the time of his debut with the Philharmonic, Bernstein had already written a symphony, Jeremiah, and a fine sonata for clarinet and piano, yet he felt restless and unfulfilled. In 1944, his work took an unexpected turn; he collaborated on a heady, rambunctious new ballet, Fancy Free, with the young Jerome Robbins. Its success soon led to a musical, On the Town, with choreography by Robbins, book and lyrics by two of Bernstein's closest friends, Betty Comden and Adolph Green – four effervescent, try-anything newcomers in their twenties reined in by one veteran director, George Abbott. The wartime setting, as in Fancy Free, was urban and contemporary – three sailors on shore leave in New York, looking for girls. The dancing was winningly fresh, the lyrics witty, and the vibrant jazz- inflected score ranged from the bluesy to the frenetic. The most memorable number, "New York, New York", buoyant with irrepressible energy, would serve as an enduring anthem for Bernstein's adopted city. The show ran for 463 performances.
Writing for the theatre, for a popular audience, brought out the best in Bernstein even as it kept solitude and depression at bay. He relished collaborating with people as talented as Abbott or Robbins, sponging up what they knew with an astonishing facility. This cross-pollination became one key to his success. Like George Gershwin, he worked with sophisticated lyricists who steered clear of the clichés of the musical theatre, particularly those soggy boy-meets-girl love songs. Thanks to the triumph of Oklahoma!, the Broadway musical was shifting direction when Bernstein arrived. The slapdash shows of the 1920s and 30s, composed loosely of song, dance and spectacle, were giving way to the book musical, with song and dance embedded in character and integrated into the story. With Oklahoma!, Rodgers and Hart had been replaced by Rodgers and Hammerstein, just as Bernstein and his collaborators were paving the way for Hammerstein's protégé, Stephen Sondheim, who kicked off his illustrious career by writing the lyrics for Bernstein's strongest musical, West Side Story.
Sondheim, like Bernstein, would be blamed for not writing hits, hummable tunes that could be extracted from their context and take off on their own – in recordings and jukeboxes, in sheet music and on the radio, where so much money lay. In The Joy of Music (1959), Bernstein had responded defensively with an imaginary dialogue, "Why Don't You Run Upstairs and Write a Nice Gershwin Tune?" Gershwin, he says, "really had the magic touch. Gershwin made hits, I don't know how. Some people do it all the time, like breathing". His imagined accuser persists: "Your songs are simply too arty. . . . A special little dissonant effect in the bass may make you happy, and maybe some of your highbrow friends, but it doesn't help to make a hit". (Hollywood had in fact been leery of Gershwin for being too artful; this made him swear, tongue in cheek, that his only goal was to write hits.) The charge points to Bernstein's strength as a theatrical composer. What he lacks in sheer melodic invention he makes up for in harmonic subtlety and complexity. For Paul Bowles, a composer before he became a writer, On the Town had "an epoch-making score" and "its instrumentation was phenomenal in its cleverness". To Bernstein, a Gershwin showpiece like Rhapsody in Blue remained a collection of moveable parts, lacking the structure and organic development that was the glory of the symphonic tradition. "Composing is a very different thing from writing tunes." Yet Bowles saw Bernstein as Gershwin's heir: "No other composer combines the same kind of nervous energy and the same incredible degree of facility".
As music drama, West Side Story is Bernstein's Porgy and Bess, the crossover work, hard to classify, in which the writer transcended himself. Gershwin came to this territory from Tin Pan Alley, having gradually mastered orchestral writing along the way. Bernstein arrived from classical composition, passing through musical comedy and operetta. It's no accident that dance looms large in Bernstein's best shows. He provided the musical base for Robbins's choreography in Fancy Free, On the Town and West Side Story, orchestral writing with strong, syncopated rhythms, almost palpable drama, and a strikingly vernacular sound. The fierce urban tension in the music for West Side Story also infuses his memorable score for Elia Kazan's film On the Waterfront. He adapted both into symphonic suites. His most frequently performed concert work actually originated on Broadway, the four-minute overture to Candide in 1956.
Both the ingenious yet flawed Candide, musically rich (too rich, some said), and the succès fou of West Side Story came at the end of the freelance years that followed Bernstein's sensational debut at Carnegie Hall. After On the Town, Koussevitzky read him the riot act, insisting that he stick to conducting and the concert hall, and Bernstein, ever deferential to his first patron, stayed away from the theatre as long as Koussevitzky was alive. (He died in 1951.) Instead, he became everyone's favourite guest conductor, building his repertoire, preserving some of his freedom to compose, sometimes leading as many as a dozen orchestras in a single year. For him conducting was a form of lovemaking, an overflow of passion, and he took pleasure in seducing yet another set of players as well as a new audience.
Two urgent commitments influenced Bernstein's conducting life during these years. As the first American-born conductor to gain wide acceptance in many European cities (or American cities, for that matter) he proselytized for American composers whose work was still only rarely given a hearing. Sometimes he overstepped the mark, as when he programmed one of his own pieces for a Boston Symphony concert, after which Koussevitzky, who preferred Bernstein's conducting to his composing, turned on him:
May I ask you: do you think that your composition is worthy of the Boston Symphony Orchestra and the Boston organization? Can it be placed on the same level as Beethoven, Schubert, Brahms, Stravinsky, Prokofieff, Bartók or Copland?
As usual with Koussevitzky, Bernstein sent a humble, breast-beating reply:
Why do these misunderstandings happen? Is there an evil element in my nature that makes me do and say immoral things? . . . Whenever I conduct in Boston I am conducting for you, deep inside, and whatever I may do well is a tribute to you. My main concern is to make you proud of me, and justified in all your efforts for me.
Yet Koussevitzky himself was a vigorous supporter of contemporary composers, and within a few years he commissioned and premiered Bernstein's second symphony, loosely based on W. H. Auden's Age of Anxiety. Bernstein had eaten humble pie out of deep respect, even reverence, but also from a canny self-interest, for the older man had taken him under his wing and done more than anyone to advance his career.
The other highlight of Bernstein's conducting Wanderjahre was his excited discovery of Israel; it seems to have triggered a renewal of his Jewish identity. He had grown up in a family in which religion mattered – his father was a Talmud student who came from a long line of rabbis – but cultural identification mattered even more. He had already set Hebrew texts from Lamentations in Jeremiah and would deploy biblical and liturgical texts in major compositions such as his Kaddish and Chichester Psalms. He was drawn to Israel and its musicians, many of them refugees from Nazi Europe, even before the state was declared. As early as 1947 he wrote to Koussevitzky:
If you ever wanted to be involved in a historical moment, this is it. The people are remarkable; life goes on in spite of police, bombs, everything. There is a strength and devotion in these people that is formidable. They will never let their land be taken from them; they will die first. And the country is beautiful beyond description.
He returned in 1948, during the Arab–Israeli war, to conduct forty concerts in sixty days, sometimes with artillery fire booming in the background. He wrote home of the beauty of Jerusalem, the heroism and privation of the people, and was tempted to sign on as principal conductor of the newly renamed Israel Philharmonic. He always maintained a link with the orchestra, a kind of family connection that meant the world to him, yet he would also perform with former Nazis in Vienna, one of the adopted cities of his later years. For him, it was the music that mattered most, the music and the tumultuous acclaim he invariably received.
In these years, questions of family and identity loomed large for him. Writing from Israel in 1950 to his sister, Shirley, he describes them as "years of compulsive living, of driving headlong down alleys of blind patterns, dictated by God knows what vibrations". In 1946, he met a striking woman, Felicia Montealegre Cohn, a Chilean piano student and aspiring actress, half-Jewish, but educated by nuns and raised as a Catholic, and they got engaged, much to his parents' distress. Always close to his family – and especially to his sister and brother, with whom he spoke an affectionate private language – the footloose Bernstein felt the the pull of home and family. But Bernstein was bisexual, ambivalent, and the engagement foundered within a year. Still, he reckoned that his career, if he were ever to head a major orchestra, demanded a spouse, whatever his sexual inclinations. Three years later, though she was now with another man, he found himself thinking of Felicia again, testing himself. As he told his sister, who kept in touch with her:
I have been engaged in an imaginary life with Felicia, having her by my side on the beach as a shockingly beautiful Yemenite boy passes – inquiring into that automatic little demon who always springs into action at such moments.
Within a year Felicia's lover had died, and she and Bernstein were engaged again and soon married. He wondered "what security she will manage to find in a marriage contracted in insecurity". How much she knew about his past life is not clear, but after some difficult months she laid out the grounds of their relationship:
You are a homosexual and may never change. . . . I am willing to accept you as you are, without being a martyr or sacrificing myself on the L.B. altar. (I happen to love you very much – this may be a disease and if it is what better cure?) . . . A companionship will grow which probably no one else may be able to offer you. The feelings you have for me will be clearer and easier to express – our marriage is not based on passion but on tenderness and mutual respect. Why not have them?
What stabilized their marriage and turned it into more of a love story was the arrival of their children, eventually three of them, which brought out the exuberant paterfamilias in Bernstein, and replicated the cherished family structure in which he had grown up. It also provided the social respectability he needed, both professionally and psychologically. Felicia became his closest friend, the indispensable anchor of his peripatetic life, replacing Copland as his confessor. Bernstein wrote some of his best letters to her, warm, loving and keenly observant, though also rife with self-dramatization.
I miss you terribly, and love your letters. They carry a whiff of something warm and joyful and familiar. Imagine – after three years: joyful! Is it wonderful: home has always been the spot in which I happened to be: now it is a place, with all that one place connotes . . . . A new experience.
This idyll came to an end twenty-five years later, in 1976, when he left her to be with a man. "You're going to die a bitter and lonely old man", she reportedly said to him. They were reconciled when she fell ill with cancer, and her death soon afterwards left him rudderless, plagued by feelings of guilt and remorse for the rest of his life.
Though it left him little time to compose, Bernstein's rock-star fame only increased when he became conductor of the New York Philharmonic in 1958. Some said his gestural and emotive conducting style was directed more at the audience than at the players, but no one was better at shaping an orchestral interpretation or enabling ordinary concert-goers to grasp it. He inhabited the music as if possessed by it. Though he never satisfied the New York Times critic, Harold Schonberg, or the atonal avant-garde, his tenure was a success by any measure: more American and contemporary music, ambitious touring and over 200 recordings, a younger, more enthusiastic audience. In preview performances and in televised Young People's Concerts, he demonstrated that he could expound music colloquially to laymen as well as he could perform it.
Bernstein's genius as an educator had been evident earlier, especially in Koussevitzky's summer music programmes at Tanglewood, in western Massachusetts. In 1954 he had taken on a new role that made him the voice and face of classical music in America. As part of a television series called Omnibus, funded by the Ford Foundation, he took viewers through the intricacies of Beethoven's Fifth, using the composer's discarded sketches to show them exactly the kind of development he found missing in Gershwin's Rhapsody in Blue, a symphonic progress in which every note feels inevitable. Bernstein was soon acknowledged as one of the best village explainers to set foot on the classical terrain. He was able to think from the audience's point of view, asking – and answering – questions at once simple and fundamental: what makes modern music modern? What are the musical constituents of jazz? How did the American musical evolve into a great indigenous form? Never talking down, instead he breaks things down, always focusing on the notes, not merely the circumstances. It is hardly a surprise that the best of these visual essays is on the art of conducting, which ranges from the use of the baton to "the intangibles, the deep magical aspect of conducting", the passion, the timing, the sculptural sense of proportion that enables a single intelligence to mould so many disparate players and instruments into a satisfying whole.
After eleven years with the Philharmonic, Bernstein spent the best part of his life refining his conducting into utter mastery. The jury is still out on which of his own classical compositions might survive. Copland himself, early on, was in two minds about Bernstein's music, praising his "vibrant rhythmic invention" and "immediate emotional appeal" but seeing it, at its worst, as "conductor's music – eclectic in style and facile in inspiration". Bernstein's Broadway musicals, even the ill-fated Candide, have been repeatedly revived, sometimes by altering or updating the book. He became too famous, too busy, too set in his ways to write the kind of fluent, arresting letters he once did, and the later letters lack the appeal of the gifted young man's self-absorption. "Some day, preferably soon, I simply must decide what I'm going to be when I grow up", he writes in 1955.
From the 1980s, his last decade, when he continued a frenetic round of international travel, lionized everywhere he went, there are hardly any letters at all. One thing was not in doubt: his conducting became ever more personal and probing, especially with the composers with whom he identified. His role in the Mahler revival was critical. He felt akin to Mahler's grandiosity and spiritual ambition, his Jewish origins, his role as a culminating figure in the Austro-German symphonic tradition, but above all to what Bernstein described as Mahler's "doubleness", his "straddling", as a man split between composing and conducting, between sweeping, resonant emotions and kitsch. As early as 1950, Bernstein had written, "Sometimes I feel clearly that his difficulties were the same as my own. Mahler was also 'possessed' by music and his compositions, too, originated during his occupations as a conductor. . . . . With works by Mahler I seem to be playing some of my own". Though "born in the lap of Gershwin and Copland", he wrote to Karl Böhm, he became an "adopted son" of European music, as much at home in Vienna and Amsterdam as in New York and Israel.
Bernstein's letters reveal more about his protean personality than about his work. Strictly musical discussions are rare, for the letters are often the fruit of his travels, while his creative collaborations were usually face to face. Nigel Simeone's impeccable notes and introductions make this a virtual biography but key moments, such as his separation from Felicia, barely figure, perhaps because they're too fraught, too sensitive, to be written about. But what the letters lack in narrative continuity they make up for with an immediacy of feeling, voice, direct exchange. The violinist Isaac Stern loved it when he and Bernstein talked "about life and music, about ideas, about family . . . . The greatest times I had with him were always alone. You could never talk with Lenny this way if one other person entered the room; [he] immediately went on stage". Alive with spontaneous intelligence, Leonard Bernstein's letters display exactly this unforced intimacy, though there were moments when he no doubt knew that posterity was listening in.
The cast of On the Town (1949)
Book Details
Nigel Simeone, editor
THE LEONARD BERNSTEIN LETTERS
606pp. Yale University Press. £25 (US $38).
978 0 300 17909 5
A protean personality as displayed in his gossipy, intelligent correspondence
MORRIS DICKSTEIN
"Dear Pupil", wrote Aaron Copland to his twenty-two-year-old protégé Leonard Bernstein in 1940:
What terrifying letters you write: fit for the flames is what they are. Just imagine how much you would have to pay to retrieve such a letter forty years from now when you are conductor of the Philharmonic. Well it all comes from the recklessness of youth, that's what it is. Of course I don't mean that you mustn't write such letters (to me, that is), but I mustn't forget to burn them.
Bernstein was just beginning his conducting studies with Serge Koussevitzky, the long-time maestro of the Boston Symphony Orchestra, but those who knew him had already singled him out as the great hope of American music. "The Big Boys here", Bernstein had written a year earlier, "have it all decided that I am to become America's Great Conductor. They need an Apostle for their music." It took him only seventeen years, not forty, to become music director of the Philharmonic, frequently performing the work of American composers. Confirming Copland's warning, one unbalanced young musician did indeed try to blackmail him with his letters, though to no avail.
Far from burning Bernstein's gossipy, indiscreet letters, Copland crossed out a few names and saved these lively effusions – for some future archive and for this volume, a rich selection of letters to and from Bernstein, meticulously edited by Nigel Simeone. The book is a mine of research and helpful information. Every detail that needs explaining is tracked and annotated and every correspondent receives a capsule biography, enabling us to follow the life in vivid bits and pieces, especially in tandem with Humphrey Burton's excellent biography of 1994, where some of these letters first appeared.
Copland assumed, quite sensibly, that his reckless young friend was destined for greatness. Bernstein's gifts were already on display when they met at a party in 1937. It was Copland's thirty-seventh birthday and the Harvard undergraduate, a gifted and showy pianist, sat down at the keyboard and played Copland's difficult Piano Variations from memory. By 1939, when he graduated, Bernstein had also begun writing music and he wondered whether the Big Boys, by fastening on him as a conductor, wanted to "to keep a rival composer out of the field". Soon he was studying conducting with the formidable Fritz Reiner at the Curtis Institute in Philadelphia, the best student Reiner ever had.
Copland became a mentor and lifelong friend – their correspondence is central to the first half of this collection – and Bernstein in turn would become the master interpreter of Copland's music, especially as it took a popular turn with ballet scores such as Billy the Kid, Rodeo and Appalachian Spring. Bernstein's first published work was his piano transcription of the infectious Latin rhythms of Copland's El Sálon México. But Bernstein's own story – his open grid of possibility, his ambition and lusty appetite for life, his largely gay sexual adventures, even his self-absorption – fascinated Copland as much as the young man's plentiful gifts. "What a letter!" he writes from Hollywood in 1943. "I had a wonderful time with it – better than any novel. But now I want to read the next chapter." Ever determined to plumb his own depths, Bernstein was seeing a psychotherapist, as he would through most of his life; his musings about his unconscious motives made the more placid Copland wonder whether he too had "an inner psyche doing funny things without my knowing it".
The wunderkind was just on the cusp of being discovered by the world at large. In late August 1943, he was appointed assistant conductor of the New York Philharmonic and just two months later, when Bruno Walter was suddenly taken ill, Bernstein, without benefit of rehearsal, made headlines with a sensational performance broadcast on national radio. The press had been alerted, his devoted Boston Jewish family was in town, and the adulation he enjoyed became a drug he would always crave. A pattern was established: though composing was his deepest aspiration, and he took frequent sabbaticals to pursue it, he could never give up the podium for any length of time. He'd long since pinpointed one symptom of his problem: "my love of people. I need them all the time – every moment . . . . I cannot spend one day alone without becoming utterly depressed". Yet all his relationships seemed like one-night stands, transient, superficial. "I still hate being alone, and yet don't want anyone in particular", he wrote to Copland in 1942. "You're the only one that persists and persists, come hell or high water."
By the time of his debut with the Philharmonic, Bernstein had already written a symphony, Jeremiah, and a fine sonata for clarinet and piano, yet he felt restless and unfulfilled. In 1944, his work took an unexpected turn; he collaborated on a heady, rambunctious new ballet, Fancy Free, with the young Jerome Robbins. Its success soon led to a musical, On the Town, with choreography by Robbins, book and lyrics by two of Bernstein's closest friends, Betty Comden and Adolph Green – four effervescent, try-anything newcomers in their twenties reined in by one veteran director, George Abbott. The wartime setting, as in Fancy Free, was urban and contemporary – three sailors on shore leave in New York, looking for girls. The dancing was winningly fresh, the lyrics witty, and the vibrant jazz- inflected score ranged from the bluesy to the frenetic. The most memorable number, "New York, New York", buoyant with irrepressible energy, would serve as an enduring anthem for Bernstein's adopted city. The show ran for 463 performances.
Writing for the theatre, for a popular audience, brought out the best in Bernstein even as it kept solitude and depression at bay. He relished collaborating with people as talented as Abbott or Robbins, sponging up what they knew with an astonishing facility. This cross-pollination became one key to his success. Like George Gershwin, he worked with sophisticated lyricists who steered clear of the clichés of the musical theatre, particularly those soggy boy-meets-girl love songs. Thanks to the triumph of Oklahoma!, the Broadway musical was shifting direction when Bernstein arrived. The slapdash shows of the 1920s and 30s, composed loosely of song, dance and spectacle, were giving way to the book musical, with song and dance embedded in character and integrated into the story. With Oklahoma!, Rodgers and Hart had been replaced by Rodgers and Hammerstein, just as Bernstein and his collaborators were paving the way for Hammerstein's protégé, Stephen Sondheim, who kicked off his illustrious career by writing the lyrics for Bernstein's strongest musical, West Side Story.
Sondheim, like Bernstein, would be blamed for not writing hits, hummable tunes that could be extracted from their context and take off on their own – in recordings and jukeboxes, in sheet music and on the radio, where so much money lay. In The Joy of Music (1959), Bernstein had responded defensively with an imaginary dialogue, "Why Don't You Run Upstairs and Write a Nice Gershwin Tune?" Gershwin, he says, "really had the magic touch. Gershwin made hits, I don't know how. Some people do it all the time, like breathing". His imagined accuser persists: "Your songs are simply too arty. . . . A special little dissonant effect in the bass may make you happy, and maybe some of your highbrow friends, but it doesn't help to make a hit". (Hollywood had in fact been leery of Gershwin for being too artful; this made him swear, tongue in cheek, that his only goal was to write hits.) The charge points to Bernstein's strength as a theatrical composer. What he lacks in sheer melodic invention he makes up for in harmonic subtlety and complexity. For Paul Bowles, a composer before he became a writer, On the Town had "an epoch-making score" and "its instrumentation was phenomenal in its cleverness". To Bernstein, a Gershwin showpiece like Rhapsody in Blue remained a collection of moveable parts, lacking the structure and organic development that was the glory of the symphonic tradition. "Composing is a very different thing from writing tunes." Yet Bowles saw Bernstein as Gershwin's heir: "No other composer combines the same kind of nervous energy and the same incredible degree of facility".
As music drama, West Side Story is Bernstein's Porgy and Bess, the crossover work, hard to classify, in which the writer transcended himself. Gershwin came to this territory from Tin Pan Alley, having gradually mastered orchestral writing along the way. Bernstein arrived from classical composition, passing through musical comedy and operetta. It's no accident that dance looms large in Bernstein's best shows. He provided the musical base for Robbins's choreography in Fancy Free, On the Town and West Side Story, orchestral writing with strong, syncopated rhythms, almost palpable drama, and a strikingly vernacular sound. The fierce urban tension in the music for West Side Story also infuses his memorable score for Elia Kazan's film On the Waterfront. He adapted both into symphonic suites. His most frequently performed concert work actually originated on Broadway, the four-minute overture to Candide in 1956.
Both the ingenious yet flawed Candide, musically rich (too rich, some said), and the succès fou of West Side Story came at the end of the freelance years that followed Bernstein's sensational debut at Carnegie Hall. After On the Town, Koussevitzky read him the riot act, insisting that he stick to conducting and the concert hall, and Bernstein, ever deferential to his first patron, stayed away from the theatre as long as Koussevitzky was alive. (He died in 1951.) Instead, he became everyone's favourite guest conductor, building his repertoire, preserving some of his freedom to compose, sometimes leading as many as a dozen orchestras in a single year. For him conducting was a form of lovemaking, an overflow of passion, and he took pleasure in seducing yet another set of players as well as a new audience.
Two urgent commitments influenced Bernstein's conducting life during these years. As the first American-born conductor to gain wide acceptance in many European cities (or American cities, for that matter) he proselytized for American composers whose work was still only rarely given a hearing. Sometimes he overstepped the mark, as when he programmed one of his own pieces for a Boston Symphony concert, after which Koussevitzky, who preferred Bernstein's conducting to his composing, turned on him:
May I ask you: do you think that your composition is worthy of the Boston Symphony Orchestra and the Boston organization? Can it be placed on the same level as Beethoven, Schubert, Brahms, Stravinsky, Prokofieff, Bartók or Copland?
As usual with Koussevitzky, Bernstein sent a humble, breast-beating reply:
Why do these misunderstandings happen? Is there an evil element in my nature that makes me do and say immoral things? . . . Whenever I conduct in Boston I am conducting for you, deep inside, and whatever I may do well is a tribute to you. My main concern is to make you proud of me, and justified in all your efforts for me.
Yet Koussevitzky himself was a vigorous supporter of contemporary composers, and within a few years he commissioned and premiered Bernstein's second symphony, loosely based on W. H. Auden's Age of Anxiety. Bernstein had eaten humble pie out of deep respect, even reverence, but also from a canny self-interest, for the older man had taken him under his wing and done more than anyone to advance his career.
The other highlight of Bernstein's conducting Wanderjahre was his excited discovery of Israel; it seems to have triggered a renewal of his Jewish identity. He had grown up in a family in which religion mattered – his father was a Talmud student who came from a long line of rabbis – but cultural identification mattered even more. He had already set Hebrew texts from Lamentations in Jeremiah and would deploy biblical and liturgical texts in major compositions such as his Kaddish and Chichester Psalms. He was drawn to Israel and its musicians, many of them refugees from Nazi Europe, even before the state was declared. As early as 1947 he wrote to Koussevitzky:
If you ever wanted to be involved in a historical moment, this is it. The people are remarkable; life goes on in spite of police, bombs, everything. There is a strength and devotion in these people that is formidable. They will never let their land be taken from them; they will die first. And the country is beautiful beyond description.
He returned in 1948, during the Arab–Israeli war, to conduct forty concerts in sixty days, sometimes with artillery fire booming in the background. He wrote home of the beauty of Jerusalem, the heroism and privation of the people, and was tempted to sign on as principal conductor of the newly renamed Israel Philharmonic. He always maintained a link with the orchestra, a kind of family connection that meant the world to him, yet he would also perform with former Nazis in Vienna, one of the adopted cities of his later years. For him, it was the music that mattered most, the music and the tumultuous acclaim he invariably received.
In these years, questions of family and identity loomed large for him. Writing from Israel in 1950 to his sister, Shirley, he describes them as "years of compulsive living, of driving headlong down alleys of blind patterns, dictated by God knows what vibrations". In 1946, he met a striking woman, Felicia Montealegre Cohn, a Chilean piano student and aspiring actress, half-Jewish, but educated by nuns and raised as a Catholic, and they got engaged, much to his parents' distress. Always close to his family – and especially to his sister and brother, with whom he spoke an affectionate private language – the footloose Bernstein felt the the pull of home and family. But Bernstein was bisexual, ambivalent, and the engagement foundered within a year. Still, he reckoned that his career, if he were ever to head a major orchestra, demanded a spouse, whatever his sexual inclinations. Three years later, though she was now with another man, he found himself thinking of Felicia again, testing himself. As he told his sister, who kept in touch with her:
I have been engaged in an imaginary life with Felicia, having her by my side on the beach as a shockingly beautiful Yemenite boy passes – inquiring into that automatic little demon who always springs into action at such moments.
Within a year Felicia's lover had died, and she and Bernstein were engaged again and soon married. He wondered "what security she will manage to find in a marriage contracted in insecurity". How much she knew about his past life is not clear, but after some difficult months she laid out the grounds of their relationship:
You are a homosexual and may never change. . . . I am willing to accept you as you are, without being a martyr or sacrificing myself on the L.B. altar. (I happen to love you very much – this may be a disease and if it is what better cure?) . . . A companionship will grow which probably no one else may be able to offer you. The feelings you have for me will be clearer and easier to express – our marriage is not based on passion but on tenderness and mutual respect. Why not have them?
What stabilized their marriage and turned it into more of a love story was the arrival of their children, eventually three of them, which brought out the exuberant paterfamilias in Bernstein, and replicated the cherished family structure in which he had grown up. It also provided the social respectability he needed, both professionally and psychologically. Felicia became his closest friend, the indispensable anchor of his peripatetic life, replacing Copland as his confessor. Bernstein wrote some of his best letters to her, warm, loving and keenly observant, though also rife with self-dramatization.
I miss you terribly, and love your letters. They carry a whiff of something warm and joyful and familiar. Imagine – after three years: joyful! Is it wonderful: home has always been the spot in which I happened to be: now it is a place, with all that one place connotes . . . . A new experience.
This idyll came to an end twenty-five years later, in 1976, when he left her to be with a man. "You're going to die a bitter and lonely old man", she reportedly said to him. They were reconciled when she fell ill with cancer, and her death soon afterwards left him rudderless, plagued by feelings of guilt and remorse for the rest of his life.
Though it left him little time to compose, Bernstein's rock-star fame only increased when he became conductor of the New York Philharmonic in 1958. Some said his gestural and emotive conducting style was directed more at the audience than at the players, but no one was better at shaping an orchestral interpretation or enabling ordinary concert-goers to grasp it. He inhabited the music as if possessed by it. Though he never satisfied the New York Times critic, Harold Schonberg, or the atonal avant-garde, his tenure was a success by any measure: more American and contemporary music, ambitious touring and over 200 recordings, a younger, more enthusiastic audience. In preview performances and in televised Young People's Concerts, he demonstrated that he could expound music colloquially to laymen as well as he could perform it.
Bernstein's genius as an educator had been evident earlier, especially in Koussevitzky's summer music programmes at Tanglewood, in western Massachusetts. In 1954 he had taken on a new role that made him the voice and face of classical music in America. As part of a television series called Omnibus, funded by the Ford Foundation, he took viewers through the intricacies of Beethoven's Fifth, using the composer's discarded sketches to show them exactly the kind of development he found missing in Gershwin's Rhapsody in Blue, a symphonic progress in which every note feels inevitable. Bernstein was soon acknowledged as one of the best village explainers to set foot on the classical terrain. He was able to think from the audience's point of view, asking – and answering – questions at once simple and fundamental: what makes modern music modern? What are the musical constituents of jazz? How did the American musical evolve into a great indigenous form? Never talking down, instead he breaks things down, always focusing on the notes, not merely the circumstances. It is hardly a surprise that the best of these visual essays is on the art of conducting, which ranges from the use of the baton to "the intangibles, the deep magical aspect of conducting", the passion, the timing, the sculptural sense of proportion that enables a single intelligence to mould so many disparate players and instruments into a satisfying whole.
After eleven years with the Philharmonic, Bernstein spent the best part of his life refining his conducting into utter mastery. The jury is still out on which of his own classical compositions might survive. Copland himself, early on, was in two minds about Bernstein's music, praising his "vibrant rhythmic invention" and "immediate emotional appeal" but seeing it, at its worst, as "conductor's music – eclectic in style and facile in inspiration". Bernstein's Broadway musicals, even the ill-fated Candide, have been repeatedly revived, sometimes by altering or updating the book. He became too famous, too busy, too set in his ways to write the kind of fluent, arresting letters he once did, and the later letters lack the appeal of the gifted young man's self-absorption. "Some day, preferably soon, I simply must decide what I'm going to be when I grow up", he writes in 1955.
From the 1980s, his last decade, when he continued a frenetic round of international travel, lionized everywhere he went, there are hardly any letters at all. One thing was not in doubt: his conducting became ever more personal and probing, especially with the composers with whom he identified. His role in the Mahler revival was critical. He felt akin to Mahler's grandiosity and spiritual ambition, his Jewish origins, his role as a culminating figure in the Austro-German symphonic tradition, but above all to what Bernstein described as Mahler's "doubleness", his "straddling", as a man split between composing and conducting, between sweeping, resonant emotions and kitsch. As early as 1950, Bernstein had written, "Sometimes I feel clearly that his difficulties were the same as my own. Mahler was also 'possessed' by music and his compositions, too, originated during his occupations as a conductor. . . . . With works by Mahler I seem to be playing some of my own". Though "born in the lap of Gershwin and Copland", he wrote to Karl Böhm, he became an "adopted son" of European music, as much at home in Vienna and Amsterdam as in New York and Israel.
Bernstein's letters reveal more about his protean personality than about his work. Strictly musical discussions are rare, for the letters are often the fruit of his travels, while his creative collaborations were usually face to face. Nigel Simeone's impeccable notes and introductions make this a virtual biography but key moments, such as his separation from Felicia, barely figure, perhaps because they're too fraught, too sensitive, to be written about. But what the letters lack in narrative continuity they make up for with an immediacy of feeling, voice, direct exchange. The violinist Isaac Stern loved it when he and Bernstein talked "about life and music, about ideas, about family . . . . The greatest times I had with him were always alone. You could never talk with Lenny this way if one other person entered the room; [he] immediately went on stage". Alive with spontaneous intelligence, Leonard Bernstein's letters display exactly this unforced intimacy, though there were moments when he no doubt knew that posterity was listening in.
Wednesday, 16 July 2014
The science of humans
The science of humans
Charles Seligman; from the book under review
Book Details
Ben Shephard
HEADHUNTERS
The search for a science of the mind
323pp. Bodley Head. £25.
978 1 84792 188 8
A network of eccentric, wilful, brilliant men prepared to try anything to advance the scientific understanding of human nature
ADAM KUPER
In the early twentieth century, a handful of Cambridge men, young medical doctors mostly, established modern anthropology, neuroscience, psychology and psychotherapy in Britain. Ben Shephard sums up their quest as "a search for a science of the mind", which was certainly a large part of it, but they were interested in a great many other things as well. They were close associates who influenced one another, but it would be a mistake to exaggerate the coherence of their projects or the extent to which they shared a common sense of what they were after. Because they were so eclectic and ranged so widely, they were not installed as ancestor figures in the disciplines into which the human sciences were beginning to fragment, even if they were influential in the committees that helped to shape the new professional institutions. Their names are therefore mostly unfamiliar today. Shephard rescues them from the oubliette of disciplinary histories and presents them as members of a cohort: a network of eccentric, wilful, brilliant men who were prepared to go anywhere, try anything, to advance the scientific understanding of human nature.
Central members of this cohort were brought together in the 1898 "Cambridge Anthropological Expedition to Torres Straits", that narrow stretch of sea, with numerous islands, which separates Australia and New Guinea. The expedition was organized by a zoologist, Alfred Haddon. Encouraged by Thomas Huxley, Haddon had visited the Torres Strait a year earlier to study tropical fauna and coral reefs. He became interested in the islanders and was tempted to take up anthropology, although Huxley did warn him that nobody could make a living at it. (Mrs Haddon pluckily remarked that "you may as well starve as an anthropologist as a zoologist".) Haddon then put together a scientific expedition to the Strait. The islanders seemed particularly interesting since they might constitute a link between the peoples of New Guinea and Indonesia and the Australian Aborigines, who represented for the Victorians the archetypal savage hunter-gatherers. And their origins were mysterious. The Cambridge team would study the anatomy, physiology, sense perception and sociology of the natives.
Haddon lectured on comparative anatomy in the Natural Sciences department in Cambridge alongside a young doctor, W. H. R. Rivers, who lectured on the sense organs. Rivers had studied experimental psychology at the leading school in the world, in Jena, in Germany, and developed an unfashionable interest in mental illnesses. Haddon invited him to lead the psychological side of the expedition, but Rivers hesitated until his two favourite students, William McDougall and C. S. Myers, signed on. He then told Haddon that after the recent death of his mother he felt run down and in need of a holiday. He would come along and pay his own way. Charles Seligman, a pathologist who was a friend and contemporary of Myers, also volunteered and was directed to join Rivers, Myers and McDougall in doing experiments on the sight and hearing of the islanders.
"I put the direction of the psychological department entirely into the hands of Rivers", Haddon reported, "and for the first time psychological observations were made on a backward people in their own country by trained psychologists with adequate equipment." Despite Haddon's optimism, their equipment turned out to be far from adequate in tropical conditions and they ended up doing quite simple experiments, but the results appeared to show that the sense perception of the islanders was not very different from that of Europeans. They might be better able to discriminate birds at a distance, but only because they had been trained to do so. Their colour vocabulary was undeveloped, but they could pick out the same shades of difference as an average Englishman. Personalities, too, were familiar. "The character of the natives appears to be as diverse as it would be in any English town", Myers remarked. Their cook was "the sturdy, plodding thick-set workman". Another man was "the best example of the high-strung nervous type: his excitement when he is telling a story is remarkable". And then there was Ulai, "the personification of cunning", who grasped what the expedition was after and sold Rivers a collection of rods that recorded his sexual conquests, explaining that at his age he would not be adding to their number.
Haddon directed the ethnological research. One coup was to persuade the Murray islanders to revive a male initiation ceremony that the missionaries had banned. Rivers found himself drawn to sociological issues, especially family and kinship. "While going over the various names which one man would apply to others, I was occasionally told that such and such a man would stop a fight, another would bury a dead man, and so on", he noted. "When the clues given by these occasional remarks were followed up it was found that there were certain very definite duties and privileges attached to certain bonds of kinship." Rivers went on to become the leading kinship theorist of his generation.
Studying hearing, Myers and Seligman collected recordings of local music. Then Seligman went off to do some ethnological work of his own in New Guinea, and Myers and McDougall made a difficult journey to Sarawak, where the administrator, Charles Hose, was orchestrating a peace conference between two groups of headhunters. Myers found it all rather trying, and objected particularly to having to do his daily rounds surrounded by pigs. "How much this distracts from the pleasures of defecation and subsequent pig eating", he complained. Nor could he get on with the local people. His one achievement was the collection of music recordings. McDougall fitted in rather better with the swashbuckling Hose and enjoyed the company of the headhunters. He was tempted to make a career of anthropology. However, he decided that it was too easy and instead went to Germany to master the latest experimental methods in psychology.
Another Cambridge friend of Rivers was a young Australian doctor, Grafton Elliot Smith, who had been studying the brains of marsupials. When the Haddon team set out for the Torres Strait, Elliot Smith went to Cairo as professor of anatomy at the medical school. Rivers turned up in Egypt in 1900, to investigate the colour vision of the Egyptians in order to establish a comparison with the Torres Strait islanders. His experimental subjects were labourers working for another former member of the Torres Strait team, Anthony Wilken, who was doing archaeological research. Rivers brought Elliot Smith in to examine the human remains that Wilken had uncovered, and especially the well-preserved brains of pre-dynastic people. Elliot Smith soon became a world authority on the human brain. He modelled the brain's structure as though it was an archaeological site, the different levels supposedly reflecting evolutionary advances. The neocortex, shared by all mammals, controlled basic functions while the prefrontal area was the seat of more advanced abilities.
Elliot Smith also began to develop theories about Egyptology and anthropology. His most famous idea was that the wonders of ancient Egypt were created not by Africans but by northern invaders, and that civilization had then spread from Egypt to all the corners of the world. Rivers was impressed, and he now decided that Melanesia had been transformed by an invasion of more advanced peoples. Seligman, who had gone on to do ethnological fieldwork in the Sudan, also adopted Elliot Smith's thesis and argued that the most advanced African civilizations were the work of light-skinned "Hamitic" invaders who had passed through Egypt. The great African pro-consul, Lord Lugard, was a convert to this doctine, which had an obvious appeal to colonialists.
Rivers, more and more absorbed by ethnology and, engaged in writing his great work, The History of Melanesian Society, found the time to collaborate in a famous experiment with his friend Henry Head, another young doctor who had studied experimental psychology in Germany and Czechoslovakia. Investigating the perception of pain, Head had two cutaneous nerves on his left forearm severed. Every Friday for the next four years, he visited Rivers in his college rooms to chart the process of regeneration and the areas of acute sensitivity. Echoing Elliot Smith's ideas about the evolutionary levels of the brain, Rivers and Head decided that the nervous system contained two layers: one older and more primitive; the other more subtle and localized. They speculated that the two systems "owed their origin to the developmental history of the nervous system. They reveal the means by which an imperfect organism has struggled towards improved functions and physical unity". And this "could be seen as a metaphor for the triumph of civilization over savagery in human history". Frederic Bartlett, a student of Rivers who went on to become a leading psychologist in the next generation, noted that this metaphor informed all Rivers's later theories in physiology, psychology and anthropology. The structure of every human organ, every social institution, revealed cumulative layers of progressive development.
Myers and McDougall stuck with psychology but developed very different approaches. Myers became an anti-racist and a humanist. McDougall was a biological determinist, although he did become president of the English Society for Psychical Research.
Like Rivers, Myers initially took an evolutionary view: "The primitive mind first or the child mind, if you like; then the industrial mind; and the abnormal last of all: that seems to me the natural order, since each in a sense implies the last". He moved from St Bartholomew's Hospital in London to Cambridge to work with Rivers, and set up a small laboratory in a cottage in Mill Lane, funded by a gift from his mother. Here he worked on the psychological basis of rhythm in music.
Psychology was looked down on by the Cambridge establishment, but Ludwig Wittgenstein was intrigued and regularly came to Mill Lane to work with Myers. "I had a discussion with Myers about the relations between Logic and Philosophy", he wrote to Bertrand Russell. "I was very candid and I am sure he thinks that I am the most arrogant devil who ever lived . . . . I think he was a bit less confused after the discussion than before." When the laboratory was opened to the public in 1913, Wittgenstein exhibited an apparatus for investigating the perception of rhythm. Perhaps influenced by Wittgenstein, Myers was moving away from biological determinism. The physiologists, he complained, "in their attempts to penetrate the reality of the known, were deliberately ignoring the knower". Experimental psychology made unrealistic assumptions, he came to believe. "The factor of feeling was expressly eliminated from our experiments on memory." Together with Rivers, Myers lobbied, unsuccessfully, for Cambridge to establish a mental health centre. He also criticized racial theories. The "mental characters" of a European peasant were "essentially the same as those of primitive communities", he insisted. "In temperament we meet the same variations in primitive as in civilized communities."
McDougall, in contrast, remained committed to biological explanations of human behaviour. He moved to Oxford, championed eugenics, and began to theorize about race and instinct. A quirky and difficult man, who had quarrelled with most of his associates, McDougall made a satisfactory marriage with the daughter of a chimney sweep in 1900 and began to repair his strained relationships with Rivers and Myers. In 1901, the three men were instrumental in the establishment of the British Psychological Society. And they shared a commitment to develop the treatment of mental illnesses. Psychoanalytic theories came to their attention with the publication of lectures Freud had delivered in 1910 at Clark University in Massachusetts. In 1913, Jung attended a medical congress in London and impressed McDougall. He arranged to visit Jung for analysis, but was frustrated by the outbreak of the First World War.
The war brought them all together again. The cohort of medical doctors, anthropologists and psychologists was now deployed in a new and contentious field: the treatment of soldiers traumatized by battle. (Shephard has published a study of military psychiatry, and this section of his book is particularly strong.) Myers had gone to France immediately the war broke out, and French psychiatrists had introduced him to soldiers who were suffering from strange symptoms – struck dumb, paralysed without any evident physical cause, or suffering from a total loss of memory. He concluded that these symptoms might sometimes be caused by the stress of battle, but that they might also have a physiological basis, resulting from close proximity to heavy explosions. In 1915, he published a description of "shell shock" in the British Medical Journal. In the following year he was appointed "Specialist in Nerve Shock" to the British army with the temporary rank of major.
The diagnosis was, however, vague, and it was also problematic in operational terms. One in six shell-shock patients were officers (who constituted only one in thirty of the fighting men). And when men diagnosed with shell shock were being evacuated from the front line, even returned to Britain, there were suspicions of faking. Manpower losses from shell shock became a serious drain. In one fortnight at the height of the Battle of the Somme, 2 Division had 2,400 wounded men and 501 cases of "shell-shock wounded". Myers was sidelined, clinics were set up near the front line to rehabilitate traumatized men, and the diagnosis of shell shock was shelved in favour of the pre-war categories of neurasthenia and hysteria.
But large numbers of soldiers had returned to Britain, wounded or psychologically incapacitated, and suffering acutely from depression: "nervous wrecks" was the common description. McDougall found himself "the head of a hospital section full of 'shell shock' cases, a most strange, wonderful and pitiful collection of nervously disordered soldiers". Elliot Smith was posted to Magull hospital near Liverpool, and recruited Rivers. They experimented with various methods, including the new talking therapy favoured by Freud and Jung, and began to pay special attention to dreams.
Rivers moved at the end of 1916 to Craiglockhart, in Scotland, a facility for officers who had been diagnosed with shell shock. He cherry-picked the most interesting cases, one of whom, famously, was Siegfried Sassoon, who had been sent there after making an anti-war protest. Sassoon recalled that at their first session he asked Rivers if he was suffering from shell shock.
"Certainly not", he replied.
"What have I got then?"
"Well, you appear to be suffering from an anti-war complex." We both of us laughed at that.
Rivers engaged critically with Freudian ideas. He reworked Freud's dream theory in a neglected classic, Conflict and Dreams, which was published posthumously, edited by Elliot Smith. But as Sassoon apparently recognized, Rivers could well have featured as a classical Freudian case study himself, tormented by homosexual desires. In 1914, Rivers had abandoned a young student, John Layard, on a remote Melanesian island, evidently unable to cope with a physical attraction that Layard himself was willing to express. "Rivers had obviously not recognized the whole homosexual content of our relationship, probably on both sides", Layard remarked. He also recalled that Rivers had "immense quantities of clever young men around him. He told me once that it was only the affection of young men that kept him alive". (Layard returned to England suffering from depression. Rivers attended him but many years later he attempted suicide, was treated by Jung, and eventually published a Jungian study of dream analysis.)
Rivers also suffered greatly from the strain of treating young officers only so that they could be sent back to the front, and he had to retire owing to nervous exhaustion. In 1922, while fighting the general election in the London University seat on behalf of the Labour Party, he had a fatal heart attack. In the same year the founding texts of the new functionalist anthropology were published, and his speculative history of Melanesian society was consigned to the scrapheap.
More broadly, in the 1920s structural-functional accounts replaced speculative historical approaches in the natural and social sciences. The brain was now seen as an intricate working organism, its parts functionally specialized. Societies were machines for living in, not deposits of archaeological strata. Academic psychologists set out to win respect by making psychology over into an experimental natural science, abandoning therapy.
The members of the cohort did not have the best of luck in the post-war world. Elliot Smith became a famous anatomist, but his reputation was badly damaged by his endorsement of the Piltdown forgery. (A hoaxer had put together a modern cranium and an ape's jaw and teeth, which led Elliot Smith to conclude that early hominids had highly developed brains.) McDougall was appointed to a chair at Harvard in 1919, but his advocacy of instinct theory, eugenics and racial determinism did not go down well. Nor did his enthusiasm for psychic research. American psychology was now "behaviorist" and experimental, and McDougall was regarded as a Victorian relic, "still thinking of himself as an Englishman in the British colonies", colleagues complained. He lost a small fortune in the stock market crash of 1929 and was then swindled over an oil well. ("The professor is noted for his experiments on animal behavior through tests made mostly with rats", the New York Times noted with some glee. "Seems this time he got caught by two oily ones.") Seligman became professor of ethnology at the London School of Economics, but in the 1920s he was sidelined by the charismatic Bronisław Malinowski. Myers was held responsible for the shell shock fiasco. ("There's no such thing as shell shock", General George S. Patton was to insist. "It's an invention of the Jews.")
"Although many of the answers which Rivers, McDougall and Elliot Smith gave have proved to be wrong, because in their time the data to answer them did not exist, the questions they posed are still relevant", Ben Shephard suggests. In fact their theories were discredited and their research programmes declared obsolete. And so they have largely been forgotten. Only Rivers has had an unexpected afterlife, resurrected in the imaginations of Siegfried Sassoon and Pat Barker. Yet this cohort of open-minded, cosmopolitan, adventurous young doctors was surely more interesting, more creative and more admirable than most of the narrow specialists who succeeded them.
Charles Seligman; from the book under review
Book Details
Ben Shephard
HEADHUNTERS
The search for a science of the mind
323pp. Bodley Head. £25.
978 1 84792 188 8
A network of eccentric, wilful, brilliant men prepared to try anything to advance the scientific understanding of human nature
ADAM KUPER
In the early twentieth century, a handful of Cambridge men, young medical doctors mostly, established modern anthropology, neuroscience, psychology and psychotherapy in Britain. Ben Shephard sums up their quest as "a search for a science of the mind", which was certainly a large part of it, but they were interested in a great many other things as well. They were close associates who influenced one another, but it would be a mistake to exaggerate the coherence of their projects or the extent to which they shared a common sense of what they were after. Because they were so eclectic and ranged so widely, they were not installed as ancestor figures in the disciplines into which the human sciences were beginning to fragment, even if they were influential in the committees that helped to shape the new professional institutions. Their names are therefore mostly unfamiliar today. Shephard rescues them from the oubliette of disciplinary histories and presents them as members of a cohort: a network of eccentric, wilful, brilliant men who were prepared to go anywhere, try anything, to advance the scientific understanding of human nature.
Central members of this cohort were brought together in the 1898 "Cambridge Anthropological Expedition to Torres Straits", that narrow stretch of sea, with numerous islands, which separates Australia and New Guinea. The expedition was organized by a zoologist, Alfred Haddon. Encouraged by Thomas Huxley, Haddon had visited the Torres Strait a year earlier to study tropical fauna and coral reefs. He became interested in the islanders and was tempted to take up anthropology, although Huxley did warn him that nobody could make a living at it. (Mrs Haddon pluckily remarked that "you may as well starve as an anthropologist as a zoologist".) Haddon then put together a scientific expedition to the Strait. The islanders seemed particularly interesting since they might constitute a link between the peoples of New Guinea and Indonesia and the Australian Aborigines, who represented for the Victorians the archetypal savage hunter-gatherers. And their origins were mysterious. The Cambridge team would study the anatomy, physiology, sense perception and sociology of the natives.
Haddon lectured on comparative anatomy in the Natural Sciences department in Cambridge alongside a young doctor, W. H. R. Rivers, who lectured on the sense organs. Rivers had studied experimental psychology at the leading school in the world, in Jena, in Germany, and developed an unfashionable interest in mental illnesses. Haddon invited him to lead the psychological side of the expedition, but Rivers hesitated until his two favourite students, William McDougall and C. S. Myers, signed on. He then told Haddon that after the recent death of his mother he felt run down and in need of a holiday. He would come along and pay his own way. Charles Seligman, a pathologist who was a friend and contemporary of Myers, also volunteered and was directed to join Rivers, Myers and McDougall in doing experiments on the sight and hearing of the islanders.
"I put the direction of the psychological department entirely into the hands of Rivers", Haddon reported, "and for the first time psychological observations were made on a backward people in their own country by trained psychologists with adequate equipment." Despite Haddon's optimism, their equipment turned out to be far from adequate in tropical conditions and they ended up doing quite simple experiments, but the results appeared to show that the sense perception of the islanders was not very different from that of Europeans. They might be better able to discriminate birds at a distance, but only because they had been trained to do so. Their colour vocabulary was undeveloped, but they could pick out the same shades of difference as an average Englishman. Personalities, too, were familiar. "The character of the natives appears to be as diverse as it would be in any English town", Myers remarked. Their cook was "the sturdy, plodding thick-set workman". Another man was "the best example of the high-strung nervous type: his excitement when he is telling a story is remarkable". And then there was Ulai, "the personification of cunning", who grasped what the expedition was after and sold Rivers a collection of rods that recorded his sexual conquests, explaining that at his age he would not be adding to their number.
Haddon directed the ethnological research. One coup was to persuade the Murray islanders to revive a male initiation ceremony that the missionaries had banned. Rivers found himself drawn to sociological issues, especially family and kinship. "While going over the various names which one man would apply to others, I was occasionally told that such and such a man would stop a fight, another would bury a dead man, and so on", he noted. "When the clues given by these occasional remarks were followed up it was found that there were certain very definite duties and privileges attached to certain bonds of kinship." Rivers went on to become the leading kinship theorist of his generation.
Studying hearing, Myers and Seligman collected recordings of local music. Then Seligman went off to do some ethnological work of his own in New Guinea, and Myers and McDougall made a difficult journey to Sarawak, where the administrator, Charles Hose, was orchestrating a peace conference between two groups of headhunters. Myers found it all rather trying, and objected particularly to having to do his daily rounds surrounded by pigs. "How much this distracts from the pleasures of defecation and subsequent pig eating", he complained. Nor could he get on with the local people. His one achievement was the collection of music recordings. McDougall fitted in rather better with the swashbuckling Hose and enjoyed the company of the headhunters. He was tempted to make a career of anthropology. However, he decided that it was too easy and instead went to Germany to master the latest experimental methods in psychology.
Another Cambridge friend of Rivers was a young Australian doctor, Grafton Elliot Smith, who had been studying the brains of marsupials. When the Haddon team set out for the Torres Strait, Elliot Smith went to Cairo as professor of anatomy at the medical school. Rivers turned up in Egypt in 1900, to investigate the colour vision of the Egyptians in order to establish a comparison with the Torres Strait islanders. His experimental subjects were labourers working for another former member of the Torres Strait team, Anthony Wilken, who was doing archaeological research. Rivers brought Elliot Smith in to examine the human remains that Wilken had uncovered, and especially the well-preserved brains of pre-dynastic people. Elliot Smith soon became a world authority on the human brain. He modelled the brain's structure as though it was an archaeological site, the different levels supposedly reflecting evolutionary advances. The neocortex, shared by all mammals, controlled basic functions while the prefrontal area was the seat of more advanced abilities.
Elliot Smith also began to develop theories about Egyptology and anthropology. His most famous idea was that the wonders of ancient Egypt were created not by Africans but by northern invaders, and that civilization had then spread from Egypt to all the corners of the world. Rivers was impressed, and he now decided that Melanesia had been transformed by an invasion of more advanced peoples. Seligman, who had gone on to do ethnological fieldwork in the Sudan, also adopted Elliot Smith's thesis and argued that the most advanced African civilizations were the work of light-skinned "Hamitic" invaders who had passed through Egypt. The great African pro-consul, Lord Lugard, was a convert to this doctine, which had an obvious appeal to colonialists.
Rivers, more and more absorbed by ethnology and, engaged in writing his great work, The History of Melanesian Society, found the time to collaborate in a famous experiment with his friend Henry Head, another young doctor who had studied experimental psychology in Germany and Czechoslovakia. Investigating the perception of pain, Head had two cutaneous nerves on his left forearm severed. Every Friday for the next four years, he visited Rivers in his college rooms to chart the process of regeneration and the areas of acute sensitivity. Echoing Elliot Smith's ideas about the evolutionary levels of the brain, Rivers and Head decided that the nervous system contained two layers: one older and more primitive; the other more subtle and localized. They speculated that the two systems "owed their origin to the developmental history of the nervous system. They reveal the means by which an imperfect organism has struggled towards improved functions and physical unity". And this "could be seen as a metaphor for the triumph of civilization over savagery in human history". Frederic Bartlett, a student of Rivers who went on to become a leading psychologist in the next generation, noted that this metaphor informed all Rivers's later theories in physiology, psychology and anthropology. The structure of every human organ, every social institution, revealed cumulative layers of progressive development.
Myers and McDougall stuck with psychology but developed very different approaches. Myers became an anti-racist and a humanist. McDougall was a biological determinist, although he did become president of the English Society for Psychical Research.
Like Rivers, Myers initially took an evolutionary view: "The primitive mind first or the child mind, if you like; then the industrial mind; and the abnormal last of all: that seems to me the natural order, since each in a sense implies the last". He moved from St Bartholomew's Hospital in London to Cambridge to work with Rivers, and set up a small laboratory in a cottage in Mill Lane, funded by a gift from his mother. Here he worked on the psychological basis of rhythm in music.
Psychology was looked down on by the Cambridge establishment, but Ludwig Wittgenstein was intrigued and regularly came to Mill Lane to work with Myers. "I had a discussion with Myers about the relations between Logic and Philosophy", he wrote to Bertrand Russell. "I was very candid and I am sure he thinks that I am the most arrogant devil who ever lived . . . . I think he was a bit less confused after the discussion than before." When the laboratory was opened to the public in 1913, Wittgenstein exhibited an apparatus for investigating the perception of rhythm. Perhaps influenced by Wittgenstein, Myers was moving away from biological determinism. The physiologists, he complained, "in their attempts to penetrate the reality of the known, were deliberately ignoring the knower". Experimental psychology made unrealistic assumptions, he came to believe. "The factor of feeling was expressly eliminated from our experiments on memory." Together with Rivers, Myers lobbied, unsuccessfully, for Cambridge to establish a mental health centre. He also criticized racial theories. The "mental characters" of a European peasant were "essentially the same as those of primitive communities", he insisted. "In temperament we meet the same variations in primitive as in civilized communities."
McDougall, in contrast, remained committed to biological explanations of human behaviour. He moved to Oxford, championed eugenics, and began to theorize about race and instinct. A quirky and difficult man, who had quarrelled with most of his associates, McDougall made a satisfactory marriage with the daughter of a chimney sweep in 1900 and began to repair his strained relationships with Rivers and Myers. In 1901, the three men were instrumental in the establishment of the British Psychological Society. And they shared a commitment to develop the treatment of mental illnesses. Psychoanalytic theories came to their attention with the publication of lectures Freud had delivered in 1910 at Clark University in Massachusetts. In 1913, Jung attended a medical congress in London and impressed McDougall. He arranged to visit Jung for analysis, but was frustrated by the outbreak of the First World War.
The war brought them all together again. The cohort of medical doctors, anthropologists and psychologists was now deployed in a new and contentious field: the treatment of soldiers traumatized by battle. (Shephard has published a study of military psychiatry, and this section of his book is particularly strong.) Myers had gone to France immediately the war broke out, and French psychiatrists had introduced him to soldiers who were suffering from strange symptoms – struck dumb, paralysed without any evident physical cause, or suffering from a total loss of memory. He concluded that these symptoms might sometimes be caused by the stress of battle, but that they might also have a physiological basis, resulting from close proximity to heavy explosions. In 1915, he published a description of "shell shock" in the British Medical Journal. In the following year he was appointed "Specialist in Nerve Shock" to the British army with the temporary rank of major.
The diagnosis was, however, vague, and it was also problematic in operational terms. One in six shell-shock patients were officers (who constituted only one in thirty of the fighting men). And when men diagnosed with shell shock were being evacuated from the front line, even returned to Britain, there were suspicions of faking. Manpower losses from shell shock became a serious drain. In one fortnight at the height of the Battle of the Somme, 2 Division had 2,400 wounded men and 501 cases of "shell-shock wounded". Myers was sidelined, clinics were set up near the front line to rehabilitate traumatized men, and the diagnosis of shell shock was shelved in favour of the pre-war categories of neurasthenia and hysteria.
But large numbers of soldiers had returned to Britain, wounded or psychologically incapacitated, and suffering acutely from depression: "nervous wrecks" was the common description. McDougall found himself "the head of a hospital section full of 'shell shock' cases, a most strange, wonderful and pitiful collection of nervously disordered soldiers". Elliot Smith was posted to Magull hospital near Liverpool, and recruited Rivers. They experimented with various methods, including the new talking therapy favoured by Freud and Jung, and began to pay special attention to dreams.
Rivers moved at the end of 1916 to Craiglockhart, in Scotland, a facility for officers who had been diagnosed with shell shock. He cherry-picked the most interesting cases, one of whom, famously, was Siegfried Sassoon, who had been sent there after making an anti-war protest. Sassoon recalled that at their first session he asked Rivers if he was suffering from shell shock.
"Certainly not", he replied.
"What have I got then?"
"Well, you appear to be suffering from an anti-war complex." We both of us laughed at that.
Rivers engaged critically with Freudian ideas. He reworked Freud's dream theory in a neglected classic, Conflict and Dreams, which was published posthumously, edited by Elliot Smith. But as Sassoon apparently recognized, Rivers could well have featured as a classical Freudian case study himself, tormented by homosexual desires. In 1914, Rivers had abandoned a young student, John Layard, on a remote Melanesian island, evidently unable to cope with a physical attraction that Layard himself was willing to express. "Rivers had obviously not recognized the whole homosexual content of our relationship, probably on both sides", Layard remarked. He also recalled that Rivers had "immense quantities of clever young men around him. He told me once that it was only the affection of young men that kept him alive". (Layard returned to England suffering from depression. Rivers attended him but many years later he attempted suicide, was treated by Jung, and eventually published a Jungian study of dream analysis.)
Rivers also suffered greatly from the strain of treating young officers only so that they could be sent back to the front, and he had to retire owing to nervous exhaustion. In 1922, while fighting the general election in the London University seat on behalf of the Labour Party, he had a fatal heart attack. In the same year the founding texts of the new functionalist anthropology were published, and his speculative history of Melanesian society was consigned to the scrapheap.
More broadly, in the 1920s structural-functional accounts replaced speculative historical approaches in the natural and social sciences. The brain was now seen as an intricate working organism, its parts functionally specialized. Societies were machines for living in, not deposits of archaeological strata. Academic psychologists set out to win respect by making psychology over into an experimental natural science, abandoning therapy.
The members of the cohort did not have the best of luck in the post-war world. Elliot Smith became a famous anatomist, but his reputation was badly damaged by his endorsement of the Piltdown forgery. (A hoaxer had put together a modern cranium and an ape's jaw and teeth, which led Elliot Smith to conclude that early hominids had highly developed brains.) McDougall was appointed to a chair at Harvard in 1919, but his advocacy of instinct theory, eugenics and racial determinism did not go down well. Nor did his enthusiasm for psychic research. American psychology was now "behaviorist" and experimental, and McDougall was regarded as a Victorian relic, "still thinking of himself as an Englishman in the British colonies", colleagues complained. He lost a small fortune in the stock market crash of 1929 and was then swindled over an oil well. ("The professor is noted for his experiments on animal behavior through tests made mostly with rats", the New York Times noted with some glee. "Seems this time he got caught by two oily ones.") Seligman became professor of ethnology at the London School of Economics, but in the 1920s he was sidelined by the charismatic Bronisław Malinowski. Myers was held responsible for the shell shock fiasco. ("There's no such thing as shell shock", General George S. Patton was to insist. "It's an invention of the Jews.")
"Although many of the answers which Rivers, McDougall and Elliot Smith gave have proved to be wrong, because in their time the data to answer them did not exist, the questions they posed are still relevant", Ben Shephard suggests. In fact their theories were discredited and their research programmes declared obsolete. And so they have largely been forgotten. Only Rivers has had an unexpected afterlife, resurrected in the imaginations of Siegfried Sassoon and Pat Barker. Yet this cohort of open-minded, cosmopolitan, adventurous young doctors was surely more interesting, more creative and more admirable than most of the narrow specialists who succeeded them.
Subscribe to:
Posts (Atom)