April 7, 2009
The End of Philosophy
By DAVID BROOKS
Socrates talked. The assumption behind his approach to philosophy, and the approaches of millions of people since, is that moral thinking is mostly a matter of reason and deliberation: Think through moral problems. Find a just principle. Apply it.
One problem with this kind of approach to morality, as Michael Gazzaniga writes in his 2008 book, “Human,” is that “it has been hard to find any correlation between moral reasoning and proactive moral behavior, such as helping other people. In fact, in most studies, none has been found.”
Today, many psychologists, cognitive scientists and even philosophers embrace a different view of morality. In this view, moral thinking is more like aesthetics. As we look around the world, we are constantly evaluating what we see. Seeing and evaluating are not two separate processes. They are linked and basically simultaneous.
As Steven Quartz of the California Institute of Technology said during a recent discussion of ethics sponsored by the John Templeton Foundation, “Our brain is computing value at every fraction of a second. Everything that we look at, we form an implicit preference. Some of those make it into our awareness; some of them remain at the level of our unconscious, but ... what our brain is for, what our brain has evolved for, is to find what is of value in our environment.”
Think of what happens when you put a new food into your mouth. You don’t have to decide if it’s disgusting. You just know. You don’t have to decide if a landscape is beautiful. You just know.
Moral judgments are like that. They are rapid intuitive decisions and involve the emotion-processing parts of the brain. Most of us make snap moral judgments about what feels fair or not, or what feels good or not. We start doing this when we are babies, before we have language. And even as adults, we often can’t explain to ourselves why something feels wrong.
In other words, reasoning comes later and is often guided by the emotions that preceded it. Or as Jonathan Haidt of the University of Virginia memorably wrote, “The emotions are, in fact, in charge of the temple of morality, and ... moral reasoning is really just a servant masquerading as a high priest.”
The question then becomes: What shapes moral emotions in the first place? The answer has long been evolution, but in recent years there’s an increasing appreciation that evolution isn’t just about competition. It’s also about cooperation within groups. Like bees, humans have long lived or died based on their ability to divide labor, help each other and stand together in the face of common threats. Many of our moral emotions and intuitions reflect that history. We don’t just care about our individual rights, or even the rights of other individuals. We also care about loyalty, respect, traditions, religions. We are all the descendents of successful cooperators.
The first nice thing about this evolutionary approach to morality is that it emphasizes the social nature of moral intuition. People are not discrete units coolly formulating moral arguments. They link themselves together into communities and networks of mutual influence.
The second nice thing is that it entails a warmer view of human nature. Evolution is always about competition, but for humans, as Darwin speculated, competition among groups has turned us into pretty cooperative, empathetic and altruistic creatures — at least within our families, groups and sometimes nations.
The third nice thing is that it explains the haphazard way most of us lead our lives without destroying dignity and choice. Moral intuitions have primacy, Haidt argues, but they are not dictators. There are times, often the most important moments in our lives, when in fact we do use reason to override moral intuitions, and often those reasons — along with new intuitions — come from our friends.
The rise and now dominance of this emotional approach to morality is an epochal change. It challenges all sorts of traditions. It challenges the bookish way philosophy is conceived by most people. It challenges the Talmudic tradition, with its hyper-rational scrutiny of texts. It challenges the new atheists, who see themselves involved in a war of reason against faith and who have an unwarranted faith in the power of pure reason and in the purity of their own reasoning.
Finally, it should also challenge the very scientists who study morality. They’re good at explaining how people make judgments about harm and fairness, but they still struggle to explain the feelings of awe, transcendence, patriotism, joy and self-sacrifice, which are not ancillary to most people’s moral experiences, but central. The evolutionary approach also leads many scientists to neglect the concept of individual responsibility and makes it hard for them to appreciate that most people struggle toward goodness, not as a means, but as an end in itself.
April 27, 2009
End the University as We Know It
By MARK C. TAYLOR
GRADUATE education is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).
Widespread hiring freezes and layoffs have brought these problems into sharp relief now. But our graduate system has been in crisis for decades, and the seeds of this crisis go as far back as the formation of modern universities. Kant, in his 1798 work “The Conflict of the Faculties,” wrote that universities should “handle the entire content of learning by mass production, so to speak, by a division of labor, so that for every branch of the sciences there would be a public teacher or professor appointed as its trustee.”
Unfortunately this mass-production university model has led to separation where there ought to be collaboration and to ever-increasing specialization. In my own religion department, for example, we have 10 faculty members, working in eight subfields, with little overlap. And as departments fragment, research and publication become more and more about less and less. Each academic becomes the trustee not of a branch of the sciences, but of limited knowledge that all too often is irrelevant for genuinely important problems. A colleague recently boasted to me that his best student was doing his dissertation on how the medieval theologian Duns Scotus used citations.
The emphasis on narrow scholarship also encourages an educational system that has become a process of cloning. Faculty members cultivate those students whose futures they envision as identical to their own pasts, even though their tenures will stand in the way of these students having futures as full professors.
The dirty secret of higher education is that without underpaid graduate students to help in laboratories and with teaching, universities couldn’t conduct research or even instruct their growing undergraduate populations. That’s one of the main reasons we still encourage people to enroll in doctoral programs. It is simply cheaper to provide graduate students with modest stipends and adjuncts with as little as $5,000 a course — with no benefits — than it is to hire full-time professors.
In other words, young people enroll in graduate programs, work hard for subsistence pay and assume huge debt burdens, all because of the illusory promise of faculty appointments. But their economical presence, coupled with the intransigence of tenure, ensures that there will always be too many candidates for too few openings.
The other obstacle to change is that colleges and universities are self-regulating or, in academic parlance, governed by peer review. While trustees and administrations theoretically have some oversight responsibility, in practice, departments operate independently. To complicate matters further, once a faculty member has been granted tenure he is functionally autonomous. Many academics who cry out for the regulation of financial markets vehemently oppose it in their own departments.
If American higher education is to thrive in the 21st century, colleges and universities, like Wall Street and Detroit, must be rigorously regulated and completely restructured. The long process to make higher learning more agile, adaptive and imaginative can begin with six major steps:
1. Restructure the curriculum, beginning with graduate programs and proceeding as quickly as possible to undergraduate programs. The division-of-labor model of separate departments is obsolete and must be replaced with a curriculum structured like a web or complex adaptive network. Responsible teaching and scholarship must become cross-disciplinary and cross-cultural.
Just a few weeks ago, I attended a meeting of political scientists who had gathered to discuss why international relations theory had never considered the role of religion in society. Given the state of the world today, this is a significant oversight. There can be no adequate understanding of the most important issues we face when disciplines are cloistered from one another and operate on their own premises.
It would be far more effective to bring together people working on questions of religion, politics, history, economics, anthropology, sociology, literature, art, religion and philosophy to engage in comparative analysis of common problems. As the curriculum is restructured, fields of inquiry and methods of investigation will be transformed.
2. Abolish permanent departments, even for undergraduate education, and create problem-focused programs. These constantly evolving programs would have sunset clauses, and every seven years each one should be evaluated and either abolished, continued or significantly changed. It is possible to imagine a broad range of topics around which such zones of inquiry could be organized: Mind, Body, Law, Information, Networks, Language, Space, Time, Media, Money, Life and Water.
Consider, for example, a Water program. In the coming decades, water will become a more pressing problem than oil, and the quantity, quality and distribution of water will pose significant scientific, technological and ecological difficulties as well as serious political and economic challenges. These vexing practical problems cannot be adequately addressed without also considering important philosophical, religious and ethical issues. After all, beliefs shape practices as much as practices shape beliefs.
A Water program would bring together people in the humanities, arts, social and natural sciences with representatives from professional schools like medicine, law, business, engineering, social work, theology and architecture. Through the intersection of multiple perspectives and approaches, new theoretical insights will develop and unexpected practical solutions will emerge.
3. Increase collaboration among institutions. All institutions do not need to do all things and technology makes it possible for schools to form partnerships to share students and faculty. Institutions will be able to expand while contracting. Let one college have a strong department in French, for example, and the other a strong department in German; through teleconferencing and the Internet both subjects can be taught at both places with half the staff. With these tools, I have already team-taught semester-long seminars in real time at the Universities of Helsinki and Melbourne.
4. Transform the traditional dissertation. In the arts and humanities, where looming cutbacks will be most devastating, there is no longer a market for books modeled on the medieval dissertation, with more footnotes than text. As financial pressures on university presses continue to mount, publication of dissertations, and with it scholarly certification, is almost impossible. (The average university press print run of a dissertation that has been converted into a book is less than 500, and sales are usually considerably lower.) For many years, I have taught undergraduate courses in which students do not write traditional papers but develop analytic treatments in formats from hypertext and Web sites to films and video games. Graduate students should likewise be encouraged to produce “theses” in alternative formats.
5. Expand the range of professional options for graduate students. Most graduate students will never hold the kind of job for which they are being trained. It is, therefore, necessary to help them prepare for work in fields other than higher education. The exposure to new approaches and different cultures and the consideration of real-life issues will prepare students for jobs at businesses and nonprofit organizations. Moreover, the knowledge and skills they will cultivate in the new universities will enable them to adapt to a constantly changing world.
6. Impose mandatory retirement and abolish tenure. Initially intended to protect academic freedom, tenure has resulted in institutions with little turnover and professors impervious to change. After all, once tenure has been granted, there is no leverage to encourage a professor to continue to develop professionally or to require him or her to assume responsibilities like administration and student advising. Tenure should be replaced with seven-year contracts, which, like the programs in which faculty teach, can be terminated or renewed. This policy would enable colleges and universities to reward researchers, scholars and teachers who continue to evolve and remain productive while also making room for young people with new ideas and skills.
For many years, I have told students, “Do not do what I do; rather, take whatever I have to offer and do with it what I could never imagine doing and then come back and tell me about it.” My hope is that colleges and universities will be shaken out of their complacency and will open academia to a future we cannot conceive.
Mark C. Taylor, the chairman of the religion department at Columbia, is the author of the forthcoming “Field Notes From Elsewhere: Reflections on Dying and Living.”
May 1, 2009
Genius: The Modern View
By DAVID BROOKS
Some people live in romantic ages. They tend to believe that genius is the product of a divine spark. They believe that there have been, throughout the ages, certain paragons of greatness — Dante, Mozart, Einstein — whose talents far exceeded normal comprehension, who had an other-worldly access to transcendent truth, and who are best approached with reverential awe.
We, of course, live in a scientific age, and modern research pierces hocus-pocus. In the view that is now dominant, even Mozart’s early abilities were not the product of some innate spiritual gift. His early compositions were nothing special. They were pastiches of other people’s work. Mozart was a good musician at an early age, but he would not stand out among today’s top child-performers.
What Mozart had, we now believe, was the same thing Tiger Woods had — the ability to focus for long periods of time and a father intent on improving his skills. Mozart played a lot of piano at a very young age, so he got his 10,000 hours of practice in early and then he built from there.
The latest research suggests a more prosaic, democratic, even puritanical view of the world. The key factor separating geniuses from the merely accomplished is not a divine spark. It’s not I.Q., a generally bad predictor of success, even in realms like chess. Instead, it’s deliberate practice. Top performers spend more hours (many more hours) rigorously practicing their craft.
The recent research has been conducted by people like K. Anders Ericsson, the late Benjamin Bloom and others. It’s been summarized in two enjoyable new books: “The Talent Code” by Daniel Coyle; and “Talent Is Overrated” by Geoff Colvin.
If you wanted to picture how a typical genius might develop, you’d take a girl who possessed a slightly above average verbal ability. It wouldn’t have to be a big talent, just enough so that she might gain some sense of distinction. Then you would want her to meet, say, a novelist, who coincidentally shared some similar biographical traits. Maybe the writer was from the same town, had the same ethnic background, or, shared the same birthday — anything to create a sense of affinity.
This contact would give the girl a vision of her future self. It would, Coyle emphasizes, give her a glimpse of an enchanted circle she might someday join. It would also help if one of her parents died when she was 12, infusing her with a profound sense of insecurity and fueling a desperate need for success.
Armed with this ambition, she would read novels and literary biographies without end. This would give her a core knowledge of her field. She’d be able to chunk Victorian novelists into one group, Magical Realists in another group and Renaissance poets into another. This ability to place information into patterns, or chunks, vastly improves memory skills. She’d be able to see new writing in deeper ways and quickly perceive its inner workings.
Then she would practice writing. Her practice would be slow, painstaking and error-focused. According to Colvin, Ben Franklin would take essays from The Spectator magazine and translate them into verse. Then he’d translate his verse back into prose and examine, sentence by sentence, where his essay was inferior to The Spectator’s original.
Coyle describes a tennis academy in Russia where they enact rallies without a ball. The aim is to focus meticulously on technique. (Try to slow down your golf swing so it takes 90 seconds to finish. See how many errors you detect.)
By practicing in this way, performers delay the automatizing process. The mind wants to turn deliberate, newly learned skills into unconscious, automatically performed skills. But the mind is sloppy and will settle for good enough. By practicing slowly, by breaking skills down into tiny parts and repeating, the strenuous student forces the brain to internalize a better pattern of performance.
Then our young writer would find a mentor who would provide a constant stream of feedback, viewing her performance from the outside, correcting the smallest errors, pushing her to take on tougher challenges. By now she is redoing problems — how do I get characters into a room — dozens and dozens of times. She is ingraining habits of thought she can call upon in order to understand or solve future problems.
The primary trait she possesses is not some mysterious genius. It’s the ability to develop a deliberate, strenuous and boring practice routine.
Coyle and Colvin describe dozens of experiments fleshing out this process. This research takes some of the magic out of great achievement. But it underlines a fact that is often neglected. Public discussion is smitten by genetics and what we’re “hard-wired” to do. And it’s true that genes place a leash on our capacities. But the brain is also phenomenally plastic. We construct ourselves through behavior. As Coyle observes, it’s not who you are, it’s what you do.
In his remarks at Academy of Sciences, in Lisbon, Portugal on 08 May 2009, MHI highlighted areas of knowledge that are important today and for the future and that need to be developed and shared across the world.
Remarks by His Highness the Aga Khan at the Academy of Sciences, in Lisbon, Portugal
08 May 2009
Minister of National Defence, Mr Nuno Severiano Teixeira,
Minister of Culture, Mr Pinto Ribeiro,
Apostolic Nuncio, Rino Passagata,
Members of the Academy,
Ladies and Gentlemen
It is an immense honour for me to be here today and to have been admitted to the Academy of Sciences of Lisbon. It reminds me of the day on which I was given an honorary doctorate from the University of Evora. And I do not want to let this occasion pass, without recollecting that very, very special day.
You here in the Academy are guardians of old knowledge and developers of new knowledge. And I thought I would share with you today, very briefly, some of the reflections that have occurred to me since I have completed my journeys in the developing world during my Golden Jubilee.
I visited numerous countries in Africa, Asia and the Middle East and I came into contact with men and women who were intelligent, mature, responsible and who were seeking to build nation states – nation states which would be autonomous, which would be well governed, whose economies would be competent, but these builders were seeking to build on the basis of an enormous knowledge deficit. These men and women in public office simply did not have access to the demography of men and women who are sufficiently educated to be able to man the institutions of state.
And I have come away with another question stemming from my point that there is a deficit of knowledge? The key question is a deficit of what knowledge? What knowledge is necessary in these environments, so that in the decades ahead we can look towards stable nation states around the world?
My conclusion was that the deficit of knowledge is in many areas which are not being offered in education, which are not being taught. Because what have been inherited are curricula of the past, reflections of the past, attitudes of the past, rather than looking forwards, asking what do future generations need to know. And that is the central question which needs to be asked, and on which an academy such as this can have such a massive impact.
Let me mention three areas. First of all, there is the nature of society in these countries. One of the characteristics of all these countries is that they have pluralist societies. And if pluralism is not part of the educational curriculum, the leaders and the peoples of these societies will always be at risk of conflict, because they are not accustomed to pluralism and they do not value it. People are not born valuing pluralism. Therefore pluralism is the sort of subject which needs to be part of education, from the youngest age onwards.
Another aspect is ethics. But not ethics born of dogma, but ethics in civil society. Because when governments fail in these parts of the world, it is civil society which steps in to sustain development. And when ethics are not part of education, teaching, examinations; when they are not part of medicine, the quality of care; when they are not part of financial services, then civil society is undermined. Ethics in civil society is another aspect which is absolutely critical.
The third example is constitutionality. So many countries which I have visited have stumbled into, run into difficulties in governance, because the national constitutions were not designed and conceived to serve the profiles of those countries. And therefore, teaching in areas such as comparative government is another area which is absolutely critical.
If these are the subjects which are necessary today, what are the subjects which will be necessary tomorrow? Is the developing world going to continue in this deficit of knowledge? Or are we going to enable it to move forwards in to new areas of knowledge? My conviction is that we have to help these countries move into new areas of knowledge. And therefore, I think of areas such as the space sciences, such as the neurosciences. There are so many new areas of inquiry which, unless we make an effort to share globally, we will continue to have vast populations around the world who will continue in this knowledge deficit.
Portugal has an extraordinary history. It has been influencing the world for centuries. Your influence today is not limited to Europe. Your influence is massive through your presence in South America. A country like Brazil is a case study for many countries around the world. Brazil is dealing with new areas of knowledge in air transport – that is a new area of knowledge – competing with the best in the world in areas such as agriculture, the development of cash crops, and sugar at new levels of technology.
So the influence of Portugal and the capacity of Portugal to influence what is happening around the world is immense. And it is in this context that I want to thank you for electing me a member of the Academy and for the opportunity you have given me to encourage you to use your global influence, through your history, through your knowledge, through your contacts with the developing world, to bring to the rest of the world what is best in your knowledge.
May 16, 2009
Some Thoughts on the Lost Art of Reading Aloud
By VERLYN KLINKENBORG
Sometimes the best way to understand the present is to look at it from the past. Consider audio books. An enormous number of Americans read by listening these days — listening aloud, I call it. The technology for doing so is diverse and widespread, and so are the places people listen to audio books. But from the perspective of a reader in, say, the early 19th century, about the time of Jane Austen, there is something peculiar about it, even lonely.
In those days, literate families and friends read aloud to each other as a matter of habit. Books were still relatively scarce and expensive, and the routine electronic diversions we take for granted were, of course, nonexistent. If you had grown up listening to adults reading to each other regularly, the thought of all of those solitary 21st-century individuals hearkening to earbuds and car radios would seem isolating. It would also seem as though they were being trained only to listen to books and not to read aloud from them.
It’s part of a pattern. Instead of making music at home, we listen to recordings of professional musicians. When people talk about the books they’ve heard, they’re often talking about the quality of the readers, who are usually professional. The way we listen to books has been de-socialized, stripped of context, which has the solitary virtue of being extremely convenient.
But listening aloud, valuable as it is, isn’t the same as reading aloud. Both require a great deal of attention. Both are good ways to learn something important about the rhythms of language. But one of the most basic tests of comprehension is to ask someone to read aloud from a book. It reveals far more than whether the reader understands the words. It reveals how far into the words — and the pattern of the words — the reader really sees.
Reading aloud recaptures the physicality of words. To read with your lungs and diaphragm, with your tongue and lips, is very different than reading with your eyes alone. The language becomes a part of the body, which is why there is always a curious tenderness, almost an erotic quality, in those 18th- and 19th-century literary scenes where a book is being read aloud in mixed company. The words are not mere words. They are the breath and mind, perhaps even the soul, of the person who is reading.
No one understood this better than Jane Austen. One of the late turning points in “Mansfield Park” comes when Henry Crawford picks up a volume of Shakespeare, “which had the air of being very recently closed,” and begins to read aloud to the young Bertrams and their cousin, Fanny Price. Fanny discovers in Crawford’s reading “a variety of excellence beyond what she had ever met with.” And yet his ability to do every part “with equal beauty” is a clear sign to us, if not entirely to Fanny, of his superficiality.
I read aloud to my writing students, and when students read aloud to me I notice something odd. They are smart and literate, and most of them had parents who read to them as children. But when students read aloud at first, I notice that they are trying to read the meaning of the words. If the work is their own, they are usually trying to read the intention of the writer.
It’s as though they’re reading what the words represent rather than the words themselves. What gets lost is the inner voice of the prose, the life of the language. This is reflected in their writing, too, at first.
In one realm — poetry — reading aloud has never really died out. Take Robert Pinsky’s new book, “Essential Pleasures: A New Anthology of Poems to Read Aloud.” But I suspect there is no going back. You can easily make the argument that reading silently is an economic artifact, a sign of a new prosperity beginning in the early 19th century and a new cheapness in books. The same argument applies to listening to books on your iPhone. But what I would suggest is that our idea of reading is incomplete, impoverished, unless we are also taking the time to read aloud.
"Man came on earth uniquely endowed with individuality and free will. He was put here to evolve his intelligence, and thereby rediscover and express his true nature, the soul: a reflection of Spirit. He was to gradually develop his innate intelligence, not merely through books or lectures and sermons, but also through his own efforts to exercise his mind and improve the quality of his thoughts and actions."
What is your "Spiritual I.Q"? For years intelligence was understood in its most limited sense of book-learning; the accepted standard of smartness was how well one scored on the verbal and mathematical tests given in high school.
However, the reality of so many "smart" people failing and/or being unable to cope with life's challenges forced psychologists to expand the notion of intelligence to include additional abilities, such as interpersonal, emotional, and volitional skills.1 Now University of California (Davis) professor of psychology Richard Emmons, Ph.D., proposes that spirituality be considered as well.
In his book, The Psychology of Ultimate Concerns: Motivation and Spirituality in Personality, Dr. Emmons identifies five core characteristics of spiritual intelligence:
1) Ability to transcend the physical and material,
2) Ability to enter heightened states of spiritual consciousness,
3) Capacity to endow everyday activity with a sense of the sacred,
4) Use of spiritual resources on practical problems, and
5) Engaging in virtuous behavior (e.g., to show forgiveness, to express gratitude, to be humble, to display compassion; to exercise self-control).
To demonstrate how spiritually oriented lifestyles result in higher levels of cognitive function, Dr. Emmons considers two universally acknowledged spiritual qualities—humility and gratitude. Humility—defined not as low self-esteem but as the realistic appraisal of one's strengths and weaknesses, neither overestimating nor underestimating them— has been shown in research studies to improve problem-solving efficiency. Likewise with gratitude: Test groups practicing gratitude make more progress toward their goals than similar groups that do not practice it, suggesting, as in the case of humility, that the benefits of spiritual qualities extend beyond the domain of mood and well-being to those areas of life-functioning more associated with cognitive intelligence.
Dr. Emmons' book is filled to overflowing with research results showing that religious people tend to be happier, healthier, have more harmonious marriages, cope better with trauma, and experience less internal conflicts. This latter aspect is of particular interest to the author, who in studying personal psychology, has long been aware of goal-setting's "Achilles heel"—its tendency to create internal conflicts. As with the person who is torn between getting ahead in the world and spending more time with his family, the holding of incompatible goals and desires is a major source of misery in many people—leading often to stress and illness. Religion deals with "ultimate concerns" that can arbitrate these conflicts, spiritualize lesser goals, and otherwise counteract the fragmenting effect of the many competing pressures in modern life.
Spiritual intelligence also helps us to understand religion better. Ordinarily, Dr. Emrnons explains, one thinks of religion as something a person has. By focusing on behaviors, skills, and abilities, spiritual intelligence reminds us that religion can be a dynamic way of addressing the vital concerns of daily life. By investigating the "doing" side of religion, Dr. Emmons provides an invaluable first step in conceptualizing a spirituality that not only is, but does. "Religion and intelligence are two concepts that are not often uttered in the same breath," he writes. "Too often, a false dichotomy is set up between two extreme caricatures—one can either be a rational, logical, analytical, and skeptical thinker or a muddle-headed, touchy-feely, gullible spiritualist." His research demonstrates that "instead of forcing a choice between faith and reason, this way of thinking about spirituality recognizes that spiritual processing can contribute to effective cognitive functioning rather than precluding it."2
Of course, religion is more than just problem-solving. The value of wisdom, Dr. Emmons reminds us, is not merely in helping us live life, but, ultimately, in helping us rise above it. What spiritual intelligence does is provide us a helpful way of
seeing past the myriad beliefs and myths of traditional religion to a universal spirituality accessible to all. He concludes: "If spiritual intelligence does indeed confer individual and societal advantages, if the world would be a better place if
people were more 'spiritually intelligent,' the desirability and feasibility of strategic efforts to augment it ought to be investigated. Rather than forcing spirituality or religion on people because it is good for them or society, edifying people as to how spiritual and religious skills might lead to success in life may
prove to be a more effective and lasting route to spiritual transformation. Just as educational programs have been developed to raise emotional intelligence, spiritual skills could similarly be acquired and cultivated." O
1.Emotional Intelligence by Daniel Goleman, reviewed in Self-Realization magazine, Summer 1998, demonstrated the importance of mood management and impulse control in pre-dictinc life success.
2.One is reminded of an inimitable observation of Swami Sri Yukteswar, recorded by Paramahansa Yogananda in his Autobiography of a Yogi: "'Saintliness is not dumbness! Divine perceptions are not incapacitating!' he would say. 'The active expression of virtue gives rise to the keenest intelligence.'"
May 28, 2009
Would You Slap Your Father? If So, You’re a Liberal
By NICHOLAS D. KRISTOF
If you want to tell whether someone is conservative or liberal, what are a couple of completely nonpolitical questions that will give a good clue?
How’s this: Would you be willing to slap your father in the face, with his permission, as part of a comedy skit?
And, second: Does it disgust you to touch the faucet in a public restroom?
Studies suggest that conservatives are more often distressed by actions that seem disrespectful of authority, such as slapping Dad. Liberals don’t worry as long as Dad has given permission.
Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.
The upshot is that liberals and conservatives don’t just think differently, they also feel differently. This may even be a result, in part, of divergent neural responses.
This came up after I wrote a column earlier this year called “The Daily Me.” I argued that most of us employ the Internet not to seek the best information, but rather to select information that confirms our prejudices. To overcome that tendency, I argued, we should set aside time for a daily mental workout with an ideological sparring partner. Afterward, I heard from Jonathan Haidt, a psychology professor at the University of Virginia. “You got the problem right, but the prescription wrong,” he said.
Simply exposing people to counterarguments may not accomplish much, he said, and may inflame antagonisms.
A study by Diana Mutz of the University of Pennsylvania found that when people saw tight television shots of blowhards with whom they disagreed, they felt that the other side was even less legitimate than before.
The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality. If you damage your prefrontal cortex, your I.Q. may be unaffected, but you’ll have trouble harrumphing.
One of the main divides between left and right is the dependence on different moral values. For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.
Some evolutionary psychologists believe that disgust emerged as a protective mechanism against health risks, like feces, spoiled food or corpses. Later, many societies came to apply the same emotion to social “threats.” Humans appear to be the only species that registers disgust, which is why a dog will wag its tail in puzzlement when its horrified owner yanks it back from eating excrement.
Psychologists have developed a “disgust scale” based on how queasy people would be in 27 situations, such as stepping barefoot on an earthworm or smelling urine in a tunnel. Conservatives systematically register more disgust than liberals. (To see how you weigh factors in moral decisions, take the tests at www.yourmorals.org.)
It appears that we start with moral intuitions that our brains then find evidence to support. For example, one experiment involved hypnotizing subjects to expect a flash of disgust at the word “take.” They were then told about Dan, a student council president who “tries to take topics that appeal to both professors and students.”
The research subjects felt disgust but couldn’t find any good reason for it. So, in some cases, they concocted their own reasons, such as: “Dan is a popularity-seeking snob.”
So how do we discipline our brains to be more open-minded, more honest, more empirical? A start is to reach out to moderates on the other side — ideally eating meals with them, for that breaks down “us vs. them” battle lines that seem embedded in us. (In ancient times we divided into tribes; today, into political parties.) The Web site www.civilpolitics.org is an attempt to build this intuitive appreciation for the other side’s morality, even if it’s not our morality.
“Minds are very hard things to open, and the best way to open the mind is through the heart,” Professor Haidt says. “Our minds were not designed by evolution to discover the truth; they were designed to play social games.”
Thus persuasion may be most effective when built on human interactions. Gay rights were probably advanced largely by the public’s growing awareness of friends and family members who were gay.
A corollary is that the most potent way to win over opponents is to accept that they have legitimate concerns, for that triggers an instinct to reciprocate. As it happens, we have a brilliant exemplar of this style of rhetoric in politics right now — Barack Obama.
May 30, 2009
Some Thoughts on the Pleasures of Being a Re-Reader
By VERLYN KLINKENBORG
I’ve always admired my friends who are wide readers. A few even pride themselves on never reading a book a second time. I’ve been a wide reader at times. When I was much younger, I spent nearly a year in the old Reading Room of the British Museum, discovering in the book I was currently reading the title of the next I would read.
But at heart, I’m a re-reader. The point of reading outward, widely, has always been to find the books I want to re-read and then to re-read them. In part, that’s an admission of defeat, an acknowledgement that no matter how long and how widely I read, I will only ever make my way through a tiny portion of the world’s literature. (The British Museum was a great place to learn that lesson.) And in part, it’s a concession to the limits of my memory. I forget a lot, which makes the pleasure of re-reading all the greater.
The love of repetition seems to be ingrained in children. And it is certainly ingrained in the way children learn to read — witness the joyous and maddening love of hearing that same bedtime book read aloud all over again, word for word, inflection for inflection. Childhood is an oasis of repetitive acts, so much so that there is something shocking about the first time a young reader reads a book only once and moves on to the next. There’s a hunger in that act but also a kind of forsaking, a glimpse of adulthood to come.
The work I chose in adulthood — to study literature — required the childish pleasure of re-reading. When I was in graduate school, once through Pope’s “Dunciad” or Berryman’s “The Dream Songs” was not going to cut it. A grasp of the poem was presumed to lie on the far side of many re-readings, none of which were really repetitions. The same is true of being a writer, which requires obsessive re-reading. But the real re-reading I mean is the savory re-reading, the books I have to be careful not to re-read too often so I can read them again with pleasure.
It’s a miscellaneous library, always shifting. It has included a book of the north woods: John J. Rowlands’s “Cache Lake Country,” which I have re-read annually for many years. It may still include Raymond Chandler, though I won’t know for sure till the next time I re-read him. It includes Michael Herr’s “Dispatches” and lots of A.J. Liebling and a surprising amount of George Eliot. It once included nearly all of Dickens, but that has been boiled down to “The Pickwick Papers” and “Great Expectations.” There are many more titles, of course. This is not a canon. This is a refuge.
Part of the fun of re-reading is that you are no longer bothered by the business of finding out what happens. Re-reading “Middlemarch,” for instance, or even “The Great Gatsby,” I’m able to pay attention to what’s really happening in the language itself — a pleasure surely as great as discovering who marries whom, and who dies and who does not.
The real secret of re-reading is simply this: It is impossible. The characters remain the same, and the words never change, but the reader always does. Pip is always there to be revisited, but you, the reader, are a little like the convict who surprises him in the graveyard — always a stranger.
I look at the books on my library shelves. They certainly seem dormant. But what if the characters are quietly rearranging themselves? What if Emma Woodhouse doesn’t learn from her mistakes? What if Tom Jones descends into a sodden life of poaching and outlawry? What if Eve resists Satan, remembering God’s injunction and Adam’s loving advice? I imagine all the characters bustling to get back into their places as they feel me taking the book down from the shelf. “Hurry,” they say, “he’ll expect to find us exactly where he left us, never mind how much his life has changed in the meantime.”
June 29, 2009
Journalism Rules Are Bent in News Coverage From Iran
By BRIAN STELTER
“Check the source” may be the first rule of journalism. But in the coverage of the protests in Iran this month, some news organizations have adopted a different stance: publish first, ask questions later. If you still don’t know the answer, ask your readers.
CNN showed scores of videos submitted by Iranians, most of them presumably from protesters who took to the streets to oppose Mahmoud Ahmadinejad’s re-election on June 12. The Web sites of The New York Times, The Huffington Post, The Guardian newspaper in London and others published minute-by-minute blogs with a mix of unverified videos, anonymous Twitter messages and traditional accounts from Tehran.
The blogs tend to run on a separate track with more traditional reporting from the news organizations, intersecting when user videos and information can be confirmed. The combination amounts to the biggest embrace yet of a collaborative new style of news gathering — one that combines the contributions of ordinary citizens with the reports and analysis of journalists.
Many mainstream media sources, which have in the past been critical of the undifferentiated sources of information on the Web, had little choice but to throw open their doors in this case. As the protests against Mr. Ahmadinejad grew, the government sharply curtailed the foreign press. As visas expired, many journalists packed up, and the ones who stayed were barred from reporting on the streets.
In a news vacuum, amateur videos and eyewitness accounts became the de facto source for information. In fact, the symbol of the protests, the image of a young woman named Neda bleeding to death on a Tehran street, was filmed by two people holding camera phones.
“It’s incredible, the volume of stuff coming out” from Iran, said Matthew Weaver, who sounded exhausted Thursday evening after blogging for more than 10 days for The Guardian newspaper’s Web site.
When rallies and conflicts occur “first the tweets come, then the pictures, then the YouTube videos, then the wires,” he said. “It’s extraordinary.”
Most important, he said, what people are saying “at one point in the day is then confirmed by more conventional sources four or five hours later.”
CNN encourages viewers to upload pictures and observations to iReport.com, its Web site for citizen journalism. Every upload is posted automatically on iReport.com, but each is studied before being shown on television.
In the vetting process, CNN contacts the person who posted the material, asks questions about the content and tries to confirm its veracity. Lila King, the executive in charge of iReport, said the staff members try to “triangulate the details” of an event by corroborating stories with multiple iReport contributors in a given area. Farsi speakers at CNN sometimes listened intently to the sound from the protest videos, discerning the accents of Iranian cities and transcribing the chants and screams.
Because the videos and images are not taken by a CNN employee, the network cannot completely vouch for their authenticity. But without professionals at the scene — CNN’s remaining correspondent was pulled out last week after the government imposed prohibitive restrictions — they provide the all-important pictures to tell the story.
In an indication of how difficult the process can be, CNN had received 5,200 Iran-related submissions and had approved about 180 of them for use on television.
Iran is now the third biggest traffic driver to iReport.com, behind the United States and Canada. One month ago, Iran ranked No. 63 on the list of countries. Ms. King called Iran a “watershed moment” for citizen dispatches, and for the first time an iReport producer sits at the main CNN newsgathering desk.
Bill Mitchell, a senior leader at the Poynter Institute, a nonprofit school for journalists, said the extent of user involvement shown in the Iran coverage seems to be a new way of thinking about journalism.
“Instead of limiting ourselves to full-blown articles to be written by a journalist (professional or otherwise), the idea is to look closely at stories as they unfold and ask: is there a piece of this story I’m in a particularly good position to enhance or advance?” he said in an e-mail message.
“And it’s not just a question for journalists,” he added.
Nico Pitney, the senior news editor at The Huffington Post, started to aggregate Iran news on June 13, the day after the election. By the middle of last week, the blog — with several updates an hour during the day — had received more than 100,000 comments and five million page views.
Mr. Pitney said blogs like his produce a synthesis of professional reporting and reliable amateur material. Essentially, the news tips that reporters have always relied upon are now being aired in public.
In a recognition of the Web’s role in covering the protests, Mr. Pitney was invited by the White House to ask a question at a presidential press conference last week. He forwarded to President Obama an e-mailed question from an Iranian. “We’ve been seeing a lot of reports coming directly out of Iran,” the president said.
Even anonymous Internet users develop a reputation over time, said Robert Mackey, the editor of a blog called The Lede for The New York Times’s Web site, who tracked the election and protest for almost two weeks. Although there have been some erroneous claims on sites like Twitter, in general “there seems to be very little mischief-making,” Mr. Mackey said. “People generally want to help solve the puzzle.”
Readers repeatedly drew Mr. Mackey’s attention to tweets and photos of protests in the comments thread of the blog. Some even shared their memories of the geography of Tehran in an attempt to verify scenes in videos.
Over time, the impromptu Iranian reporters have honed their skills. Some put the date of a skirmish in the file descriptions they send. Others film street signs and landmarks. But the user uploads can sometimes be misleading. Last Wednesday, Mr. Mackey put a call out to readers to determine whether a video was actually new. A commenter pointed to a two-day-old YouTube version.
Cases like this show why the publication of tweets and Flickr photos can be awkward. Echoing others, Mr. Weaver of The Guardian’s blog said his manner of reporting had made some of his colleagues uncomfortable; he recalled one colleague who remarked, “Twitter? I won’t touch it. It’s all garbage.”
On a couple of occasions, The Guardian’s blog featured video clips that were later discovered to be days old. Mr. Weaver said readers of live blogs are “a bit more forgiving” of those incidents, in part because bloggers are transparent about what they do and do not know.
Television anchors were frequently put in the same position while covering Iran. Last Wednesday, the Fox News anchor Shepard Smith showed a YouTube video of police officials beating and dragging people.
“We do not know when or where this video was from,” Mr. Smith told viewers. “We do not even know if it was staged, although we have no reason to believe that.” All he knew for sure was that it was “recently uploaded to YouTube.” For news organizations that face reporting constraints, that has become a good enough starting point.
July 2, 2009
When Our Brains Short-Circuit
By NICHOLAS D. KRISTOF
Our political system sometimes produces such skewed results that it’s difficult not to blame bloviating politicians. But maybe the deeper problem lies in our brains.
Evidence is accumulating that the human brain systematically misjudges certain kinds of risks. In effect, evolution has programmed us to be alert for snakes and enemies with clubs, but we aren’t well prepared to respond to dangers that require forethought.
If you come across a garter snake, nearly all of your brain will light up with activity as you process the “threat.” Yet if somebody tells you that carbon emissions will eventually destroy Earth as we know it, only the small part of the brain that focuses on the future — a portion of the prefrontal cortex — will glimmer.
“We humans do strange things, perhaps because vestiges of our ancient brain still guide us in the modern world,” notes Paul Slovic, a psychology professor at the University of Oregon and author of a book on how our minds assess risks.
Consider America’s political response to these two recent challenges:
1. President Obama proposes moving some inmates from Guantánamo Bay, Cuba, to supermax prisons from which no one has ever escaped. This is the “enemy with club” threat that we have evolved to be alert to, so Democrats and Republicans alike erupt in outrage and kill the plan.
2. The climate warms, ice sheets melt and seas rise. The House scrounges a narrow majority to pass a feeble cap-and-trade system, but Senate passage is uncertain. The issue is complex, full of trade-offs and more cerebral than visceral — and so it doesn’t activate our warning systems.
“What’s important is the threats that were dominant in our evolutionary history,” notes Daniel Gilbert, a professor of psychology at Harvard University. In contrast, he says, the kinds of dangers that are most serious today — such as climate change — sneak in under the brain’s radar.
Professor Gilbert argues that the threats that get our attention tend to have four features. First, they are personalized and intentional. The human brain is highly evolved for social behavior (“that’s why we see faces in clouds, not clouds in faces,” says Mr. Gilbert), and, like gazelles, we are instinctively and obsessively on the lookout for predators and enemies.
Second, we respond to threats that we deem disgusting or immoral — characteristics more associated with sex, betrayal or spoiled food than with atmospheric chemistry.
“That’s why people are incensed about flag burning, or about what kind of sex people have in private, even though that doesn’t really affect the rest of us,” Professor Gilbert said. “Yet where we have a real threat to our well-being, like global warming, it doesn’t ring alarm bells.”
Third, threats get our attention when they are imminent, while our brain circuitry is often cavalier about the future. That’s why we are so bad at saving for retirement. Economists tear their hair out at a puzzlingly irrational behavior called hyperbolic discounting: people’s preference for money now rather than much larger payments later.
For example, in studies, most Americans prefer $50 now to $100 in six months, even though that represents a 100 percent return.
Fourth, we’re far more sensitive to changes that are instantaneous than those that are gradual. We yawn at a slow melting of the glaciers, while if they shrank overnight we might take to the streets.
In short, we’re brilliantly programmed to act on the risks that confronted us in the Pleistocene Age. We’re less adept with 21st-century challenges.
At the University of Virginia, Professor Jonathan Haidt shows his Psychology 101 students how evolution has prepared us to fear some things: He asks how many students would be afraid to stand within 10 feet of a friend carrying a pet boa constrictor. Many hands go up, although almost none of the students have been bitten by a snake.
“The objects of our phobias, and the things that are actually dangerous to us, are almost unrelated in the modern world, but they were related in our ancient environment,” Mr. Haidt said. “We have no ‘preparedness’ to fear a gradual rise in the Earth’s temperature.”
This short-circuitry in our brains explains many of our policy priorities. We Americans spend nearly $700 billion a year on the military and less than $3 billion on the F.D.A., even though food-poisoning kills more Americans than foreign armies and terrorists. We’re just lucky we don’t have a cabinet-level Department of Snake Extermination.
Still, all is not lost, particularly if we understand and acknowledge our neurological shortcomings — and try to compensate with rational analysis. When we work at it, we are indeed capable of foresight: If we can floss today to prevent tooth decay in later years, then perhaps we can also drive less to save the planet.
Ta’lim-ul-Islam: Quest for Knowledge – A poem by Alnoor Rajan Talwar
TA’LIM-UL-ISLAM* – QUEST FOR KNOWLEDGE
Is just another conversation
With what we do daily
There will always be
The search for answers
Who am I?
Who are we?
When did it all begin?
When will it all end?
What changes will we live to see?
Will this world ever mend?
Ta’lim came to provide
That can no longer hide
We constantly strive
For a better and more meaningful life
For knowledge and guidance that change with the time
For values and ideals that make life sublime
For community relations to last
Regardless of creed, color or cast
In the depths of yearning
In the shadows of grieving
In the agonies of living
When I have sought (to no avail)
For some assurances of peace
Ta’lim came to heal
My quest for knowledge and solace
My hunger for a higher wisdom
Me of bridges that link my past, present and future
Of faith that moves mountains
Of miracles and myths
And fables and facts
Of thriving Ismaili empires
& Dai’s preaching under blazing fires
Because of Ta’lim
I am now constantly seeking answers
My thirst may be quenched
But not quite
As I live
To fall and rise
And rise and shine
In my own way
With all my might
I can finally say
Without a fight
A better Muslim
A better Ismaili
And a better person
Who speaks the unspoken
And hears the unheard
And who is not afraid
To soar the winds
Like a bird
Thank you, Ta’lim-ul-Islam
Alnoor Rajan Talwar
*Ta’lim-ul-Islam, a religious education program of Canada, is a series of educational sessions delivered by scholars from across North America.
August 3, 2009
AbroadAt Louvre, Many Stop to Snap but Few Stay to Focus
By MICHAEL KIMMELMAN
PARIS — Spending an idle morning watching people look at art is hardly a scientific experiment, but it rekindles a perennial question: What exactly are we looking for when we roam as tourists around museums? As with so many things right in front of us, the answer may be no less useful for being familiar.
At the Louvre the other day, in the Pavillon des Sessions, two young women in flowered dresses meandered through the gallery. They paused and circled around a few sculptures. They took their time. They looked slowly.
The pavilion puts some 100 immaculate objects from outside Europe on permanent view in a ground floor suite of cool, silent galleries at one end of the museum. Feathered masks from Alaska, ancient bowls from the Philippines, Mayan stone portraits and the most amazing Zulu spoon carved from wood in the abstracted S-shape of a slender young woman take no back seat, aesthetically speaking, to the great Titians and Chardins upstairs.
The young women were unusual for stopping. Most of the museum’s visitors passed through the gallery oblivious.
A few game tourists glanced vainly in guidebooks or hopefully at wall labels, as if learning that one or another of these sculptures came from Papua New Guinea or Hawaii or the Archipelago of Santa Cruz, or that a work was three centuries old or maybe four might help them see what was, plain as day, just before them.
Almost nobody, over the course of that hour or two, paused before any object for as long as a full minute. Only a 17th-century wood sculpture of a copulating couple, from San Cristobal in the Solomon Islands, placed near an exit, caused several tourists to point, smile and snap a photo, but without really breaking stride.
Visiting museums has always been about self-improvement. Partly we seem to go to them to find something we already recognize, something that gives us our bearings: think of the scrum of tourists invariably gathered around the Mona Lisa. At one time a highly educated Westerner read perhaps 100 books, all of them closely. Today we read hundreds of books, or maybe none, but rarely any with the same intensity. Travelers who took the Grand Tour across Europe during the 18th century spent months and years learning languages, meeting politicians, philosophers and artists and bore sketchbooks in which to draw and paint — to record their memories and help them see better.
Cameras replaced sketching by the last century; convenience trumped engagement, the viewfinder afforded emotional distance and many people no longer felt the same urgency to look. It became possible to imagine that because a reproduction of an image was safely squirreled away in a camera or cell phone, or because it was eternally available on the Web, dawdling before an original was a waste of time, especially with so much ground to cover.
We could dream about covering lots of ground thanks to expanding collections and faster means of transportation. At the same time, the canon of art that provided guideposts to tell people where to go and what to look at was gradually dismantled. A core of shared values yielded to an equality among visual materials. This was good and necessary, up to a point. Millions of images came to compete for our attention. Liberated by a proliferation, Western culture was also set adrift in an ocean of passing stimulation, with no anchors to secure it.
So tourists now wander through museums, seeking to fulfill their lifetime’s art history requirement in a day, wondering whether it may now be the quantity of material they pass by rather than the quality of concentration they bring to what few things they choose to focus upon that determines whether they have “done” the Louvre. It’s self-improvement on the fly.
The art historian T. J. Clark, who during the 1970s and ’80s pioneered a kind of analysis that rejected old-school connoisseurship in favor of art in the context of social and political affairs, has lately written a book about devoting several months of his time to looking intently at two paintings by Poussin. Slow looking, like slow cooking, may yet become the new radical chic.
Until then we grapple with our impatience and cultural cornucopia. Recently, I bought a couple of sketchbooks to draw with my 10-year-old in St. Peter’s and elsewhere around Rome, just for the fun of it, not because we’re any good, but to help us look more slowly and carefully at what we found. Crowds occasionally gathered around us as if we were doing something totally strange and novel, as opposed to something normal, which sketching used to be. I almost hesitate to mention our sketching. It seems pretentious and old-fogeyish in a cultural moment when we can too easily feel uncomfortable and almost ashamed just to look hard.
Artists fortunately remind us that there’s in fact no single, correct way to look at any work of art, save for with an open mind and patience. If you have ever gone to a museum with a good artist you probably discovered that they don’t worry so much about what art history books or wall labels tell them is right or wrong, because they’re selfish consumers, freed to look by their own interests.
Back to those two young women at the Louvre: aspiring artists or merely curious, they didn’t plant themselves forever in front of the sculptures but they stopped just long enough to laugh and cluck and stare, and they skipped the wall labels until afterward.
They looked, in other words. And they seemed to have a very good time.
Leaving, they caught sight of a sculptured effigy from Papua New Guinea with a feathered nose, which appeared, by virtue of its wide eyes and open hands positioned on either side of its head, as if it were taunting them.
They thought for a moment. “Nyah-nyah,” they said in unison. Then blew him a raspberry.
August 11, 2009
Reviving the Lost Art of Naming the World
By CAROL KAESUK YOON
One spring when I was a graduate student, I would go each Monday down into the bowels of the entomology building. There I would meet Prof. Jack Franclemont, an elderly gentleman always with little dog in tow, to be tutored in the ordering and naming of life — the science of taxonomy.
Professor Franclemont, a famed moth specialist, was perfectly old school, wearing coat and tie to give the day’s lecture even though I was the only member of the audience. Quaintly distracted, he never quite got my name right, sometimes calling me Miss Loon or Miss Voon. After the talk, I would identify moths using a guide written in 1923, in silence or listening to stories of his dog’s latest antics. I enjoyed the meditative pleasure of those hours, despite the fact that as the lone (and not terribly proficient) student of an aging teacher, I could not help feeling that taxonomy might be dying, which, in fact, it is.
Despite the field’s now blatant modernity, with practitioners using DNA sequences, sophisticated evolutionary theory and supercomputers to order and name all of life, jobs for taxonomists continue to be in steady decline. The natural history collections crucial to the work are closeted or tossed.
Outside taxonomy, no one is much up in arms about this, but perhaps we should be, because the ordering and naming of life is no esoteric science. The past few decades have seen a stream of studies that show that sorting and naming the natural world is a universal, deep-seated and fundamental human activity, one we cannot afford to lose because it is essential to understanding the living world, and our place in it.
Anthropologists were the first to recognize that taxonomy might be more than the science officially founded by Carl Linnaeus, the Swedish botanist, in the 1700s. Studying how nonscientists order and name life, creating what are called folk taxonomies, anthropologists began to realize that when people across the globe were creating ordered groups and giving names to what lived around them, they followed highly stereotyped patterns, appearing unconsciously to follow a set of unwritten rules. Not that conformity to rules was at first obvious to anthropologists who were instead understandably dazzled by the variety in folk taxonomies. The Ilongots, for example, a people of the Philippines, name gorgeous wild orchids after human body parts. There bloom the thighs, there fingernails, yonder elbows and thumbs. The Rofaifo people of New Guinea, excellent natural historians, classify the cassowary, a giant bird complete with requisite feathers and beak, as a mammal. In fact, there seemed, at first glance, to be little room even for agreement among people, let alone a set of universally followed rules. More recently, however, deep underlying similarities have begun to become apparent.
Cecil Brown, an anthropologist at Northern Illinois University who has studied folk taxonomies in 188 languages, has found that people recognize the same basic categories repeatedly, including fish, birds, snakes, mammals, “wugs” (meaning worms and insects, or what we might call creepy-crawlies), trees, vines, herbs and bushes.
Dr. Brown’s finding would be considerably less interesting if these categories were clear-cut depictions of reality that must inevitably be recognized. But tree and bush are hardly that, since there is no way to define a tree versus a bush. The two categories grade insensibly into one another. Wugs, likewise, are neither an evolutionarily nor ecologically nor otherwise cohesive group. Still, people repeatedly recognize and name these oddities.
Likewise, people consistently use two-word epithets to designate specific organisms within a larger group of organisms, despite there being an infinitude of potentially more logical methods. It is so familiar that it is hard to notice. In English, among the oaks, we distinguish the pin oak, among bears, grizzly bears. When Mayan Indians, familiar with the wild piglike creature known as peccaries, encountered Spaniards’ pigs, they dubbed them “village peccaries.” We use two-part names for ourselves as well: Sally Smith or Li Wen. Even scientists are bound by this practice, insisting on Latin binomials for species.
There appears to be such profound unconscious agreement that people will even concur on which exact words make the best names for particular organisms. Brent Berlin, an ethnobiologist at the University of Georgia, discovered this when he read 50 pairs of names, each consisting of one bird and one fish name, to a group of 100 undergraduates, and asked them to identify which was which. The names had been randomly chosen from the language of Peru’s Huambisa people, to which the students had had no previous exposure. With such a large sample size — there were 5,000 choices being made — the students should have scored 50 percent or very close to it if they were blindly guessing. Instead, they identified the bird and fish names correctly 58 percent of the time, significantly more often than expected for random guessing. Somehow they were often able to intuit the names’ birdiness or fishiness.
The most surprising evidence for the deep-seatedness of taxonomy comes from patients who have, through accident or disease, suffered traumas of the brain. Consider the case of the university student whom British researchers refer to simply as J.B.R. Doctors found that upon recovering from swelling of the brain caused by herpes, J.B.R. could no longer recognize living things.
He could still recognize nonliving objects, like a flashlight, a compass, a kettle or a canoe. But the young man was unable to recognize a kangaroo, a mushroom or a buttercup. He could not say what a parrot or even the unmistakable ostrich was. And J.B.R. is far from alone; doctors around the world have found patients with the same difficulty. Most recently, scientists studying these patients’ brains have reported repeatedly finding damage — a deadening of activity or actual lesions — in a region of the temporal lobe, leading some researchers to hypothesize that there might be a specific part of the brain that is devoted to the doing of taxonomy. As curious as they are, these patients and their woes would be of little relevance to our own lives, if they had merely lost some dispensable librarianlike ability to classify living things. As it turns out, their situation is much worse. These are people completely at sea. Without the power to order and name life, a person simply does not know how to live in the world, how to understand it. How to tell the carrot from the cat — which to grate and which to pet? They are utterly lost, anchorless in a strange and confusing world. Because to order and name life is to have a sense of the world around, and, as a result, what one’s place is in it.
Today few people are proficient in the ordering and naming of life. There are the dwindling professional taxonomists, and fast-declining peoples like the Tzeltal Maya of Mexico, among whom a 2-year-old can name more than 30 different plants and whose 4-year-olds can recognize nearly 100. Things were different once. In Linnaeus’s day, it was a matter of aristocratic pride to have a wonderful and wonderfully curated collection of wild organisms, both dead and alive. Darwin (who gained fame first as the world’s foremost barnacle taxonomist) might have expected any dinner-party conversation to turn taxonomic, after an afternoon of beetle-hunting or wildflower study. Most of us claim and enjoy no such expertise.
We are, all of us, abandoning taxonomy, the ordering and naming of life. We are willfully becoming poor J.B.R., losing the ability to order and name and therefore losing a connection to and a place in the living world.
No wonder so few of us can really see what is out there. Even when scads of insistent wildlife appear with a flourish right in front of us, and there is such life always — hawks migrating over the parking lot, great colorful moths banging up against the window at night — we barely seem to notice. We are so disconnected from the living world that we can live in the midst of a mass extinction, of the rapid invasion everywhere of new and noxious species, entirely unaware that anything is happening. Happily, changing all this turns out to be easy. Just find an organism, any organism, small, large, gaudy, subtle — anywhere, and they are everywhere — and get a sense of it, its shape, color, size, feel, smell, sound. Give a nod to Professor Franclemont and meditate, luxuriate in its beetle-ness, its daffodility. Then find a name for it. Learn science’s name, one of countless folk names, or make up your own. To do so is to change everything, including yourself. Because once you start noticing organisms, once you have a name for particular beasts, birds and flowers, you can’t help seeing life and the order in it, just where it has always been, all around you.
Adapted from “Naming Nature: The Clash Between Instinct and Science” by Carol Kaesuk Yoon. Copyright 2009 by Carol Kaesuk Yoon. With permission of the publisher, W.W. Norton & Company, Inc.
Poetry in Motion
By DANNY HEITMAN
Published: August 15, 2009
Baton Rouge, La.
IT seems that we’ve done just about everything to get the American auto industry out of the doldrums. We’ve forced bankruptcies. We’ve exchanged cash for clunkers. But have we tried poetry?
The question is brought to mind by the story of Marianne Moore, the famous American writer, who served for a brief season as the Ford Motor Company’s unofficial poet laureate.
Moore, who died in 1972, was at the height of her literary powers in the autumn of 1955, when a letter arrived in her Brooklyn mailbox.
A Ford executive wrote that the company was launching “a rather important new series of cars,” but his team was stumped to think of a name for the latest product line. Could Moore, an icon of American letters, help them out?
Moore embraced the assignment with relish, not surprising for a poet who enjoyed — and whose writing was frequently inspired by — popular culture, whether it be baseball, boxing or bric-a-brac. The correspondence became a cultural fixture of its own after it was published in The New Yorker two years later.
Throughout the fall and winter of 1955, Moore’s steady stream of suggestions arrived at Ford: “the Ford Silver Sword,” “Intelligent Bullet,” “the Ford Fabergé,” “Mongoose Civique,” “Anticipator,” “Pastelogram,” “Astranaut” and, the highest flight of fancy, “Utopian Turtletop.”
Moore apparently had no qualms about enlisting her muse in the service of the automotive industry. She was also willing to embrace the risks of the marketplace, agreeing to be paid only if she came up with a winning name. As Moore’s biographer Charles Molesworth points out, she “had always enjoyed the language of advertisement, delighting in its inventiveness and ebullience, and even relating it to the poetics of praise.”
These days, poetry and commerce are rarely on such good speaking terms. Poetry doesn’t sell well, and poets almost never attain the celebrity that touched Moore, Robert Frost and Carl Sandburg half a century ago. If some Detroit executive got the bright idea to consult a poet for marketing advice today, one rather doubts he’d know whom to call.
It’s nice to think that the two groups — poets and carmakers — might find new relevance through collaboration, but history is not encouraging.
After much thought, Ford Motors politely rejected all of Moore’s lyrical suggestions for its new car line. Instead, the company’s executives opted for a choice generated internally: the Edsel.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum