Concept of Knowledge Revisited

Discussion on R&R from all regions
Post Reply
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 2, 2008
Op-Ed Columnist
The Cognitive Age
By DAVID BROOKS

If you go into a good library, you will find thousands of books on globalization. Some will laud it. Some will warn about its dangers. But they’ll agree that globalization is the chief process driving our age. Our lives are being transformed by the increasing movement of goods, people and capital across borders.

The globalization paradigm has led, in the political arena, to a certain historical narrative: There were once nation-states like the U.S. and the European powers, whose economies could be secured within borders. But now capital flows freely. Technology has leveled the playing field. Competition is global and fierce.

New dynamos like India and China threaten American dominance thanks to their cheap labor and manipulated currencies. Now, everything is made abroad. American manufacturing is in decline. The rest of the economy is threatened.

Hillary Clinton summarized the narrative this week: “They came for the steel companies and nobody said anything. They came for the auto companies and nobody said anything. They came for the office companies, people who did white-collar service jobs, and no one said anything. And they came for the professional jobs that could be outsourced, and nobody said anything.”

The globalization paradigm has turned out to be very convenient for politicians. It allows them to blame foreigners for economic woes. It allows them to pretend that by rewriting trade deals, they can assuage economic anxiety. It allows them to treat economic and social change as a great mercantilist competition, with various teams competing for global supremacy, and with politicians starring as the commanding generals.

But there’s a problem with the way the globalization paradigm has evolved. It doesn’t really explain most of what is happening in the world.

Globalization is real and important. It’s just not the central force driving economic change. Some Americans have seen their jobs shipped overseas, but global competition has accounted for a small share of job creation and destruction over the past few decades. Capital does indeed flow around the world. But as Pankaj Ghemawat of the Harvard Business School has observed, 90 percent of fixed investment around the world is domestic. Companies open plants overseas, but that’s mainly so their production facilities can be close to local markets.

Nor is the globalization paradigm even accurate when applied to manufacturing. Instead of fleeing to Asia, U.S. manufacturing output is up over recent decades. As Thomas Duesterberg of Manufacturers Alliance/MAPI, a research firm, has pointed out, the U.S.’s share of global manufacturing output has actually increased slightly since 1980.

The chief force reshaping manufacturing is technological change (hastened by competition with other companies in Canada, Germany or down the street). Thanks to innovation, manufacturing productivity has doubled over two decades. Employers now require fewer but more highly skilled workers. Technological change affects China just as it does the America. William Overholt of the RAND Corporation has noted that between 1994 and 2004 the Chinese shed 25 million manufacturing jobs, 10 times more than the U.S.

The central process driving this is not globalization. It’s the skills revolution. We’re moving into a more demanding cognitive age. In order to thrive, people are compelled to become better at absorbing, processing and combining information. This is happening in localized and globalized sectors, and it would be happening even if you tore up every free trade deal ever inked.

The globalization paradigm emphasizes the fact that information can now travel 15,000 miles in an instant. But the most important part of information’s journey is the last few inches — the space between a person’s eyes or ears and the various regions of the brain. Does the individual have the capacity to understand the information? Does he or she have the training to exploit it? Are there cultural assumptions that distort the way it is perceived?

The globalization paradigm leads people to see economic development as a form of foreign policy, as a grand competition between nations and civilizations. These abstractions, called “the Chinese” or “the Indians,” are doing this or that. But the cognitive age paradigm emphasizes psychology, culture and pedagogy — the specific processes that foster learning. It emphasizes that different societies are being stressed in similar ways by increased demands on human capital. If you understand that you are living at the beginning of a cognitive age, you’re focusing on the real source of prosperity and understand that your anxiety is not being caused by a foreigner.

It’s not that globalization and the skills revolution are contradictory processes. But which paradigm you embrace determines which facts and remedies you emphasize. Politicians, especially Democratic ones, have fallen in love with the globalization paradigm. It’s time to move beyond it.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 27, 2008
Basics
Curriculum Designed to Unite Art and Science
By NATALIE ANGIER

Senator Barack Obama likes to joke that the battle for the Democratic presidential nomination has been going on so long, babies have been born, and they’re already walking and talking.

That’s nothing. The battle between the sciences and the humanities has been going on for so long, its early participants have stopped walking and talking, because they’re already dead.

It’s been some 50 years since the physicist-turned-novelist C.P. Snow delivered his famous “Two Cultures” lecture at the University of Cambridge, in which he decried the “gulf of mutual incomprehension,” the “hostility and dislike” that divided the world’s “natural scientists,” its chemists, engineers, physicists and biologists, from its “literary intellectuals,” a group that, by Snow’s reckoning, included pretty much everyone who wasn’t a scientist. His critique set off a frenzy of hand-wringing that continues to this day, particularly in the United States, as educators, policymakers and other observers bemoan the Balkanization of knowledge, the scientific illiteracy of the general public and the chronic academic turf wars that are all too easily lampooned.

Yet a few scholars of thick dermis and pep-rally vigor believe that the cultural chasm can be bridged and the sciences and the humanities united into a powerful new discipline that would apply the strengths of both mindsets, the quantitative and qualitative, to a wide array of problems. Among the most ambitious of these exercises in fusion thinking is a program under development at Binghamton University in New York called the New Humanities Initiative.

Jointly conceived by David Sloan Wilson, a professor of biology, and Leslie Heywood, a professor of English, the program is intended to build on some of the themes explored in Dr. Wilson’s evolutionary studies program, which has proved enormously popular with science and nonscience majors alike, and which he describes in the recently published “Evolution for Everybody.” In Dr. Wilson’s view, evolutionary biology is a discipline that, to be done right, demands a crossover approach, the capacity to think in narrative and abstract terms simultaneously, so why not use it as a template for emulsifying the two cultures generally?

“There are more similarities than differences between the humanities and the sciences, and some of the stereotypes have to be altered,” Dr. Wilson said. “Darwin, for example, established his entire evolutionary theory on the basis of his observations of natural history, and most of that information was qualitative, not quantitative.”

As he and Dr. Heywood envision the program, courses under the New Humanities rubric would be offered campuswide, in any number of departments, including history, literature, philosophy, sociology, law and business. The students would be introduced to basic scientific tools like statistics and experimental design and to liberal arts staples like the importance of analyzing specific texts or documents closely, identifying their animating ideas and comparing them with the texts of other times or other immortal minds.

One goal of the initiative is to demystify science by applying its traditional routines and parlance in nontraditional settings — graphing Jane Austen, as the title of an upcoming book felicitously puts it. “If you do statistics in the context of something you’re interested in and are good at, then it becomes an incremental as opposed to a saltational jump,” Dr. Wilson said. “You see that the mechanics are not so hard after all, and once you understand why you’re doing the statistics in the first place, it ends up being simple nuts and bolts stuff, nothing more.”

To illustrate how the New Humanities approach to scholarship might work, Dr. Heywood cited her own recent investigations into the complex symbolism of the wolf, a topic inspired by a pet of hers that was seven-eighths wolf. “He was completely different from a dog,” she said. “He was terrified of things in the human environment that dogs are perfectly at ease with, like the swishing sound of a jogging suit, or somebody wearing a hat, and he kept his reserve with people, even me.”

Dr. Heywood began studying the association between wolves and nature, and how people’s attitudes toward one might affect their regard for the other. “In the standard humanities approach, you compile and interpret images of wolves from folkloric history, and you analyze previously published texts about wolves,” and that’s pretty much it, Dr. Heywood said. Seeking a more full-bodied understanding, she delved into the scientific literature, studying wolf ecology, biology and evolution. She worked with Dr. Wilson and others to design a survey to gauge people’s responses to three images of a wolf: one of a classic beautiful wolf, another of a hunter holding a dead wolf, the third of a snarling, aggressive wolf.

It’s an implicit association test, designed to gauge subliminal attitudes by measuring latency of response between exposure to an image on a screen and the pressing of a button next to words like beautiful, frightening, good, wrong.

“These firsthand responses give me more to work with in understanding how people read wolves, as opposed to seeing things through other filters and published texts,” Dr. Heywood said.

Combining some of her early survey results with the wealth of wolf imagery culled from cultures around the world, Dr. Heywood finds preliminary support for the provocative hypothesis that humans and wolves may have co-evolved.

“They were competing predators that occupied the same ecological niche as we did,” she said, “but it’s possible that we learned some of our social and hunting behaviors from them as well.” Hence, our deeply conflicted feelings toward wolves — as the nurturing mother to Romulus and Remus, as the vicious trickster disguised as Little Red Riding Hood’s grandmother.

In designing the New Humanities initiative, Dr. Wilson is determined to avoid romanticizing science or presenting it as the ultimate arbiter of meaning, as other would-be integrationists and ardent Darwinists have done.

“You can study music, dance, narrative storytelling and artmaking scientifically, and you can conclude that yes, they’re deeply biologically driven, they’re essential to our species, but there would still be something missing,” he said, “and that thing is an appreciation for the work itself, a true understanding of its meaning in its culture and context.”

Other researchers who have reviewed the program prospectus have expressed their enthusiasm, among them George Levine, an emeritus professor of English at Rutgers University, a distinguished scholar in residence at New York University and author of “Darwin Loves You.” Dr. Levine has criticized many recent attempts at so-called Literary Darwinism, the application of evolutionary psychology ideas to the analysis of great novels and plays. What it usually amounts to is reimagining Emma Bovary or Emma Woodhouse as a young, fecund female hunter-gatherer circa 200,000 B.C.

“When you maximize the importance of biological forces and minimize culture, you get something that doesn’t tell you a whole lot about the particularities of literature,” Dr. Levine said. “What you end up with, as far as I’m concerned, is banality.” Reading the New Humanities proposal, by contrast, “I was struck by how it absolutely refused the simple dichotomy,” he said.

“There is a kind of basic illiteracy on both sides,” he added, “and I find it a thrilling idea that people might be made to take pleasure in crossing the border.”
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 30, 2008
Op-Ed Contributor
Best Is the New Worst
By SUSAN JACOBY

PITY the poor word “elite,” which simply means “the best” as an adjective and “the best of a group” as a noun. What was once an accolade has turned poisonous in American public life over the past 40 years, as both the left and the right have twisted it into a code word meaning “not one of us.” But the newest and most ominous wrinkle in the denigration of all things elite is that the slur is being applied to knowledge itself.

Senator Hillary Clinton’s use of the phrase “elite opinion” to dismiss the near unanimous opposition of economists to her proposal for a gas tax holiday was a landmark in the use of elite to attack expertise supposedly beyond the comprehension of average Americans. One might as well say that there is no point in consulting musicians about music or ichthyologists about fish.

The assault on “elite” did not begin with politicians, although it does have political antecedents in sneers directed at “eggheads” during the anti-Communist crusades of the 1950s. The broader cultural perversion of its meaning dates from the late 1960s, when the academic left pinned the label on faculty members who resisted the establishment of separate departments for what were then called “minority studies.” In this case, two distinct faculty groups were tarred with elitism — those who wanted to incorporate black and women’s studies into the core curriculum, and those who thought that blacks and women had produced nothing worthy of study. Instead of elitist, the former group should have been described as “inclusionary” and the latter as “bigoted.”

The second stage of elite-bashing was conceived by the cultural and political right. Conservative intellectuals who rose to prominence during the Reagan administration managed the neat trick of reversing the ’60s usage of “elite” by applying it as a slur to the left alone. “Elite,” often rendered in the plural, became synonymous with “limousine liberals” who opposed supposedly normative American values. That the right-wing intellectual establishment also constituted a powerful elite was somehow obscured.

“Elite” and “elitist” do not, in a dictionary sense, mean the same thing. An elitist is someone who does believe in government by an elite few — an anti-democratic philosophy that has nothing to do with elite achievement. But the terms have become so conflated that Americans have come to consider both elite and elitist synonyms for snobbish.

All the older forms of elite-bashing have now devolved into a kind of aggressive denial of the threat to American democracy posed by public ignorance.

During the past few months, I have received hundreds of e-mail messages calling me an elitist for drawing attention to America’s knowledge deficit. One of the most memorable came from a man who objected to my citation of a statistic, from a 2006 National Geographic-Roper survey, indicating that nearly two-thirds of Americans age 18 to 24 cannot find Iraq on a map. “Why should I care whether my mechanic knows where Iraq is, as long as he knows how to fix my car?” the man asked.

But what could be more elitist than the idea that a mechanic cannot be expected to know the location of a country where thousands of Americans of his own generation are fighting and dying?

Another peculiar new use of “elitist” (often coupled with “Luddite”) is its application to any caveats about the Internet as a source of knowledge. After listening to one of my lectures, a college student told me that it was elitist to express alarm that one in four Americans, according to the National Constitution Center, cannot name any First Amendment rights or that 62 percent cannot name the three branches of government. “You don’t need to have that in your head,” the student said, “because you can just look it up on the Web.”

True, but how can an information-seeker know what to look for if he or she does not know that the Bill of Rights exists? There is no point-and-click formula for accumulating a body of knowledge needed to make sense of isolated facts.

It is past time to retire the sliming of elite knowledge and education from public discourse. Do we want mediocre schools or the best education for our children? If we need an operation, do we want an ordinary surgeon or the best, most elite surgeon available?

America was never imagined as a democracy of dumbness. The Declaration of Independence and the Constitution were written by an elite group of leaders, and although their dream was limited to white men, it held the seeds of a future in which anyone might aspire to the highest — let us say it out loud, elite — level of achievement.

Susan Jacoby is the author of “The Age of American Unreason.”
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

June 27, 2008
Op-Ed Contributor
Your Brain Lies to You
By SAM WANG and SANDRA AAMODT

FALSE beliefs are everywhere. Eighteen percent of Americans think the sun revolves around the earth, one poll has found. Thus it seems slightly less egregious that, according to another poll, 10 percent of us think that Senator Barack Obama, a Christian, is instead a Muslim. The Obama campaign has created a Web site to dispel misinformation. But this effort may be more difficult than it seems, thanks to the quirky way in which our brains store memories — and mislead us along the way.

The brain does not simply gather and stockpile information as a computer’s hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man’s curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don’t remember how you learned it.

This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.

With time, this misremembering only gets worse. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term hippocampal storage to longer-term cortical storage. As the source is forgotten, the message and its implications gain strength. This could explain why, during the 2004 presidential campaign, it took some weeks for the Swift Boat Veterans for Truth campaign against Senator John Kerry to have an effect on his standing in the polls.

Even if they do not understand the neuroscience behind source amnesia, campaign strategists can exploit it to spread misinformation. They know that if their message is initially memorable, its impression will persist long after it is debunked. In repeating a falsehood, someone may back it up with an opening line like “I think I read somewhere” or even with a reference to a specific source.

In one study, a group of Stanford students was exposed repeatedly to an unsubstantiated claim taken from a Web site that Coca-Cola is an effective paint thinner. Students who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than The National Enquirer, their other choice), giving it a gloss of credibility.

Adding to this innate tendency to mold information we recall is the way our brains fit facts into established mental frameworks. We tend to remember news that accords with our worldview, and discount statements that contradict it.

In another Stanford study, 48 students, half of whom said they favored capital punishment and half of whom said they opposed it, were presented with two pieces of evidence, one supporting and one contradicting the claim that capital punishment deters crime. Both groups were more convinced by the evidence that supported their initial position.

Psychologists have suggested that legends propagate by striking an emotional chord. In the same way, ideas can spread by emotional selection, rather than by their factual merits, encouraging the persistence of falsehoods about Coke — or about a presidential candidate.

Journalists and campaign workers may think they are acting to counter misinformation by pointing out that it is not true. But by repeating a false rumor, they may inadvertently make it stronger. In its concerted effort to “stop the smears,” the Obama campaign may want to keep this in mind. Rather than emphasize that Mr. Obama is not a Muslim, for instance, it may be more effective to stress that he embraced Christianity as a young man.

Consumers of news, for their part, are prone to selectively accept and remember statements that reinforce beliefs they already hold. In a replication of the study of students’ impressions of evidence about the death penalty, researchers found that even when subjects were given a specific instruction to be objective, they were still inclined to reject evidence that disagreed with their beliefs.

In the same study, however, when subjects were asked to imagine their reaction if the evidence had pointed to the opposite conclusion, they were more open-minded to information that contradicted their beliefs. Apparently, it pays for consumers of controversial news to take a moment and consider that the opposite interpretation may be true.

In 1919, Justice Oliver Wendell Holmes of the Supreme Court wrote that “the best test of truth is the power of the thought to get itself accepted in the competition of the market.” Holmes erroneously assumed that ideas are more likely to spread if they are honest. Our brains do not naturally obey this admirable dictum, but by better understanding the mechanisms of memory perhaps we can move closer to Holmes’s ideal.

Sam Wang, an associate professor of molecular biology and neuroscience at Princeton, and Sandra Aamodt, a former editor in chief of Nature Neuroscience, are the authors of “Welcome to Your Brain: Why You Lose Your Car Keys but Never Forget How to Drive and Other Puzzles of Everyday Life.”
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Why newspapers beat the Internet for accuracy

Nigel Hannaford
Calgary Herald


Saturday, August 16, 2008


Thanks to the Internet, news of the murder and beheading of Greyhound passenger Tim McLean spread around the world in hours.

As a Canwest report details, it wasn't long before copies of the police scanner tape were online, along with allegations of drug use by bus passengers, cannibalism and various other things.

Looked at purely from the perspective of what technology can do, it was quite remarkable.

The only problem is that mixed with the facts was a small tissue of lies. For instance, one product of the blogosphere was a purported link between McLean and his accused killer. It was apparently the work of his grieving friends attempting, as a police spokesman put it, "to rationalize the irrational." Still, it was passed around as the latest revelation.

This is hardly the first time unreliable information went out over the Internet, of course. During the last U.S. presidential election, influential bloggers touted selected exit polls, fantasized a John Kerry win, and according to Bloomberg wires, reversed the U.S. stock market just before markets closed.

The Internet's amazing capability to distribute news is therefore a cautionary tale. It is a reminder to the reader that for all the mainstream media's faults, such as its built-in biases and the lumbering pace at which it moves, and for the admitted spectacular successes that amateur news gatherers occasionally score, the mainstream media does at least scrutinize what comes in for general believability.

It's not a perfect filter. Mistakes get through. Sometimes we accept a government's word that a Middle East dictator is concealing weapons of mass destruction, or perversely refuse to accept that a war is being won, because victory doesn't fit the narrative we thought was unfolding.

But, the mainstream media is also accountable. Mistakes are corrected. In the end, hand-wringing mea culpas follow unwelcome but irrefutable revelations. News hounds may even have their noses so close to the ground that they fail to notice they're on a false trail, but the mainstream media's intention is always to be accurate, and balanced.

And importantly, now that the mainstream media has moved heavily into the Internet (check the Herald's webpage, for instance) readers can be sure those same checks and balances to news coverage in the digital age.

In short, there's much value to readers in the judgment of an experienced news team.

However, there's this good news side to the Internet.

Sure, there's a problem with bad information. However, not for centuries have individuals had such a podium. For, technology's wheel has turned full circle, in terms of what it takes to publish information.

Reading up on how the Anglo-American tradition of free speech came about, I came across many examples of one-man proprietorships in the early days: The fellow who owned the Gutenberg was also reporter, editor, publisher, printer, delivery boy and business manager.

No surprise, these people did what they did because they had a point to make.

Generally, it was a religious one and parenthetically, one of the things least understood now is how large is the blood-debt owed by the secular right of free speech to stubborn Protestant printers of the 16th and 17th centuries, who fought authorities to present truth as they saw it.

But, it took the later development of the 19th century mass market and the emergence of newspapers as profit-making businesses rather than proselytizing broadsheets, to spark today's aspiration to objectivity, balance and credibility.

This was when the machinery of production went beyond the physical capacity of one man to reach a mass audience. Meanwhile, political pressure made even the most ambitious press barons aware the appearance of telling both sides of the story was a commercial asset.

As the decades went by, there was less and less room for lone rangers.

But, the gift of the Internet and the PC is that once more, one person can do it all and this time, speak to millions. (In theory, anyway. Actually doing it is another matter.)

Yes, it means the reader seeking truth must be careful where he looks for it, and some of the McLean material shows just what kind of dreck there is.

Still, when Orwellian machinery of social control exists, it's good to know there are also ways for individuals to do an end run around it. If that means truth and error must temporarily coexist, well, 'twas ever thus.

And -- here comes the pitch -- it also shows why subscribing to a good newspaper such as the Herald is still worth the buck.

Getting it first is great.

Getting it right, is what really matters. This we work very, very hard to accomplish.

nhannaford@theherald.canwest.com

© The Calgary Herald 2008
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

http://www.nytimes.com/2008/09/16/scien ... ?th&emc=th

September 16, 2008
Basics
Gut Instinct’s Surprising Role in Math
By NATALIE ANGIER

You are shopping in a busy supermarket and you’re ready to pay up and go home. You perform a quick visual sweep of the checkout options and immediately start ramming your cart through traffic toward an appealingly unpeopled line halfway across the store. As you wait in line and start reading nutrition labels, you can’t help but calculate that the 529 calories contained in a single slice of your Key lime cheesecake amounts to one-fourth of your recommended daily caloric allowance and will take you 90 minutes on the elliptical to burn off and you’d better just stick the thing behind this stack of Soap Opera Digests and hope a clerk finds it before it melts.

One shopping spree, two distinct number systems in play. Whenever we choose a shorter grocery line over a longer one, or a bustling restaurant over an unpopular one, we rally our approximate number system, an ancient and intuitive sense that we are born with and that we share with many other animals. Rats, pigeons, monkeys, babies — all can tell more from fewer, abundant from stingy. An approximate number sense is essential to brute survival: how else can a bird find the best patch of berries, or two baboons know better than to pick a fight with a gang of six?

When it comes to genuine computation, however, to seeing a self-important number like 529 and panicking when you divide it into 2,200, or realizing that, hey, it’s the square of 23! well, that calls for a very different number system, one that is specific, symbolic and highly abstract. By all evidence, scientists say, the capacity to do mathematics, to manipulate representations of numbers and explore the quantitative texture of our world is a uniquely human and very recent skill. People have been at it only for the last few millennia, it’s not universal to all cultures, and it takes years of education to master. Math-making seems the opposite of automatic, which is why scientists long thought it had nothing to do with our ancient, pre-verbal size-em-up ways.

Yet a host of new studies suggests that the two number systems, the bestial and celestial, may be profoundly related, an insight with potentially broad implications for math education.

One research team has found that how readily people rally their approximate number sense is linked over time to success in even the most advanced and abstruse mathematics courses. Other scientists have shown that preschool children are remarkably good at approximating the impact of adding to or subtracting from large groups of items but are poor at translating the approximate into the specific. Taken together, the new research suggests that math teachers might do well to emphasize the power of the ballpark figure, to focus less on arithmetic precision and more on general reckoning.

“When mathematicians and physicists are left alone in a room, one of the games they’ll play is called a Fermi problem, in which they try to figure out the approximate answer to an arbitrary problem,” said Rebecca Saxe, a cognitive neuroscientist at the Massachusetts Institute of Technology who is married to a physicist. “They’ll ask, how many piano tuners are there in Chicago, or what contribution to the ocean’s temperature do fish make, and they’ll try to come up with a plausible answer.”

“What this suggests to me,” she added, “is that the people whom we think of as being the most involved in the symbolic part of math intuitively know that they have to practice those other, nonsymbolic, approximating skills.”

This month in the journal Nature, Justin Halberda and Lisa Feigenson of Johns Hopkins University and Michele Mazzocco of the Kennedy Krieger Institute in Baltimore described their study of 64 14-year-olds who were tested at length on the discriminating power of their approximate number sense. The teenagers sat at a computer as a series of slides with varying numbers of yellow and blue dots flashed on a screen for 200 milliseconds each — barely as long as an eye blink. After each slide, the students pressed a button indicating whether they thought there had been more yellow dots or blue. (Take a version of the test.)

Given the antiquity and ubiquity of the nonverbal number sense, the researchers were impressed by how widely it varied in acuity. There were kids with fine powers of discrimination, able to distinguish ratios on the order of 9 blue dots for every 10 yellows, Dr. Feigenson said. “Others performed at a level comparable to a 9-month-old,” barely able to tell if five yellows outgunned three blues. Comparing the acuity scores with other test results that Dr. Mazzocco had collected from the students over the past 10 years, the researchers found a robust correlation between dot-spotting prowess at age 14 and strong performance on a raft of standardized math tests from kindergarten onward. “We can’t draw causal arrows one way or another,” Dr. Feigenson said, “but your evolutionarily endowed sense of approximation is related to how good you are at formal math.”

The researchers caution that they have no idea yet how the two number systems interact. Brain imaging studies have traced the approximate number sense to a specific neural structure called the intraparietal sulcus, which also helps assess features like an object’s magnitude and distance. Symbolic math, by contrast, operates along a more widely distributed circuitry, activating many of the prefrontal regions of the brain that we associate with being human. Somewhere, local and global must be hooked up to a party line.

Other open questions include how malleable our inborn number sense may be, whether it can be improved with training, and whether those improvements would pay off in a greater appetite and aptitude for math. If children start training with the flashing dot game at age 4, will they be supernumerate by middle school?

Dr. Halberda, who happens to be Dr. Feigenson’s spouse, relishes the work’s philosophical implications. “What’s interesting and surprising in our results is that the same system we spend years trying to acquire in school, and that we use to send a man to the moon, and that has inspired the likes of Plato, Einstein and Stephen Hawking, has something in common with what a rat is doing when it’s out hunting for food,” he said. “I find that deeply moving.”

Behind every great leap of our computational mind lies the pitter-patter of rats’ feet, the little squeak of rodent kind.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Raising a reader lifts society

Calgary Herald


Friday, September 26, 2008


Poet Emily Dickinson wrote, "There is no frigate like a book, to take us lands away." Or to link us in our common humanity. Reading is one of life's greatest pleasures and there is no more lasting gift parents can give their children than a love of reading.

"This Traverse may the poorest take; Without oppress of Toll," Dickinson continues, and the truth in her lovely poem is in that line. Reading is something everyone can do and there are no barriers to its enjoyment -- not socioeconomic, demographic, racial, ethnic, religious or any other category by which people define themselves. All it takes is a book, a reader's imagination and the ability to read.

Canwest's Raise-a-Reader campaign, which focuses on improving literacy and helping children to become lifelong lovers of reading, has been such a dazzling success year after year because of that universality to which reading speaks. A quiet space at home or a packed C-Train during rush hour -- reading can be done anywhere, by anyone. It requires no expensive equipment and no journey except the one the mind takes. Just open to page one and enjoy.

Nor could anything be lovelier than the sight of a child, wide-eyed and engrossed in a story, whether read at bedtime by a loving parent, at school or day care, or at story-hour at the library. In this day of electronic games, nothing speaks to a child's mind the way a book can, engaging him or her on so many levels of richness that can never be duplicated by the most state-of-the-art video game. A favourite book roots itself in the child's mind in such a way that passages and scenes from it come back years later, enriching the adult life anew. What greater gift can there be than the gift of literacy?

© The Calgary Herald 2008
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

October 28, 2008
Op-Ed Columnist
The Behavioral Revolution
By DAVID BROOKS

Roughly speaking, there are four steps to every decision. First, you perceive a situation. Then you think of possible courses of action. Then you calculate which course is in your best interest. Then you take the action.

Over the past few centuries, public policy analysts have assumed that step three is the most important. Economic models and entire social science disciplines are premised on the assumption that people are mostly engaged in rationally calculating and maximizing their self-interest.

But during this financial crisis, that way of thinking has failed spectacularly. As Alan Greenspan noted in his Congressional testimony last week, he was “shocked” that markets did not work as anticipated. “I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such as that they were best capable of protecting their own shareholders and their equity in the firms.”

So perhaps this will be the moment when we alter our view of decision-making. Perhaps this will be the moment when we shift our focus from step three, rational calculation, to step one, perception.

Perceiving a situation seems, at first glimpse, like a remarkably simple operation. You just look and see what’s around. But the operation that seems most simple is actually the most complex, it’s just that most of the action takes place below the level of awareness. Looking at and perceiving the world is an active process of meaning-making that shapes and biases the rest of the decision-making chain.

Economists and psychologists have been exploring our perceptual biases for four decades now, with the work of Amos Tversky and Daniel Kahneman, and also with work by people like Richard Thaler, Robert Shiller, John Bargh and Dan Ariely.

My sense is that this financial crisis is going to amount to a coming-out party for behavioral economists and others who are bringing sophisticated psychology to the realm of public policy. At least these folks have plausible explanations for why so many people could have been so gigantically wrong about the risks they were taking.

Nassim Nicholas Taleb has been deeply influenced by this stream of research. Taleb not only has an explanation for what’s happening, he saw it coming. His popular books “Fooled by Randomness” and “The Back Swan” were broadsides at the risk-management models used in the financial world and beyond.

In “The Black Swan,” Taleb wrote, “The government-sponsored institution Fannie Mae, when I look at its risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup.” Globalization, he noted, “creates interlocking fragility.” He warned that while the growth of giant banks gives the appearance of stability, in reality, it raises the risk of a systemic collapse — “when one fails, they all fail.”

Taleb believes that our brains evolved to suit a world much simpler than the one we now face. His writing is idiosyncratic, but he does touch on many of the perceptual biases that distort our thinking: our tendency to see data that confirm our prejudices more vividly than data that contradict them; our tendency to overvalue recent events when anticipating future possibilities; our tendency to spin concurring facts into a single causal narrative; our tendency to applaud our own supposed skill in circumstances when we’ve actually benefited from dumb luck.

And looking at the financial crisis, it is easy to see dozens of errors of perception. Traders misperceived the possibility of rare events. They got caught in social contagions and reinforced each other’s risk assessments. They failed to perceive how tightly linked global networks can transform small events into big disasters.

Taleb is characteristically vituperative about the quantitative risk models, which try to model something that defies modelization. He subscribes to what he calls the tragic vision of humankind, which “believes in the existence of inherent limitations and flaws in the way we think and act and requires an acknowledgement of this fact as a basis for any individual and collective action.” If recent events don’t underline this worldview, nothing will.

If you start thinking about our faulty perceptions, the first thing you realize is that markets are not perfectly efficient, people are not always good guardians of their own self-interest and there might be limited circumstances when government could usefully slant the decision-making architecture (see “Nudge” by Thaler and Cass Sunstein for proposals). But the second thing you realize is that government officials are probably going to be even worse perceivers of reality than private business types. Their information feedback mechanism is more limited, and, being deeply politicized, they’re even more likely to filter inconvenient facts.

This meltdown is not just a financial event, but also a cultural one. It’s a big, whopping reminder that the human mind is continually trying to perceive things that aren’t true, and not perceiving them takes enormous effort.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

November 30, 2008, 10:00 pm

‘Paradise Lost’ in Prose

Dennis Danielson, a distinguished Miltonist, has just published a translation of “Paradise Lost.” Into what language?, you ask. Into English, is the answer.

Danielson is well aware that it might seem odd to translate a poem into the language in which it is already written. Dryden turned some of “Paradise Lost” into rhymed verse for a libretto while Milton was still alive; but that was an adaptation, not a translation. There are of course the Classic Comics and Cliff Notes precedents; but these are abridgments designed for the students who don’t have time to, or don’t want to, read the book. Danielson’s is a word-for-word translation, probably longer than the original since its prose unpacks a very dense poetry. The value of his edition, he says, is that it “invites more readers than ever before to enjoy the magnificent story — to experience the grandeur, heroism, pathos, beauty and grace of Milton’s inimitable work.”

Danielson borrows the word “inimitable” from John Wesley, who in 1763 was already articulating the justification for a prose translation of the poem. Wesley reports that in the competition for the title of world’s greatest poem, “the preference has generally been given by impartial judges to Milton’s ‘Paradise Lost,’” but, he laments, “this inimitable work amidst all its beauties is unintelligible to [an] abundance of readers.” Two hundred and fifty years later, Harold Bloom made the same observation. Ordinary readers, he said, now “require mediation to read ‘Paradise Lost’ with full appreciation.”

What features of the poem require mediation? Danielson’s answer is the “linguistic obscurity” from which he proposes to “free” the story so that today’s readers can read it “in their own language.” By their own language he doesn’t mean the language of some “with-it” slang, but a language less Latinate in its syntax and less archaic in its diction than the original (which was archaic and stylized when it was written). Milton’s language is not like Chaucer’s — a dialect modern readers must learn; it is our language structured into a syntax more convoluted than the syntax of ordinary speech, but less convoluted or cryptic than the syntax of modern poets like Hart Crane, Wallace Stevens and John Ashbery.

Like Milton, these poets do not make it easy for readers to move through a poem. Roadblocks, in the form of ambiguities, deliberate obscurities, shifting grammatical paths and recondite allusions, are everywhere and one is expected to stop and try to figure things out, make connections or come to terms with an inability to make connections.

The experience of reading poetry like this was well described by the great critic F.R. Leavis, who said of Milton (he did not mean this as a compliment) that his verse “calls pervasively for a kind of attention … toward itself.” That is, when reading the poetry one is not encouraged to see it as a window on some other, more real, world; it is its own world, and when it refers it refers to other parts of itself. Milton, Leavis said, displays “a capacity for words rather than a capacity for feeling through words.” The poetry is not mimetic in the usual sense of representing something prior to it; it creates the facts and significances to which you are continually asked to attend.

It is from this strenuous and often frustrating labor that Danielson wants to free the reader, who, once liberated, will be able to go with the flow and enjoy the pleasures of a powerful narrative. But that is not what Milton had in mind, as Donald Davie, another prominent critic, saw when he observed (again in complaint) that, rather than facilitating forward movement, Milton’s verse tends to “check narrative impetus” and to “provoke interesting and important speculative questions,” the consideration of which interrupts our progress.

Here is an example. When Adam decides to join Eve in sin and eat the apple, the poem says that he was “fondly overcome by female charm.” The word that asks you to pause is “fondly,” which means both foolishly and affectionately. The two meanings have different relationships to the action they characterize. If you do something foolishly, you have no excuse, and it’s a bit of a mystery as to why you did; if you do it prompted by affection and love, the wrongness of it may still be asserted, but something like an explanation or an excuse has at least been suggested.

The ambiguity plays into the poem-length concern with the question of just how culpable Adam and Eve are for the fall. (Given their faculties and emotions, were they capable of standing?) “Fondly” doesn’t resolve the question, but keeps it alive and adds to the work the reader must always be doing when negotiating this poem.

Here is Danielson’s translation of the line: “an infatuated fool overcome by a woman’s charms.” “Infatuated” isn’t right because it redoubles the accusation in “fool” rather than softening it. The judgment is sharp and clear, but it is a clearer judgment than Milton intended or provided. Something has been lost (although as Danielson points out, something is always lost in a translation).

Another example. At an earlier point, the epic narrator comments on the extent of mankind’s susceptibility to the blandishments of the fallen angels — “and devils to adore for deities.” The tone is one of incredulity; how could anyone be so stupid as to be unable to tell the difference? But the line’s assertion that as polar opposites devils and deities should be easily distinguishable is complicated by the fact that as words “devils” and “deities” are close together, beginning and ending with the same letter and sharing an “e” and an “i” in between. The equivalence suggested by sound (although denied by the sense) is reinforced by the mirror-structure of “adore for,” a phrase that separates devils from deities but in fact participates in the subliminal assertion of their likeness.

What, then, is the line saying? It is saying simultaneously that the difference between devils and deities is obvious and perspicuous and that the difference is hard to tell. This is one of those moments Davie has in mind when he talks about the tendency of Milton’s verse to go off the rails of narrative in order to raise speculative questions that have no definitive answer.

When Danielson comes to render “devils to adore for deities,” he turns it into a present participle: “worshiping devils themselves.” Absent are both the tone of scornful wonder the epic voice directs at the erring sinners and the undercutting of that scorn by the dance of vowels and consonants.

One more example. In line 2 of book I, the reference to the fruit of the forbidden tree is followed by “whose mortal taste/ Brought death into the world.” “Mortal,” from the Latin “mors,” means both fatal — there is no recovery from it — and bringing about the condition of mortality, the condition of being human, the taste of mortality. By eating of the forbidden tree, Adam and Eve become capable of death and therefore capable of having a beginning and an end and a middle filled up by successes, failures, losses and recoveries. To say that a “mortal taste” brought death into the world is to say something tautologous; but the tautology is profound when it reminds us of both the costs and the glories of being mortal. If no mortality, then no human struggles, no narrative, no story, no aspiration (in eternity there’s nowhere to go), no “Paradise Lost.”

Danielson translates “whose mortal taste” as “whose lethal taste,” which is accurate, avoids tautology (or at least suppresses it) and gets us into the next line cleanly and without fuss or provoked speculation. But fuss and bother and speculations provoked by etymological puzzles are what makes this verse go (or, rather, not go), and while the reader’s way may be smoothed by a user-friendly prose translation, smoothness is not what Milton is after; it is not a pleasure he wishes to provide.

I have no doubt that Danielson is aware of all of this. He is not making a mistake. He is making a choice. He knows as well as anyone how Milton’s poetry works, but it is his judgment (following Wesley and Bloom) that many modern readers will not take their Milton straight and require some unraveling of the knots before embarking on the journey.

I’m not sure he’s right (I’ve found students of all kinds responsive to the poetry once they give it half a chance), but whether he is or not, he has fashioned a powerful pedagogical tool that is a gift to any teacher of Milton whatever the level of instruction.

The edition is a parallel one — Milton’s original on the left hand page and Danielson’s prose rendering on the right. This means that you can ask students to take a passage and compare the effects and meanings produced by the two texts. You can ask students to compose their own translations and explain or defend the choices they made. You can ask students to look at prose translations in another language and think about the difference, if there is one, between translating into a foreign tongue and translating into a more user-friendly version of English. You can ask students to speculate on the nature of translation and on the relationship between translation and the perennial debate about whether there are linguistic universals.

In short, armed with just this edition which has no editorial apparatus (to have included one would have been to defeat Danielson’s purpose), you can teach a course in Milton and venture into some deep philosophical waters as well. A nice bargain in this holiday season.

[Editor's note: An earlier version of this article misquoted a phrase from Milton's "Paradise Lost." The phrase has been corrected to "and devils to adore for deities."]

http://fish.blogs.nytimes.com/2008/11/3 ... mode=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

December 16, 2008
Op-Ed Columnist
Lost in the Crowd
By DAVID BROOKS

All day long, you are affected by large forces. Genes influence your intelligence and willingness to take risks. Social dynamics unconsciously shape your choices. Instantaneous perceptions set off neural reactions in your head without you even being aware of them.

Over the past few years, scientists have made a series of exciting discoveries about how these deep patterns influence daily life. Nobody has done more to bring these discoveries to public attention than Malcolm Gladwell.

Gladwell’s important new book, “Outliers,” seems at first glance to be a description of exceptionally talented individuals. But in fact, it’s another book about deep patterns. Exceptionally successful people are not lone pioneers who created their own success, he argues. They are the lucky beneficiaries of social arrangements.

As Gladwell told Jason Zengerle of New York magazine: “The book’s saying, ‘Great people aren’t so great. Their own greatness is not the salient fact about them. It’s the kind of fortunate mix of opportunities they’ve been given.’ ”

Gladwell’s noncontroversial claim is that some people have more opportunities than other people. Bill Gates was lucky to go to a great private school with its own computer at the dawn of the information revolution. Gladwell’s more interesting claim is that social forces largely explain why some people work harder when presented with those opportunities.

Chinese people work hard because they grew up in a culture built around rice farming. Tending a rice paddy required working up to 3,000 hours a year, and it left a cultural legacy that prizes industriousness. Many upper-middle-class American kids are raised in an atmosphere of “concerted cultivation,” which inculcates a fanatical devotion to meritocratic striving.

In Gladwell’s account, individual traits play a smaller role in explaining success while social circumstances play a larger one. As he told Zengerle, “I am explicitly turning my back on, I think, these kind of empty models that say, you know, you can be whatever you want to be. Well, actually, you can’t be whatever you want to be. The world decides what you can and can’t be.”

As usual, Gladwell intelligently captures a larger tendency of thought — the growing appreciation of the power of cultural patterns, social contagions, memes. His book is being received by reviewers as a call to action for the Obama age. It could lead policy makers to finally reject policies built on the assumption that people are coldly rational utility-maximizing individuals. It could cause them to focus more on policies that foster relationships, social bonds and cultures of achievement.

Yet, I can’t help but feel that Gladwell and others who share his emphasis are getting swept away by the coolness of the new discoveries. They’ve lost sight of the point at which the influence of social forces ends and the influence of the self-initiating individual begins.

Most successful people begin with two beliefs: the future can be better than the present, and I have the power to make it so. They were often showered by good fortune, but relied at crucial moments upon achievements of individual will.

Most successful people also have a phenomenal ability to consciously focus their attention. We know from experiments with subjects as diverse as obsessive-compulsive disorder sufferers and Buddhist monks that people who can self-consciously focus attention have the power to rewire their brains.

Control of attention is the ultimate individual power. People who can do that are not prisoners of the stimuli around them. They can choose from the patterns in the world and lengthen their time horizons. This individual power leads to others. It leads to self-control, the ability to formulate strategies in order to resist impulses. If forced to choose, we would all rather our children be poor with self-control than rich without it.

It leads to resilience, the ability to persevere with an idea even when all the influences in the world say it can’t be done. A common story among entrepreneurs is that people told them they were too stupid to do something, and they set out to prove the jerks wrong.

It leads to creativity. Individuals who can focus attention have the ability to hold a subject or problem in their mind long enough to see it anew.

Gladwell’s social determinism is a useful corrective to the Homo economicus view of human nature. It’s also pleasantly egalitarian. The less successful are not less worthy, they’re just less lucky. But it slights the centrality of individual character and individual creativity. And it doesn’t fully explain the genuine greatness of humanity’s outliers. As the classical philosophers understood, examples of individual greatness inspire achievement more reliably than any other form of education. If Gladwell can reduce William Shakespeare to a mere product of social forces, I’ll buy 25 more copies of “Outliers” and give them away in Times Square.

Bob Herbert is off today.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

December 27, 2008
Op-Ed Guest Columnist
Living the Off-Label Life
By JUDITH WARNER
What if you could just take a pill and all of a sudden remember to pay your bills on time? What if, thanks to modern neuroscience, you could, simultaneously, make New Year’s Eve plans, pay the mortgage, call the pediatrician, consolidate credit card debt and do your job — well — without forgetting dentist appointments or neglecting to pick up your children at school?

Would you do it? Tune out the distractions of our online, on-call, too-fast A.D.D.-ogenic world with focus and memory-enhancing medications like Ritalin or Adderall? Stay sharp as a knife — no matter how overworked and sleep-deprived — with a mental-alertness-boosting drug like the anti-narcolepsy medication Provigil?

I’ve always said no. Fantasy aside, I’ve always rejected the idea of using drugs meant for people with real neurological disorders to treat the pathologies of everyday life.

Most of us, viscerally, do. Cognitive enhancement — a practice typified by the widely reported abuse of psychostimulants by college students cramming for exams, and by the less reported but apparently growing use of mind-boosters like Provigil among in-the-know scientists and professors — goes against the grain of some of our most basic beliefs about fairness and meritocracy. It seems to many people to be unnatural, inhuman, hubristic, pure cheating.

That’s why when Henry Greely, director of Stanford Law School’s Center for Law and the Biosciences, published an article, with a host of co-authors, in the science journal Nature earlier this month suggesting that we ought to rethink our gut reactions and “accept the benefits of enhancement,” he was deluged with irate responses from readers.

“There were three kinds of e-mail reactions,” he told me in a phone interview last week. “‘How much crack are you smoking? How much money did your friends in pharma give you? How much crack did you get from your friends in pharma?’ ”

As Americans, our default setting on matters of psychotropic drugs — particularly when it comes to medicating those who are not very ill — tends to be, as the psychiatrist Gerald Klerman called it in 1972, something akin to “pharmacological Calvinism.” People should suffer and endure, the thinking goes, accept what hard work and their God-given abilities bring them and hope for no more.

But Greely and his Nature co-authors suggest that such arguments are outdated and intellectually dishonest. We enhance our brain function all the time, they say — by drinking coffee, by eating nutritious food, by getting an education, even by getting a good night’s sleep. Taking brain-enhancing drugs should be viewed as just another step along that continuum, one that’s “morally equivalent” to such “other, more familiar, enhancements,” they write.

Normal life, unlike sports competitions, they argue, isn’t a zero-sum game, where one person’s doped advantage necessarily brings another’s disadvantage. A surgeon whose mind is extra-sharp, a pilot who’s extra alert, a medical researcher whose memory is fine-tuned to make extraordinary connections, is able to work not just to his or her own benefit, but for that of countless numbers of people. “Cognitive enhancement,” they write, “unlike enhancement for sports competitions, could lead to substantive improvements in the world.”

I’m not convinced of that. I’m not sure that pushing for your personal best — all the time — is tantamount to truly being the best person you can be. I have long thought that a life so frenetic and fractured that it drives “neuro-normal” people to distraction, leaving them sleep-deprived and exhausted, demands — indeed, screams for — systemic change.

But now I do wonder: What if the excessive demands of life today are creating ever-larger categories of people who can’t reach their potential due to handicaps that in an easier time were just quirks? (Absent-minded professor-types were, for generations, typically men who didn’t need to be present — organized and on-time — for their kids.) Is it any fairer to saddle a child with a chronically overwhelmed parent than with one suffering from untreated depression?

And, furthermore, how much can most of us, on a truly meaningful scale, change our lives? At a time of widespread layoffs and job anxiety among those still employed, can anyone but the most fortunate afford to cut their hours to give themselves time to breathe? Can working parents really sacrifice on either side of the wage-earning/life-making equation? It’s disturbing to think that we just have to make do with the world we now live in. But to do otherwise is for most people an impossible luxury.

For some of us, saddled with brains ill-adapted to this era, and taxed with way too many demands and distractions, pharmacological Calvinism may now be a luxury, too.

****
December 27, 2008
Op-Contributor
Why We’re Still Happy
By SONJA LYUBOMIRSKY
Riverside, Calif.

THESE days, bad news about the economy is everywhere.

So why aren’t we panicking? Why aren’t we spending our days dejected about the markets? How is it that we manage to remain mostly preoccupied with the quotidian tasks and concerns of life? Traffic, dinner, homework, deadlines, sharp words, flirtatious glances.

Because the news these days affects everyone.

Research in psychology and economics suggests that when only your salary is cut, or when only you make a foolish investment, or when only you lose your job, you become considerably less satisfied with your life. But when everyone from autoworkers to Wall Street financiers becomes worse off, your life satisfaction remains pretty much the same.

Indeed, humans are remarkably attuned to relative position and status. As the economists David Hemenway and Sara Solnick demonstrated in a study at Harvard, many people would prefer to receive an annual salary of $50,000 when others are making $25,000 than to earn $100,000 a year when others are making $200,000.

Similarly, Daniel Zizzo and Andrew Oswald, economists in Britain, conducted a study that showed that people would give up money if doing so would cause someone else to give up a slightly larger sum. That is, we will make ourselves poorer in order to make someone else poorer, too.

Findings like these reveal an all-too-human truth. We care more about social comparison, status and rank than about the absolute value of our bank accounts or reputations.

For example, Andrew Clark, an economist in France, has recently shown that being laid off hurts less if you live in a community with a high unemployment rate. What’s more, if you are unemployed, you will, on average, be happier if your spouse is unemployed, too.

So in a world in which just about all of us have seen our retirement savings and home values plummet, it’s no wonder that we all feel surprisingly O.K.

Sonja Lyubomirsky, a professor of psychology at the University of California, Riverside, is the author of “The How of Happiness: A Scientific Approach to Getting the Life You Want.”
prhedst
Posts: 11
Joined: Sun Apr 06, 2008 11:49 am

Post by prhedst »

I thought this was an interesting thread so I thought I would give a summary of the problems facing contemporary epistemology (went back through old lecture notes in university). Basically, the problem is skepticism, which traces itself generally to Descartes.

I don't know how the Ismaili philosophy would respond to this problem, but I do know that Descartes arguements borrowed from Plato and Aristotle, whose philosophy were incorporated into Ismailism.

Anyway, here is food for thought for anyone who is interested.

Problems in Contemporary Epistemology

Skepticism: Holds that no one can know anything about the world around us.

Certainty Argument:
You know that O only if you are certain that O. (O is just any ordinary proposition: I have two hands.)
H shows that one cannot be certain of O (H = Skeptical Proposition)
Therefore, you do not know that O.

Arguemnt from Ignorance
1.You know that O, only if you know that H is false. (That you are not dreaming).
2.But you do not know that H is false. (You don't know you are not dreaming).
3.Therefore, you don't know that O.

This is a valid arguement:
If A, then B.
Not B
Therefore, Not A.

Generally all contemporary epistemology centers around how to respond to the skeptic. But how did we end up in this predicament in the first place? It all goes back to Rene Descartes.

Descartes is the father of modern day epistemology. His goal was twofold:
1.To find a secure foundation for all knowledge, and
2.To expose beliefs without foundation.

Foundationalism holds that
1.There are some justified basic beliefs
2.All non-basic justified beliefs depend for their justification ultimately upon basic justified beliefs.

We can know some basic things (we exist, material things exist).
Other things depend upon that for their justification.

He wrote in the time of during the Enlightenment, the scientific revolution, in which there was a growing degree of skepticism, a questioning of the Church, etc.

Meditation 1
His first Meditation gave the criteria for justification and knowledge: certainty. To know something, you must be certain. It is all or nothing.

How can we identify beliefs that are false from those we can be certain are true? Through doubt. And yet Descartes is not a skeptic. He wants to defeat skepticism, and to do this, paradoxicaly though it may sound, he casts doubt on every single belief. His method was basically to tear it all down and start anew.

Doubt Arguement:

Sense-Perception Argument
1.Beliefs based on sources of information that are sometimes mistaken are uncertain
2.Sense-perception is a source of information on which beliefs are based, and is sometimes mistaken
3.Therefore beliefs based on sense-perception are uncertain.

Dreaming Arguement
1.Beliefs based on sensory experiences in dreams cannot be distinguished from those sensory experiences when awake
2.If so, I can never know for certain whether or not I am dreaming.
3.If so, I cannot know my specific beliefs about external world are not false.
4.Therefore, I can never know for certain anything specific about external world on the basis of sensory experience.

Evil Genius Arguement
1.It is possible that an omnipotent evil genius exists who deceives me.
2.If so, it is possible all my beliefs are false.
3.Therefore, I cannot be certain of anything.

Meditation 2

This Meditation gave the Cogito arguement, “that I am,” that anything which is thinking must exist.
1.I am thinking
2.Therefore, I exist.

The purpose of his Meditations is to find a base of certain knowledge, from which all other knowledge can be then extended (Foundationalism). He is advancing of rationalism, the idea that reason is the ultimate source of knowledge, in contrast with empiricism which says senses, not reason, is the ultimate source of knowledge.
Fundamental to his advancement of rationalism is the Wax Argument:

1.All properties of wax that we perceive with the senses change as the wax melts.
2.Whatever aspects of a thing disapears while the thing remains cannot be the essence of a thing. (This idea goes back to Plato and Aristotle. Essence is a necessary feature of something and without it it wouldn't be what it is.)
3.Our senses do not grasp the essence of the wax.
4.The wax can be extended in ways that I cannot accurately imagine.
5.Yet the wax remains the same piece of wax as it melts.
6.Therefore, insofar as I grasp essence of the wax, I grasp in through my mind and not through my senses or imagination.

7.What I know regarding wax applies to everything external to me.
8.Therefore, in order to grasp essence of any external body, you must know that your mind exists but not vis versa.
9.Therefore, the mind is more easily known than the body.

What this arguement demonstrates is that even in the case of mateiral things, are most certain knowledge is derived from the intellect, not the senses or imagination. Knowledge of mind comes first in order of knowing, and from this are derived knowledge of bodies.

Justified True Belief is a modern day concept of knowledge. It states that in order to know P:
1.P must be true. (truth condition)
2.S must believe that P. (belief condition)
3.S must be justified in believing that P. (justification condition)

The Cartesian conception of knowledge is contrasted with justified true belief in the following manner:

S knows that P, if and only if
1.P must be true.
2.S must believe that P without any possible doubt.
3.S must have a justification confirming the truth of P.

So unlike justified true belief, you need absolute certainty. Yet how do we know P is true? What is the truth criteria? What Descrates calls clear and distinct ideas. Where do these come from? God. Descartes is going to show that God exists.

Ontological Argument
1. I have idea of supremely perfect being.
2. Existence is a perfection.
3. Part of nature of supremely perfect being that it exists.
4. Therefore a supremely perfect being exists.

This is a purely a priori (rational) argument. It requires no empirical premises or sensory evidence.

Cosmological Arguement (Put forward by St. Thomas Aquinas)
God exists based on the observable fact of the world.
1.I have an idea of God.
2.This idea must have a cause. (Drawing on Leibenez' principle of sufficient reason that everything has a cause.)
3.There cannot be any less reality (perfection) in the cause than in the effect. This is revealed by the light of nature.
4.If the cause of my idea were anything but God, than 3 would be false.
5.Therefore God exists.

Light of Nature
The light of nature is a rational faculty, and whatever is revealed by it cannot be doubtful.

Descartes had proved that God exists, and that He God cannot deceive, because deception is sign of imperfection.

1.)Deception is imperfection
2.)God is supremely perfect.
3.)Therefore God is not a deceiver.

If God was not good he would be lacking something. God cannot lack anything because He is supremely perfect.

Philosophers generally think something has gone wrong here but disagree where the mistake is.

Descrates needs God to be perfect in order for the rest of his philosophy to work. He needs clear and distinct ideas. How do we know they're true? Because God is perfect and not a deceiver. But how do you know its the light of nature instead of an evil deceiver?

Cartesian Circularity
Descartes reasoning goes in circle He needs the existence of God to overrule the evil genuis, and he also requires that clear and distinct ideas exist. And yet clear and distinct ideas are needed to demonsrtate existence of God. The light of nature is self-evident, an a priori truth.

Skepticism holds that Descrates used a dubious theology to confirm certainty. Its principal arguement is that: we don't know we're not being dreaming/deceived/are brains hooked up to vats, therefore knowledge of the external world is impossible.
prhedst
Posts: 11
Joined: Sun Apr 06, 2008 11:49 am

Post by prhedst »

The skeptical arguments stems from the rejection of Descartes' proof for the existence of God. They leave that aside, and adopt the Method of Universal Doubt Descartes employed.

What I personally think about it, first of all, is that I believe God can never be proved through reason, inasmuch as God is trans-rational (importantly, not sub-rational). You can have a profound experience of God, and that experience is something that "passeth understanding." Miracles likewise do not follow the rules of logic, which is partially why they are miracles. But the experience is universal: whether you are Rumi (Muslim), Meister Eckhart (Christian) or Ramakrishna (Hindu), the essence of your experience will be the same.

And yet, how do you prove it within the criteria the skeptic demands? You cannot, as the skeptic is asking for proof of something that belongs within another domain. It is like asking for someone to give you the temperature of music ... It does not apply to that domain.

But, if we can accept Descartes' idea of God on its own grounds, then how does his epistemology hold up? This is something of course somethign that is never brought up in epistemology class.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

March 19, 2009
Op-Ed Columnist
The Daily Me
By NICHOLAS D. KRISTOF

Some of the obituaries these days aren’t in the newspapers but are for the newspapers. The Seattle Post-Intelligencer is the latest to pass away, save for a remnant that will exist only in cyberspace, and the public is increasingly seeking its news not from mainstream television networks or ink-on-dead-trees but from grazing online.

When we go online, each of us is our own editor, our own gatekeeper. We select the kind of news and opinions that we care most about.

Nicholas Negroponte of M.I.T. has called this emerging news product The Daily Me. And if that’s the trend, God save us from ourselves.

That’s because there’s pretty good evidence that we generally don’t truly want good information — but rather information that confirms our prejudices. We may believe intellectually in the clash of opinions, but in practice we like to embed ourselves in the reassuring womb of an echo chamber.

One classic study sent mailings to Republicans and Democrats, offering them various kinds of political research, ostensibly from a neutral source. Both groups were most eager to receive intelligent arguments that strongly corroborated their pre-existing views.

There was also modest interest in receiving manifestly silly arguments for the other party’s views (we feel good when we can caricature the other guys as dunces). But there was little interest in encountering solid arguments that might undermine one’s own position.

That general finding has been replicated repeatedly, as the essayist and author Farhad Manjoo noted in his terrific book last year: “True Enough: Learning to Live in a Post-Fact Society.”

Let me get one thing out of the way: I’m sometimes guilty myself of selective truth-seeking on the Web. The blog I turn to for insight into Middle East news is often Professor Juan Cole’s, because he’s smart, well-informed and sensible — in other words, I often agree with his take. I’m less likely to peruse the blog of Daniel Pipes, another Middle East expert who is smart and well-informed — but who strikes me as less sensible, partly because I often disagree with him.

The effect of The Daily Me would be to insulate us further in our own hermetically sealed political chambers. One of last year’s more fascinating books was Bill Bishop’s “The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart.” He argues that Americans increasingly are segregating themselves into communities, clubs and churches where they are surrounded by people who think the way they do.

Almost half of Americans now live in counties that vote in landslides either for Democrats or for Republicans, he said. In the 1960s and 1970s, in similarly competitive national elections, only about one-third lived in landslide counties.

“The nation grows more politically segregated — and the benefit that ought to come with having a variety of opinions is lost to the righteousness that is the special entitlement of homogeneous groups,” Mr. Bishop writes.

One 12-nation study found Americans the least likely to discuss politics with people of different views, and this was particularly true of the well educated. High school dropouts had the most diverse group of discussion-mates, while college graduates managed to shelter themselves from uncomfortable perspectives.

The result is polarization and intolerance. Cass Sunstein, a Harvard law professor now working for President Obama, has conducted research showing that when liberals or conservatives discuss issues such as affirmative action or climate change with like-minded people, their views quickly become more homogeneous and more extreme than before the discussion. For example, some liberals in one study initially worried that action on climate change might hurt the poor, while some conservatives were sympathetic to affirmative action. But after discussing the issue with like-minded people for only 15 minutes, liberals became more liberal and conservatives more conservative.

The decline of traditional news media will accelerate the rise of The Daily Me, and we’ll be irritated less by what we read and find our wisdom confirmed more often. The danger is that this self-selected “news” acts as a narcotic, lulling us into a self-confident stupor through which we will perceive in blacks and whites a world that typically unfolds in grays.

So what’s the solution? Tax breaks for liberals who watch Bill O’Reilly or conservatives who watch Keith Olbermann? No, until President Obama brings us universal health care, we can’t risk the surge in heart attacks.

So perhaps the only way forward is for each of us to struggle on our own to work out intellectually with sparring partners whose views we deplore. Think of it as a daily mental workout analogous to a trip to the gym; if you don’t work up a sweat, it doesn’t count.

Now excuse me while I go and read The Wall Street Journal’s editorial page.

http://www.nytimes.com/2009/03/19/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

March 26, 2009
Op-Ed Columnist
Learning How to Think
By NICHOLAS D. KRISTOF
Ever wonder how financial experts could lead the world over the economic cliff?

One explanation is that so-called experts turn out to be, in many situations, a stunningly poor source of expertise. There’s evidence that what matters in making a sound forecast or decision isn’t so much knowledge or experience as good judgment — or, to be more precise, the way a person’s mind works.

More on that in a moment. First, let’s acknowledge that even very smart people allow themselves to be buffaloed by an apparent “expert” on occasion.

The best example of the awe that an “expert” inspires is the “Dr. Fox effect.” It’s named for a pioneering series of psychology experiments in which an actor was paid to give a meaningless presentation to professional educators.

The actor was introduced as “Dr. Myron L. Fox” (no such real person existed) and was described as an eminent authority on the application of mathematics to human behavior. He then delivered a lecture on “mathematical game theory as applied to physician education” — except that by design it had no point and was completely devoid of substance. However, it was warmly delivered and full of jokes and interesting neologisms.

Afterward, those in attendance were given questionnaires and asked to rate “Dr. Fox.” They were mostly impressed. “Excellent presentation, enjoyed listening,” wrote one. Another protested: “Too intellectual a presentation.”

A different study illustrated the genuflection to “experts” another way. It found that a president who goes on television to make a case moves public opinion only negligibly, by less than a percentage point. But experts who are trotted out on television can move public opinion by more than 3 percentage points, because they seem to be reliable or impartial authorities.

But do experts actually get it right themselves?

The expert on experts is Philip Tetlock, a professor at the University of California, Berkeley. His 2005 book, “Expert Political Judgment,” is based on two decades of tracking some 82,000 predictions by 284 experts. The experts’ forecasts were tracked both on the subjects of their specialties and on subjects that they knew little about.

The result? The predictions of experts were, on average, only a tiny bit better than random guesses — the equivalent of a chimpanzee throwing darts at a board.

“It made virtually no difference whether participants had doctorates, whether they were economists, political scientists, journalists or historians, whether they had policy experience or access to classified information, or whether they had logged many or few years of experience,” Mr. Tetlock wrote.

Indeed, the only consistent predictor was fame — and it was an inverse relationship. The more famous experts did worse than unknown ones. That had to do with a fault in the media. Talent bookers for television shows and reporters tended to call up experts who provided strong, coherent points of view, who saw things in blacks and whites. People who shouted — like, yes, Jim Cramer!

Mr. Tetlock called experts such as these the “hedgehogs,” after a famous distinction by the late Sir Isaiah Berlin (my favorite philosopher) between hedgehogs and foxes. Hedgehogs tend to have a focused worldview, an ideological leaning, strong convictions; foxes are more cautious, more centrist, more likely to adjust their views, more pragmatic, more prone to self-doubt, more inclined to see complexity and nuance. And it turns out that while foxes don’t give great sound-bites, they are far more likely to get things right.

This was the distinction that mattered most among the forecasters, not whether they had expertise. Over all, the foxes did significantly better, both in areas they knew well and in areas they didn’t.

Other studies have confirmed the general sense that expertise is overrated. In one experiment, clinical psychologists did no better than their secretaries in their diagnoses. In another, a white rat in a maze repeatedly beat groups of Yale undergraduates in understanding the optimal way to get food dropped in the maze. The students overanalyzed and saw patterns that didn’t exist, so they were beaten by the rodent.

The marketplace of ideas for now doesn’t clear out bad pundits and bad ideas partly because there’s no accountability. We trumpet our successes and ignore failures — or else attempt to explain that the failure doesn’t count because the situation changed or that we were basically right but the timing was off.

For example, I boast about having warned in 2002 and 2003 that Iraq would be a violent mess after we invaded. But I tend to make excuses for my own incorrect forecast in early 2007 that the troop “surge” would fail.

So what about a system to evaluate us prognosticators? Professor Tetlock suggests that various foundations might try to create a “trans-ideological Consumer Reports for punditry,” monitoring and evaluating the records of various experts and pundits as a public service. I agree: Hold us accountable!

http://www.nytimes.com/2009/03/26/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

April 7, 2009
Op-Ed Columnist
The End of Philosophy
By DAVID BROOKS

Socrates talked. The assumption behind his approach to philosophy, and the approaches of millions of people since, is that moral thinking is mostly a matter of reason and deliberation: Think through moral problems. Find a just principle. Apply it.

One problem with this kind of approach to morality, as Michael Gazzaniga writes in his 2008 book, “Human,” is that “it has been hard to find any correlation between moral reasoning and proactive moral behavior, such as helping other people. In fact, in most studies, none has been found.”

Today, many psychologists, cognitive scientists and even philosophers embrace a different view of morality. In this view, moral thinking is more like aesthetics. As we look around the world, we are constantly evaluating what we see. Seeing and evaluating are not two separate processes. They are linked and basically simultaneous.

As Steven Quartz of the California Institute of Technology said during a recent discussion of ethics sponsored by the John Templeton Foundation, “Our brain is computing value at every fraction of a second. Everything that we look at, we form an implicit preference. Some of those make it into our awareness; some of them remain at the level of our unconscious, but ... what our brain is for, what our brain has evolved for, is to find what is of value in our environment.”

Think of what happens when you put a new food into your mouth. You don’t have to decide if it’s disgusting. You just know. You don’t have to decide if a landscape is beautiful. You just know.

Moral judgments are like that. They are rapid intuitive decisions and involve the emotion-processing parts of the brain. Most of us make snap moral judgments about what feels fair or not, or what feels good or not. We start doing this when we are babies, before we have language. And even as adults, we often can’t explain to ourselves why something feels wrong.

In other words, reasoning comes later and is often guided by the emotions that preceded it. Or as Jonathan Haidt of the University of Virginia memorably wrote, “The emotions are, in fact, in charge of the temple of morality, and ... moral reasoning is really just a servant masquerading as a high priest.”

The question then becomes: What shapes moral emotions in the first place? The answer has long been evolution, but in recent years there’s an increasing appreciation that evolution isn’t just about competition. It’s also about cooperation within groups. Like bees, humans have long lived or died based on their ability to divide labor, help each other and stand together in the face of common threats. Many of our moral emotions and intuitions reflect that history. We don’t just care about our individual rights, or even the rights of other individuals. We also care about loyalty, respect, traditions, religions. We are all the descendents of successful cooperators.

The first nice thing about this evolutionary approach to morality is that it emphasizes the social nature of moral intuition. People are not discrete units coolly formulating moral arguments. They link themselves together into communities and networks of mutual influence.

The second nice thing is that it entails a warmer view of human nature. Evolution is always about competition, but for humans, as Darwin speculated, competition among groups has turned us into pretty cooperative, empathetic and altruistic creatures — at least within our families, groups and sometimes nations.

The third nice thing is that it explains the haphazard way most of us lead our lives without destroying dignity and choice. Moral intuitions have primacy, Haidt argues, but they are not dictators. There are times, often the most important moments in our lives, when in fact we do use reason to override moral intuitions, and often those reasons — along with new intuitions — come from our friends.

The rise and now dominance of this emotional approach to morality is an epochal change. It challenges all sorts of traditions. It challenges the bookish way philosophy is conceived by most people. It challenges the Talmudic tradition, with its hyper-rational scrutiny of texts. It challenges the new atheists, who see themselves involved in a war of reason against faith and who have an unwarranted faith in the power of pure reason and in the purity of their own reasoning.

Finally, it should also challenge the very scientists who study morality. They’re good at explaining how people make judgments about harm and fairness, but they still struggle to explain the feelings of awe, transcendence, patriotism, joy and self-sacrifice, which are not ancillary to most people’s moral experiences, but central. The evolutionary approach also leads many scientists to neglect the concept of individual responsibility and makes it hard for them to appreciate that most people struggle toward goodness, not as a means, but as an end in itself.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

April 27, 2009
Op-Ed Contributor
End the University as We Know It
By MARK C. TAYLOR

GRADUATE education is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).

Widespread hiring freezes and layoffs have brought these problems into sharp relief now. But our graduate system has been in crisis for decades, and the seeds of this crisis go as far back as the formation of modern universities. Kant, in his 1798 work “The Conflict of the Faculties,” wrote that universities should “handle the entire content of learning by mass production, so to speak, by a division of labor, so that for every branch of the sciences there would be a public teacher or professor appointed as its trustee.”

Unfortunately this mass-production university model has led to separation where there ought to be collaboration and to ever-increasing specialization. In my own religion department, for example, we have 10 faculty members, working in eight subfields, with little overlap. And as departments fragment, research and publication become more and more about less and less. Each academic becomes the trustee not of a branch of the sciences, but of limited knowledge that all too often is irrelevant for genuinely important problems. A colleague recently boasted to me that his best student was doing his dissertation on how the medieval theologian Duns Scotus used citations.

The emphasis on narrow scholarship also encourages an educational system that has become a process of cloning. Faculty members cultivate those students whose futures they envision as identical to their own pasts, even though their tenures will stand in the way of these students having futures as full professors.

The dirty secret of higher education is that without underpaid graduate students to help in laboratories and with teaching, universities couldn’t conduct research or even instruct their growing undergraduate populations. That’s one of the main reasons we still encourage people to enroll in doctoral programs. It is simply cheaper to provide graduate students with modest stipends and adjuncts with as little as $5,000 a course — with no benefits — than it is to hire full-time professors.

In other words, young people enroll in graduate programs, work hard for subsistence pay and assume huge debt burdens, all because of the illusory promise of faculty appointments. But their economical presence, coupled with the intransigence of tenure, ensures that there will always be too many candidates for too few openings.

The other obstacle to change is that colleges and universities are self-regulating or, in academic parlance, governed by peer review. While trustees and administrations theoretically have some oversight responsibility, in practice, departments operate independently. To complicate matters further, once a faculty member has been granted tenure he is functionally autonomous. Many academics who cry out for the regulation of financial markets vehemently oppose it in their own departments.

If American higher education is to thrive in the 21st century, colleges and universities, like Wall Street and Detroit, must be rigorously regulated and completely restructured. The long process to make higher learning more agile, adaptive and imaginative can begin with six major steps:

1. Restructure the curriculum, beginning with graduate programs and proceeding as quickly as possible to undergraduate programs. The division-of-labor model of separate departments is obsolete and must be replaced with a curriculum structured like a web or complex adaptive network. Responsible teaching and scholarship must become cross-disciplinary and cross-cultural.

Just a few weeks ago, I attended a meeting of political scientists who had gathered to discuss why international relations theory had never considered the role of religion in society. Given the state of the world today, this is a significant oversight. There can be no adequate understanding of the most important issues we face when disciplines are cloistered from one another and operate on their own premises.

It would be far more effective to bring together people working on questions of religion, politics, history, economics, anthropology, sociology, literature, art, religion and philosophy to engage in comparative analysis of common problems. As the curriculum is restructured, fields of inquiry and methods of investigation will be transformed.

2. Abolish permanent departments, even for undergraduate education, and create problem-focused programs. These constantly evolving programs would have sunset clauses, and every seven years each one should be evaluated and either abolished, continued or significantly changed. It is possible to imagine a broad range of topics around which such zones of inquiry could be organized: Mind, Body, Law, Information, Networks, Language, Space, Time, Media, Money, Life and Water.

Consider, for example, a Water program. In the coming decades, water will become a more pressing problem than oil, and the quantity, quality and distribution of water will pose significant scientific, technological and ecological difficulties as well as serious political and economic challenges. These vexing practical problems cannot be adequately addressed without also considering important philosophical, religious and ethical issues. After all, beliefs shape practices as much as practices shape beliefs.

A Water program would bring together people in the humanities, arts, social and natural sciences with representatives from professional schools like medicine, law, business, engineering, social work, theology and architecture. Through the intersection of multiple perspectives and approaches, new theoretical insights will develop and unexpected practical solutions will emerge.

3. Increase collaboration among institutions. All institutions do not need to do all things and technology makes it possible for schools to form partnerships to share students and faculty. Institutions will be able to expand while contracting. Let one college have a strong department in French, for example, and the other a strong department in German; through teleconferencing and the Internet both subjects can be taught at both places with half the staff. With these tools, I have already team-taught semester-long seminars in real time at the Universities of Helsinki and Melbourne.

4. Transform the traditional dissertation. In the arts and humanities, where looming cutbacks will be most devastating, there is no longer a market for books modeled on the medieval dissertation, with more footnotes than text. As financial pressures on university presses continue to mount, publication of dissertations, and with it scholarly certification, is almost impossible. (The average university press print run of a dissertation that has been converted into a book is less than 500, and sales are usually considerably lower.) For many years, I have taught undergraduate courses in which students do not write traditional papers but develop analytic treatments in formats from hypertext and Web sites to films and video games. Graduate students should likewise be encouraged to produce “theses” in alternative formats.

5. Expand the range of professional options for graduate students. Most graduate students will never hold the kind of job for which they are being trained. It is, therefore, necessary to help them prepare for work in fields other than higher education. The exposure to new approaches and different cultures and the consideration of real-life issues will prepare students for jobs at businesses and nonprofit organizations. Moreover, the knowledge and skills they will cultivate in the new universities will enable them to adapt to a constantly changing world.

6. Impose mandatory retirement and abolish tenure. Initially intended to protect academic freedom, tenure has resulted in institutions with little turnover and professors impervious to change. After all, once tenure has been granted, there is no leverage to encourage a professor to continue to develop professionally or to require him or her to assume responsibilities like administration and student advising. Tenure should be replaced with seven-year contracts, which, like the programs in which faculty teach, can be terminated or renewed. This policy would enable colleges and universities to reward researchers, scholars and teachers who continue to evolve and remain productive while also making room for young people with new ideas and skills.

For many years, I have told students, “Do not do what I do; rather, take whatever I have to offer and do with it what I could never imagine doing and then come back and tell me about it.” My hope is that colleges and universities will be shaken out of their complacency and will open academia to a future we cannot conceive.

Mark C. Taylor, the chairman of the religion department at Columbia, is the author of the forthcoming “Field Notes From Elsewhere: Reflections on Dying and Living.”

http://www.nytimes.com/2009/04/27/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 1, 2009
Op-Ed Columnist
Genius: The Modern View
By DAVID BROOKS

Some people live in romantic ages. They tend to believe that genius is the product of a divine spark. They believe that there have been, throughout the ages, certain paragons of greatness — Dante, Mozart, Einstein — whose talents far exceeded normal comprehension, who had an other-worldly access to transcendent truth, and who are best approached with reverential awe.

We, of course, live in a scientific age, and modern research pierces hocus-pocus. In the view that is now dominant, even Mozart’s early abilities were not the product of some innate spiritual gift. His early compositions were nothing special. They were pastiches of other people’s work. Mozart was a good musician at an early age, but he would not stand out among today’s top child-performers.

What Mozart had, we now believe, was the same thing Tiger Woods had — the ability to focus for long periods of time and a father intent on improving his skills. Mozart played a lot of piano at a very young age, so he got his 10,000 hours of practice in early and then he built from there.

The latest research suggests a more prosaic, democratic, even puritanical view of the world. The key factor separating geniuses from the merely accomplished is not a divine spark. It’s not I.Q., a generally bad predictor of success, even in realms like chess. Instead, it’s deliberate practice. Top performers spend more hours (many more hours) rigorously practicing their craft.

The recent research has been conducted by people like K. Anders Ericsson, the late Benjamin Bloom and others. It’s been summarized in two enjoyable new books: “The Talent Code” by Daniel Coyle; and “Talent Is Overrated” by Geoff Colvin.

If you wanted to picture how a typical genius might develop, you’d take a girl who possessed a slightly above average verbal ability. It wouldn’t have to be a big talent, just enough so that she might gain some sense of distinction. Then you would want her to meet, say, a novelist, who coincidentally shared some similar biographical traits. Maybe the writer was from the same town, had the same ethnic background, or, shared the same birthday — anything to create a sense of affinity.

This contact would give the girl a vision of her future self. It would, Coyle emphasizes, give her a glimpse of an enchanted circle she might someday join. It would also help if one of her parents died when she was 12, infusing her with a profound sense of insecurity and fueling a desperate need for success.

Armed with this ambition, she would read novels and literary biographies without end. This would give her a core knowledge of her field. She’d be able to chunk Victorian novelists into one group, Magical Realists in another group and Renaissance poets into another. This ability to place information into patterns, or chunks, vastly improves memory skills. She’d be able to see new writing in deeper ways and quickly perceive its inner workings.

Then she would practice writing. Her practice would be slow, painstaking and error-focused. According to Colvin, Ben Franklin would take essays from The Spectator magazine and translate them into verse. Then he’d translate his verse back into prose and examine, sentence by sentence, where his essay was inferior to The Spectator’s original.

Coyle describes a tennis academy in Russia where they enact rallies without a ball. The aim is to focus meticulously on technique. (Try to slow down your golf swing so it takes 90 seconds to finish. See how many errors you detect.)

By practicing in this way, performers delay the automatizing process. The mind wants to turn deliberate, newly learned skills into unconscious, automatically performed skills. But the mind is sloppy and will settle for good enough. By practicing slowly, by breaking skills down into tiny parts and repeating, the strenuous student forces the brain to internalize a better pattern of performance.

Then our young writer would find a mentor who would provide a constant stream of feedback, viewing her performance from the outside, correcting the smallest errors, pushing her to take on tougher challenges. By now she is redoing problems — how do I get characters into a room — dozens and dozens of times. She is ingraining habits of thought she can call upon in order to understand or solve future problems.

The primary trait she possesses is not some mysterious genius. It’s the ability to develop a deliberate, strenuous and boring practice routine.

Coyle and Colvin describe dozens of experiments fleshing out this process. This research takes some of the magic out of great achievement. But it underlines a fact that is often neglected. Public discussion is smitten by genetics and what we’re “hard-wired” to do. And it’s true that genes place a leash on our capacities. But the brain is also phenomenally plastic. We construct ourselves through behavior. As Coyle observes, it’s not who you are, it’s what you do.

http://www.nytimes.com/2009/05/01/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Sayings of MHI is an important source of knowledge in our tariqah. The following is a link to the Youtube on MHI's sayings with nice music. Enjoy!

http://ismailimail.wordpress.com/2009/0 ... iv-quotes/

Part2

http://www.youtube.com/watch?v=kzo0xNvq17E
Last edited by kmaherali on Sun May 24, 2009 11:16 am, edited 1 time in total.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

In his remarks at Academy of Sciences, in Lisbon, Portugal on 08 May 2009, MHI highlighted areas of knowledge that are important today and for the future and that need to be developed and shared across the world.

Remarks by His Highness the Aga Khan at the Academy of Sciences, in Lisbon, Portugal
08 May 2009


Professor Oliveira,
Minister of National Defence, Mr Nuno Severiano Teixeira,
Minister of Culture, Mr Pinto Ribeiro,
Apostolic Nuncio, Rino Passagata,
Excellencies,
Members of the Academy,
Distinguished guests,
Ladies and Gentlemen

It is an immense honour for me to be here today and to have been admitted to the Academy of Sciences of Lisbon. It reminds me of the day on which I was given an honorary doctorate from the University of Evora. And I do not want to let this occasion pass, without recollecting that very, very special day.

You here in the Academy are guardians of old knowledge and developers of new knowledge. And I thought I would share with you today, very briefly, some of the reflections that have occurred to me since I have completed my journeys in the developing world during my Golden Jubilee.

I visited numerous countries in Africa, Asia and the Middle East and I came into contact with men and women who were intelligent, mature, responsible and who were seeking to build nation states – nation states which would be autonomous, which would be well governed, whose economies would be competent, but these builders were seeking to build on the basis of an enormous knowledge deficit. These men and women in public office simply did not have access to the demography of men and women who are sufficiently educated to be able to man the institutions of state.

And I have come away with another question stemming from my point that there is a deficit of knowledge? The key question is a deficit of what knowledge? What knowledge is necessary in these environments, so that in the decades ahead we can look towards stable nation states around the world?

My conclusion was that the deficit of knowledge is in many areas which are not being offered in education, which are not being taught. Because what have been inherited are curricula of the past, reflections of the past, attitudes of the past, rather than looking forwards, asking what do future generations need to know. And that is the central question which needs to be asked, and on which an academy such as this can have such a massive impact.

Let me mention three areas. First of all, there is the nature of society in these countries. One of the characteristics of all these countries is that they have pluralist societies. And if pluralism is not part of the educational curriculum, the leaders and the peoples of these societies will always be at risk of conflict, because they are not accustomed to pluralism and they do not value it. People are not born valuing pluralism. Therefore pluralism is the sort of subject which needs to be part of education, from the youngest age onwards.

Another aspect is ethics. But not ethics born of dogma, but ethics in civil society. Because when governments fail in these parts of the world, it is civil society which steps in to sustain development. And when ethics are not part of education, teaching, examinations; when they are not part of medicine, the quality of care; when they are not part of financial services, then civil society is undermined. Ethics in civil society is another aspect which is absolutely critical.

The third example is constitutionality. So many countries which I have visited have stumbled into, run into difficulties in governance, because the national constitutions were not designed and conceived to serve the profiles of those countries. And therefore, teaching in areas such as comparative government is another area which is absolutely critical.

If these are the subjects which are necessary today, what are the subjects which will be necessary tomorrow? Is the developing world going to continue in this deficit of knowledge? Or are we going to enable it to move forwards in to new areas of knowledge? My conviction is that we have to help these countries move into new areas of knowledge. And therefore, I think of areas such as the space sciences, such as the neurosciences. There are so many new areas of inquiry which, unless we make an effort to share globally, we will continue to have vast populations around the world who will continue in this knowledge deficit.

Portugal has an extraordinary history. It has been influencing the world for centuries. Your influence today is not limited to Europe. Your influence is massive through your presence in South America. A country like Brazil is a case study for many countries around the world. Brazil is dealing with new areas of knowledge in air transport – that is a new area of knowledge – competing with the best in the world in areas such as agriculture, the development of cash crops, and sugar at new levels of technology.

So the influence of Portugal and the capacity of Portugal to influence what is happening around the world is immense. And it is in this context that I want to thank you for electing me a member of the Academy and for the opportunity you have given me to encourage you to use your global influence, through your history, through your knowledge, through your contacts with the developing world, to bring to the rest of the world what is best in your knowledge.

Thank you.

http://www.akdn.org/speeches_detail.asp?ID=741
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 16, 2009
Editorial Observer
Some Thoughts on the Lost Art of Reading Aloud
By VERLYN KLINKENBORG

Sometimes the best way to understand the present is to look at it from the past. Consider audio books. An enormous number of Americans read by listening these days — listening aloud, I call it. The technology for doing so is diverse and widespread, and so are the places people listen to audio books. But from the perspective of a reader in, say, the early 19th century, about the time of Jane Austen, there is something peculiar about it, even lonely.

In those days, literate families and friends read aloud to each other as a matter of habit. Books were still relatively scarce and expensive, and the routine electronic diversions we take for granted were, of course, nonexistent. If you had grown up listening to adults reading to each other regularly, the thought of all of those solitary 21st-century individuals hearkening to earbuds and car radios would seem isolating. It would also seem as though they were being trained only to listen to books and not to read aloud from them.

It’s part of a pattern. Instead of making music at home, we listen to recordings of professional musicians. When people talk about the books they’ve heard, they’re often talking about the quality of the readers, who are usually professional. The way we listen to books has been de-socialized, stripped of context, which has the solitary virtue of being extremely convenient.

But listening aloud, valuable as it is, isn’t the same as reading aloud. Both require a great deal of attention. Both are good ways to learn something important about the rhythms of language. But one of the most basic tests of comprehension is to ask someone to read aloud from a book. It reveals far more than whether the reader understands the words. It reveals how far into the words — and the pattern of the words — the reader really sees.

Reading aloud recaptures the physicality of words. To read with your lungs and diaphragm, with your tongue and lips, is very different than reading with your eyes alone. The language becomes a part of the body, which is why there is always a curious tenderness, almost an erotic quality, in those 18th- and 19th-century literary scenes where a book is being read aloud in mixed company. The words are not mere words. They are the breath and mind, perhaps even the soul, of the person who is reading.

No one understood this better than Jane Austen. One of the late turning points in “Mansfield Park” comes when Henry Crawford picks up a volume of Shakespeare, “which had the air of being very recently closed,” and begins to read aloud to the young Bertrams and their cousin, Fanny Price. Fanny discovers in Crawford’s reading “a variety of excellence beyond what she had ever met with.” And yet his ability to do every part “with equal beauty” is a clear sign to us, if not entirely to Fanny, of his superficiality.

I read aloud to my writing students, and when students read aloud to me I notice something odd. They are smart and literate, and most of them had parents who read to them as children. But when students read aloud at first, I notice that they are trying to read the meaning of the words. If the work is their own, they are usually trying to read the intention of the writer.

It’s as though they’re reading what the words represent rather than the words themselves. What gets lost is the inner voice of the prose, the life of the language. This is reflected in their writing, too, at first.

In one realm — poetry — reading aloud has never really died out. Take Robert Pinsky’s new book, “Essential Pleasures: A New Anthology of Poems to Read Aloud.” But I suspect there is no going back. You can easily make the argument that reading silently is an economic artifact, a sign of a new prosperity beginning in the early 19th century and a new cheapness in books. The same argument applies to listening to books on your iPhone. But what I would suggest is that our idea of reading is incomplete, impoverished, unless we are also taking the time to read aloud.

http://www.nytimes.com/2009/05/16/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

SPIRITUAL INTELLIGENCE

Why it's "smart" to live the good life

By Lawrence Martin

"Man came on earth uniquely endowed with individuality and free will. He was put here to evolve his intelligence, and thereby rediscover and express his true nature, the soul: a reflection of Spirit. He was to gradually develop his innate intelligence, not merely through books or lectures and sermons, but also through his own efforts to exercise his mind and improve the quality of his thoughts and actions."
—Paramahansa Yogananda


What is your "Spiritual I.Q"? For years intelligence was understood in its most limited sense of book-learning; the accepted standard of smartness was how well one scored on the verbal and mathematical tests given in high school.

However, the reality of so many "smart" people failing and/or being unable to cope with life's challenges forced psychologists to expand the notion of intelligence to include additional abilities, such as interpersonal, emotional, and volitional skills.1 Now University of California (Davis) professor of psychology Richard Emmons, Ph.D., proposes that spirituality be considered as well.

In his book, The Psychology of Ultimate Concerns: Motivation and Spirituality in Personality, Dr. Emmons identifies five core characteristics of spiritual intelligence:

1) Ability to transcend the physical and material,
2) Ability to enter heightened states of spiritual consciousness,
3) Capacity to endow everyday activity with a sense of the sacred,
4) Use of spiritual resources on practical problems, and
5) Engaging in virtuous behavior (e.g., to show forgiveness, to express gratitude, to be humble, to display compassion; to exercise self-control).

To demonstrate how spiritually oriented lifestyles result in higher levels of cognitive function, Dr. Emmons considers two universally acknowledged spiritual qualities—humility and gratitude. Humility—defined not as low self-esteem but as the realistic appraisal of one's strengths and weaknesses, neither overestimating nor underestimating them— has been shown in research studies to improve problem-solving efficiency. Likewise with gratitude: Test groups practicing gratitude make more progress toward their goals than similar groups that do not practice it, suggesting, as in the case of humility, that the benefits of spiritual qualities extend beyond the domain of mood and well-being to those areas of life-functioning more associated with cognitive intelligence.

Dr. Emmons' book is filled to overflowing with research results showing that religious people tend to be happier, healthier, have more harmonious marriages, cope better with trauma, and experience less internal conflicts. This latter aspect is of particular interest to the author, who in studying personal psychology, has long been aware of goal-setting's "Achilles heel"—its tendency to create internal conflicts. As with the person who is torn between getting ahead in the world and spending more time with his family, the holding of incompatible goals and desires is a major source of misery in many people—leading often to stress and illness. Religion deals with "ultimate concerns" that can arbitrate these conflicts, spiritualize lesser goals, and otherwise counteract the fragmenting effect of the many competing pressures in modern life.

Spiritual intelligence also helps us to understand religion better. Ordinarily, Dr. Emrnons explains, one thinks of religion as something a person has. By focusing on behaviors, skills, and abilities, spiritual intelligence reminds us that religion can be a dynamic way of addressing the vital concerns of daily life. By investigating the "doing" side of religion, Dr. Emmons provides an invaluable first step in conceptualizing a spirituality that not only is, but does. "Religion and intelligence are two concepts that are not often uttered in the same breath," he writes. "Too often, a false dichotomy is set up between two extreme caricatures—one can either be a rational, logical, analytical, and skeptical thinker or a muddle-headed, touchy-feely, gullible spiritualist." His research demonstrates that "instead of forcing a choice between faith and reason, this way of thinking about spirituality recognizes that spiritual processing can contribute to effective cognitive functioning rather than precluding it."2

Of course, religion is more than just problem-solving. The value of wisdom, Dr. Emmons reminds us, is not merely in helping us live life, but, ultimately, in helping us rise above it. What spiritual intelligence does is provide us a helpful way of
seeing past the myriad beliefs and myths of traditional religion to a universal spirituality accessible to all. He concludes: "If spiritual intelligence does indeed confer individual and societal advantages, if the world would be a better place if
people were more 'spiritually intelligent,' the desirability and feasibility of strategic efforts to augment it ought to be investigated. Rather than forcing spirituality or religion on people because it is good for them or society, edifying people as to how spiritual and religious skills might lead to success in life may
prove to be a more effective and lasting route to spiritual transformation. Just as educational programs have been developed to raise emotional intelligence, spiritual skills could similarly be acquired and cultivated." O


1.Emotional Intelligence by Daniel Goleman, reviewed in Self-Realization magazine, Summer 1998, demonstrated the importance of mood management and impulse control in pre-dictinc life success.

2.One is reminded of an inimitable observation of Swami Sri Yukteswar, recorded by Paramahansa Yogananda in his Autobiography of a Yogi: "'Saintliness is not dumbness! Divine perceptions are not incapacitating!' he would say. 'The active expression of virtue gives rise to the keenest intelligence.'"

Published in SRF Magazine Summer 2000.
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 28, 2009
Op-Ed Columnist
Would You Slap Your Father? If So, You’re a Liberal
By NICHOLAS D. KRISTOF

If you want to tell whether someone is conservative or liberal, what are a couple of completely nonpolitical questions that will give a good clue?

How’s this: Would you be willing to slap your father in the face, with his permission, as part of a comedy skit?

And, second: Does it disgust you to touch the faucet in a public restroom?

Studies suggest that conservatives are more often distressed by actions that seem disrespectful of authority, such as slapping Dad. Liberals don’t worry as long as Dad has given permission.

Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.

The upshot is that liberals and conservatives don’t just think differently, they also feel differently. This may even be a result, in part, of divergent neural responses.

This came up after I wrote a column earlier this year called “The Daily Me.” I argued that most of us employ the Internet not to seek the best information, but rather to select information that confirms our prejudices. To overcome that tendency, I argued, we should set aside time for a daily mental workout with an ideological sparring partner. Afterward, I heard from Jonathan Haidt, a psychology professor at the University of Virginia. “You got the problem right, but the prescription wrong,” he said.

Simply exposing people to counterarguments may not accomplish much, he said, and may inflame antagonisms.

A study by Diana Mutz of the University of Pennsylvania found that when people saw tight television shots of blowhards with whom they disagreed, they felt that the other side was even less legitimate than before.

The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality. If you damage your prefrontal cortex, your I.Q. may be unaffected, but you’ll have trouble harrumphing.

One of the main divides between left and right is the dependence on different moral values. For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.

Some evolutionary psychologists believe that disgust emerged as a protective mechanism against health risks, like feces, spoiled food or corpses. Later, many societies came to apply the same emotion to social “threats.” Humans appear to be the only species that registers disgust, which is why a dog will wag its tail in puzzlement when its horrified owner yanks it back from eating excrement.

Psychologists have developed a “disgust scale” based on how queasy people would be in 27 situations, such as stepping barefoot on an earthworm or smelling urine in a tunnel. Conservatives systematically register more disgust than liberals. (To see how you weigh factors in moral decisions, take the tests at www.yourmorals.org.)

It appears that we start with moral intuitions that our brains then find evidence to support. For example, one experiment involved hypnotizing subjects to expect a flash of disgust at the word “take.” They were then told about Dan, a student council president who “tries to take topics that appeal to both professors and students.”

The research subjects felt disgust but couldn’t find any good reason for it. So, in some cases, they concocted their own reasons, such as: “Dan is a popularity-seeking snob.”

So how do we discipline our brains to be more open-minded, more honest, more empirical? A start is to reach out to moderates on the other side — ideally eating meals with them, for that breaks down “us vs. them” battle lines that seem embedded in us. (In ancient times we divided into tribes; today, into political parties.) The Web site www.civilpolitics.org is an attempt to build this intuitive appreciation for the other side’s morality, even if it’s not our morality.

“Minds are very hard things to open, and the best way to open the mind is through the heart,” Professor Haidt says. “Our minds were not designed by evolution to discover the truth; they were designed to play social games.”

Thus persuasion may be most effective when built on human interactions. Gay rights were probably advanced largely by the public’s growing awareness of friends and family members who were gay.

A corollary is that the most potent way to win over opponents is to accept that they have legitimate concerns, for that triggers an instinct to reciprocate. As it happens, we have a brilliant exemplar of this style of rhetoric in politics right now — Barack Obama.

http://www.nytimes.com/2009/05/28/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

May 30, 2009
Editorial Observer
Some Thoughts on the Pleasures of Being a Re-Reader
By VERLYN KLINKENBORG

I’ve always admired my friends who are wide readers. A few even pride themselves on never reading a book a second time. I’ve been a wide reader at times. When I was much younger, I spent nearly a year in the old Reading Room of the British Museum, discovering in the book I was currently reading the title of the next I would read.

But at heart, I’m a re-reader. The point of reading outward, widely, has always been to find the books I want to re-read and then to re-read them. In part, that’s an admission of defeat, an acknowledgement that no matter how long and how widely I read, I will only ever make my way through a tiny portion of the world’s literature. (The British Museum was a great place to learn that lesson.) And in part, it’s a concession to the limits of my memory. I forget a lot, which makes the pleasure of re-reading all the greater.

The love of repetition seems to be ingrained in children. And it is certainly ingrained in the way children learn to read — witness the joyous and maddening love of hearing that same bedtime book read aloud all over again, word for word, inflection for inflection. Childhood is an oasis of repetitive acts, so much so that there is something shocking about the first time a young reader reads a book only once and moves on to the next. There’s a hunger in that act but also a kind of forsaking, a glimpse of adulthood to come.

The work I chose in adulthood — to study literature — required the childish pleasure of re-reading. When I was in graduate school, once through Pope’s “Dunciad” or Berryman’s “The Dream Songs” was not going to cut it. A grasp of the poem was presumed to lie on the far side of many re-readings, none of which were really repetitions. The same is true of being a writer, which requires obsessive re-reading. But the real re-reading I mean is the savory re-reading, the books I have to be careful not to re-read too often so I can read them again with pleasure.

It’s a miscellaneous library, always shifting. It has included a book of the north woods: John J. Rowlands’s “Cache Lake Country,” which I have re-read annually for many years. It may still include Raymond Chandler, though I won’t know for sure till the next time I re-read him. It includes Michael Herr’s “Dispatches” and lots of A.J. Liebling and a surprising amount of George Eliot. It once included nearly all of Dickens, but that has been boiled down to “The Pickwick Papers” and “Great Expectations.” There are many more titles, of course. This is not a canon. This is a refuge.

Part of the fun of re-reading is that you are no longer bothered by the business of finding out what happens. Re-reading “Middlemarch,” for instance, or even “The Great Gatsby,” I’m able to pay attention to what’s really happening in the language itself — a pleasure surely as great as discovering who marries whom, and who dies and who does not.

The real secret of re-reading is simply this: It is impossible. The characters remain the same, and the words never change, but the reader always does. Pip is always there to be revisited, but you, the reader, are a little like the convict who surprises him in the graveyard — always a stranger.

I look at the books on my library shelves. They certainly seem dormant. But what if the characters are quietly rearranging themselves? What if Emma Woodhouse doesn’t learn from her mistakes? What if Tom Jones descends into a sodden life of poaching and outlawry? What if Eve resists Satan, remembering God’s injunction and Adam’s loving advice? I imagine all the characters bustling to get back into their places as they feel me taking the book down from the shelf. “Hurry,” they say, “he’ll expect to find us exactly where he left us, never mind how much his life has changed in the meantime.”

http://www.nytimes.com/2009/05/30/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

June 29, 2009
Journalism Rules Are Bent in News Coverage From Iran
By BRIAN STELTER

“Check the source” may be the first rule of journalism. But in the coverage of the protests in Iran this month, some news organizations have adopted a different stance: publish first, ask questions later. If you still don’t know the answer, ask your readers.

CNN showed scores of videos submitted by Iranians, most of them presumably from protesters who took to the streets to oppose Mahmoud Ahmadinejad’s re-election on June 12. The Web sites of The New York Times, The Huffington Post, The Guardian newspaper in London and others published minute-by-minute blogs with a mix of unverified videos, anonymous Twitter messages and traditional accounts from Tehran.

The blogs tend to run on a separate track with more traditional reporting from the news organizations, intersecting when user videos and information can be confirmed. The combination amounts to the biggest embrace yet of a collaborative new style of news gathering — one that combines the contributions of ordinary citizens with the reports and analysis of journalists.

Many mainstream media sources, which have in the past been critical of the undifferentiated sources of information on the Web, had little choice but to throw open their doors in this case. As the protests against Mr. Ahmadinejad grew, the government sharply curtailed the foreign press. As visas expired, many journalists packed up, and the ones who stayed were barred from reporting on the streets.

In a news vacuum, amateur videos and eyewitness accounts became the de facto source for information. In fact, the symbol of the protests, the image of a young woman named Neda bleeding to death on a Tehran street, was filmed by two people holding camera phones.

“It’s incredible, the volume of stuff coming out” from Iran, said Matthew Weaver, who sounded exhausted Thursday evening after blogging for more than 10 days for The Guardian newspaper’s Web site.

When rallies and conflicts occur “first the tweets come, then the pictures, then the YouTube videos, then the wires,” he said. “It’s extraordinary.”

Most important, he said, what people are saying “at one point in the day is then confirmed by more conventional sources four or five hours later.”

CNN encourages viewers to upload pictures and observations to iReport.com, its Web site for citizen journalism. Every upload is posted automatically on iReport.com, but each is studied before being shown on television.

In the vetting process, CNN contacts the person who posted the material, asks questions about the content and tries to confirm its veracity. Lila King, the executive in charge of iReport, said the staff members try to “triangulate the details” of an event by corroborating stories with multiple iReport contributors in a given area. Farsi speakers at CNN sometimes listened intently to the sound from the protest videos, discerning the accents of Iranian cities and transcribing the chants and screams.

Because the videos and images are not taken by a CNN employee, the network cannot completely vouch for their authenticity. But without professionals at the scene — CNN’s remaining correspondent was pulled out last week after the government imposed prohibitive restrictions — they provide the all-important pictures to tell the story.

In an indication of how difficult the process can be, CNN had received 5,200 Iran-related submissions and had approved about 180 of them for use on television.

Iran is now the third biggest traffic driver to iReport.com, behind the United States and Canada. One month ago, Iran ranked No. 63 on the list of countries. Ms. King called Iran a “watershed moment” for citizen dispatches, and for the first time an iReport producer sits at the main CNN newsgathering desk.

Bill Mitchell, a senior leader at the Poynter Institute, a nonprofit school for journalists, said the extent of user involvement shown in the Iran coverage seems to be a new way of thinking about journalism.

“Instead of limiting ourselves to full-blown articles to be written by a journalist (professional or otherwise), the idea is to look closely at stories as they unfold and ask: is there a piece of this story I’m in a particularly good position to enhance or advance?” he said in an e-mail message.

“And it’s not just a question for journalists,” he added.

Nico Pitney, the senior news editor at The Huffington Post, started to aggregate Iran news on June 13, the day after the election. By the middle of last week, the blog — with several updates an hour during the day — had received more than 100,000 comments and five million page views.

Mr. Pitney said blogs like his produce a synthesis of professional reporting and reliable amateur material. Essentially, the news tips that reporters have always relied upon are now being aired in public.

In a recognition of the Web’s role in covering the protests, Mr. Pitney was invited by the White House to ask a question at a presidential press conference last week. He forwarded to President Obama an e-mailed question from an Iranian. “We’ve been seeing a lot of reports coming directly out of Iran,” the president said.

Even anonymous Internet users develop a reputation over time, said Robert Mackey, the editor of a blog called The Lede for The New York Times’s Web site, who tracked the election and protest for almost two weeks. Although there have been some erroneous claims on sites like Twitter, in general “there seems to be very little mischief-making,” Mr. Mackey said. “People generally want to help solve the puzzle.”

Readers repeatedly drew Mr. Mackey’s attention to tweets and photos of protests in the comments thread of the blog. Some even shared their memories of the geography of Tehran in an attempt to verify scenes in videos.

Over time, the impromptu Iranian reporters have honed their skills. Some put the date of a skirmish in the file descriptions they send. Others film street signs and landmarks. But the user uploads can sometimes be misleading. Last Wednesday, Mr. Mackey put a call out to readers to determine whether a video was actually new. A commenter pointed to a two-day-old YouTube version.

Cases like this show why the publication of tweets and Flickr photos can be awkward. Echoing others, Mr. Weaver of The Guardian’s blog said his manner of reporting had made some of his colleagues uncomfortable; he recalled one colleague who remarked, “Twitter? I won’t touch it. It’s all garbage.”

On a couple of occasions, The Guardian’s blog featured video clips that were later discovered to be days old. Mr. Weaver said readers of live blogs are “a bit more forgiving” of those incidents, in part because bloggers are transparent about what they do and do not know.

Television anchors were frequently put in the same position while covering Iran. Last Wednesday, the Fox News anchor Shepard Smith showed a YouTube video of police officials beating and dragging people.

“We do not know when or where this video was from,” Mr. Smith told viewers. “We do not even know if it was staged, although we have no reason to believe that.” All he knew for sure was that it was “recently uploaded to YouTube.” For news organizations that face reporting constraints, that has become a good enough starting point.

http://www.nytimes.com/2009/06/29/busin ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

July 2, 2009
Op-Ed Columnist
When Our Brains Short-Circuit
By NICHOLAS D. KRISTOF

Our political system sometimes produces such skewed results that it’s difficult not to blame bloviating politicians. But maybe the deeper problem lies in our brains.

Evidence is accumulating that the human brain systematically misjudges certain kinds of risks. In effect, evolution has programmed us to be alert for snakes and enemies with clubs, but we aren’t well prepared to respond to dangers that require forethought.

If you come across a garter snake, nearly all of your brain will light up with activity as you process the “threat.” Yet if somebody tells you that carbon emissions will eventually destroy Earth as we know it, only the small part of the brain that focuses on the future — a portion of the prefrontal cortex — will glimmer.

“We humans do strange things, perhaps because vestiges of our ancient brain still guide us in the modern world,” notes Paul Slovic, a psychology professor at the University of Oregon and author of a book on how our minds assess risks.

Consider America’s political response to these two recent challenges:

1. President Obama proposes moving some inmates from Guantánamo Bay, Cuba, to supermax prisons from which no one has ever escaped. This is the “enemy with club” threat that we have evolved to be alert to, so Democrats and Republicans alike erupt in outrage and kill the plan.

2. The climate warms, ice sheets melt and seas rise. The House scrounges a narrow majority to pass a feeble cap-and-trade system, but Senate passage is uncertain. The issue is complex, full of trade-offs and more cerebral than visceral — and so it doesn’t activate our warning systems.

“What’s important is the threats that were dominant in our evolutionary history,” notes Daniel Gilbert, a professor of psychology at Harvard University. In contrast, he says, the kinds of dangers that are most serious today — such as climate change — sneak in under the brain’s radar.

Professor Gilbert argues that the threats that get our attention tend to have four features. First, they are personalized and intentional. The human brain is highly evolved for social behavior (“that’s why we see faces in clouds, not clouds in faces,” says Mr. Gilbert), and, like gazelles, we are instinctively and obsessively on the lookout for predators and enemies.

Second, we respond to threats that we deem disgusting or immoral — characteristics more associated with sex, betrayal or spoiled food than with atmospheric chemistry.

“That’s why people are incensed about flag burning, or about what kind of sex people have in private, even though that doesn’t really affect the rest of us,” Professor Gilbert said. “Yet where we have a real threat to our well-being, like global warming, it doesn’t ring alarm bells.”

Third, threats get our attention when they are imminent, while our brain circuitry is often cavalier about the future. That’s why we are so bad at saving for retirement. Economists tear their hair out at a puzzlingly irrational behavior called hyperbolic discounting: people’s preference for money now rather than much larger payments later.

For example, in studies, most Americans prefer $50 now to $100 in six months, even though that represents a 100 percent return.

Fourth, we’re far more sensitive to changes that are instantaneous than those that are gradual. We yawn at a slow melting of the glaciers, while if they shrank overnight we might take to the streets.

In short, we’re brilliantly programmed to act on the risks that confronted us in the Pleistocene Age. We’re less adept with 21st-century challenges.

At the University of Virginia, Professor Jonathan Haidt shows his Psychology 101 students how evolution has prepared us to fear some things: He asks how many students would be afraid to stand within 10 feet of a friend carrying a pet boa constrictor. Many hands go up, although almost none of the students have been bitten by a snake.

“The objects of our phobias, and the things that are actually dangerous to us, are almost unrelated in the modern world, but they were related in our ancient environment,” Mr. Haidt said. “We have no ‘preparedness’ to fear a gradual rise in the Earth’s temperature.”

This short-circuitry in our brains explains many of our policy priorities. We Americans spend nearly $700 billion a year on the military and less than $3 billion on the F.D.A., even though food-poisoning kills more Americans than foreign armies and terrorists. We’re just lucky we don’t have a cabinet-level Department of Snake Extermination.

Still, all is not lost, particularly if we understand and acknowledge our neurological shortcomings — and try to compensate with rational analysis. When we work at it, we are indeed capable of foresight: If we can floss today to prevent tooth decay in later years, then perhaps we can also drive less to save the planet.

http://www.nytimes.com/2009/07/02/opini ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Ta’lim-ul-Islam: Quest for Knowledge – A poem by Alnoor Rajan Talwar

TA’LIM-UL-ISLAM* – QUEST FOR KNOWLEDGE

Knowledge
Is just another conversation
With Life
With what we do daily
Rituals
Culture
Tradition
Prayer
Interpretation
Language
Misinterpretation
There will always be
The search for answers
Who am I?
Who are we?
When did it all begin?
When will it all end?
What changes will we live to see?
Will this world ever mend?
Ta’lim came to provide
Answers
That can no longer hide
We constantly strive
For a better and more meaningful life
For knowledge and guidance that change with the time
For values and ideals that make life sublime
For community relations to last
Regardless of creed, color or cast
In the depths of yearning
In the shadows of grieving
In the agonies of living
When I have sought (to no avail)
For some assurances of peace
Ta’lim came to heal
My soul
Fulfilling
My quest for knowledge and solace
Feeding
My hunger for a higher wisdom
Teaching
Me of bridges that link my past, present and future
Of faith that moves mountains
Of miracles and myths
And fables and facts
Of thriving Ismaili empires
& Dai’s preaching under blazing fires
Because of Ta’lim
I am now constantly seeking answers
My thirst may be quenched
But not quite
As I live
To fall and rise
And rise and shine
In my own way
With all my might
I can finally say
Without a fight
I AM
I am
A better Muslim
A better Ismaili
And a better person
Who speaks the unspoken
And hears the unheard
And who is not afraid
To soar the winds
Like a bird
Thank you, Ta’lim-ul-Islam

Alnoor Rajan Talwar
*Ta’lim-ul-Islam, a religious education program of Canada, is a series of educational sessions delivered by scholars from across North America.

http://feedproxy.google.com/~r/IsmailiM ... E6xO2MYzk/
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

August 3, 2009
AbroadAt Louvre, Many Stop to Snap but Few Stay to Focus
By MICHAEL KIMMELMAN

PARIS — Spending an idle morning watching people look at art is hardly a scientific experiment, but it rekindles a perennial question: What exactly are we looking for when we roam as tourists around museums? As with so many things right in front of us, the answer may be no less useful for being familiar.

At the Louvre the other day, in the Pavillon des Sessions, two young women in flowered dresses meandered through the gallery. They paused and circled around a few sculptures. They took their time. They looked slowly.

The pavilion puts some 100 immaculate objects from outside Europe on permanent view in a ground floor suite of cool, silent galleries at one end of the museum. Feathered masks from Alaska, ancient bowls from the Philippines, Mayan stone portraits and the most amazing Zulu spoon carved from wood in the abstracted S-shape of a slender young woman take no back seat, aesthetically speaking, to the great Titians and Chardins upstairs.

The young women were unusual for stopping. Most of the museum’s visitors passed through the gallery oblivious.

A few game tourists glanced vainly in guidebooks or hopefully at wall labels, as if learning that one or another of these sculptures came from Papua New Guinea or Hawaii or the Archipelago of Santa Cruz, or that a work was three centuries old or maybe four might help them see what was, plain as day, just before them.

Almost nobody, over the course of that hour or two, paused before any object for as long as a full minute. Only a 17th-century wood sculpture of a copulating couple, from San Cristobal in the Solomon Islands, placed near an exit, caused several tourists to point, smile and snap a photo, but without really breaking stride.

Visiting museums has always been about self-improvement. Partly we seem to go to them to find something we already recognize, something that gives us our bearings: think of the scrum of tourists invariably gathered around the Mona Lisa. At one time a highly educated Westerner read perhaps 100 books, all of them closely. Today we read hundreds of books, or maybe none, but rarely any with the same intensity. Travelers who took the Grand Tour across Europe during the 18th century spent months and years learning languages, meeting politicians, philosophers and artists and bore sketchbooks in which to draw and paint — to record their memories and help them see better.

Cameras replaced sketching by the last century; convenience trumped engagement, the viewfinder afforded emotional distance and many people no longer felt the same urgency to look. It became possible to imagine that because a reproduction of an image was safely squirreled away in a camera or cell phone, or because it was eternally available on the Web, dawdling before an original was a waste of time, especially with so much ground to cover.

We could dream about covering lots of ground thanks to expanding collections and faster means of transportation. At the same time, the canon of art that provided guideposts to tell people where to go and what to look at was gradually dismantled. A core of shared values yielded to an equality among visual materials. This was good and necessary, up to a point. Millions of images came to compete for our attention. Liberated by a proliferation, Western culture was also set adrift in an ocean of passing stimulation, with no anchors to secure it.

So tourists now wander through museums, seeking to fulfill their lifetime’s art history requirement in a day, wondering whether it may now be the quantity of material they pass by rather than the quality of concentration they bring to what few things they choose to focus upon that determines whether they have “done” the Louvre. It’s self-improvement on the fly.

The art historian T. J. Clark, who during the 1970s and ’80s pioneered a kind of analysis that rejected old-school connoisseurship in favor of art in the context of social and political affairs, has lately written a book about devoting several months of his time to looking intently at two paintings by Poussin. Slow looking, like slow cooking, may yet become the new radical chic.

Until then we grapple with our impatience and cultural cornucopia. Recently, I bought a couple of sketchbooks to draw with my 10-year-old in St. Peter’s and elsewhere around Rome, just for the fun of it, not because we’re any good, but to help us look more slowly and carefully at what we found. Crowds occasionally gathered around us as if we were doing something totally strange and novel, as opposed to something normal, which sketching used to be. I almost hesitate to mention our sketching. It seems pretentious and old-fogeyish in a cultural moment when we can too easily feel uncomfortable and almost ashamed just to look hard.

Artists fortunately remind us that there’s in fact no single, correct way to look at any work of art, save for with an open mind and patience. If you have ever gone to a museum with a good artist you probably discovered that they don’t worry so much about what art history books or wall labels tell them is right or wrong, because they’re selfish consumers, freed to look by their own interests.

Back to those two young women at the Louvre: aspiring artists or merely curious, they didn’t plant themselves forever in front of the sculptures but they stopped just long enough to laugh and cluck and stare, and they skipped the wall labels until afterward.

They looked, in other words. And they seemed to have a very good time.

Leaving, they caught sight of a sculptured effigy from Papua New Guinea with a feathered nose, which appeared, by virtue of its wide eyes and open hands positioned on either side of its head, as if it were taunting them.

They thought for a moment. “Nyah-nyah,” they said in unison. Then blew him a raspberry.

http://www.nytimes.com/2009/08/03/arts/ ... &th&emc=th
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

August 11, 2009
Reviving the Lost Art of Naming the World
By CAROL KAESUK YOON

One spring when I was a graduate student, I would go each Monday down into the bowels of the entomology building. There I would meet Prof. Jack Franclemont, an elderly gentleman always with little dog in tow, to be tutored in the ordering and naming of life — the science of taxonomy.

Professor Franclemont, a famed moth specialist, was perfectly old school, wearing coat and tie to give the day’s lecture even though I was the only member of the audience. Quaintly distracted, he never quite got my name right, sometimes calling me Miss Loon or Miss Voon. After the talk, I would identify moths using a guide written in 1923, in silence or listening to stories of his dog’s latest antics. I enjoyed the meditative pleasure of those hours, despite the fact that as the lone (and not terribly proficient) student of an aging teacher, I could not help feeling that taxonomy might be dying, which, in fact, it is.

Despite the field’s now blatant modernity, with practitioners using DNA sequences, sophisticated evolutionary theory and supercomputers to order and name all of life, jobs for taxonomists continue to be in steady decline. The natural history collections crucial to the work are closeted or tossed.

Outside taxonomy, no one is much up in arms about this, but perhaps we should be, because the ordering and naming of life is no esoteric science. The past few decades have seen a stream of studies that show that sorting and naming the natural world is a universal, deep-seated and fundamental human activity, one we cannot afford to lose because it is essential to understanding the living world, and our place in it.

Anthropologists were the first to recognize that taxonomy might be more than the science officially founded by Carl Linnaeus, the Swedish botanist, in the 1700s. Studying how nonscientists order and name life, creating what are called folk taxonomies, anthropologists began to realize that when people across the globe were creating ordered groups and giving names to what lived around them, they followed highly stereotyped patterns, appearing unconsciously to follow a set of unwritten rules. Not that conformity to rules was at first obvious to anthropologists who were instead understandably dazzled by the variety in folk taxonomies. The Ilongots, for example, a people of the Philippines, name gorgeous wild orchids after human body parts. There bloom the thighs, there fingernails, yonder elbows and thumbs. The Rofaifo people of New Guinea, excellent natural historians, classify the cassowary, a giant bird complete with requisite feathers and beak, as a mammal. In fact, there seemed, at first glance, to be little room even for agreement among people, let alone a set of universally followed rules. More recently, however, deep underlying similarities have begun to become apparent.

Cecil Brown, an anthropologist at Northern Illinois University who has studied folk taxonomies in 188 languages, has found that people recognize the same basic categories repeatedly, including fish, birds, snakes, mammals, “wugs” (meaning worms and insects, or what we might call creepy-crawlies), trees, vines, herbs and bushes.

Dr. Brown’s finding would be considerably less interesting if these categories were clear-cut depictions of reality that must inevitably be recognized. But tree and bush are hardly that, since there is no way to define a tree versus a bush. The two categories grade insensibly into one another. Wugs, likewise, are neither an evolutionarily nor ecologically nor otherwise cohesive group. Still, people repeatedly recognize and name these oddities.

Likewise, people consistently use two-word epithets to designate specific organisms within a larger group of organisms, despite there being an infinitude of potentially more logical methods. It is so familiar that it is hard to notice. In English, among the oaks, we distinguish the pin oak, among bears, grizzly bears. When Mayan Indians, familiar with the wild piglike creature known as peccaries, encountered Spaniards’ pigs, they dubbed them “village peccaries.” We use two-part names for ourselves as well: Sally Smith or Li Wen. Even scientists are bound by this practice, insisting on Latin binomials for species.

There appears to be such profound unconscious agreement that people will even concur on which exact words make the best names for particular organisms. Brent Berlin, an ethnobiologist at the University of Georgia, discovered this when he read 50 pairs of names, each consisting of one bird and one fish name, to a group of 100 undergraduates, and asked them to identify which was which. The names had been randomly chosen from the language of Peru’s Huambisa people, to which the students had had no previous exposure. With such a large sample size — there were 5,000 choices being made — the students should have scored 50 percent or very close to it if they were blindly guessing. Instead, they identified the bird and fish names correctly 58 percent of the time, significantly more often than expected for random guessing. Somehow they were often able to intuit the names’ birdiness or fishiness.

The most surprising evidence for the deep-seatedness of taxonomy comes from patients who have, through accident or disease, suffered traumas of the brain. Consider the case of the university student whom British researchers refer to simply as J.B.R. Doctors found that upon recovering from swelling of the brain caused by herpes, J.B.R. could no longer recognize living things.

He could still recognize nonliving objects, like a flashlight, a compass, a kettle or a canoe. But the young man was unable to recognize a kangaroo, a mushroom or a buttercup. He could not say what a parrot or even the unmistakable ostrich was. And J.B.R. is far from alone; doctors around the world have found patients with the same difficulty. Most recently, scientists studying these patients’ brains have reported repeatedly finding damage — a deadening of activity or actual lesions — in a region of the temporal lobe, leading some researchers to hypothesize that there might be a specific part of the brain that is devoted to the doing of taxonomy. As curious as they are, these patients and their woes would be of little relevance to our own lives, if they had merely lost some dispensable librarianlike ability to classify living things. As it turns out, their situation is much worse. These are people completely at sea. Without the power to order and name life, a person simply does not know how to live in the world, how to understand it. How to tell the carrot from the cat — which to grate and which to pet? They are utterly lost, anchorless in a strange and confusing world. Because to order and name life is to have a sense of the world around, and, as a result, what one’s place is in it.

Today few people are proficient in the ordering and naming of life. There are the dwindling professional taxonomists, and fast-declining peoples like the Tzeltal Maya of Mexico, among whom a 2-year-old can name more than 30 different plants and whose 4-year-olds can recognize nearly 100. Things were different once. In Linnaeus’s day, it was a matter of aristocratic pride to have a wonderful and wonderfully curated collection of wild organisms, both dead and alive. Darwin (who gained fame first as the world’s foremost barnacle taxonomist) might have expected any dinner-party conversation to turn taxonomic, after an afternoon of beetle-hunting or wildflower study. Most of us claim and enjoy no such expertise.

We are, all of us, abandoning taxonomy, the ordering and naming of life. We are willfully becoming poor J.B.R., losing the ability to order and name and therefore losing a connection to and a place in the living world.

No wonder so few of us can really see what is out there. Even when scads of insistent wildlife appear with a flourish right in front of us, and there is such life always — hawks migrating over the parking lot, great colorful moths banging up against the window at night — we barely seem to notice. We are so disconnected from the living world that we can live in the midst of a mass extinction, of the rapid invasion everywhere of new and noxious species, entirely unaware that anything is happening. Happily, changing all this turns out to be easy. Just find an organism, any organism, small, large, gaudy, subtle — anywhere, and they are everywhere — and get a sense of it, its shape, color, size, feel, smell, sound. Give a nod to Professor Franclemont and meditate, luxuriate in its beetle-ness, its daffodility. Then find a name for it. Learn science’s name, one of countless folk names, or make up your own. To do so is to change everything, including yourself. Because once you start noticing organisms, once you have a name for particular beasts, birds and flowers, you can’t help seeing life and the order in it, just where it has always been, all around you.

Adapted from “Naming Nature: The Clash Between Instinct and Science” by Carol Kaesuk Yoon. Copyright 2009 by Carol Kaesuk Yoon. With permission of the publisher, W.W. Norton & Company, Inc.

http://www.nytimes.com/2009/08/11/scien ... nted=print
kmaherali
Posts: 25168
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Op-Ed Contributor
Poetry in Motion
By DANNY HEITMAN
Published: August 15, 2009
Baton Rouge, La.

IT seems that we’ve done just about everything to get the American auto industry out of the doldrums. We’ve forced bankruptcies. We’ve exchanged cash for clunkers. But have we tried poetry?

The question is brought to mind by the story of Marianne Moore, the famous American writer, who served for a brief season as the Ford Motor Company’s unofficial poet laureate.

Moore, who died in 1972, was at the height of her literary powers in the autumn of 1955, when a letter arrived in her Brooklyn mailbox.

A Ford executive wrote that the company was launching “a rather important new series of cars,” but his team was stumped to think of a name for the latest product line. Could Moore, an icon of American letters, help them out?

Moore embraced the assignment with relish, not surprising for a poet who enjoyed — and whose writing was frequently inspired by — popular culture, whether it be baseball, boxing or bric-a-brac. The correspondence became a cultural fixture of its own after it was published in The New Yorker two years later.

Throughout the fall and winter of 1955, Moore’s steady stream of suggestions arrived at Ford: “the Ford Silver Sword,” “Intelligent Bullet,” “the Ford Fabergé,” “Mongoose Civique,” “Anticipator,” “Pastelogram,” “Astranaut” and, the highest flight of fancy, “Utopian Turtletop.”

Moore apparently had no qualms about enlisting her muse in the service of the automotive industry. She was also willing to embrace the risks of the marketplace, agreeing to be paid only if she came up with a winning name. As Moore’s biographer Charles Molesworth points out, she “had always enjoyed the language of advertisement, delighting in its inventiveness and ebullience, and even relating it to the poetics of praise.”

These days, poetry and commerce are rarely on such good speaking terms. Poetry doesn’t sell well, and poets almost never attain the celebrity that touched Moore, Robert Frost and Carl Sandburg half a century ago. If some Detroit executive got the bright idea to consult a poet for marketing advice today, one rather doubts he’d know whom to call.

It’s nice to think that the two groups — poets and carmakers — might find new relevance through collaboration, but history is not encouraging.

After much thought, Ford Motors politely rejected all of Moore’s lyrical suggestions for its new car line. Instead, the company’s executives opted for a choice generated internally: the Edsel.

Danny Heitman, a columnist for The Baton Rouge Advocate, is the author of “A Summer of Birds: John James Audubon at Oakley House.”
http://www.nytimes.com/2009/08/16/opini ... ?th&emc=th
Post Reply