Column: From Newton to Einstein to Chen, science is writing new laws all the time.
Laws of physics have their limits. Newton’s law of gravity is good enough to guide astronauts to the Moon. But it took Einstein’s more sophisticated gravity law to design the GPS system that guides you through unfamiliar streets. Now a research team has measured the long-anticipated breakdown on another great physical law – Max Planck’s law of thermal radiation.
It happens at very small separations between two objects, such as the space between the recording head and the hard disk in your computer. Knowing the rate of thermal-radiation exchange between head and disk is a key element in designing hard-disk data-recording systems, in which the recording head tends to heat up.
MIT physicist Gang Chen calls this “a very important issue for magnetic storage.”
Thus the basic physics that Professor Chen, his student Sheng Shen at MIT, and partner Arvind Narayaswamy at Columbia University are pursuing has immediate relevance in our everyday world. Already, it shows that the thermal exchange at very small separations between two surfaces can be 1,000 times greater than Planck’s law predicts. That’s something hard disk designers need to know.
This research illustrates a fundamental truth about science: Nature knows nothing of the scientific laws we cook up. Nature does its own thing. We observe what it does. When we see regularities in natural phenomena we encode our observations in what we call “natural laws.” These reflect what we know about nature at any given time. They allow us to predict cause-and-effect relationships. But they can break down when we try to use them in situations where their underlying assumptions don’t apply.
Newton’s gravity law assumes an attractive force between two or more bodies that has no effect on time. It handles the orbits of planets and spacecraft very well. Einstein’s gravity law assumes no gravitational force. Instead, it assumes that the mass of a body, such as the Sun, distorts time and space. A planet moving through space that is only slightly distorted by the mass of the Sun travels a course very close to what Newton’s theory predicts. Newton’s theory breaks down in situations where space-time distortion is important. That includes GPS navigation, which depends on very precise timing of signals between satellites and ground equipment. Time runs slightly faster at satellite altitude. Thus GPS engineers need to know when to abandon Newton and follow Einstein.
When Planck published his thermal-radiation theory 109 years ago, he warned that it might break down when two physical objects are very close to each other. Physicists have been looking for that breakdown as nanotechnology has developed ever-smaller physical systems. In Nano Letters this month, Chen and his colleagues tell how they measure heat transfer across gaps as small as 10 nanometers (10 billionths of a meter). That’s comparable to the 6 to 7 nanometer gap between a recording head and a hard disk. Mr. Shen notes that engineers now can know that Planck’s law heat-flow predictions are “not a fundamental limitation” as they try to achieve higher energy densities and higher efficiencies in nano devices.
Physical law is always a work in progress. When what we call law today eventually breaks down, new opportunities for scientific and engineering progress usually appear.
August 29, 2009
Freud’s Adirondack Vacation
By LEON HOFFMAN
SIGMUND Freud arrived in Hoboken, N.J., 100 years ago today on his first and only visit to the United States. He came to lecture on psychoanalysis and to receive an honorary degree from Clark University, in Worcester, Mass. It was, he said, “an honorable call,” a mark of his academic success. Freud was then 53 and had been practicing for 23 years.
At the time, most doctors here and in Europe still considered mental illness to be caused by “degeneration” of the brain. They assumed that there was little to be done for it beyond physical treatments like diet, exercise, drugs, rest and massage. But a growing awareness that the mind could influence bodily functions was giving rise to debates about the nature of the unconscious mind.
G. Stanley Hall, the president of Clark and the first person to earn a doctorate in psychology from Harvard, invited American scientists to hear Freud’s ideas about the unconscious roots of mental illness. William James, the philosopher and psychologist, was among those who attended, as were other prominent academics, like Adolf Meyer, who would become perhaps the most important psychiatric educator in the first half of the 20th century, and Franz Boas, the father of American anthropology. Emma Goldman, the noted radical, who was also there, remarked, “Among the array of professors, looking stiff and important in their caps and gowns, Sigmund Freud, in ordinary attire, unassuming, almost shrinking, stood out like a giant among Pygmies.”
Speaking in German and without notes, Freud delivered five lectures covering the basic principles of psychoanalysis: hysteria and the psychoanalytic method, the idea that mental illness could arise from a person’s early experience, the importance of dreams and unconscious mental activity, infantile sexuality and the nature of transference.
When Freud learned that James would attend only one day, he chose that day to speak on the interpretation of dreams and the power of the unconscious. After the lecture, the two men spent more than an hour alone together. James would later express ambivalence about Freud’s ideas. “They can’t fail to throw light on human nature,” he wrote, “but I confess that he made on me personally the impression of a man obsessed with fixed ideas.”
While accounts of Freud’s visit have inevitably focused on this conversation with James, a less-known encounter with another prominent American scientist would become far more significant — for the two men and for the future of psychoanalysis in the United States. This person was James Jackson Putnam, a professor of neurology at Harvard and a leader of a growing movement to professionalize psychotherapy in the United States. Putnam and many other scientifically minded people were trying to counteract the growing influence of spiritual healers, who had been trying to treat the mentally ill with religious and mystical approaches. He had recently attended the first medical conference on psychotherapy, in New Haven.
After listening to Freud at Clark, Putnam invited him and the other psychoanalysts who had traveled with him to the United States — Carl Jung (who also lectured and received an honorary degree at Clark) and Sandor Ferenczi — to spend a few days at the Putnam family camp in the Adirondacks, after the group visited Niagara Falls. Freud marveled at Putnam Camp, “where we had an opportunity of being acquainted with the utter wilderness of such an American landscape.” In several days of hiking and feasting, Putnam and Freud cemented a strong bond.
It was, Freud would later write, “the most important personal relationship which arose from the meeting at Worcester.” Putnam lent his stature to Freud’s ideas, promoting the psychoanalytic approach as a way to reach those patients who had been considered incurable. “There are obvious limits to its usefulness,” Putnam wrote in 1910, “but nevertheless it strikes deeper than any other method now known to psychiatry, and reaches some of these very cases to which the terms degenerative and incurable have been applied, forcing us to recast our conception of these states.”
Talk therapy offered a message of hope, in contrast to the pessimism that came with theories of hereditary illness and degeneration.
Looking back on his trip a few years later, Freud wrote that it had been encouraging: “In Europe I felt as though I were despised; but over there I found myself received by the foremost men as an equal.”
Putnam would go on to become the first president of the American Psychoanalytic Association, in 1911. And psychoanalytic ideas would fairly rapidly become part and parcel of American culture and psychiatric education. Freudian terms like transference, the unconscious and the Oedipus complex entered the lexicon. And mental-health practitioners embarked on in-depth studies of their patients’ idiosyncratic life stories from childhood on. Thanks in large measure to Putnam’s work, psychoanalysis would become — and remain for 100 years — an ingrained and respected approach to treating mental illness of all kinds.
Leon Hoffman, a psychiatrist, is a co-director of the Pacella Parent Child Center of the New York Psychoanalytic Society and Institute.
“A work of art is an expressive and communicative medium of feelings and thought.” Pierre Francastel, 1950.
What is the role of art and architecture in society? How can one learn more about history through art and architecture?
These apparently simple questions encapsulate a series of complex responses that could easily fill several volumes. Since antiquity, these questions have been catalysts for the development of philosophical, aesthetic, societal, and architectural theories. A helpful strategy to respond here is to touch upon the work of 20th century thinkers who have elucidated (or addressed) some of the issues raised by the questions, as well as examine in more detail the examples of the Alhambra and the Generalife.
In his seminal work, Peinture et Societe, Pierre Francastel examines early Renaissance and modern works of art and demonstrates that artists act as the transmitters from one state of civilisation to another. Francastel’s pioneer work is among those which have brought attention to the societal role and importance of works of art as well as to their relevance as tools for the writing of history.
I believe that architecture, which is primarily associated with the basic notions of shelter and functionality, possesses other characteristics and operates at many other levels; unfortunately these are often relegated to oblivion. Examples include architecture’s experiential impact on all our senses and its symbolic possibilities. These categories account for the production of extraordinary works throughout the ages. It is through the symbolic and sensorial criteria that we can discuss architecture's expressive quality, hence its relevance as art.
For instance, the Alhambra and the Generalife, which are extraordinary palatial complexes composed of buildings and gardens, are undoubtedly among the extant architectural wonders of the world from the medieval period. The Alhambra, which signifies “The Red” in Arabic (a1 hamra), took most of its present form in the early 14th century during the rule of Ibn Nasrid, the founder of the Nasrid Dynasty. The castle eventually became a strategically fortified and sumptuous city palace on a hill overlooking the city of Granada. Outside the Alhambra is the Generalife, which is derived from the Arabic words Jennat al-Arif - meaning, interestingly, ‘garden of the architect’. With elegantly laid out gardens, the Generalife is another palace from the 14th century, which functioned as the summer retreat for the Nasrid court.
I will focus here on the uses of water that underscore the design of many of the indoor and outdoor spaces in the Alhambra and the Generalife, and by extension, of Islamic architecture. I will not refer to the well-known and mere ecological and functional aspects of water such as its essential quality to sustain life, hence its availability through sources and cisterns, its use as a means of transport (waterways), and the infrastructure to deliver it (aqueducts) or other pragmatic uses. Rather, I will refer to its experiential and symbolic qualities created by such properties as reflection, transparency, sound, taste and its tactility. The anonymous architects and artists of the Nasrid complexes used these qualities in a masterly way to underscore the expressive and symbolic spatial content of gardens and buildings, i.e., to transform them into true works of art.
Consider the famous Patio de los Arrayanes (Court of the Myrtles) in the Alhambra. This rectangular court, which is simultaneously indoor and outdoor, contains a water basin, which adds coolness to the space during the warm months. Functionally, it limits and defines circulation to the edges of the court. At the same time, the basin acts as a gigantic mirror which reflects inversed images of the facades at both ends of the court. I would suggest that the intentions of creating a deliberate reflection of the buildings refers to the symbolic and pervasive notion of reversibility found in Islam. Think of the relationships between paradise-earth (Gardens), the idea of the cosmological tree of life growing upside down in paradise, and the concept of praying to and from the Ka‘ba in Mecca.
At one end of the basin, a discreet and beautifully designed fountain feeds it with a continuous yet subtle flow of water, which does not disturb its mirroring quality. Introducing the soothing sound of bubbling water, the fountain also becomes a contemplation anchor, which visually lures visitors. Vegetation, a tacit symbol of the life-giving power of water, appears as well in the court through the planting of geometrically pruned myrtle hedges, which gives the place its name and visual scale. In some of the walls, the unusual ornamental patterns of the alicatados (tiles) evoke the idea of order, flow and movement, and add colour as well to the court. The skilful combinations of all these elements ultimately produce a powerful expressive space, which will continue to inspire artists, architects and poets.
The Alhambra and the Generalife are in the company of other single works such as the Parthenon in Athens and the Pantheon in Rome, other complexes such as the Cordoba mosque and many Eastern and Western gardens, or entire cities such as Toledo, Venice and Florence. All these works - art objects - possess unusual expressive qualities, which clearly corroborate Francastel’s aphorism at the beginning of this essay.
Perhaps what I have stated above may be reinforced by introducing some of the concepts that Material Culture addresses. A novel inter-disciplinary domain, Material Culture may be considered as yet another possible approach to discuss the issues raised at the beginning of this essay. Material Culture constitutes a type of intersection of many disciplines such as history, art history, anthropology, folklore, and the history of science and technology. As such, it is concerned with the psychological role, the meaning, the experiential impact that all physical objects have on humans in a particular culture. It also refers to the range of manufactured objects or artefacts that are typical of a culture and form an essential part of its identity.
Professor Daniel Waugh, a faculty member at the University of Washington and a well-known scholar in the field, illustrates the point: “Material objects include items with physical substance. They are primarily shaped or produced by human action, though objects created by nature can also play an important role in the history of human societies. For example, a coin is the product of human action. An animal horn is not, but it takes on meaning for humans if used as a drinking cup or a decorative or ritual object ... The physical existence of a religious image in a dark cave as a 'work of art' provides evidence of the piety of an artist or a sponsor.”
Among the artefacts which Material Culture studies, architecture and art objects - whether paintings, sculptures, calligraphy, musical scores and many other similar art works - play an important role in the making and the understanding of culture. They constitute a fundamental repository that informs the writing of history and ultimately supports the idea that without their works of art, without the possibility of their expressive channels, human societies cease to exist as such.
Ricardo L. Castro is an Associate Professor of Architecture at McGill University, Montreal, Quebec
 Pierre Francastel, Peinture et Societe (Lyon: Audin ed.) p. ii. (My translation). Unfortunately this important work by the renowned French sociologist, critic, and historian awaits translation into English.
September 20, 2009
The Holy Grail of the Unconscious
By SARA CORBETT
This is a story about a nearly 100-year-old book, bound in red leather, which has spent the last quarter century secreted away in a bank vault in Switzerland. The book is big and heavy and its spine is etched with gold letters that say “Liber Novus,” which is Latin for “New Book.” Its pages are made from thick cream-colored parchment and filled with paintings of otherworldly creatures and handwritten dialogues with gods and devils. If you didn’t know the book’s vintage, you might confuse it for a lost medieval tome.
And yet between the book’s heavy covers, a very modern story unfolds. It goes as follows: Man skids into midlife and loses his soul. Man goes looking for soul. After a lot of instructive hardship and adventure — taking place entirely in his head — he finds it again.
Some people feel that nobody should read the book, and some feel that everybody should read it. The truth is, nobody really knows. Most of what has been said about the book — what it is, what it means — is the product of guesswork, because from the time it was begun in 1914 in a smallish town in Switzerland, it seems that only about two dozen people have managed to read or even have much of a look at it.
Of those who did see it, at least one person, an educated Englishwoman who was allowed to read some of the book in the 1920s, thought it held infinite wisdom — “There are people in my country who would read it from cover to cover without stopping to breathe scarcely,” she wrote — while another, a well-known literary type who glimpsed it shortly after, deemed it both fascinating and worrisome, concluding that it was the work of a psychotic.
So for the better part of the past century, despite the fact that it is thought to be the pivotal work of one of the era’s great thinkers, the book has existed mostly just as a rumor, cosseted behind the skeins of its own legend — revered and puzzled over only from a great distance.
Which is why one rainy November night in 2007, I boarded a flight in Boston and rode the clouds until I woke up in Zurich, pulling up to the airport gate at about the same hour that the main branch of the United Bank of Switzerland, located on the city’s swanky Bahnhofstrasse, across from Tommy Hilfiger and close to Cartier, was opening its doors for the day. A change was under way: the book, which had spent the past 23 years locked inside a safe deposit box in one of the bank’s underground vaults, was just then being wrapped in black cloth and loaded into a discreet-looking padded suitcase on wheels. It was then rolled past the guards, out into the sunlight and clear, cold air, where it was loaded into a waiting car and whisked away.
THIS COULD SOUND, I realize, like the start of a spy novel or a Hollywood bank caper, but it is rather a story about genius and madness, as well as possession and obsession, with one object — this old, unusual book — skating among those things. Also, there are a lot of Jungians involved, a species of thinkers who subscribe to the theories of Carl Jung, the Swiss psychiatrist and author of the big red leather book. And Jungians, almost by definition, tend to get enthused anytime something previously hidden reveals itself, when whatever’s been underground finally makes it to the surface.
Carl Jung founded the field of analytical psychology and, along with Sigmund Freud, was responsible for popularizing the idea that a person’s interior life merited not just attention but dedicated exploration — a notion that has since propelled tens of millions of people into psychotherapy. Freud, who started as Jung’s mentor and later became his rival, generally viewed the unconscious mind as a warehouse for repressed desires, which could then be codified and pathologized and treated. Jung, over time, came to see the psyche as an inherently more spiritual and fluid place, an ocean that could be fished for enlightenment and healing.
Whether or not he would have wanted it this way, Jung — who regarded himself as a scientist — is today remembered more as a countercultural icon, a proponent of spirituality outside religion and the ultimate champion of dreamers and seekers everywhere, which has earned him both posthumous respect and posthumous ridicule. Jung’s ideas laid the foundation for the widely used Myers-Briggs personality test and influenced the creation of Alcoholics Anonymous. His central tenets — the existence of a collective unconscious and the power of archetypes — have seeped into the larger domain of New Age thinking while remaining more at the fringes of mainstream psychology.
A big man with wire-rimmed glasses, a booming laugh and a penchant for the experimental, Jung was interested in the psychological aspects of séances, of astrology, of witchcraft. He could be jocular and also impatient. He was a dynamic speaker, an empathic listener. He had a famously magnetic appeal with women. Working at Zurich’s Burghölzli psychiatric hospital, Jung listened intently to the ravings of schizophrenics, believing they held clues to both personal and universal truths. At home, in his spare time, he pored over Dante, Goethe, Swedenborg and Nietzsche. He began to study mythology and world cultures, applying what he learned to the live feed from the unconscious — claiming that dreams offered a rich and symbolic narrative coming from the depths of the psyche. Somewhere along the way, he started to view the human soul — not just the mind and the body — as requiring specific care and development, an idea that pushed him into a province long occupied by poets and priests but not so much by medical doctors and empirical scientists.
October 13, 2009
The Young and the Neuro
By DAVID BROOKS
When you go to an academic conference you expect to see some geeks, gravitas and graying professors giving lectures. But the people who showed up at the Social and Affective Neuroscience Society’s conference in Lower Manhattan last weekend were so damned young, hip and attractive. The leading figures at this conference were in their 30s, and most of the work was done by people in their 20s. When you spoke with them, you felt yourself near the beginning of something long and important.
In 2001, an Internet search of the phrase “social cognitive neuroscience” yielded 53 hits. Now you get more than a million on Google. Young scholars have been drawn to this field from psychology, economics, political science and beyond in the hopes that by looking into the brain they can help settle some old arguments about how people interact.
These people study the way biology, in the form of genes, influences behavior. But they’re also trying to understand the complementary process of how social behavior changes biology. Matthew Lieberman of U.C.L.A. is doing research into what happens in the brain when people are persuaded by an argument.
Keely Muscatell, one of his doctoral students, and others presented a study in which they showed people from various social strata some images of menacing faces. People whose parents had low social status exhibited more activation in the amygdala (the busy little part of the brain involved in fear and emotion) than people from high-status families.
Reem Yahya and a team from the University of Haifa studied Arabs and Jews while showing them images of hands and feet in painful situations. The two cultures perceived pain differently. The Arabs perceived higher levels of pain over all while the Jews were more sensitive to pain suffered by members of a group other than their own.
Mina Cikara of Princeton and others scanned the brains of Yankee and Red Sox fans as they watched baseball highlights. Neither reacted much to an Orioles-Blue Jays game, but when they saw their own team doing well, brain regions called the ventral striatum and nucleus accumbens were activated. This is a look at how tribal dominance struggles get processed inside.
Jonathan B. Freeman of Tufts and others peered into the reward centers of the brain such as the caudate nucleus. They found that among Americans, that region was likely to be activated by dominant behavior, whereas among Japanese, it was more likely to be activated by subordinate behavior — the same region rewarding different patterns of behavior depending on culture.
All of these studies are baby steps in a long conversation, and young academics are properly circumspect about drawing broad conclusions. But eventually their work could give us a clearer picture of what we mean by fuzzy words like ‘culture.’ It could also fill a hole in our understanding of ourselves. Economists, political scientists and policy makers treat humans as ultrarational creatures because they can’t define and systematize the emotions. This work is getting us closer to that.
The work demonstrates that we are awash in social signals, and any social science that treats individuals as discrete decision-making creatures is nonsense. But it also suggests that even though most of our reactions are fast and automatic, we still have free will and control.
Many of the studies presented here concerned the way we divide people by in-group and out-group categories in as little as 170 milliseconds. The anterior cingulate cortices in American and Chinese brains activate when people see members of their own group endure pain, but they do so at much lower levels when they see members of another group enduring it. These effects may form the basis of prejudice.
But a study by Saaid A. Mendoza and David M. Amodio of New York University showed that if you give people a strategy, such as reminding them to be racially fair, it is possible to counteract those perceptions. People feel disgust toward dehumanized groups, but a study by Claire Hoogendoorn, Elizabeth Phelps and others at N.Y.U. suggests it is possible to lower disgust and the accompanying insula activity through cognitive behavioral therapy.
In other words, consciousness is too slow to see what happens inside, but it is possible to change the lenses through which we unconsciously construe the world.
Since I’m not an academic, I’m free to speculate that this work will someday give us new categories, which will replace misleading categories like ‘emotion’ and ‘reason.’ I suspect that the work will take us beyond the obsession with I.Q. and other conscious capacities and give us a firmer understanding of motivation, equilibrium, sensitivity and other unconscious capacities.
The hard sciences are interpenetrating the social sciences. This isn’t dehumanizing. It shines attention on the things poets have traditionally cared about: the power of human attachments. It may even help policy wonks someday see people as they really are.
October 13, 2009
The Collider, the Particle and a Theory About Fate
By DENNIS OVERBYE
More than a year after an explosion of sparks, soot and frigid helium shut it down, the world’s biggest and most expensive physics experiment, known as the Large Hadron Collider, is poised to start up again. In December, if all goes well, protons will start smashing together in an underground racetrack outside Geneva in a search for forces and particles that reigned during the first trillionth of a second of the Big Bang.
Then it will be time to test one of the most bizarre and revolutionary theories in science. I’m not talking about extra dimensions of space-time, dark matter or even black holes that eat the Earth. No, I’m talking about the notion that the troubled collider is being sabotaged by its own future. A pair of otherwise distinguished physicists have suggested that the hypothesized Higgs boson, which physicists hope to produce with the collider, might be so abhorrent to nature that its creation would ripple backward through time and stop the collider before it could make one, like a time traveler who goes back in time to kill his grandfather.
Holger Bech Nielsen, of the Niels Bohr Institute in Copenhagen, and Masao Ninomiya of the Yukawa Institute for Theoretical Physics in Kyoto, Japan, put this idea forward in a series of papers with titles like “Test of Effect From Future in Large Hadron Collider: a Proposal” and “Search for Future Influence From LHC,” posted on the physics Web site arXiv.org in the last year and a half.
According to the so-called Standard Model that rules almost all physics, the Higgs is responsible for imbuing other elementary particles with mass.
“It must be our prediction that all Higgs producing machines shall have bad luck,” Dr. Nielsen said in an e-mail message. In an unpublished essay, Dr. Nielson said of the theory, “Well, one could even almost say that we have a model for God.” It is their guess, he went on, “that He rather hates Higgs particles, and attempts to avoid them.”
This malign influence from the future, they argue, could explain why the United States Superconducting Supercollider, also designed to find the Higgs, was canceled in 1993 after billions of dollars had already been spent, an event so unlikely that Dr. Nielsen calls it an “anti-miracle.”
You might think that the appearance of this theory is further proof that people have had ample time — perhaps too much time — to think about what will come out of the collider, which has been 15 years and $9 billion in the making.
The collider was built by CERN, the European Organization for Nuclear Research, to accelerate protons to energies of seven trillion electron volts around an 18-mile underground racetrack and then crash them together into primordial fireballs.
For the record, as of the middle of September, CERN engineers hope to begin to collide protons at the so-called injection energy of 450 billion electron volts in December and then ramp up the energy until the protons have 3.5 trillion electron volts of energy apiece and then, after a short Christmas break, real physics can begin.
Dr. Nielsen and Dr. Ninomiya started laying out their case for doom in the spring of 2008. It was later that fall, of course, after the CERN collider was turned on, that a connection between two magnets vaporized, shutting down the collider for more than a year.
Dr. Nielsen called that “a funny thing that could make us to believe in the theory of ours.”
He agreed that skepticism would be in order. After all, most big science projects, including the Hubble Space Telescope, have gone through a period of seeming jinxed. At CERN, the beat goes on: Last weekend the French police arrested a particle physicist who works on one of the collider experiments, on suspicion of conspiracy with a North African wing of Al Qaeda.
Dr. Nielsen and Dr. Ninomiya have proposed a kind of test: that CERN engage in a game of chance, a “card-drawing” exercise using perhaps a random-number generator, in order to discern bad luck from the future. If the outcome was sufficiently unlikely, say drawing the one spade in a deck with 100 million hearts, the machine would either not run at all, or only at low energies unlikely to find the Higgs.
Sure, it’s crazy, and CERN should not and is not about to mortgage its investment to a coin toss. The theory was greeted on some blogs with comparisons to Harry Potter. But craziness has a fine history in a physics that talks routinely about cats being dead and alive at the same time and about anti-gravity puffing out the universe.
As Niels Bohr, Dr. Nielsen’s late countryman and one of the founders of quantum theory, once told a colleague: “We are all agreed that your theory is crazy. The question that divides us is whether it is crazy enough to have a chance of being correct.”
Dr. Nielsen is well-qualified in this tradition. He is known in physics as one of the founders of string theory and a deep and original thinker, “one of those extremely smart people that is willing to chase crazy ideas pretty far,” in the words of Sean Carroll, a Caltech physicist and author of a coming book about time, “From Eternity to Here.”
Another of Dr. Nielsen’s projects is an effort to show how the universe as we know it, with all its apparent regularity, could arise from pure randomness, a subject he calls “random dynamics.”
Dr. Nielsen admits that he and Dr. Ninomiya’s new theory smacks of time travel, a longtime interest, which has become a respectable research subject in recent years. While it is a paradox to go back in time and kill your grandfather, physicists agree there is no paradox if you go back in time and save him from being hit by a bus. In the case of the Higgs and the collider, it is as if something is going back in time to keep the universe from being hit by a bus. Although just why the Higgs would be a catastrophe is not clear. If we knew, presumably, we wouldn’t be trying to make one.
We always assume that the past influences the future. But that is not necessarily true in the physics of Newton or Einstein. According to physicists, all you really need to know, mathematically, to describe what happens to an apple or the 100 billion galaxies of the universe over all time are the laws that describe how things change and a statement of where things start. The latter are the so-called boundary conditions — the apple five feet over your head, or the Big Bang.
The equations work just as well, Dr. Nielsen and others point out, if the boundary conditions specify a condition in the future (the apple on your head) instead of in the past, as long as the fundamental laws of physics are reversible, which most physicists believe they are.
“For those of us who believe in physics,” Einstein once wrote to a friend, “this separation between past, present and future is only an illusion.”
In Kurt Vonnegut’s novel “Sirens of Titan,” all of human history turns out to be reduced to delivering a piece of metal roughly the size and shape of a beer-can opener to an alien marooned on Saturn’s moon so he can repair his spaceship and go home.
Whether the collider has such a noble or humble fate — or any fate at all — remains to be seen. As a Red Sox fan my entire adult life, I feel I know something about jinxes.
October 20, 2009
Where the Wild Things Are
By DAVID BROOKS
In Homer’s poetry, every hero has a trait. Achilles is angry. Odysseus is cunning. And so was born one picture of character and conduct.
In this view, what you might call the philosopher’s view, each of us has certain ingrained character traits. An honest person will be honest most of the time. A compassionate person will be compassionate.
These traits, as they say, go all the way down. They shape who we are, what we choose to do and whom we befriend. Our job is to find out what traits of character we need to become virtuous.
But, as Kwame Anthony Appiah, a Princeton philosopher, notes in his book “Experiments in Ethics,” this philosopher’s view of morality is now being challenged by a psychologist’s view. According to the psychologist’s view, individuals don’t have one thing called character.
The psychologists say this because a century’s worth of experiments suggests that people’s actual behavior is not driven by permanent traits that apply from one context to another. Students who are routinely dishonest at home are not routinely dishonest at school. People who are courageous at work can be cowardly at church. People who behave kindly on a sunny day may behave callously the next day when it is cloudy and they are feeling glum. Behavior does not exhibit what the psychologists call “cross-situational stability.”
The psychologists thus tend to gravitate toward a different view of conduct. In this view, people don’t have one permanent thing called character. We each have a multiplicity of tendencies inside, which are activated by this or that context. As Paul Bloom of Yale put it in an essay for The Atlantic last year, we are a community of competing selves. These different selves “are continually popping in and out of existence. They have different desires, and they fight for control — bargaining with, deceiving, and plotting against one another.”
The philosopher’s view is shaped like a funnel. At the bottom, there is a narrow thing called character. And at the top, the wide ways it expresses itself. The psychologist’s view is shaped like an upside-down funnel. At the bottom, there is a wide variety of unconscious tendencies that get aroused by different situations. At the top, there is the narrow story we tell about ourselves to give coherence to life.
The difference is easy to recognize on the movie screen. Most movies embrace the character version. The hero is good and conquers evil. Spike Jonze’s new movie adaptation of “Where the Wild Things Are” illuminates the psychological version.
At the beginning of the movie, young Max is torn by warring impulses he cannot control or understand. Part of him loves and depends upon his mother. But part of him rages against her.
In the midst of turmoil, Max falls into a primitive, mythical realm with a community of Wild Things. The Wild Things contain and re-enact different pieces of his inner frenzy. One of them feels unimportant. One throws a tantrum because his love has been betrayed. They embody his different tendencies.
Many critics have noted that, in the movie version, the Wild Things are needlessly morose and whiney. But in one important way, the movie is better than the book. In the book, Max effortlessly controls the Wild Things by taming them with “the magic trick of staring into all their yellow eyes without blinking once.”
In the movie, Max wants to control the Wild Things. The Wild Things in turn want to be controlled. They want him to build a utopia for them where they won’t feel pain. But in the movie, Max fails as king. He lacks the power to control his Wild Things. The Wild Things come to recognize that he isn’t really a king, and maybe there are no such things as kings.
In the philosopher’s picture, the good life is won through direct assault. Heroes use reason to separate virtue from vice. Then they use willpower to conquer weakness, fear, selfishness and the dark passions lurking inside. Once they achieve virtue they do virtuous things.
In the psychologist’s version, the good life is won indirectly. People have only vague intuitions about the instincts and impulses that have been implanted in them by evolution, culture and upbringing. There is no easy way to command all the wild things jostling inside.
But it is possible to achieve momentary harmony through creative work. Max has all his Wild Things at peace when he is immersed in building a fort or when he is giving another his complete attention. This isn’t the good life through heroic self-analysis but through mundane, self-forgetting effort, and through everyday routines.
Appiah believes these two views of conduct are in conversation, not conflict. But it does seem we’re in one of those periods when words like character fall into dispute and change their meaning.
October 20, 2009
For Decades, Puzzling People With Mathematics
By JOHN TIERNEY
For today’s mathematical puzzle, assume that in the year 1956 there was a children’s magazine in New York named after a giant egg, Humpty Dumpty, who purportedly served as its chief editor.
Mr. Dumpty was assisted by a human editor named Martin Gardner, who prepared “activity features” and wrote a monthly short story about the adventures of the child egg, Humpty Dumpty Jr. Another duty of Mr. Gardner’s was to write a monthly poem of moral advice from Humpty Sr. to Humpty Jr.
At that point, Mr. Gardner was 42 and had never taken a math course beyond high school. He had struggled with calculus and considered himself poor at solving basic mathematical puzzles, let alone creating them. But when the publisher of Scientific American asked him if there might be enough material for a monthly column on “recreational mathematics,” a term that sounded even more oxymoronic in 1956 than it does today, Mr. Gardner took a gamble.
He quit his job with Humpty Dumpty.
On Wednesday, Mr. Gardner will celebrate his 95th birthday with the publication of another book — his second book of essays and mathematical puzzles to be published just this year. With more than 70 books to his name, he is the world’s best-known recreational mathematician, and has probably introduced more people to the joys of math than anyone in history.
How is this possible?
Actually, there are two separate puzzles here. One is how Mr. Gardner, who still works every day at his old typewriter, has managed for so long to confound and entertain his readers. The other is why so many of us have never been able to resist this kind of puzzle. Why, when we hear about the guy trying to ferry a wolf and a goat and a head of cabbage across the river in a small boat, do we feel compelled to solve his transportation problem?
It never occurred to me that math could be fun until the day in grade school that my father gave me a book of 19th-century puzzles assembled by Mr. Gardner — the same puzzles, as it happened, that Mr. Gardner’s father had used to hook him during his school days. The algebra and geometry were sugar-coated with elaborate stories and wonderful illustrations of giraffe races, pool-hall squabbles, burglaries and scheming carnival barkers. (Go to nytimes.com/tierneylab for some examples.)
The puzzles didn’t turn Mr. Gardner into a professional mathematician — he majored in philosophy at the University of Chicago — but he remained a passionate amateur through his first jobs in public relations and journalism. After learning of mathematicians’ new fascination with folding certain pieces of paper into different shapes, he sold an article about these “flexagons” to Scientific American, and that led to his monthly “Mathematical Games” column, which he wrote for the next quarter-century.
Mr. Gardner prepared for the new monthly column by scouring Manhattan’s second-hand bookstores for math puzzles and games. In another line of work, that would constitute plagiarism, but among puzzle makers it has long been the norm: a good puzzle is forever.
For instance, that puzzle about ferrying the wolf, the goat and the cabbage was included in a puzzle collection prepared for the emperor Charlemagne 12 centuries ago — and it was presumably borrowed by Charlemagne’s puzzlist. The row-boat problem has been passed down in cultures around the world in versions featuring guards and prisoners, jealous spouses, missionaries, cannibals and assorted carnivores.
“The number of puzzles I’ve invented you can count on your fingers,” Mr. Gardner says. Through his hundreds of columns and dozens of books, he always credited others for the material and insisted that he wasn’t even a good mathematician.
“I don’t think I ever wrote a column that required calculus,” he says. “The big secret of my success as a columnist was that I didn’t know much about math.
“I had to struggle to get everything clear before I wrote a column, so that meant I could write it in a way that people could understand.”
After he gave up the column in 1981, Mr. Gardner kept turning out essays and books, and his reputation among mathematicians, puzzlists and magicians just kept growing. Since 1994, they have been convening in Atlanta every two years to swap puzzles and ideas at an event called the G4G: the Gathering for Gardner.
“Many have tried to emulate him; no one has succeeded,” says Ronald Graham, a mathematician at the University of California, San Diego. “Martin has turned thousands of children into mathematicians, and thousands of mathematicians into children.”
Mr. Gardner says he has been gratified to see more and more teachers incorporating puzzles into the math curriculum. The pleasure of puzzle-solving, as he sees it, is a happy byproduct of evolution.
“Consider a cow,” he says. “A cow doesn’t have the problem-solving skill of a chimpanzee, which has discovered how to get termites out of the ground by putting a stick into a hole.
“Evolution has developed the brain’s ability to solve puzzles, and at the same time has produced in our brain a pleasure of solving problems.”
Mr. Gardner’s favorite puzzles are the ones that require a sudden insight. That aha! moment can come in any kind of puzzle, but there’s a special pleasure when the insight is mathematical — and therefore eternal, as Mr. Gardner sees it. In his new book, “When You Were a Tadpole and I Was a Fish,” he explains why he is an “unashamed Platonist” when it comes to mathematics.
“If all sentient beings in the universe disappeared,” he writes, “there would remain a sense in which mathematical objects and theorems would continue to exist even though there would be no one around to write or talk about them. Huge prime numbers would continue to be prime even if no one had proved them prime.”
I share his mathematical Platonism, and I think that is ultimately the explanation for the appeal of the puzzles. They may superficially involve row boats or pool halls or giraffes, but they’re really about transcendent numbers and theorems.
When you figure out the answer, you know you’ve found something that is indisputably true anywhere, anytime. For a brief moment, the universe makes perfect sense.
Correction: An earlier version of this column incorrectly said that Martin Gardner was 37 in 1956.
November 10, 2009
The Rush to Therapy
By DAVID BROOKS
We’re all born late. We’re born into history that is well under way. We’re born into cultures, nations and languages that we didn’t choose. On top of that, we’re born with certain brain chemicals and genetic predispositions that we can’t control. We’re thrust into social conditions that we detest. Often, we react in ways we regret even while we’re doing them.
But unlike the other animals, people do have a drive to seek coherence and meaning. We have a need to tell ourselves stories that explain it all. We use these stories to supply the metaphysics, without which life seems pointless and empty.
Among all the things we don’t control, we do have some control over our stories. We do have a conscious say in selecting the narrative we will use to make sense of the world. Individual responsibility is contained in the act of selecting and constantly revising the master narrative we tell about ourselves.
The stories we select help us, in turn, to interpret the world. They guide us to pay attention to certain things and ignore other things. They lead us to see certain things as sacred and other things as disgusting. They are the frameworks that shape our desires and goals. So while story selection may seem vague and intellectual, it’s actually very powerful. The most important power we have is the power to help select the lens through which we see reality.
Most people select stories that lead toward cooperation and goodness. But over the past few decades a malevolent narrative has emerged.
That narrative has emerged on the fringes of the Muslim world. It is a narrative that sees human history as a war between Islam on the one side and Christianity and Judaism on the other. This narrative causes its adherents to shrink their circle of concern. They don’t see others as fully human. They come to believe others can be blamelessly murdered and that, in fact, it is admirable to do so.
This narrative is embraced by a small minority. But it has caused incredible amounts of suffering within the Muslim world, in Israel, in the U.S. and elsewhere. With their suicide bombings and terrorist acts, adherents to this narrative have made themselves central to global politics. They are the ones who go into crowded rooms, shout “Allahu akbar,” or “God is great,” and then start murdering.
When Maj. Nidal Malik Hasan did that in Fort Hood, Tex., last week, many Americans had an understandable and, in some ways, admirable reaction. They didn’t want the horror to become a pretext for anti-Muslim bigotry.
So immediately the coverage took on a certain cast. The possibility of Islamic extremism was immediately played down. This was an isolated personal breakdown, not an ideological assault, many people emphasized.
Major Hasan was portrayed as a disturbed individual who was under a lot of stress. We learned about pre-traumatic stress syndrome, and secondary stress disorder, which one gets from hearing about other people’s stress. We heard the theory (unlikely in retrospect) that Hasan was so traumatized by the thought of going into a combat zone that he decided to take a gun and create one of his own.
A shroud of political correctness settled over the conversation. Hasan was portrayed as a victim of society, a poor soul who was pushed over the edge by prejudice and unhappiness.
There was a national rush to therapy. Hasan was a loner who had trouble finding a wife and socializing with his neighbors.
This response was understandable. It’s important to tamp down vengeful hatreds in moments of passion. But it was also patronizing. Public commentators assumed the air of kindergarten teachers who had to protect their children from thinking certain impermissible and intolerant thoughts. If public commentary wasn’t carefully policed, the assumption seemed to be, then the great mass of unwashed yahoos in Middle America would go off on a racist rampage.
Worse, it absolved Hasan — before the real evidence was in — of his responsibility. He didn’t have the choice to be lonely or unhappy. But he did have a choice over what story to build out of those circumstances. And evidence is now mounting to suggest he chose the extremist War on Islam narrative that so often leads to murderous results.
The conversation in the first few days after the massacre was well intentioned, but it suggested a willful flight from reality. It ignored the fact that the war narrative of the struggle against Islam is the central feature of American foreign policy. It ignored the fact that this narrative can be embraced by a self-radicalizing individual in the U.S. as much as by groups in Tehran, Gaza or Kandahar.
It denied, before the evidence was in, the possibility of evil. It sought to reduce a heinous act to social maladjustment. It wasn’t the reaction of a morally or politically serious nation.
November 27, 2009
The Other Education
By DAVID BROOKS
Like many of you, I went to elementary school, high school and college. I took such and such classes, earned such and such grades, and amassed such and such degrees.
But on the night of Feb. 2, 1975, I turned on WMMR in Philadelphia and became mesmerized by a concert the radio station was broadcasting. The concert was by a group I’d never heard of — Bruce Springsteen and the E Street Band. Thus began a part of my second education.
We don’t usually think of this second education. For reasons having to do with the peculiarities of our civilization, we pay a great deal of attention to our scholastic educations, which are formal and supervised, and we devote much less public thought to our emotional educations, which are unsupervised and haphazard. This is odd, since our emotional educations are much more important to our long-term happiness and the quality of our lives.
In any case, over the next few decades Springsteen would become one of the professors in my second education. In album after album he assigned a new course in my emotional curriculum.
This second education doesn’t work the way the scholastic education works. In a normal schoolroom, information walks through the front door and announces itself by light of day. It’s direct. The teacher describes the material to be covered, and then everybody works through it.
The knowledge transmitted in an emotional education, on the other hand, comes indirectly, seeping through the cracks of the windowpanes, from under the floorboards and through the vents. It’s generally a byproduct of the search for pleasure, and the learning is indirect and unconscious.
From that first night in the winter of 1975, I wanted the thrill that Springsteen was offering. His manager, Jon Landau, says that each style of music elicits its own set of responses. Rock, when done right, is jolting and exhilarating.
Once I got a taste of that emotional uplift, I was hooked. The uplifting experiences alone were bound to open the mind for learning.
I followed Springsteen into his world. Once again, it wasn’t the explicit characters that mattered most. Springsteen sings about teenage couples out on a desperate lark, workers struggling as the mills close down, and drifters on the wrong side of the law. These stories don’t directly touch my life, and as far as I know he’s never written a song about a middle-age pundit who interviews politicians by day and makes mind-numbingly repetitive school lunches at night.
What mattered most, as with any artist, were the assumptions behind the stories. His tales take place in a distinct universe, a distinct map of reality. In Springsteen’s universe, life’s “losers” always retain their dignity. Their choices have immense moral consequences, and are seen on an epic and anthemic scale.
There are certain prominent neighborhoods on his map — one called defeat, another called exaltation, another called nostalgia. Certain emotional chords — stoicism, for one — are common, while others are absent. “There is no sarcasm in his writing,” Landau says, “and not a lot of irony.”
I find I can’t really describe what this landscape feels like, especially in newspaper prose. But I do believe his narrative tone, the mental map, has worked its way into my head, influencing the way I organize the buzzing confusion of reality, shaping the unconscious categories through which I perceive events. Just as being from New York or rural Georgia gives you a perspective from which to see the world, so spending time in Springsteen’s universe inculcates its own preconscious viewpoint.
Then there is the man himself. Like other parts of the emotional education, it is hard to bring the knowledge to consciousness, but I do think important lessons are communicated by that embarrassed half-giggle he falls into when talking about himself. I do think a message is conveyed in the way he continually situates himself within a tradition — de-emphasizing his own individual contributions, stressing instead the R&B groups, the gospel and folk singers whose work comes out through him.
I’m not claiming my second education has been exemplary or advanced. I’m describing it because I have only become aware of it retrospectively, and society pays too much attention to the first education and not enough to the second.
In fact, we all gather our own emotional faculty — artists, friends, family and teams. Each refines and develops the inner instrument with a million strings.
Last week, my kids attended their first Springsteen concert in Baltimore. At one point, I looked over at my 15-year-old daughter. She had her hands clapped to her cheeks and a look of slack-jawed, joyous astonishment on her face. She couldn’t believe what she was seeing — 10,000 people in a state of utter abandon, with Springsteen surrendering himself to them in the center of the arena.
December 29, 2009
EssayThe Joy of Physics Isn’t in the Results, but in the Search Itself
By DENNIS OVERBYE
I was asked recently what the Large Hadron Collider, the giant particle accelerator outside Geneva, is good for. After $10 billion and 15 years, the machine is ready to begin operations early next year, banging together protons in an effort to recreate the conditions of the Big Bang. Sure, there are new particles and abstract symmetries in the offing for those few who speak the language of quantum field theory. But what about the rest of us?
The classic answer was allegedly given long ago by Michael Faraday, who, when asked what good was electricity, told a government minister that he didn’t know but that “one day you will tax it.”
Not being fast enough on my feet, I rattled off the usual suspects. Among the spinoffs from particle physics, besides a robust academic research community, are the Web, which was invented as a tool for physicists to better communicate at CERN — the European Organization for Nuclear Research, builders of the new collider — and many modern medical imaging methods like M.R.I.’s and PET scans.
These tests sound innocuous and even miraculous: noninvasive and mostly painless explorations of personal inner space, but their use does involve an encounter with forces that sound like they came from the twilight zone. When my wife, Nancy, had a scan known as a Spect last fall, for what seems to have been a false alarm, she had to be injected with a radioactive tracer. That meant she had to sleep in another room for a couple of days and was forbidden to hug our daughter.
The “P” in PET scan, after all, stands for positron, as in the particles that are opposites to the friendly workhorse, the electron, which is to say antimatter, the weird stuff of science-fiction dreams.
I don’t know if anyone ever asked Paul Dirac, the British physicist who predicted the existence of antimatter, whether it would ever be good for anything. Some people are now saying that the overuse of scanning devices has helped bankrupt the health care system. Indeed, when I saw the bill for Nancy’s scan, I almost fainted, but when I saw how little of it we ourselves had to pay, I felt like ordering up Champagne.
But better medical devices are not why we build these machines that eat a small city’s worth of electricity to bang together protons and recreate the fires of the Big Bang. Better diagnoses are not why young scientists spend the best years of their lives welding and soldering and pulling cable through underground caverns inside detectors the size of New York apartment buildings to capture and record those holy fires.
They want to know where we all came from, and so do I. In a drawer at home I have a family tree my brother made as a school project long ago tracing our ancestry back several hundred years in Norway, but it’s not enough. Whatever happened in the Big Bang, whatever laws are briefly reincarnated in the unholy proton fires at CERN, not only made galaxies and planets possible, but it also made us possible. How atoms could achieve such a thing is a story mostly untold but worth revering. The Earth’s biosphere is the most complicated manifestation of the laws of nature that we know of.
Like an only child dreaming of lost siblings, we dream of finding other Earths, other creatures and civilizations out in space, or even other universes. We all want to find out that we are cosmic Anastasias and that there is a secret that connects us, that lays bare the essential unity of physical phenomena.
And so we try, sometimes against great odds. The year that is now ending began with some areas of science in ruins. One section of the Large Hadron Collider looked like a train wreck with several-ton magnets lying about smashed after an electrical connection between them vaporized only nine days off a showy inauguration.
The Hubble Space Telescope was limping about in orbit with only one of its cameras working.
But here is the scorecard at the end of the year: in December, the newly refurbished collider produced a million proton collisions, including 50,000 at the record energy of 1.2 trillion electron volts per proton, before going silent for the holidays. CERN is on track to run it next year at three times that energy.
The Hubble telescope, after one last astronaut servicing visit, reached to within spitting distance of the Big Bang and recorded images of the most distant galaxies yet observed, which existed some 600 million or 700 million years after the putative beginning of time
Not to mention the rapidly expanding universe of extrasolar planets. In my view from the cosmic bleachers, the pot is bubbling for discovery. We all got a hint of just how crazy that might be in the new age of the Internet on Dec. 17, when physicists around the world found themselves glued to a Webcast of the results from an experiment called the Cryogenic Dark Matter Search. Rumors had swept the blogs and other outposts of scientific commentary that the experimenters were going to announce that they had finally detected the ethereal and mysterious dark matter particles, which, astronomers say, make up a quarter of the universe.
In the end, the result was frustratingly vague and inconclusive.
“We want it to be true — we so want to have a clue about dark matter,” Maria Spiropulu, a Caltech physicist working at CERN wrote to me the night of the Webcast.
“And it is not easy,” Dr. Spiropulu said. “The experiments are not easy and the analysis is not easy. This is a tough, tough ride over all.”
Although we might well solve part of the dark matter conundrum in the coming years, the larger mystery winds out in front of us like a train snaking into the fog.
We may never know where we came from. We will probably never find that cosmic connection to our lost royalty. Someday I will visit Norway and look up those ancestors. They died not knowing the fate of the universe, and so will I, but maybe that’s all right.
Steven Weinberg, a University of Texas physicist and Nobel Prize winner, once wrote in his 1977 book “The First Three Minutes”: “The more the universe seems comprehensible, the more it also seems pointless.” Dr. Weinberg has been explaining that statement ever since. He went on to say that it is by how we live and love and, yes, do science, that the universe warms up and acquires meaning.
As the dark matter fever was rising a few weeks ago, I called Vera Rubin, the astronomer at the department of terrestrial magnetism of the Carnegie Institution of Washington, who helped make dark matter a cosmic issue by showing that galaxies rotate too fast for the gravity of their luminous components to keep them together.
But Dr. Rubin, who likes to stick to the facts, refused to be excited. “I don’t know if we have dark matter or have to nudge Newton’s Laws or what.
“I’m sorry I know so little; I’m sorry we all know so little. But that’s kind of the fun, isn’t it?”
January 5, 2010
Books on Science A Guide to the Cosmos, in Words and Images
By DENNIS OVERBYE
In the universe there is always room for another surprise. Or two. Or a trillion.
Take the Witch Head Nebula, for example — a puffy purplish trail of gas in the constellation Eridanus. When a picture of it is turned on its side, the nebula looks just like, well, a witch, complete with a pointy chin and peaked hat, ready to jump on a broomstick or offer an apple to Snow White.
In 30 years of covering astronomy, I had never heard of the Witch Head Nebula until I came across a haunting two-page spread showing it snaking across an inky black star-speckled background in “Far Out: A Space-Time Chronicle,” an exquisite picture guide to the universe by Michael Benson, a photographer, journalist and filmmaker, and obviously a longtime space buff.
Actually “exquisite” does not really do justice to the aesthetic and literary merits of the book, published in the fall. I live in New York, so most of the cosmos is invisible to me, but even when I lived under the black crystalline and — at this time of year — head-ringingly cold skies of the Catskills, I could see only so far. If you don’t have your own Hubble Space Telescope, this book is the next best thing.
Mr. Benson has scoured images from the world’s observatories, including the Hubble, to fashion a step-by-step tour of the cosmos, outward from fantastical clusters and nebulae a few hundred light-years away to soft red dots of primordial galaxies peppering the wall of the sky billions of light-years beyond the stars, almost to the Big Bang.
The result is an art book befitting its Abrams imprint. Here are stars packed like golden sand, gas combed in delicate blue threads, piled into burgundy thunderheads and carved into sinuous rilles and ribbons, and galaxies clotted with star clusters dancing like spiders on the ceiling.
Mr. Benson has reprocessed many of the images to give them colors truer to physical reality. For example, in the NASA version of the Hubble’s “Pillars of Creation,” showing fingers of gas and dust in the Eagle Nebula boiling away to reveal new stars, the “pillars” are brown and the radiation burning them away is green; Mr. Benson has turned it into a composition in shades of red, including burgundy, the actual color of the ionized hydrogen that makes the nebula.
You can sit and look through this book for hours and never be bored by the shapes, colors and textures into which cosmic creation can arrange itself, or you can actually read the accompanying learned essays. Mr. Benson’s prose is up to its visual surroundings, no mean feat.
“The enlarging mirrors of our telescopes,” he writes, “comprise material forged at the centers of the same generation of stars they now record.”
One set of essays relates what was going on in the sky to what was going on back on Earth. The Witch Head, for example, is about 700 light-years from here, which means its soft smoky light has been traveling to us since the early part of the 14th century. It is a milestone for, among other things, the bubonic plague, the first stirrings of the Renaissance in Italy and the foundation of the Ming dynasty in China.
The Heart Nebula, another new acquaintance, in Cassiopeia right next to the Soul Nebula, is 7,500 light-years away. Its image dates to the time of the first proto-writing in China and the first wine, in Persia, and when the Mediterranean burst its banks in biblical fashion and flooded the Black Sea.
The journey outward ends in those distant blurry galaxies on the doorstep of the Big Bang. Or is it the beginning?
“Eternity,” Mr. Benson quotes William Blake as saying in an epigraph, “is in love with the productions of time.” Well, aren’t we all?
January 12, 2010
Deciphering the Chatter of Monkeys and Chimps
By NICHOLAS WADE
Walking through the Tai forest of Ivory Coast, Klaus Zuberbühler could hear the calls of the Diana monkeys, but the babble held no meaning for him.
That was in 1990. Today, after nearly 20 years of studying animal communication, he can translate the forest’s sounds. This call means a Diana monkey has seen a leopard. That one means it has sighted another predator, the crowned eagle. “In our experience time and again, it’s a humbling experience to realize there is so much more information being passed in ways which hadn’t been noticed before,” said Dr. Zuberbühler, a psychologist at the University of St. Andrews in Scotland.
Do apes and monkeys have a secret language that has not yet been decrypted? And if so, will it resolve the mystery of how the human faculty for language evolved? Biologists have approached the issue in two ways, by trying to teach human language to chimpanzees and other species, and by listening to animals in the wild.
The first approach has been propelled by people’s intense desire — perhaps reinforced by childhood exposure to the loquacious animals in cartoons — to communicate with other species. Scientists have invested enormous effort in teaching chimpanzees language, whether in the form of speech or signs. A New York Times reporter who understands sign language, Boyce Rensberger, was able in 1974 to conduct what may be the first newspaper interview with another species when he conversed with Lucy, a signing chimp. She invited him up her tree, a proposal he declined, said Mr. Rensberger, who is now at M.I.T.
But with a few exceptions, teaching animals human language has proved to be a dead end. They should speak, perhaps, but they do not. They can communicate very expressively — think how definitely dogs can make their desires known — but they do not link symbolic sounds together in sentences or have anything close to language.
Better insights have come from listening to the sounds made by animals in the wild. Vervet monkeys were found in 1980 to have specific alarm calls for their most serious predators. If the calls were recorded and played back to them, the monkeys would respond appropriately. They jumped into bushes on hearing the leopard call, scanned the ground at the snake call, and looked up when played the eagle call.
It is tempting to think of the vervet calls as words for “leopard,” “snake” or “eagle,” but that is not really so. The vervets do not combine the calls with other sounds to make new meanings. They do not modulate them, so far as is known, to convey that a leopard is 10, or 100, feet away. Their alarm calls seem less like words and more like a person saying “Ouch!” — a vocal representation of an inner mental state rather than an attempt to convey exact information.
But the calls do have specific meaning, which is a start. And the biologists who analyzed the vervet calls, Robert Seyfarth and Dorothy Cheney of the University of Pennsylvania, detected another significant element in primates’ communication when they moved on to study baboons. Baboons are very sensitive to who stands where in their society’s hierarchy. If played a recording of a superior baboon threatening an inferior, and the latter screaming in terror, baboons will pay no attention — this is business as usual in baboon affairs. But when researchers concoct a recording in which an inferior’s threat grunt precedes a superior’s scream, baboons will look in amazement toward the loudspeaker broadcasting this apparent revolution in their social order.
Baboons evidently recognize the order in which two sounds are heard, and attach different meanings to each sequence. They and other species thus seem much closer to people in their understanding of sound sequences than in their production of them. “The ability to think in sentences does not lead them to speak in sentences,” Drs. Seyfarth and Cheney wrote in their book “Baboon Metaphysics.”
Some species may be able to produce sounds in ways that are a step or two closer to human language. Dr. Zuberbühler reported last month that Campbell’s monkeys, which live in the forests of the Ivory Coast, can vary individual calls by adding suffixes, just as a speaker of English changes a verb’s present tense to past by adding an “-ed.”
The Campbell’s monkeys give a “krak” alarm call when they see a leopard. But adding an “-oo” changes it to a generic warning of predators. One context for the krak-oo sound is when they hear the leopard alarm calls of another species, the Diana monkey. The Campbell’s monkeys would evidently make good reporters since they distinguish between leopards they have observed directly (krak) and those they have heard others observe (krak-oo).
Even more remarkably, the Campbell’s monkeys can combine two calls to generate a third with a different meaning. The males have a “Boom boom” call, which means “I’m here, come to me.” When booms are followed by a series of krak-oos, the meaning is quite different, Dr. Zuberbühler says. The sequence means “Timber! Falling tree!”
Dr. Zuberbühler has observed a similar achievement among putty-nosed monkeys that combine their “pyow” call (warning of a leopard) with their “hack” call (warning of a crowned eagle) into a sequence that means “Let’s get out of here in a real hurry.”
Apes have larger brains than monkeys and might be expected to produce more calls. But if there is an elaborate code of chimpanzee communication, their human cousins have not yet cracked it. Chimps make a food call that seems to have a lot of variation, perhaps depending on the perceived quality of the food. How many different meanings can the call assume? “You would need the animals themselves to decide how many meaningful calls they can discriminate,” Dr. Zuberbühler said. Such a project, he estimates, could take a lifetime of research.
Monkeys and apes possess many of the faculties that underlie language. They hear and interpret sequences of sounds much like people do. They have good control over their vocal tract and could produce much the same range of sounds as humans. But they cannot bring it all together.
This is particularly surprising because language is so useful to a social species. Once the infrastructure of language is in place, as is almost the case with monkeys and apes, the faculty might be expected to develop very quickly by evolutionary standards. Yet monkeys have been around for 30 million years without saying a single sentence. Chimps, too, have nothing resembling language, though they shared a common ancestor with humans just five million years ago. What is it that has kept all other primates locked in the prison of their own thoughts?
Drs. Seyfarth and Cheney believe that one reason may be that they lack a “theory of mind”; the recognition that others have thoughts. Since a baboon does not know or worry about what another baboon knows, it has no urge to share its knowledge. Dr. Zuberbühler stresses an intention to communicate as the missing factor. Children from the youngest ages have a great desire to share information with others, even though they gain no immediate benefit in doing so. Not so with other primates.
“In principle, a chimp could produce all the sounds a human produces, but they don’t do so because there has been no evolutionary pressure in this direction,” Dr. Zuberbühler said. “There is nothing to talk about for a chimp because he has no interest in talking about it.” At some point in human evolution, on the other hand, people developed the desire to share thoughts, Dr. Zuberbühler notes. Luckily for them, all the underlying systems of perceiving and producing sounds were already in place as part of the primate heritage, and natural selection had only to find a way of connecting these systems with thought.
Yet it is this step that seems the most mysterious of all. Marc D. Hauser, an expert on animal communication at Harvard, sees the uninhibited interaction between different neural systems as critical to the development of language. “For whatever reason, maybe accident, our brains are promiscuous in a way that animal brains are not, and once this emerges it’s explosive,” he said.
In animal brains, by contrast, each neural system seems to be locked in place and cannot interact freely with others. “Chimps have tons to say but can’t say it,” Dr. Hauser said. Chimpanzees can read each other’s goals and intentions, and do lots of political strategizing, for which language would be very useful. But the neural systems that compute these complex social interactions have not been married to language.
Dr. Hauser is trying to find out whether animals can appreciate some of the critical aspects of language, even if they cannot produce it. He and Ansgar Endress reported last year that cotton-top tamarins can distinguish a word added in front of another word from the same word added at the end. This may seem like the syntactical ability to recognize a suffix or prefix, but Dr. Hauser thinks it is just the ability to recognize when one thing comes before another and has little to do with real syntax.
“I’m becoming pessimistic,” he said of the efforts to explore whether animals have a form of language. “I conclude that the methods we have are just impoverished and won’t get us to where we want to be as far as demonstrating anything like semantics or syntax.”
Yet, as is evident from Dr. Zuberbühler’s research, there are many seemingly meaningless sounds in the forest that convey information in ways perhaps akin to language.
January 26, 2010
Physicists’ Dreams and Worries in Era of the Big Collider
By DENNIS OVERBYE
A few dozen scientists got together in Los Angeles for the weekend recently to talk about their craziest hopes and dreams for the universe.
At least that was the idea.
“I want to set out the questions for the next nine decades,” Maria Spiropulu said on the eve of the conference, called the Physics of the Universe Summit. She was hoping that the meeting, organized with the help of Joseph D. Lykken of the Fermi National Accelerator Laboratory and Gordon Kane of the University of Michigan, would replicate the success of a speech by the mathematician David Hilbert, who in 1900 laid out an agenda of 23 math questions to be solved in the 20th century.
Dr. Spiropulu is a professor at the California Institute of Technology and a senior scientist at CERN, outside Geneva. Next month, CERN’s Large Hadron Collider, the most powerful particle accelerator ever built, will begin colliding protons and generating sparks of primordial fire in an effort to recreate conditions that ruled the universe in the first trillionth of a second of time.
Physicists have been speculating for 30 years what they will see. Now it is almost Christmas morning.
Organized into “duels” of world views, round tables and “diatribes and polemics,” the conference was billed as a place where the physicists could let down their hair about what might come, avoid “groupthink” and “be daring (even at the expense of being wrong),” according to Dr. Spiropulu’s e-mailed instructions. “Tell us what is bugging you and what is inspiring you,” she added.
Adding to the air of looseness, the participants were housed in a Hollywood hotel known long ago as the “Riot Hyatt,” for the antics of rock stars who stayed there.
The eclectic cast included Larry Page, a co-founder of Google, who was handing out new Google phones to his friends; Elon Musk, the PayPal electric-car entrepreneur, who hosted the first day of the meeting at his SpaceX factory, where he is building rockets to ferry supplies and, perhaps, astronauts to the space station; and the filmmaker Jesse Dylan, who showed a new film about the collider. One afternoon, the magician David Blaine was sitting around the SpaceX cafeteria doing card tricks for the physicists.
This group proved to be at least as good at worrying as dreaming.
“We’re confused,” Dr. Lykken explained, “and we’re probably going to be confused for a long time.”
The first speaker of the day was Lisa Randall, a Harvard theorist who began her talk by quoting Galileo to the effect that physics progressed more by working on small problems than by talking about grand ones — an issue that she is taking on in a new book about science and the collider.
And so Dr. Randall emphasized the challenges ahead. Physicists have high expectations and elegant theories about what they will find, she said, but once they start looking in detail at these theories, “they’re not that pretty.”
For example, a major hope is some explanation for why gravity is so weak compared with the other forces of nature. How is it that a refrigerator magnet can hold itself up against the pull of the entire Earth? One popular solution is a hypothesized feature of nature known as supersymmetry, which would cause certain mathematical discrepancies in the calculations to cancel out, as well as produce a plethora of previously undiscovered particles — known collectively as wimps, for weakly interacting massive particles — and presumably a passel of Nobel prizes.
In what physicists call the “wimp miracle,” supersymmetry could also explain the mysterious dark matter that astronomers say makes up 25 percent of the universe. But no single supersymmetrical particle quite fits the bill all by itself, Dr. Randall reported, without some additional fiddling with its parameters.
Moreover, she added, it is worrying that supersymmetric effects have not already shown up as small deviations from the predictions of present-day physics, known as the Standard Model.
“A lot of stuff doesn’t happen,” Dr. Randall said. “We would have expected to see clues by now, but we haven’t.”
These are exciting times, she concluded, but the answers physicists seek might not come quickly or easily. They should prepare for surprises and trouble.
“I can’t help it,” Dr. Randall said. “I’m a worrier.”
Dr. Randall was followed by Dr. Kane, a self-proclaimed optimist who did try to provoke by claiming that physics was on the verge of seeing “the bottom of the iceberg.” The collider would soon discover supersymmetry, he said, allowing physicists to zero in on an explanation of almost everything about the physical world, or at least particle physics.
But he and other speakers were scolded for not being bold enough in the subsequent round-table discussion.
Where, asked Michael Turner of the University of Chicago, were the big ideas? The passion? Where, for that matter, was the universe? Dr. Kane’s hypothesized breakthrough did not include an explanation for the so-called dark energy that seems to be speeding up the expansion of the universe.
Dr. Kane grumbled that the proposed solutions to dark energy did not affect particle physics.
The worrying continued. Lawrence Krauss, a cosmologist from Arizona State, said that most theories were wrong.
“We get the notions they are right because we keep talking about them,” he said. Not only are most theories wrong, he said, but most data are also wrong — at first — subject to glaring uncertainties. The recent history of physics, he said, is full of promising discoveries that disappeared because they could not be repeated.
And so it went.
Maurizio Pierini, a young CERN physicist, pointed out that the tests for new physics were mostly designed to discover supersymmetry. “What if it’s not supersymmetry?” he asked.
Another assumption physicists have taken for granted — that dark matter is a simple particle rather than an entire spectrum of dark behaviors — might not be true, they were told. “Does nature really love simplicity?” Aaron Pierce of the University of Michigan asked.
Neal Weiner of New York University, who has suggested the existence of forces as well as particles on the dark side, said that until recently ideas about dark matter were driven by ideas about particle theory rather than data.
“Ultimately we learn that perhaps it has very little to do with us at all,” Dr. Weiner said. “Who knows what we will find in the dark sector?”
At one point, Mark Wise, a theoretical physicist at Caltech, felt compelled to remind the audience that this was not a depressing time for physics, listing the collider and other new experiments on heaven and on earth. “You cannot call this a depressing time,” he said.
Dr. Randall immediately chimed in. “I agree it’s a good time,” she said. “We’ll make progress by thinking about these little problems.”
On the second day, the discussion continued in an auditorium at Caltech and concluded with a showing of Mr. Dylan’s film and a history talk by Lyn Evans, the CERN scientist who has supervised the building of the Large Hadron Collider through its ups and downs over 15 years, including a disastrous explosion after it first started up in 2008.
Dr. Evans, looking relaxed, said: “It’s a beautiful machine. Now let the adventure of discovery begin.”
Dr. Spiropulu said it had already begun. Her detector, she said, recorded 50,000 proton collisions during the testing of the collider in December, recapitulating much of 20th-century particle physics.
Now it is the 21st century, Dr. Spiropulu said, and “all that has been discussed these last few days will be needed immediately.”
Correction: January 25, 2010
An earlier version of this article misstated the affiliation of Aaron Pierce, a physicist. He is with the University of Michigan, not the California Institute of Technology.
February 16, 2010
In Brookhaven Collider, Scientists Briefly Break a Law of Nature
By DENNIS OVERBYE
Physicists said Monday that they had whacked a tiny region of space with enough energy to briefly distort the laws of physics, providing the first laboratory demonstration of the kind of process that scientists suspect has shaped cosmic history.
The blow was delivered in the Relativistic Heavy Ion Collider, or RHIC, at the Brookhaven National Laboratory on Long Island, where, since 2000, physicists have been accelerating gold nuclei around a 2.4-mile underground ring to 99.995 percent of the speed of light and then colliding them in an effort to melt protons and neutrons and free their constituents — quarks and gluons. The goal has been a state of matter called a quark-gluon plasma, which theorists believe existed when the universe was only a microsecond old.
The departure from normal physics manifested itself in the apparent ability of the briefly freed quarks to tell right from left. That breaks one of the fundamental laws of nature, known as parity, which requires that the laws of physics remain unchanged if we view nature in a mirror.
This happened in bubbles smaller than the nucleus of an atom, which lasted only a billionth of a billionth of a billionth of a second. But in these bubbles were “hints of profound physics,” in the words of Steven Vigdor, associate director for nuclear and particle physics at Brookhaven. Very similar symmetry-breaking bubbles, at an earlier period in the universe, are believed to have been responsible for breaking the balance between matter and its opposite antimatter and leaving the universe with a preponderance of matter.
“We now have a hook” into how these processes occur, Dr. Vigdor said, adding in an e-mail message, “IF the interpretation of the RHIC results turns out to be correct.” Other physicists said the results were an important window into the complicated dynamics of quarks, which goes by the somewhat whimsical name of Quantum Chromo Dynamics.
Frank Wilczek, a physicist at the Massachusetts Institute of Technology who won the Nobel Prize for work on the theory of quarks, called the new results “interesting and surprising,” and said understanding them would help understand the behavior of quarks in unusual circumstances.
“It is comparable, I suppose, to understanding better how galaxies form, or astrophysical black holes,” he said.
The Brookhaven scientists and their colleagues discussed their latest results from RHIC in talks and a news conference at a meeting of the American Physical Society Monday in Washington, and in a pair of papers submitted to Physical Review Letters. “This is a view of what the world was like at 2 microseconds,” said Jack Sandweiss of Yale, a member of the Brookhaven team, calling it, “a seething cauldron.”
Among other things, the group announced it had succeeded in measuring the temperature of the quark-gluon plasma as 4 trillion degrees Celsius, “by far the hottest matter ever made,” Dr. Vigdor said. That is 250,000 times hotter than the center of the Sun and well above the temperature at which theorists calculate that protons and neutrons should melt, but the quark-gluon plasma does not act the way theorists had predicted.
Instead of behaving like a perfect gas, in which every quark goes its own way independent of the others, the plasma seemed to act like a liquid. “It was a very big surprise,” Dr. Vigdor said, when it was discovered in 2005. Since then, however, theorists have revisited their calculations and found that the quark soup can be either a liquid or a gas, depending on the temperature, he explained. “This is not your father’s quark-gluon plasma,” said Barbara V. Jacak, of the State University at Stony Brook, speaking for the team that made the new measurements.
It is now thought that the plasma would have to be a million times more energetic to become a perfect gas. That is beyond the reach of any conceivable laboratory experiment, but the experiments colliding lead nuclei in the Large Hadron Collider outside Geneva next winter should reach energies high enough to see some evolution from a liquid to a gas.
Parity, the idea that the laws of physics are the same when left and right are switched, as in a mirror reflection, is one of the most fundamental symmetries of space-time as we know it. Physicists were surprised to discover in 1956, however, that parity is not obeyed by all the laws of nature after all. The universe is slightly lopsided in this regard. The so-called weak force, which governs some radioactive decays, seems to be left-handed, causing neutrinos, the ghostlike elementary particles that are governed by that force, to spin clockwise, when viewed oncoming, but never counterclockwise.
Under normal conditions, the laws of quark behavior observe the principle of mirror symmetry, but Dmitri Kharzeev of Brookhaven, a longtime student of symmetry changes in the universe, had suggested in 1998 that those laws might change under the very abnormal conditions in the RHIC fireball. Conditions in that fireball are such that a cube with sides about one quarter the thickness of a human hair could contain the total amount of energy consumed in the United States in a year.
All this energy, he said, could put a twist in the gluon force fields, which give quarks their marching orders. There can be left-hand twists and right-hand twists, he explained, resulting in space within each little bubble getting a local direction.
What makes the violation of mirror symmetry observable in the collider is the combination of this corkscrewed space with a magnetic field, produced by the charged gold ions blasting at one another. The quarks were then drawn one way or the other along the magnetic field, depending on their electrical charges.
The magnetic fields produced by the collisions are the most intense ever observed, roughly 100 million billion gauss, Dr. Sandweiss said.
The directions of the magnetic field and of the corkscrew effect can be different in every bubble, the presumed parity violations can only be studied statistically, averaged over 14 million bubble events. In each of them, the mirror symmetry could be broken in a different direction, Dr. Sandweiss explained, but the effect would always be the same, with positive quarks going one way and negative ones the other. That is what was recorded in RHIC’s STAR detector (STAR being short for Solenoidal Tracker at RHIC) by Dr. Sandweiss and his colleagues. Dr. Sandweiss cautioned that it was still possible that some other effect could be mimicking the parity violation, and he d held off publication of the results for a year, trying unsuccessfully to find one. So they decided, he said, that it was worthy of discussion.
One test of the result, he said, would be to run RHIC at a lower energy and see if the effect went away when there was not enough oomph in the beam to distort space-time. The idea of parity might seem like a very abstract and mathematical concept, but it affects our chemistry and biology. It is not only neutrinos that are skewed. So are many of the molecules of life, including proteins, which are left-handed, and sugars, which are right-handed.
The chirality, or handedness, of molecules prevents certain reactions from taking place in chemistry and biophysics, Dr. Sandweiss noted, and affects what we can digest.
Physicists suspect that the left-handedness of neutrinos might have contributed to the most lopsided feature of the universe of all, the fact that it is composed of matter and not antimatter, even though the present-day laws do not discriminate. The amount of parity violation that physicists have measured in experiments, however, is not enough to explain how the universe got so unbalanced today. We like symmetry, Dr. Kharzeev, of Brookhaven, noted, but if the symmetry between matter and antimatter had not been broken long ago, “the universe would be a very desolate place.”
The new measurement from the quark plasma does not explain the antimatter problem either, Dr. Sandweiss said, but it helps show how departures from symmetry can appear in bubbles like the ones in RHIC in the course of cosmic evolution. Scientists think that the laws of physics went through a series of changes, or “phase transitions,” like water freezing to ice, as the universe cooled during the stupendously hot early moments of the Big Bang. Symmetry-violating bubbles like those of RHIC are more likely to form during these cosmic changeovers. “If you learn more about it from this experiment, we could then illuminate the process that gives rise to these bubbles,” Dr. Sandweiss said.
Dr. Vigdor said: “A lot of physics sounds like science fiction. There is a lot of speculation on what happened in the early universe. The amazing thing is that we have this chance to test any of this.”
Steven Strogatz on math, from basic to baffling.
exponential growth, folding paper, functions, logarithms
If you were an avid television watcher in the 1980s, you may remember a clever show called “Moonlighting.” Known for its snappy dialogue and the romantic chemistry between its co-stars, it featured Cybill Shepherd and Bruce Willis as a couple of wisecracking private detectives named Maddie Hayes and David Addison. While investigating one particularly tough case, David asks a coroner’s assistant for his best guess about possible suspects. “Beats me,” says the assistant. “But you know what I don’t understand?” To which David replies, “Logarithms?” Then, reacting to Maddie’s look: “What? You understood those?”
(Click image to play clip.)
That pretty well sums up how many people feel about logarithms. Their peculiar name is just part of their image problem. Most folks never use them again after high school, at least not consciously, and are oblivious to the logarithms hiding behind the scenes of their daily lives.
The same is true of many of the other functions discussed in algebra II and pre-calculus. Power functions, exponential functions — what was the point of all that? My goal in this week’s column is to help you appreciate the function of all those functions, even if you never have occasion to press their buttons on your calculator.
A mathematician needs functions for the same reason that a builder needs hammers and drills. Tools transform things. So do functions. In fact, mathematicians often refer to them as “transformations” because of this. But instead of wood or steel, functions pound away on numbers and shapes and, sometimes, even on other functions.
To show you what I mean, let’s plot the graph of the equation
You may remember how this sort of activity goes: you draw a picture of the xy plane with the x-axis running horizontally and the y-axis vertically. Then for each x you compute the corresponding y and plot them together as a single point in the xy plane. For example, when x is 1, the equation says y equals 4 minus 1 squared, which is 4 minus 1, or 3. So (x,y) = (1, 3) is a point on the graph. After calculating and plotting a few more points, the following picture emerges.
The droopy shape of the curve is due to the action of mathematical pliers. In the equation for y, the function that transforms x into x2 behaves a lot like the common tool for bending and pulling things. When it’s applied to every point on a piece of the x-axis (which you could visualize as a straight piece of wire), the pliers bend and elongate that piece into the downward-curving arch shown above.
And what role does the 4 play in the equation y = 4 – x2? It acts like a nail for hanging a picture on a wall. It lifts the bent wire arch up by 4 units. Since it raises all points by the same amount, it’s known as a “constant function.”
This example illustrates the dual nature of functions. On the one hand, they’re tools: the x2 bends the piece of the x-axis and the 4 lifts it. On the other hand, they’re building blocks: the 4 and the –x2 can be regarded as component parts of a more complicated function, 4 – x2, just as wires, batteries and transistors are component parts of a radio.
Once you start to look at things this way, you’ll notice functions everywhere. The arching curve above — technically known as a “parabola”— is the signature of the squaring function x2 operating behind the scenes. Look for it when you’re taking a sip from a water fountain or watching a basketball arc toward the hoop. And if you ever have a few minutes to spare on a layover in Detroit’s International Airport, be sure to stop by the Delta terminal to enjoy the world’s most breathtaking parabolas at play:
Parabolas and constants are associated with a wider class of functions — “power functions” of the form xn, in which a variable x is raised to a fixed power n. For a parabola, n = 2; for a constant, n = 0.
Changing the value of n yields other handy tools. For example, raising x to the first power (n = 1) gives a function that works like a ramp, a steady incline of growth or decay. It’s called a “linear function” because its xy graph is a line. If you leave a bucket out in a steady rain, the water collecting at the bottom rises linearly in time.
Another useful tool is the inverse square function 1/x2, corresponding to the case n = –2. It’s good for describing how waves and forces attenuate as they spread out in three dimensions — for instance, how a sound softens as it moves away from its source.
Power functions like these are the building blocks that scientists and engineers use to describe growth and decay in their mildest forms.
But when you need mathematical dynamite, it’s time to unpack the exponential functions. They describe all sorts of explosive growth, from nuclear chain reactions to the proliferation of bacteria in a Petri dish. The most familiar example is the function 10x, in which 10 is raised to the power x. Make sure not to confuse this with the earlier power functions. Here the exponent (the power x) is a variable, and the base (the number 10) is a constant — whereas in a power function like x2, it’s the other way around. This switch makes a huge difference. Exponential growth is almost unimaginably rapid.
That’s why it’s so hard to fold a piece of paper in half more than 7 or 8 times. Each folding approximately doubles the thickness of the wad, causing it to grow exponentially. Meanwhile, the wad’s length shrinks in half every time, and thus decreases exponentially fast. For a standard sheet of notebook paper, after 7 folds the wad becomes thicker than it is long, so it can’t be folded again. It’s not a matter of the folder’s strength; for a sheet to be considered legitimately folded n times, the resulting wad is required to have 2n layers in a straight line, and this can’t happen if the wad is thicker than it is long.
The challenge was thought to be impossible until Britney Gallivan, then a junior in high school, solved it in 2002. She began by deriving a formula
that predicted the maximum number of times, n, that paper of a given thickness T and length L could be folded in one direction. Notice the forbidding presence of the exponential function 2n in two places — once to account for the doubling of the wad’s thickness at each fold, and another time to account for the halving of its length.
Using her formula, Britney concluded that she would need to use a special roll of toilet paper nearly three quarters of a mile long. In January 2002, she went to a shopping mall in her hometown of Pomona, Calif., and unrolled the paper. Seven hours later, and with the help of her parents, she smashed the world record by folding the paper in half 12 times!
More in This Series
From Fish to Infinity (Jan. 31, 2010)
Rock Groups (Feb. 7, 2010)
The Enemy of My Enemy (Feb. 14, 2010)
Division and Its Discontents (Feb. 21, 2010)
The Joy of X (Feb. 28, 2010)
Finding Your Roots (March 7, 2010)
Square Dancing (March 14, 2010)
Think Globally (March 21, 2010)
See the Entire Series »
In theory, exponential growth is also supposed to grace your bank account. If your money grows at an annual interest rate of r, after one year it will be worth (1 + r) times more; after two years, (1 + r) squared; and after x years, (1 + r)x times more than your initial deposit. Thus the miracle of compounding that we so often hear about is caused by exponential growth in action.
Which brings back to logarithms. We need them because it’s always useful to have tools that can undo one another. Just as every office worker needs both a stapler and a staple remover, every mathematician needs exponential functions and logarithms. They’re “inverses.” This means that if you type a number x into your calculator, and then punch the 10x button followed by the log x button, you’ll get back to the number you started with.
Logarithms are compressors. They’re ideal for taking numbers that vary over a wide range and squeezing them together so they become more manageable. For instance, 100 and 100 million differ a million-fold, a gulf that most of us find incomprehensible. But their logarithms differ only fourfold (they are 2 and 8, because 100 = 102 and 100 million = 108). In conversation, we all use a crude version of logarithmic shorthand when we refer to any salary between $100,000 and $999,999 as being “six figures.” That “six” is roughly the logarithm of these salaries, which in fact span the range from 5 to 6.
As impressive as all these functions may be, a mathematician’s toolbox can only do so much — which is why I still haven’t assembled my Ikea bookcases.
1. The excerpt from “Moonlighting” is from the episode “In God We Strongly Suspect.” It originally aired on Feb. 11, 1986, during the show’s second season.
2. Will Hoffman and Derek Paul Boyle have filmed an intriguing video of the parabolas all around us in the everyday world (along with their exponential cousins, curves called “catenaries,” so-named for the shape of hanging chains). Full disclosure: the filmmakers say this video was inspired by a story I told on an episode of RadioLab.
3. For simplicity, I’ve referred to expressions like x2 as functions, through to be more precise I should speak of “the function that maps x into x2.” I hope this sort of abbreviation won’t cause confusion, since we’ve all seen it on calculator buttons.
4. For the story of Britney Gallivan’s adventures in paper folding, see: Gallivan, B. C. “How to Fold Paper in Half Twelve Times: An ‘Impossible Challenge’ Solved and Explained.” Pomona, CA: Historical Society of Pomona Valley, 2002. For a journalist’s account, aimed at children, see Ivars Peterson, “Champion paper-folder,” Muse (July/August 2004), p. 33. The Mythbusters have also attempted to replicate Britney’s experiment on their television show.
5. For evidence that our innate number sense is logarithmic, see: Stanislas Dehaene, Véronique Izard, Elizabeth Spelke, and Pierre Pica, “Log or linear? Distinct intuitions of the number scale in Western and Amazonian indigene cultures,” Science, Vol. 320 (2008), p. 1217. Popular accounts of this study are available at ScienceDaily and in this episode of RadioLab.
Thanks to David Field, Paul Ginsparg, Jon Kleinberg, Andy Ruina and Carole Schiffman for their comments and suggestions; Diane Hopkins, Cindy Klauss and Brian Madsen for their help in finding and obtaining the “Moonlighting” clip; and Margaret Nelson, for preparing the illustration.
Need to print this post? Here is a print-friendly PDF version of this piece, with images. (Similar PDFs have been created for previous columns in this series, when appropriate.)
April 3, 2010
The End of History (Books)
By MARC ARONSON
TODAY, Apple’s iPad goes on sale, and many see this as a Gutenberg moment, with digital multimedia moving one step closer toward replacing old-fashioned books.
Speaking as an author and editor of illustrated nonfiction, I agree that important change is afoot, but not in the way most people see it. In order for electronic books to live up to their billing, we have to fix a system that is broken: getting permission to use copyrighted material in new work. Either we change the way we deal with copyrights — or works of nonfiction in a multimedia world will become ever more dull and disappointing.
The hope of nonfiction is to connect readers to something outside the book: the past, a discovery, a social issue. To do this, authors need to draw on pre-existing words and images.
Unless we nonfiction writers are lucky and hit a public-domain mother lode, we have to pay for the right to use just about anything — from a single line of a song to any part of a poem; from the vast archives of the world’s art (now managed by gimlet-eyed venture capitalists) to the historical images that serve as profit centers for museums and academic libraries.
The amount we pay depends on where and how the material is used. In fact, the very first question a rights holder asks is “What are you going to do with my baby?” Which countries do you plan to sell in? What languages? Over what period of time? How large will the image be in your book?
Given that permission costs are already out of control for old-fashioned print, it’s fair to expect that they will rise even higher with e-books. After all, digital books will be in print forever (we assume); they can be downloaded, copied, shared and maybe even translated. We’ve all heard about the multimedia potential of the iPad, but how much will writers be charged for film clips and audio? Rights holders will demand a hefty premium for use in digital books — if they make their materials available in that format at all.
Seeing the clouds on the horizon, publishers painstakingly remove photos and even text extracts from print books as they are converted to e-books. So instead of providing a dazzling future, the e-world is forcing nonfiction to become drier, blander and denser.
Still, this logjam between technological potential and copyright hell could turn into a great opportunity — if it leads to a new model for how permission costs are calculated in e-books and even in print.
For e-books, the new model would look something like this: Instead of paying permission fees upfront based on estimated print runs, book creators would pay based on a periodic accounting of downloads. Right now, fees are laid out on a set schedule whose minimum rates are often higher than a modest book can support. The costs may be fine for textbooks or advertisers, but they punish individual authors. Since publishers can’t afford to fully cover permissions fees for print books, and cannot yet predict what they will earn from e-books, the writer has to choose between taking a loss on permissions fees or short-changing readers on content.
But if rights holders were compensated for actual downloads, there would be a perfect fit. The better a book did, the more the original rights holder would be paid. The challenge of this model is accurate accounting — but in the age of iTunes micropayments surely someone can figure out a way.
Before we even get to downloads, though, we need to fix the problem for print books. As a starting point, authors and publishers — perhaps through a joint committee of the Authors Guild and the Association of American Publishers — should create a grid of standard rates and images and text extracts keyed to print runs and prices.
Since authors and publishers have stakes on both sides of this issue, they ought to be able to come up with suggested fees that would allow creators to set reasonable budgets, and compel rights holders to conform to industry norms.
A good starting point might be a suggested scale based on the total number of images used in a book; an image that was one one-hundredth of a story would cost less than an image that was a tenth of it. Such a plan would encourage authors to use more art, which is precisely what we all want.
If rights remain as tightly controlled and as expensive as they are now, nonfiction will be the province of the entirely new or the overly familiar. Dazzling books with newly created art, text and multimedia will far outnumber works filled with historical materials. Only a few well-heeled companies will have the wherewithal to create gee-whiz multimedia book-like products that require permissions, and these projects will most likely focus on highly popular subjects. History’s outsiders and untold stories will be left behind.
We treat copyrights as individual possessions, jewels that exist entirely by themselves. I’m obviously sympathetic to that point of view. But source material also takes on another life when it’s repurposed. It becomes part of the flow, the narration, the interweaving of text and art in books and e-books. It’s essential that we take this into account as we re-imagine permissions in a digital age.
When we have a new model for permissions, we will have new media. Then all of us — authors, readers, new-media innovators, rights holders — will really see the stories that words and images can tell.
Marc Aronson is the author, most recently, of “If Stones Could Speak: Unlocking the Secrets of Stonehenge.”
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum