TECHNOLOGY AND DEVELOPMENT

Current issues, news and ethics
Post Reply
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

http://www.nytimes.com/2007/10/02/scien ... nted=print
October 2, 2007
A Conversation With Hany Farid
Proving That Seeing Shouldn’t Always Be Believing
By CLAUDIA DREIFUS

HANOVER, N.H. — As Hany Farid sat in his office here at Dartmouth College on a recent morning, he fiddled with his laptop and cracked disconcerting little jokes.

“Don’t ever send me a photograph of yourself,” said Dr. Farid, head of the Image Science Laboratory at Dartmouth. “I’ll do the most terrible things to it.”

Dr. Farid, a 41-year-old engineer, is a founder of a subdiscipline within computer science: digital forensics. Most days, he spends his time transforming ordinary images into ones with drastic new meanings. Click, goes his mouse. Courtney Love has joined Grandpa at the family barbecue. Click. Click. Elvis Presley is on Dartmouth’s board of trustees.

The purpose of all this manipulation is to discover how computerized forgeries are made. Intelligence agencies, news organizations and scientific journals employ Dr. Farid’s consulting services when they need to authenticate the validity of images. Dr. Farid sells a software package, “Q,” to clients so they, too, can become digital detectives.

An edited version of two hours’ worth of conversation follows.

Q. Let’s start with some definitions. What exactly is digital forensics?

A. It’s a new field. It didn’t exist five years ago. We look at digital media — images, audio and video — and we try to ascertain whether or not they’ve been manipulated. We use mathematical and computational techniques to detect alterations in them.

In society today, we’re now seeing doctored images regularly. If tabloids can’t obtain a photo of Brad Pitt and Angelina Jolie walking together on a beach, they’ll make up a composite from two pictures. Star actually did that. And it’s happening in the courts, politics and scientific journals, too. As a result, we now live in an age when the once-held belief that photographs were the definitive record of events is gone.

Actually, photographic forgeries aren’t new. People have doctored images since the beginning of photography. But the techniques needed to do that during the Civil War, when Mathew Brady made composites, were extremely difficult and time consuming. In today’s world, anyone with a digital camera, a PC, Photoshop and an hour’s worth of time can make fairly compelling digital forgeries.

Q. Why do scientists need to know about this?

A. Because not long ago, researchers from South Korea had to retract papers published in Science because the photographs used to prove that human stem cells had been cloned were effectively Photoshop-cloned, and not laboratory-cloned. There have been other recent cases, too. And today, in science, more and more, photographs are the data. The Federal Office of Research Integrity has said that in 1990, less than 3 percent of allegations of fraud they investigated involved contested images. By 2001, that number was 26 percent. And last year, it was 44.1 percent.

Mike Rossner of The Journal of Cell Biology estimates that 20 percent of the manuscripts he accepts contain at least one figure that has to be remade because of inappropriate image manipulation. He means that the images are not accurate reflections of the original data. Rossner estimates that about 1 percent of the papers have some piece of image data that is downright fraudulent.

Q. Where does he get his figures from?

A. Mike has a full-time person who looks at every image supporting accepted manuscripts. Other biologists tell me anecdotally that many images in journals are regularly touched up to improve contrast or to remove little imperfections. The journals are, in essence, doing the same things fashion magazines do. Some of it is legitimate. In other cases, they are crossing the line.

Q. Are there policy changes that you think scientists should be considering?

A. I think it’s very hard to define inappropriate manipulation. Sometimes you can change 30 percent of the pixels in an image and it won’t fundamentally change anything. At other times, you can change 5 percent of the pixels and it radically changes meaning. I’m not a purist. I think there’s room for cropping, adjusting, contrast enhancement, but I want to know what was done. I think journal editors need to see the unadulterated, unretouched original images.

No. 2, the scientific community as a whole needs to come out with a well-thought-out policy on what is and isn’t acceptable when it comes to altering photographs. And this is something that must be refined, updated and changed as the technology changes. The journals are probably going to have to hire more staff. That will slow down the publication pipeline somewhat. But the cost of these scandals is too high. They undermine the public’s faith in science.

Q. You make software to detect forgeries. How do you design your programs?

A. I think like a forger. I spend a lot of time in Photoshop making digital forgeries to learn the tools and techniques a forger uses. We’ll make a composite photograph of two people and ask, “How do you manipulate this photograph to make it compelling?” By working backwards, we learn the forger’s techniques and how to detect them.

For instance, when looking at composites of two people, we’ve discovered that one of the hardest things for a forger to match is the lighting. So we’ve developed a way of measuring whether the lighting is consistent within various parts of the image. Lately, I’ve become obsessed with eyes. In a person’s eyes, one sees a slight reflection of the light in the room. So I’ve developed a technique that can take that little image of the reflection of light and tell us where the light was while you were being photographed. Does that match what we see in the image?

We also look at numbers. The pixels of a digital image are represented on a computer by numbers. Once you’ve altered an image, the numbers change. So we can analyze those pixel values for traces of manipulation.

Q. You consult regularly in legal cases. How is your work used in the courts?

A. I’ve consulted for the F.B.I., which sometimes uses images in prosecutions. They make surveillance tapes. At a trial, the defense might argue that the F.B.I. doctored the images. So how do you prove they weren’t doctored? That’s my job.

I’ve also been an expert witness in several child pornography cases. The Supreme Court in 2002 ruled that computer-generated child porn is protected under the First Amendment. So now in these cases, defense lawyers will sometimes argue that the images aren’t real. So far, I have only testified on the side of the prosecution. But I’ve been approached by defendants several times and I’ve told them, “I’ll work on your case, but I’m going to testify to whatever I find.” And in every situation, the defense lawyers said, “No, thank you.” In my opinion, that’s because they knew the photographs were not computer generated.

Q. What’s been the most interesting use of your software?

A. I sold a copy of it to a Canadian company that runs a bounty fishing contest. People send in photographs of fish they’ve caught. My program can check if the fish in the picture has been enlarged. We can prove whether or not the fish was really “THIS big!”
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

http://www.nytimes.com/2007/10/09/scien ... ref=slogin

October 9, 2007
3 Win Nobel in Medicine for Gene Technology
By LAWRENCE K. ALTMAN

Two Americans and a Briton won the 2007 Nobel Prize in medicine yesterday for developing the immensely powerful “knockout” technology, which allows scientists to create animal models of human disease in mice.

The winners, who will share the $1.54 million prize, are Mario R. Capecchi, 70, of the University of Utah in Salt Lake City; Oliver Smithies, 82, of the University of North Carolina in Chapel Hill; and Sir Martin J. Evans, 66, of Cardiff University in Wales.

Other scientists are applying their technology, also known as gene targeting, in a variety of ways, from basic research to the development of new therapies, said the Nobel Committee from the Karolinska Institute in Stockholm that selected the winners.

The knockout technique provided researchers with a superb new tool for finding out what any given gene does. It allows them to genetically engineer a strain of mice with the gene missing, or knocked out, then watch to see what the mice can no longer do.

After the first decoding of the mouse and human genomes in 2001 yielded thousands of new genes of unknown function, knockout mice became a prime source of information for making sense of these novel genes.

Most human genes can also be studied in this way through their counterpart genes in the mouse. Mice have been likened to pocket-size humans, because they have the same organs and their genes are about 95 percent identical in sequence. Scientists have developed more than 500 mouse models of human ailments, including those affecting the heart and central nervous system, as well as diabetes, cancer and cystic fibrosis.

Scientists can now use the technology to create genetic mutations that can be activated at specific time points, or in specific cells or organs, both during development and in the adult animal, the Nobel citation said.

Gene-targeting technology can knock out single genes to study development of the embryo, aging and normal physiology. So far, more than 10,000 mouse genes, or about half of those in the mammalian genome, have been knocked out, the committee said.

Researchers can also make conditional knockouts, mice in which a gene of interest can be inactivated in a specific tissue or part of the brain, at any stage in life. Another important variation is to tag a normal gene with a so-called reporter gene that causes a visible color change in all cells where the normal gene is switched on.

Knockout mice are so important in medical research that thousands of strains are kept available in institutions like the Jackson Laboratory in Bar Harbor, Me.

“The technique is revolutionary and has completely changed the way we use the mouse to study the function of genes,” said Dr. Richard P. Woychik, the lab’s director. “When people come across a novel human gene, one of the first things they think about is knocking it out in a mouse.”

The three laureates, who are friends but work independently, also shared a Lasker Award in 2001. They began their work in the 1980s, and the first reports that the technology could generate gene-targeted mice were published in 1989. The reports involved a rare inherited human disease, the Lesch-Nyhan syndrome, in which lack of an enzyme causes fits of self-mutilation.

The prize was particularly rewarding for Dr. Capecchi, who said he lived as a street urchin in Italy during World War II and later had to prove his scientific peers wrong after they rejected his initial grant to the National Institutes of Health in 1980, saying his project was not feasible.

Dr. Capecchi’s mother, the daughter of an American, had lived in a luxurious villa in Florence and had become a Bohemian poet, writing against Fascism and Nazism. She refused to marry his father, an Italian Air Force officer with whom she had had a love affair.

When young Mario was not yet 4, the Gestapo came to their home in Tyrol, in the Italian Alps, to take his mother to the Dachau concentration camp — an event he said he remembered vividly.

Because she knew her time of freedom was limited, she had sold all her possessions and given the proceeds to an Italian farming family, with whom Mario lived for about a year. When the money ran out, the family sent him on his way. He said he wandered south, moving from town to town as his cover was exposed. He wandered, usually alone, but sometimes in small gangs, begging and stealing, sleeping in the streets, occasionally in an orphanage.

At the war’s end, the malnourished boy was put in a hospital for a year. During that time his mother, who had survived Dachau, searched hospitals and orphanages for him. A week after she found him — on his birthday — they were on a boat to join her brother in the home of a Quaker family in Pennsylvania.

The family put Mario in the third grade, where as a means of communication his teachers told him to draw murals. As he did, he slowly learned English. Because of the street smarts he developed in Italy, he became a class leader and the boy who beat up the bullies.

He went on to study political science at Antioch College, alternating periods of work and studies. Then he went to the Massachusetts Institute of Technology and Harvard, where he worked in the laboratory of James Watson, the Nobel Prize-winning co-discoverer of the structure of DNA.

When he decided to leave the Harvard faculty in 1973 because members of the department did not get along, he said, and did not recruit sufficient younger scientists, Dr. Capecchi went to Utah. Colleagues told him, he said, that he was “nuts” to leave Harvard’s Ivy League splendor. But Dr. Capecchi said Dr. Watson told him he could do good science anywhere.

Dr. Capecchi said the main advantage was that he could work on long-term projects more easily in Utah than at Harvard, where there was a push to get results quickly.

Dr. Capecchi said that when he reapplied to the N.I.H. in 1984 for the grant it had rejected in 1980, he was told, “We are glad you didn’t follow our advice.”

After learning he had become a Nobel Prize winner, Dr. Smithies told Agence France-Presse that “it’s actually a rather peaceful feeling of culmination of a life of science.”

Dr. Smithies has credited his interest in science to his boyhood love for radios and telescopes, and for a comic-strip inventor whom he wanted to emulate. He earned a scholarship to Oxford, then dropped out of medical school to study chemistry before moving to the University of Wisconsin. Because of a visa problem, Dr. Smithies worked in Toronto for about seven years before returning to Wisconsin. He became a geneticist and moved to the University of North Carolina 19 years ago.

Dr. Smithies is a licensed airplane pilot and is fond of gliding.

Dr. Evans had planned to have an “ordinary day” off work cleaning his daughter’s home in Cambridge, England, where he was visiting when he learned he won the prize. It was “a boyhood dream come true,” Dr. Evans told Agence France-Presse.

Like Dr. Capecchi, Dr. Evans said his scientific career was an upward struggle. In an interview with the Lasker Foundation, Dr. Evans said recognition was important to him because he often was a lone scientist who cried out against the consensus. In applying for grants, he said he was told many of his ideas were premature and could not be done.

“Then five years later,” he said, “I find everyone is doing the same thing.”

Nicholas Wade contributed reporting.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Ancestry too complex for DNA test
Report says web tests can't link you to Genghis Khan

Margaret Munro
CanWest News Service

Friday, October 19, 2007

Ancestral DNA tests can often be misleading and have profound impact on people's lives, warn scientists concerned with the popular "recreational" gene tests.

The tests cannot, as one Canadian company claims, "discover" someone's relation to Mongolian emperor Genghis Khan, says Deborah Bolnick, a genetic anthropologist at University of Texas and lead author of a report on the tests.

Nor can they determine with certainty the origin of someone's ancestors or their race, says the report to be published today in the journal Science.

The tests, which almost half a million consumers have purchased for between $100 and $900 per test, claim to be able to trace people's roots to everything from African tribes to European palaces to Aboriginal bands.

"Using revolutionary DNA analysis, find out how you are related to Marie Antoinette -- one of the most illustrious women in European history," offers Genetrack Biolabs of Vancouver, one of Canada's largest gene testing companies. Genetrack says it can also "discover" your relation to notorious outlaw Jesse James or Genghis Khan and "find out where your ancestors came from, their ethnic background, and how they have scattered throughout the world today."

U.S. companies go even further.

"One simple test can identify your family's country of origin," says African Ancestry. It will even print up a certificate of ancestry to hang on the wall. Another firm, DNAPrint Genomics, says it can determine if someone's genetic heritage is Native American, East Asian, sub-Saharan African or Indo-European.

Bolnick and her colleagues say the tests -- and company websites featuring tribes of people in elaborate face paint and traditional clothing -- are often misleading. And they reinforce the controversial idea that race is genetically determined.

"There is no clear-cut connection between an individual's DNA and his or her racial or ethnic affiliation," say the researchers.

The report goes on to say ancestry tests "cannot pinpoint the place of origin or social affiliation of even one ancestor with exact certainty."

Some of the tests may be a harmless way for some people to spend their money, Bolnick said in an interview from Quebec City, where she is attending a Genome Canada conference. But she and her colleagues say the tests can have a "profound" impact on some lives. "Many people are taking the tests to get answers about their identity and instead end up with an identity crisis," says Bolnick.

Better standards and guidelines are needed to address the growing use of the tests, which might change how people report their race of ethnicity on government forms, college or job applications, Bolnick says.

The tests, promoted with the help of stars like Oprah Winfrey who have offered up DNA samples, have proliferated in recent years. People typically send in cheek swabs for testing and get the result in the mail a few weeks later. Winfrey was linked to a tribe in Liberia.

Genetic ancestry usually compares someone's DNA to databases of samples from different geographic regions. Bolnick says these tests tend to ignore the way people have moved around over the eons and that many DNA sequences are found in different populations. This is why tests can incorrectly conclude someone has Native American roots when their ancestors actually lived in Asia, she says.

June Wong, an official with Genetrack Biolabs in Vancouver, agrees some ancestry tests are misleading and oversold. She is particularly troubled by the tests that claim to link individuals to specific tribes and regions.

Wong says customers need to be aware the tests' limitations, which are often alluded to on company websites.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Canada joins effort to re-create Big Bang

Kent Spencer
CanWest News Service


Sunday, October 21, 2007


It seems fitting that Canada's contribution to an international experiment into the universe's origins would look like a giant hockey puck.

Simon Fraser University physics professor Michel Vetterli said Saturday the 50-country effort will try to determine what makes the universe tick.

"I've always been fascinated taking radios apart," said Vetterli. "This is the ultimate -- taking the universe apart."

Canada's particle detector is made of copper, is five metres in diameter and two metres deep, and is shaped "like a big hockey puck with a hole in the middle," said Vetterli.

It is designed to catch particles smashed together in a $3.5-billion European particle accelerator, which will be switched on in 2008.

The conditions simulated will be close to those that existed during the formation of the universe, sometimes called the Big Bang.

It is hoped analysis will show what the particles are made of and, by extension, how the universe works.

"If you want to understand a car, you study its pieces. If you want to understand matter, you study its pieces," said Vetterli. "It's very exciting. We're expecting to learn an enormous amount."

Scientists don't know what discoveries they will make, but they are shooting high.

Vetterli isn't making predictions, but he said the particle accelerator is the way forward to the kinds of future tech we can only dream about now -- like the fictional matter teleportation technology which featured in the Star Trek TV series.

"We will never 'Beam me up, Scotty' if we don't do this," he said.

Once the puck is in Geneva, Switzerland, it will be set in place in the huge European accelerator. The accelerator shoots tiny invisible particles at close to the speed of light and smashes them together so scientists can study the collisions.

The particles are the same ones that pass unnoticed through a person's body every day as cosmic rays.

The accelerator tunnel, about the size of a subway tunnel, is being assembled in a giant circle stretching 27 kilometres underground.

An ultra-powerful IBM supercomputer at the TRIUMF nuclear research centre in Vancouver -- the second part of Canada's 14-year, $100-million contribution to the project -- will analyze the collision data.

The computer is housed in several closet-sized boxes that wield enormous power packed into very condensed space.

After the particles are rammed together, the reactions can be studied with the help of detectors like the hockey puck to determine their makeup.

"We want to know why mass exists.

"At the moment, all we have are theories," said Vetterli.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

http://www.nytimes.com/2007/10/26/opini ... ?th&emc=th

October 26, 2007
Op-Ed Columnist
The Outsourced Brain
By DAVID BROOKS

The gurus seek bliss amidst mountaintop solitude and serenity in the meditative trance, but I, grasshopper, have achieved the oneness with the universe that is known as pure externalization.

I have melded my mind with the heavens, communed with the universal consciousness, and experienced the inner calm that externalization brings, and it all started because I bought a car with a G.P.S.

Like many men, I quickly established a romantic attachment to my G.P.S. I found comfort in her tranquil and slightly Anglophilic voice. I felt warm and safe following her thin blue line. More than once I experienced her mercy, for each of my transgressions would be greeted by nothing worse than a gentle, “Make a U-turn if possible.”

After a few weeks, it occurred to me that I could no longer get anywhere without her. Any trip slightly out of the ordinary had me typing the address into her system and then blissfully following her satellite-fed commands. I found that I was quickly shedding all vestiges of geographic knowledge.

It was unnerving at first, but then a relief. Since the dawn of humanity, people have had to worry about how to get from here to there. Precious brainpower has been used storing directions, and memorizing turns. I myself have been trapped at dinner parties at which conversation was devoted exclusively to the topic of commuter routes.

My G.P.S. goddess liberated me from this drudgery. She enabled me to externalize geographic information from my own brain to a satellite brain, and you know how it felt? It felt like nirvana.

Through that experience I discovered the Sacred Order of the External Mind. I realized I could outsource those mental tasks I didn’t want to perform. Life is a math problem, and I had a calculator.

Until that moment, I had thought that the magic of the information age was that it allowed us to know more, but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants — silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.

Musical taste? I have externalized it. Now I just log on to iTunes and it tells me what I like.

I click on its recommendations, sample 30 seconds of each song, and download the ones that appeal. I look on my iPod playlist and realize I’ve never heard of most of the artists I listen to. I was once one of those people with developed opinions about the Ramones, but now I’ve shed all that knowledge and blindly submit to a mishmash of anonymous groups like the Reindeer Section — a disturbing number of which seem to have had their music featured on the soundtrack of “The O.C.”

Memory? I’ve externalized it. I am one of those baby boomers who are making this the “It’s on the Tip of My Tongue Decade.” But now I no longer need to have a memory, for I have Google, Yahoo and Wikipedia. Now if I need to know some fact about the world, I tap a few keys and reap the blessings of the external mind.

Personal information? I’ve externalized it. I’m no longer clear on where I end and my BlackBerry begins. When I want to look up my passwords or contact my friends I just hit a name on my directory. I read in a piece by Clive Thompson in Wired that a third of the people under 30 can’t remember their own phone number. Their smartphones are smart, so they don’t need to be. Today’s young people are forgoing memory before they even have a chance to lose it.

Now, you may wonder if in the process of outsourcing my thinking I am losing my individuality. Not so. My preferences are more narrow and individualistic than ever. It’s merely my autonomy that I’m losing.

I have relinquished control over my decisions to the universal mind. I have fused with the knowledge of the cybersphere, and entered the bliss of a higher metaphysic. As John Steinbeck nearly wrote, a fella ain’t got a mind of his own, just a little piece of the big mind — one mind that belongs to everybody. Then it don’t matter, Ma. I’ll be everywhere, around in the dark. Wherever there is a network, I’ll be there. Wherever there’s a TiVo machine making a sitcom recommendation based on past preferences, I’ll be there. Wherever there’s a Times reader selecting articles based on the most e-mailed list, I’ll be there. I’ll be in the way Amazon links purchasing Dostoyevsky to purchasing garden furniture. And when memes are spreading, and humiliation videos are shared on Facebook — I’ll be there, too.

I am one with the external mind. Om.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

October 31, 2007
Op-Ed Columnist
If I.T. Merged With E.T.
By THOMAS L. FRIEDMAN
Ethakota, India

Well, here’s something you don’t see every day. I was visiting an Indian village 350 miles east of Hyderabad and got to watch a very elderly Indian man undergo an EKG in a remote clinic, while a heart specialist, hundreds of miles away in Bangalore, watched via satellite TV and dispensed a diagnosis. This kind of telemedicine is the I.T. revolution at its best. But what struck me most was that just underneath the TV screen, powering the whole endeavor, were 16 car batteries — the E.T., energy technology, revolution, at its worst.

Some 250 million Indians today have cellphones. Many of them are people who make just $2 or $3 a day. More and more are getting access to computers and the Internet, even in villages. But only 85 percent of Indian villages are electrified — and that is being generous, since many still don’t have reliable 24/7 quality power.

If only ... If only we could make a breakthrough in clean, distributed power — an E.T. revolution — it could drive the I.T. revolution into every forgotten corner of the world to create jobs, light up schools and tap the innovative prowess of rural populations, like India’s 700 million villagers. There is a green Edison growing up out here — if only we can give them the light to learn.

To appreciate that potential, look at how much is being done with just car batteries, backup diesel generators and India’s creaky rural electricity grid. I traveled to a cluster of villages with a team from the Byrraju Foundation — a truly impressive nonprofit set up by B. Ramalinga Raju and his family. Raju and his brother Rama are co-founders of one of India’s leading outsourcing companies, Satyam Computer Services. The Hyderabad-based brothers wanted to give back to their country, but they wanted it to be a hand up, not a hand out.

So besides funding health clinics and computer-filled primary schools in villages in their home state of Andhra Pradesh, they tried something new: outsourcing their outsourcing to villages.

Here in Ethakota, amid the banana and palm groves, 120 college-educated villagers, trained in computers and English by Satyam and connected to the world by wireless networks, are processing data for a British publisher and selling services for an Indian phone company. They run two eight-hour shifts, but could run three — if only the electricity didn’t go off for six hours a day!

Talking to the workers at the Ethakota data center — one of three Byrraju has set up — you can see what a merger of I.T. and E.T. could do: enable so many more Indians to live local and act global.

Suresh Varma, 30, one of the data managers, was working for a U.S. oil company in Hyderabad and actually decided to move back to the village where his parents came from. “I have a much higher quality of life here than in an urban area anywhere in India,” he said. “The city is concrete. You spend most of your time in traffic, just getting from one place to another. Here you walk to work. Here I am in touch with what is happening in the cities, but at the same time I don’t miss out on my professional aspirations. ... It is like moving from a Silicon Valley to a real valley.”

Unlike in the city, where outsourcing workers come and go, “in the village, nobody gives up these jobs,” said Verghese Jacob, who heads the Byrraju Foundation, which plans to gradually hand over ownership of the data centers to the villagers. “They are very innovative and positive, and because some of them had never worked on a computer before, their respect for the opportunity is so much more than for a city child who takes it for granted.”

When the world starts getting wired and electrified, you never know who you’ll bump into. In the village of Podagatlapalli, I met Sha Yu, a 22-year-old Chinese graduate of Beijing’s Renmin University and a Byrraju volunteer, teaching rural Indian high school students how to produce their own newspaper on a computer.

“I felt in China people don’t know so much about India, so I thought I want to come and see what is happening here,” she explained. “In rural India, communication is not that developed, so I started a newspaper for the high school. If I can learn something from here, and bring it back, I can give some ideas to the Chinese government. If this rural area can be empowered, it would be an amazing thing for the world.”

Amazing indeed. India’s strained megacities, like Mumbai and Calcutta, can’t keep growing. Mr. Jacob estimates that just one of his rural outsourcing centers creates the equivalent employment and salaries of 400 acres of farm land.

India, in other words, could actually mint more land in the countryside, but it can’t do it off car batteries. It will take a real energy revolution. If only ...
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Hologram technology takes to runway, leaving models behind

Reuters


Friday, November 02, 2007


NEW YORK - Finally, coming to New York, a fashion show devoid of skinny models and serious faces -- in fact, the models don't even exist.

U.S. discount retailer Target Corp., known for its innovative marketing, is staging a "model-less" fashion show in Manhattan next week that will feature holograms strutting down a runway in its merchandise instead of size-zero models.

The images, which will appear to be three-dimensional, will show clothes by designers like Isaac Mizrahi and Liz Lange sashaying across a virtual runway.

"This is the first time a fashion show will be completely produced with hologram technology, without models, without a runway and easily accessible to all fashion fans," Target senior vice-president Trish Adams said in a statement.

The show will take place Tuesday and Wednesday in Vanderbilt Hall at Grand Central Terminal.

It will show clothes and accessories from Target's men's, women's, bridal and maternity collections.

© The Calgary Herald 2007
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

November 11, 2007
The DNA Age
In DNA Era, New Worries About Prejudice
By AMY HARMON

When scientists first decoded the human genome in 2000, they were quick to portray it as proof of humankind’s remarkable similarity. The DNA of any two people, they emphasized, is at least 99 percent identical.

But new research is exploring the remaining fraction to explain differences between people of different continental origins.

Scientists, for instance, have recently identified small changes in DNA that account for the pale skin of Europeans, the tendency of Asians to sweat less and West Africans’ resistance to certain diseases.

At the same time, genetic information is slipping out of the laboratory and into everyday life, carrying with it the inescapable message that people of different races have different DNA. Ancestry tests tell customers what percentage of their genes are from Asia, Europe, Africa and the Americas. The heart-disease drug BiDil is marketed exclusively to African-Americans, who seem genetically predisposed to respond to it. Jews are offered prenatal tests for genetic disorders rarely found in other ethnic groups.

Such developments are providing some of the first tangible benefits of the genetic revolution. Yet some social critics fear they may also be giving long-discredited racial prejudices a new potency. The notion that race is more than skin deep, they fear, could undermine principles of equal treatment and opportunity that have relied on the presumption that we are all fundamentally equal.

“We are living through an era of the ascendance of biology, and we have to be very careful,” said Henry Louis Gates Jr., director of the W. E. B. Du Bois Institute for African and African American Research at Harvard University. “We will all be walking a fine line between using biology and allowing it to be abused.”

Certain superficial traits like skin pigmentation have long been presumed to be genetic. But the ability to pinpoint their DNA source makes the link between genes and race more palpable. And on mainstream blogs, in college classrooms and among the growing community of ancestry test-takers, it is prompting the question of whether more profound differences may also be attributed to DNA.

Nonscientists are already beginning to stitch together highly speculative conclusions about the historically charged subject of race and intelligence from the new biological data. Last month, a blogger in Manhattan described a recently published study that linked several snippets of DNA to high I.Q. An online genetic database used by medical researchers, he told readers, showed that two of the snippets were found more often in Europeans and Asians than in Africans.

No matter that the link between I.Q. and those particular bits of DNA was unconfirmed, or that other high I.Q. snippets are more common in Africans, or that hundreds or thousands of others may also affect intelligence, or that their combined influence might be dwarfed by environmental factors. Just the existence of such genetic differences between races, proclaimed the author of the Half Sigma blog, a 40-year-old software developer, means “the egalitarian theory,” that all races are equal, “is proven false.”

Though few of the bits of human genetic code that vary between individuals have yet to be tied to physical or behavioral traits, scientists have found that roughly 10 percent of them are more common in certain continental groups and can be used to distinguish people of different races. They say that studying the differences, which arose during the tens of thousands of years that human populations evolved on separate continents after their ancestors dispersed from humanity’s birthplace in East Africa, is crucial to mapping the genetic basis for disease.

But many geneticists, wary of fueling discrimination and worried that speaking openly about race could endanger support for their research, are loath to discuss the social implications of their findings. Still, some acknowledge that as their data and methods are extended to nonmedical traits, the field is at what one leading researcher recently called “a very delicate time, and a dangerous time.”

“There are clear differences between people of different continental ancestries,” said Marcus W. Feldman, a professor of biological sciences at Stanford University. “It’s not there yet for things like I.Q., but I can see it coming. And it has the potential to spark a new era of racism if we do not start explaining it better.”

Dr. Feldman said any finding on intelligence was likely to be exceedingly hard to pin down. But given that some may emerge, he said he wanted to create “ready response teams” of geneticists to put such socially fraught discoveries in perspective.

The authority that DNA has earned through its use in freeing falsely convicted inmates, preventing disease and reconstructing family ties leads people to wrongly elevate genetics over other explanations for differences between groups.

“I’ve spent the last 10 years of my life researching how much genetic variability there is between populations,” said Dr. David Altshuler, director of the Program in Medical and Population Genetics at the Broad Institute in Cambridge, Mass. “But living in America, it is so clear that the economic and social and educational differences have so much more influence than genes. People just somehow fixate on genetics, even if the influence is very small.”

But on the Half Sigma blog and elsewhere, the conversation is already flashing forward to what might happen if genetically encoded racial differences in socially desirable — or undesirable — traits are identified.

“If I were to believe the ‘facts’ in this post, what should I do?” one reader responded on Half Sigma. “Should I advocate discrimination against blacks because they are less smart? Should I not hire them to my company because odds are I could find a smarter white person? Stop trying to prove that one group of people are genetically inferior to your group. Just stop.”

Renata McGriff, 52, a health care consultant who had been encouraging black clients to volunteer genetic information to scientists, said she and other African-Americans have lately been discussing “opting out of genetic research until it’s clear we’re not going to use science to validate prejudices.”

“I don’t want the children in my family to be born thinking they are less than someone else based on their DNA,” added Ms. McGriff, of Manhattan.

Such discussions are among thousands that followed the geneticist James D. Watson’s assertion last month that Africans are innately less intelligent than other races. Dr. Watson, a Nobel Prize winner, subsequently apologized and quit his post at the Cold Spring Harbor Laboratory on Long Island.

But the incident has added to uneasiness about whether society is prepared to handle the consequences of science that may eventually reveal appreciable differences between races in the genes that influence socially important traits.

New genetic information, some liberal critics say, could become the latest rallying point for a conservative political camp that objects to social policies like affirmative action, as happened with “The Bell Curve,” the controversial 1994 book that examined the relationship between race and I.Q.

Yet even some self-described liberals argue that accepting that there may be genetic differences between races is important in preparing to address them politically.

“Let’s say the genetic data says we’ll have to spend two times as much for every black child to close the achievement gap,” said Jason Malloy, 28, an artist in Madison, Wis., who wrote a defense of Dr. Watson for the widely read science blog Gene Expression. Society, he said, would need to consider how individuals “can be given educational and occupational opportunities that work best for their unique talents and limitations.”

Others hope that the genetic data may overturn preconceived notions of racial superiority by, for example, showing that Africans are innately more intelligent than other groups. But either way, the increased outpouring of conversation on the normally taboo subject of race and genetics has prompted some to suggest that innate differences should be accepted but, at some level, ignored.

“Regardless of any such genetic variation, it is our moral duty to treat all as equal before God and before the law,” Perry Clark, 44, wrote on a New York Times blog. It is not necessary, argued Dr. Clark, a retired neonatologist in Leawood, Kan., who is white, to maintain the pretense that inborn racial differences do not exist.

“When was the last time a nonblack sprinter won the Olympic 100 meters?” he asked.

“To say that such differences aren’t real,” Dr. Clark later said in an interview, “is to stick your head in the sand and go blah blah blah blah blah until the band marches by.”

Race, many sociologists and anthropologists have argued for decades, is a social invention historically used to justify prejudice and persecution. But when Samuel M. Richards gave his students at Pennsylvania State University genetic ancestry tests to establish the imprecision of socially constructed racial categories, he found the exercise reinforced them instead.

One white-skinned student, told she was 9 percent West African, went to a Kwanzaa celebration, for instance, but would not dream of going to an Asian cultural event because her DNA did not match, Dr. Richards said. Preconceived notions of race seemed all the more authentic when quantified by DNA.

“Before, it was, ‘I’m white because I have white skin and grew up in white culture,’ ” Dr. Richards said. “Now it’s, ‘I really know I’m white, so white is this big neon sign hanging over my head.’ It’s like, oh, no, come on. That wasn’t the point.”

Other related articles on DNA can be linked at:
http://www.nytimes.com/2007/11/11/us/11 ... ?th&emc=th
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

102 years on, Einstein's 'time dilation' verified

Tom Spears
CanWest News Service


Monday, November 12, 2007


Chalk up another for Albert Einstein. A Canadian-led physics experiment supports his theory that time slows for objects that travel very fast. And it only took the physics world 102 years to show it.

Einstein was just a youngster when his special theory of relativity predicted something he couldn't actually test: the idea called "time dilation," which says time moves more slowly as you approach the speed of light.

That was in 1905. (His more famous theory, general relativity, came a decade later). Science fiction has been using this weird look at time ever since.

In the science journal Nature Physics, Gerald Gwinner of the University of Manitoba physics department shows how he tested the idea in a lab in Germany, using atoms that speed along at more than 10,000 kilometres per second. That was enough to make the atoms experience time more slowly.

The international team of physicists used a particle accelerator to make the most accurate measurements so far of time dilation, pinpointing the degree of the amount time slows.

Researchers verified Einstein's theory as far back as 1938, to a margin of error of one per cent, but Gwinner's group took it to a new level.

"We have an accuracy now of 10 to the minus seventh, or 10,000 times that."

© The Calgary Herald 2007
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

November 16, 2007
Editorial
A Stem Cell Achievement

Scientists in Oregon have successfully cloned monkey embryos from the skin cells of an adult monkey and then extracted stem cells from them — the first time this feat has been achieved in any animal other than mice. The same method might also work in humans, bringing researchers closer to the goal of creating embryonic stem cells to study diseases and ultimately treat them with new drugs or stem cell therapies.

Scientists had previously cloned embryos of other species, but they had never been able to clone a primate — let alone extract primate stem cells. Now researchers at the Oregon Health and Science University have succeeded. They created embryos that were genetically identical to an adult monkey, extracted stem cells from them and grew the stem cells into heart cells and nerve cells that could, theoretically, be used to replace damaged cells.

The most immediate application will be to study diseases in monkeys that closely mimic human diseases, analyze how they develop in a laboratory dish and find ways to treat them. This research is likely to be welcomed by both sides of the battles over stem cells.

Using this technique to make human stem cells will still face daunting technical hurdles. The researchers used more than 300 monkey eggs to produce just one normal stem cell line. The success rate is bound to improve, but it may be difficult to find enough human egg donors to allow the research to move forward rapidly.

The Oregon achievement ought to galvanize Congress to expand the array of embryonic stem cell research that can receive federal financing. This research should not be hobbled by the opposition of President Bush and his backers among religious conservatives, who have severely limited the availability of federal funds. The potential benefits for human health are far too important.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

November 17, 2007
The DNA Age
My Genome, Myself: Seeking Clues in DNA
By AMY HARMON

The exploration of the human genome has long been relegated to elite scientists in research laboratories. But that is about to change. An infant industry is capitalizing on the plunging cost of genetic testing technology to offer any individual unprecedented — and unmediated — entree to their own DNA.

For as little as $1,000 and a saliva sample, customers will be able to learn what is known so far about how the billions of bits in their biological code shape who they are. Three companies have already announced plans to market such services, one yesterday.

Offered the chance to be among the early testers, I agreed, but not without reservations. What if I learned I was likely to die young? Or that I might have passed on a rogue gene to my daughter? And more pragmatically, what if an insurance company or an employer used such information against me in the future?

But three weeks later, I was already somewhat addicted to the daily communion with my genes. (Recurring note to self: was this addiction genetic?)

For example, my hands hurt the other day. So naturally, I checked my DNA.

Was this the first sign that I had inherited the arthritis that gnarled my paternal grandmother’s hard-working fingers? Logging onto my account at 23andMe, the start-up company that is now my genetic custodian, I typed my search into the “Genome Explorer” and hit return. I was, in essence, Googling my own DNA.

I had spent hours every day doing just that as new studies linking bits of DNA to diseases and aspects of appearance, temperament and behavior came out on an almost daily basis. At times, surfing my genome induced the same shock of recognition that comes when accidentally catching a glimpse of oneself in the mirror.

I had refused to drink milk growing up. Now, it turns out my DNA is devoid of the mutation that eases the digestion of milk after infancy, which became common in Europeans after the domestication of cows.

But it could also make me question my presumptions about myself. Apparently I lack the predisposition for good verbal memory, although I had always prided myself on my ability to recall quotations. Should I be recording more of my interviews? No, I decided; I remember what people say. DNA is not definitive.

I don’t like brussels sprouts. Who knew it was genetic? But I have the snippet of DNA that gives me the ability to taste a compound that makes many vegetables taste bitter. I differ from people who are blind to bitter taste — who actually like brussels sprouts — by a single spelling change in our four-letter genetic alphabet: somewhere on human chromosome 7, I have a G where they have a C.

It is one of roughly 10 million tiny differences, known as single nucleotide polymorphisms, or SNPs (pronounced “snips”) scattered across the 23 pairs of human chromosomes from which 23andMe takes its name. The company generated a list of my “genotypes” — AC’s, CC’s, CT’s and so forth, based on which versions of every SNP I have on my collection of chromosome pairs.

For instance, I tragically lack the predisposition to eat fatty foods and not gain weight. But people who, like me, are GG at the SNP known to geneticists as rs3751812 are 6.3 pounds lighter, on average, than the AA’s. Thanks, rs3751812!

And if an early finding is to be believed, my GG at rs6602024 mean that I am an additional 10 pounds lighter than those whose genetic Boggle served up a different spelling. Good news, except that now I have only my slothful ways to blame for my inability to fit into my old jeans.

And although there is great controversy about the role that genes play in shaping intelligence, it was hard to resist looking up the SNPs that have been linked — however tenuously — to I.Q. Three went in my favor, three against. But I found hope in a study that appeared last week describing a SNP strongly linked with an increase in the I.Q. of breast-fed babies.

Babies with the CC or CG form of the SNP apparently benefit from a fatty acid found only in breast milk, while those with the GG form do not. My CC genotype meant that I had been eligible for the 6-point I.Q. boost when my mother breast-fed me. And because, by the laws of genetics, my daughter had to have inherited one of my C’s, she, too, would see the benefit of my having nursed her. Now where did I put those preschool applications?

I was not always so comfortable in my own genome. Before I spit into the vial, I called several major insurance companies to see if I was hurting my chances of getting coverage. They said no, but that is now, when almost no one has such information about their genetic make-up. In five years, if companies like 23andMe are at all successful, many more people presumably would. And isn’t an individual’s relative risk of disease precisely what insurance companies want to know?

Last month, alone in a room at 23andMe’s headquarters in Mountain View, Calif., with my password for the first time, I wavered (genetic?) and walked down the hall to get lunch.

Once I looked at my results, I could never turn back. I had prepared for the worst of what I could learn this day. But what if something even worse came along tomorrow?

Some health care providers argue that the public is unprepared for such information and that it is irresponsible to provide it without an expert to help put it in context. And at times, as I worked up the courage to check on my risks of breast cancer and Alzheimer’s, I could see their point.

One of the companies that plans to market personal DNA information, Navigenics, intends to provide a phone consultation with a genetic counselor along with the results. Its service would cost $2,500 and would initially provide data on 20 health conditions.

DeCODE Genetics and 23andMe will offer referrals. Although what they can tell you is limited right now, all three companies are hoping that people will be drawn by the prospect of instant updates on what is expected to be a flood of new findings.

I knew I would never be able to pass up the chance to fill in more pieces of my genetic puzzle.

But I had decided not to submit my daughter’s DNA for testing — at least not yet — because I didn’t want to regard anything about her as predestined. If she wants to play the piano, who cares if she lacks perfect pitch? If she wants to run the 100-meter dash, who cares if she lacks the sprinting gene? And did I really want to know — did she really want to know someday — what genes she got from which parent and which grandparent?

I, however, am not age 3. Whatever was lurking in my genes had been there my entire life. Not looking would be like rejecting some fundamental part of myself.

Compelled to know (genetic?), I breezed through the warning screens on the site. There would be no definitive information, I read, and new discoveries might reverse whatever I was told. Even if I learned that my risk for developing a disease was high, there might well be nothing to do about it, and, besides, I should not regard this as a medical diagnosis. “If, after considering these points, you still wish to view your results,” the screen read, “click here.”

I clicked.

Like other testers of 23andMe’s service, my first impulse was to look up the bits of genetic code associated with the diseases that scare me the most.

But in the bar charts that showed good genes in green and bad ones in red, I found a perverse sense of accomplishment. My risk of breast cancer was no higher than average, as was my chance of developing Alzheimer’s. I was 23 percent less likely to get Type 2 diabetes than most people. And my chance of being paralyzed by multiple sclerosis, almost nil. I was three times more likely than the average person to get Crohn’s disease, but my odds were still less than one in a hundred.

I was in remarkably good genetic health, and I hadn’t even been to the gym in months!

Still, just studying my DNA had made me more acutely aware of the basic health risks we all face. I renounced my midafternoon M&M’s.

And then I opened my “Gene Journal” for heart disease to find that I was 23 percent more likely than average to have a heart attack. “Healthy lifestyle choices play a major role in preventing the blockages that lead to heart attacks,” it informed me.

Thanks, Gene Journal. Yet somehow even this banal advice resonated when the warning came from my own DNA.

Back in New York, I headed to the gym despite a looming story deadline and my daughter’s still-unfinished preschool applications. At least I had more time. I had discovered a SNP that likely increased my life span.

But in what I have come to accept as the genomic law of averages, I soon found that I might well be sight impaired during those extra years. According to the five SNPs for macular degeneration I fed into the “Genome Explorer,” I was nearly 100 times more likely to develop the disease than someone with the most favorable A-C-G-T combination.

And unlike the standard eat-right-and-exercise advice for heart health, there was not much I could do about it. Still, I found the knowledge of my potential future strangely comforting, even when it was not one I would wish for. At least my prospects for nimble fingers in old age were looking brighter. I didn’t have the bad form of that arthritis SNP.

Maybe I was just typing too much.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

http://www.nytimes.com/2007/11/20/scien ... ?th&emc=th

November 20, 2007
Through Genetics, Tapping a Tree’s Potential as a Source of Energy
By ANDREW POLLACK

It might be true that “only God can make a tree,” as the poet Joyce Kilmer wrote. But genetic engineers can fundamentally redesign them.

Aiming to turn trees into new energy sources, scientists are using a controversial genetic engineering process to change the composition of the wood. A major goal is to reduce the amount of lignin, a chemical compound that interferes with efforts to turn the tree’s cellulose into biofuels like ethanol.

Vincent L. Chiang, co-director of the forest biotechnology group at North Carolina State University, has developed transgenic trees with as little as half the lignin of their natural counterparts. “I think the transgenic tree with low lignin will contribute significantly to energy needs,” he said.

Environmentalists say such work can be risky, because lignin provides trees with structural stiffness and resistance to pests. Even some scientists working on altering wood composition acknowledge that reducing lignin too much could lead to wobbly, vulnerable trees.

“Nature would have selected for lower-lignin trees if they could survive,” said Shawn Mansfield, associate professor of wood science at the University of British Columbia.

People working in the field also acknowledge that they will face resistance from others who see trees as majestic symbols of pristine nature that should not be genetically altered like corn and soybeans.

“The general public is not going to look at trees at this point as a row crop,” said Susan McCord, executive director of the Institute of Forest Biotechnology in Raleigh, N.C. “The same is true of foresters. The people who go into that work, they love trees. They view them very differently than a row of corn.”

Ethanol is mainly made from the starch in corn kernels. To increase the supply to make a dent in the nation’s energy picture, scientists are looking at using cellulose, a component of the cell wall in plants.

Proponents of using trees for this say they are good sources of cellulose and are also good at absorbing carbon dioxide, helping to fight global warming. Also, trees can be cut as needed rather than having to be harvested at a given time each year like a crop.

But the cellulose is covered by lignin, another component of the cell wall, making it difficult for enzymes to reach the cellulose and break it down into simple sugars that can be converted to ethanol. Pulp and paper companies break down lignin using acids and steam. Ethanol producers would have to do the same.

Trees that have less lignin might reduce or eliminate these steps. That could save at least 10 cents a gallon in ethanol costs, said Michael Ladisch, director of the Laboratory of Renewable Resources Engineering at Purdue.

Scientists understand the steps in creating lignin and can make lower-lignin trees by blocking one of them. One way is to put in a reverse copy of a gene that codes for an enzyme in lignin formation. The reverse copy silences that gene and reduces production of that enzyme.

Dr. Chiang said a 50 percent reduction in lignin appeared to be the maximum achievable, adding, “The tree doesn’t allow you to go further.”

The new focus on biofuels has brought a renewed interest in tree biotechnology, and new money for it, from the Energy Department. The field has been languishing because of technical challenges, costs, environmental concerns and financial problems in the forest products industry.

The revival has dismayed critics like Anne Petermann, a leader of the Stop Genetically Engineered Trees Campaign. She said energy concerns were being used “as a really great opportunity to sell this controversial technology to the public.”

Just one company in the United States is known to be pursuing genetic engineering of forest trees vigorously. The company, ArborGen, is small but has some big backers, being jointly owned by three forest products companies: International Paper, MeadWestvaco and Rubicon, based in New Zealand

ArborGen, based in Summerville, S.C., is developing a low-lignin eucalyptus that it hopes to sell in South America, where the fast-growing trees are already used for pulp and paper. For the United States, the company is developing a eucalyptus genetically engineered to survive cold snaps, allowing the trees to be grown more widely.

“In the next 5 to 10 years, you’ll be seeing transgenic trees on the market,” said Maud Hinchee, the chief technology officer at ArborGen.

Two genetically engineered trees are approved by the Agriculture Department, both for crops: papaya trees resistant to the ringspot virus, and plum trees resistant to plum pox virus.

The only known approval of a genetically engineered forest tree has come in China, where insect-resistant poplars have been widely planted.

Genetically modifying forest trees raises questions beyond those of crops. Trees can establish themselves in the wild, while corn would have trouble surviving without a farmer’s tender care.

A biologist, Claire Williams, said the wind could carry pollen from some trees like pines hundreds of miles, making it difficult to prevent a trait like reduced lignin from spreading to wild trees.

Dr. Williams, who works for the State Department but was interviewed while she was working at Duke, said the long life spans of trees made it “almost impossible to evaluate the long-term consequences of transgenic trees.”

Loblolly pine, the main tree the forest industry grows in the Southeast, takes 25 years to go from seed to harvest.

Critics also say transgenic trees would usually be grown on plantations, which, they say, lack the beauty and wildlife of natural forests.

Supporters of transgenic tree research say that because of the long time it takes to grow trees, conventional breeding is difficult.

“The only way to domesticate trees is through genetic engineering,” said Richard Meilan, associate professor of molecular tree physiology at Purdue. He said plantations of fast-growing trees for energy production would reduce the need to cut trees in natural forests. “Let’s domesticate those trees and grow them as commodities and not sacrifice our wild forests,” Dr. Meilan said.

The low-lignin trees, some experts say, have not been tested enough under real field conditions. “To mess with physiology like this, you really need to get out of the laboratory,” said Steven H. Strauss, a professor of forest science at Oregon State University who has conducted field tests of transgenic trees.

The one big field trial of low-lignin trees, conducted over four years in Britain and France, found that they appeared to grow normally and were not more vulnerable to insects, according to a paper published by the investigators in Nature Biotechnology in 2002.

And Jeffrey F. Pedersen, a research geneticist for the Agriculture Department in Lincoln, Neb., found that sorghum with reduced lignin was actually more resistant to a particular fungus than similar varieties with normal levels. He said arresting lignin production could lead to a buildup in the plant of chemical lignin precursors that also have pathogen-fighting properties.

Dr. Chiang of North Carolina State said his trees appeared normal, at least in the greenhouse. He has found that trees that produce less lignin might produce more cellulose, making them even more useful in producing ethanol, pulp or paper without reducing tree strength.

Some field tests are under way outside the United States, Dr. Chiang said, by corporate sponsors of his research who do not want to be identified.

Dr. Hinchee said ArborGen was aiming to reduce lignin 10 percent to 20 percent, to be on the safe side. “It’s not to our advantage to have a tree that’s weak in some other way,” she said.

Rather than reduce lignin, Purdue researchers, working under a $1.4 million three-year grant from the Energy Department, are trying to alter it.

Lignin can be made of two types of alcohols, said Clint Chapple, a biochemist who is working on the project with Professors Meilan and Ladisch. Pulp and paper companies know that one type is easier to remove. By boosting or inactivating various genes, the scientists plan to create trees with different mixes of the two alcohols and test how easy it is to make ethanol.

Dr. Meilan said that after determining an optimal composition, the team hoped to find such trees in the wild that could be reproduced, eliminating the need for genetic engineering.

But it is not certain that can be done. “I believe in the end,” he said, “we will have to rely on genetically engineered trees for our energy plantations.”
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

There is a related pictorial representation linked at:
http://www.nytimes.com/2007/11/21/scien ... ref=slogin

November 21, 2007
Scientists Bypass Need for Embryo to Get Stem Cells
By GINA KOLATA
Two teams of scientists reported yesterday that they had turned human skin cells into what appear to be embryonic stem cells without having to make or destroy an embryo — a feat that could quell the ethical debate troubling the field.

All they had to do, the scientists said, was add four genes. The genes reprogrammed the chromosomes of the skin cells, making the cells into blank slates that should be able to turn into any of the 220 cell types of the human body, be it heart, brain, blood or bone. Until now, the only way to get such human universal cells was to pluck them from a human embryo several days after fertilization, destroying the embryo in the process.

The need to destroy embryos has made stem cell research one of the most divisive issues in American politics, pitting President Bush against prominent Republicans like Nancy Reagan, and patient advocates who hoped that stem cells could cure diseases like Alzheimer’s. The new studies could defuse the issue as a presidential election nears.

The reprogrammed skin cells may yet prove to have subtle differences from embryonic stem cells that come directly from human embryos, and the new method includes potentially risky steps, like introducing a cancer gene. But stem cell researchers say they are confident that it will not take long to perfect the method and that today’s drawbacks will prove to be temporary.

Researchers and ethicists not involved in the findings say the work, conducted by independent teams from Japan and Wisconsin, should reshape the stem cell field. At some time in the near future, they said, today’s debate over whether it is morally acceptable to create and destroy human embryos to obtain stem cells should be moot.

“Everyone was waiting for this day to come,” said the Rev. Tadeusz Pacholczyk, director of education at the National Catholic Bioethics Center. “You should have a solution here that will address the moral objections that have been percolating for years,” he added.

The White House said that Mr. Bush was “very pleased” about the new findings, adding that “By avoiding techniques that destroy life, while vigorously supporting alternative approaches, President Bush is encouraging scientific advancement within ethical boundaries.”

The new method sidesteps other ethical quandaries, creating stem cells that genetically match the donor without having to resort to cloning or the requisite donation of women’s eggs. Genetically matched cells would not be rejected by the immune system if used as replacement tissues for patients. Even more important, scientists say, is that genetically matched cells from patients would enable them to study complex diseases, like Alzheimer’s, in the laboratory.

Until now, the only way most scientists thought such patient-specific stem cells could be made would be to create embryos that were clones of that person and extract their stem cells. Just last week, scientists in Oregon reported that they did this with monkeys, but the prospect of doing such experiments in humans has been ethically fraught.

But with the new method, human cloning for stem cell research, like the creation of human embryos to extract stem cells, may be unnecessary. The new cells in theory might be turned into an embryo, but not by simply implanting them in a womb.

“It really is amazing,” said Dr. Leonard Zon, director of the stem cell program at Children’s Hospital Boston at Harvard Medical School.

And, said Dr. Douglas A. Melton, co-director of the Stem Cell Institute at Harvard University, it is “ethically uncomplicated.”

For all the hopes invested in it over the last decade, embryonic stem cell research has moved slowly, with no cures or major therapeutic discoveries in sight.

The new work could allow the field to vault significant problems, including the shortage of human embryonic stem cells and restrictions on federal financing for such research. Even when scientists have other sources of financing, they report that it is expensive and difficult to find women who will provide eggs for such research.

The new discovery is being published online today in Cell, in a paper by Shinya Yamanaka of Kyoto University and the Gladstone Institute of Cardiovascular Disease in San Francisco, and in Science, in a paper by James A. Thomson and his colleagues at the University of Wisconsin. Dr. Thomson’s work received some federal money.

While both groups used just four genes to reprogram human skin cells, two of the genes used differed from group to group. All the genes in question, though, act in a similar way — they are master regulator genes whose role is to turn other genes on or off.

The reprogrammed cells, the scientists report, appear to behave very much like human embryonic stem cells but were called “induced pluripotent stem cells,” meaning cells that can change into many different types.

“By any means we test them they are the same as embryonic stem cells,” Dr. Thomson says.

He and Dr. Yamanaka caution, though, that they still must confirm that the reprogrammed human skin cells really are the same as stem cells they get from embryos. And while those studies are under way, Dr. Thomson and others say, it would be premature to abandon research with stem cells taken from human embryos.

Another caveat is that, so far, scientists use a type of virus, a retrovirus, to insert the genes into the cells’ chromosomes. Retroviruses slip genes into chromosomes at random, sometimes causing mutations that can make normal cells turn into cancers.

One gene used by the Japanese scientists actually is a cancer gene.

The cancer risk means that the resulting stem cells would not be suitable for replacement cells or tissues for patients with diseases, like diabetes, in which their own cells die. But they would be ideal for the sort of studies that many researchers say are the real promise of this endeavor — studying the causes and treatments of complex diseases.

For example, researchers could make stem cells from a person with a disease like Alzheimer’s and turn the stem cells into nerve cells in a petri dish. Then they might learn what goes awry in the brain and how to prevent or treat the disease.

But even the retrovirus drawback may be temporary, scientists say. Dr. Yamanaka and several other researchers are trying to get the same effect by adding chemicals or using more benign viruses to get the genes into cells. They say they are starting to see success.

“Anyone who is going to suggest that this is just a sideshow and that it won’t work is wrong,” Dr. Melton predicted.

The new discovery was preceded by work in mice. Last year, Dr. Yamanaka published a paper showing that he could add four genes to mouse cells and turn them into mouse embryonic stem cells.

He even completed the ultimate test to show that the resulting stem cells could become any type of mouse cell. He used them to create new mice. Twenty percent of those mice, though, developed cancer, illustrating the risk of using retroviruses and a cancer gene to make cells for replacement parts.

Scientists were electrified by the reprogramming discovery, Dr. Melton said. “Once it worked, I hit my forehead and said, ‘It’s so obvious,’” he said. “But it’s not obvious until it’s done.”

The work set off an international race to repeat the work with human cells.

“Dozens, if not hundreds of labs, have been attempting to do this,” said Dr. George Daley, associate director of the stem cell program at Children’s Hospital.

Ever since the birth of Dolly the sheep in 1996, scientists knew that adult cells could, in theory, turn into embryonic stem cells. But they had no idea how to do it without cloning, the way Dolly was created.

With cloning, researchers put an adult cell’s chromosomes into an unfertilized egg whose genetic material was removed. The egg, by some mysterious process, then does all the work. It reprograms the adult cell’s chromosomes, bringing them back to the state they were in just after the egg was fertilized. A few days later, a ball of stem cells emerges in the embryo, and every cell of the embryo, including its stem cells, is an exact genetic match of the adult.

The abiding questions, though, were: How did the egg reprogram the adult cell’s chromosomes? Would it be possible to reprogram an adult cell without using an egg?

About four years ago, Dr. Yamanaka and Dr. Thomson independently hit upon the same idea. They would search for genes that are being used in an embryonic stem cell that are not being used in an adult cell. Then they would see if those genes would reprogram an adult cell.

Dr. Yamanaka worked with mouse cells, and Dr. Thomson worked with human cells from foreskins.

The researchers found more than 1,000 candidate genes. So both groups took educated guesses, trying to whittle down the genes to the few dozen they thought might be the crucial ones and then asking whether any combinations of those genes could turn a skin cell into a stem cell.

“The number of factors could have been 1 or 10 or 100 or more,” Dr. Yamanaka said in a telephone interview from his laboratory in Japan.

If many genes had been required, the experiments would have failed, Dr. Thomson said, because it would have been impossible to test all the gene combinations.

As soon as Dr. Yamanaka saw that the mouse experiments succeeded, he began trying the same brute force method in human skin cells that he had ordered from a commercial laboratory. Some were face cells from a 36-year-old white woman and others were connective tissue cells from joints of a 69-year-old white man.

Dr. Yamanaka said he thought it would take a few years to find the right genes and the right conditions to make the human experiments work. Feeling the hot breath of competitors on his neck, he was in his laboratory every day for 12 to 14 hours a day, he said.

A few months later, he succeeded.

“We did work very hard,” Dr. Yamanaka said. “But we were very surprised.”
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Afghans bitten hard by cellphone bug

Kelly Cryderman
From CanWest News Service


Sunday, November 25, 2007


In a country where roads are often impassable, travel is fraught with danger and recent history recalls many Afghans taking the long road to Pakistan just to make a call, the mobile phone is king.

Afghanistan's cellphone networks may be new and terribly unreliable, but they're spreading like wildfire across the country, aiding everyone from women entrepreneurs, to criminal gangs operating in the desert, to regular Afghans who previously couldn't call their relatives.

"It is due to this public call office that I am supporting my family," said Kandahar City resident Qudratullah, 24, who operates a tiny kiosk called a PCO where the many Afghans who can't afford cellphones can pay to make calls.

"I want to be a teacher or a businessman," said Qudratullah, who is able to pay for classes that would put him in Grade 10 in Canada and who, like many Afghans, has only one name.

Across the courtyard from Qudratullah's wooden shack is foodstuff shopkeeper Mohammed Anwer Zarif, who said just a few years ago he had to travel to Kabul, Herat or Pakistan to place his product orders. He said there was no other reliable way to communicate.

Now he can just call his suppliers when he needs a new shipment. "Then quickly they send the stuff," Zarif said.

The telecommunications industry was close to non-existent before the Taliban were overthrown in 2001. But there's room for tremendous growth now: Few land lines exist in Afghanistan and just four million of its 32 million inhabitants are mobile subscribers. "It's right at the heart of our investment promotion," said Omar Zakhilwal, president and CEO of the Afghanistan Investment Support Agency, which licenses and promotes businesses across the country.

There are many challenges, including that many Afghans live in areas without regular electricity to charge their phones. Fuel for the generators running the cellphone towers and hiring the security are expensive.

Still, a company called Etisalat became the fourth service provider in August and there's room for even more competition, Zakhilwal said.

The cellphone industry is growing, he said, because it realizes a quick profit relative to other businesses. Security, moreover, isn't as big a concern as in other sectors. Criminal elements or the Taliban, who regularly battle Canadian Forces in Kandahar province, know they need the cellphone towers that are springing up. "The insurgents in the south, in any part of the country, absolutely rely on the services of the mobile phone," Zakhilwal said. "It's a benefit for everyone, friends or foes."

Large parts of the country, where the central government has little or no control, are still racked with periodic violence. Many foreign companies are leery about doing business in Afghanistan.

But not so much for the mobile industry, Zakhilwal said. Five years ago, there was no investment in the sector, but next year it's poised to hit $1 billion US, he said.

It appears cellphones are also helping women who, six years after the fall of the Taliban, are still much less seen in public than men.

kcryderman@theherald.canwest.com

***
The incredible human nose

Reese Halter
Calgary Herald


Sunday, November 25, 2007



CREDIT: Herald Archive, Reuters
Smell was central to human evolution over the last seven million years, protecting us from rotten foods and poisons.

The human nose is miraculous. It is a complex organ of smell. In fact, one per cent of human genes are devoted to olfaction; smell was central to our evolution over the past seven million years.

The main role of smell is to protect humans from decaying foods and poisons. Foods that are indigestible tend to smell woody or musky and are made up of large molecules. Edible foods, on the other hand, have low molecular weights, which can be processed by our digestive enzymes.

The primary role of scent is not about sex.

Interestingly, the exact way the nose works -- that is, the way we smell -- is not fully understood. Scientists know there are receptor neurons continually shooting axons or neutral connecting wires up to the olfactory bulb -- the brain's control centre for smell.

There are at least 1,000 different receptors in the human nose that are able to distinguish between some 10,000 different molecules.

There are 112 known types of atoms. The human nose can detect most scent molecules because they are made up of carbon, hydrogen, oxygen, nitrogen and sulfur.

Humans sneeze in order to shed viruses and things trying to get inside us.

When we breathe through our nose, we tend to breathe through only one side of it for a while, then for a while through the other side. The erectile tissue lining our nasal cavity works only one passage at a time. This is called the nasal cycle.

Smell information from the right side of the nasal passage is sent to the left side of the brain and vice versa. Incidentally, when you breathe from the left nasal passage your verbal skills increase significantly.

Women have a better sense of smell than men and it lasts longer. As humans age we experience a loss of smell. Epileptics show considerable smell loss. And loss of smell is one of the first symptoms of the onset of Parkinson's disease.

The loss of the sense of smell is known as anosmia.

An atom is made up of a cloud of electrons frenetically orbiting a hard core of protons and neutrons. All molecules pulse with vibrations. Those vibrations are the result of the electron string holding them together. In essence, molecules act like a musical instrument. Each emits its own unique set of

vibration notes.

Until about 10 years ago, scientists believed that the human nose was able to discern the scent of a molecule based solely from its shape.

In the late 1990s, a biophysicist named Dr. Luca Turin put forward a theory on the way the human nose functions -- it caused quit a buzz in the science community.

Turin used his obsession with perfume to redefine how the human nose functioned. He devised a scale of molecular vibration ranging between 0 and 4,000, or the number of times an electron bounced back and forth as it bonded one atom to another. Within just a few vibrations, a molecule can either smell of shoe leather or rose tea or shrimp shells.

Turin proposed that as the molecules enter the nose, electrons are met by microscopic electrical protein receptors, embedded in nasal flesh, and that the receptor holds the molecule and reads its vibration, not its shape.

Years of experimentation, tens of thousands of smells later and about a decade since the theory was first presented, it's finally starting to be accepted.

The scent of Tide laundry detergent, Clorox bleach, Calvin Klein, Channel, L'Oreal and Miyake all come from six huge corporations: International Flavors and Fragrances, Givaudan Roure, Quest International, Firmenich, Takasago, and Haarmann & Reimer. It is a $20-billion-a-year industry.

Armies of organic chemists create between 500 and 2,000 different new molecules at each company per year. About 20 interesting and strong molecules per company are used each year for a host of new products.

To me there is nothing as remarkable as natural scents of wild forests. My favourite pungent scent comes from the peppermint leaves of colossal montane Eucalyptus delegatensis or woollybutts of the Australian Victorian Alps.

Reese Halter is a conservation biologist. He can be reached through www.DrReese.com

© The Calgary Herald 2007
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

November 27, 2007
After Stem-Cell Breakthrough, the Work Begins
By ANDREW POLLACK

If stem cell researchers were oil prospectors, it could be said that they struck a gusher last week. But to realize the potential boundless riches they now must figure out how to build refineries, pipelines and gas stations.

Biologists were electrified on Tuesday, when scientists in Japan and Wisconsin reported that they could turn human skin cells into cells that behave like embryonic stem cells, able to grow indefinitely and to potentially turn into any type of tissue in the body.

The discovery, if it holds up, would decisively solve the raw material problem. It should provide an unlimited supply of stem cells without the ethically controversial embryo destruction and the restrictions on federal financing that have impeded work on human embryonic cells.

But scientists still face the challenge of taking that abundant raw material and turning it into useful medical treatments, like replacement tissue for damaged hearts and brains. And that challenge will be roughly as daunting for the new cells as it has been for the embryonic stem cells.

“Even though we have this nice new sources of cells, it doesn’t solve all the downstream problems of getting them into the body in useful form,” said James A. Thomson of the University of Wisconsin, who led one of the teams that developed the stem cell substitutes. Dr. Thomson was also the first to isolate human embryonic stem cells, about a decade ago.

Still, the new discovery should accelerate progress — if only because with the ethical issues seemingly out of the way, more scientists and money will be drawn to the field.

There are two ways that stem cells can lead to treatments for diseases. Making replacement tissues for ailing organs is the direct way. But many scientists say the biggest impact of the new cells will be on the indirect way: using the cells to learn about diseases and then applying that knowledge to develop conventional drugs.

Using the new technique, scientists could take a skin cell from a person with a certain disease and generate stem cells. Those cells could then be turned into other cells, allowing the scientists to look at neurons from a person with Alzheimer’s disease, say, or heart cells from a person with heart failure. And a pharmaceutical company might get an early read on a new Alzheimer’s drug by trying it out on the newly created neurons.

“You cannot really go to a patient and say, ‘I want to study your brain,’” said Dr. Lorenz Studer, who works on neural stem cells at the Memorial Sloan-Kettering Cancer Center. “For the first time it gets us access to these cells.”

Some scientists have been trying to make disease-specific embryonic cells by creating a cloned embryo of a person with the disease. But that effort requires women to undergo sometimes risky treatments to donate their eggs.

Some diseased cells, like those from a tumor biopsy, are already available for study, but those are from a person already sick. The new approach would allow scientists to watch the disease as it developed and potentially design drugs not just to treat it but to prevent it.

“This is a whole new way of thinking about how we might investigate human disease,” said Kenneth S. Zaret, program leader for cell and developmental biology at the Fox Chase Cancer Center in Philadelphia.

Just this month, Israeli scientists reported in the journal Cell Stem Cell that they had created stem cell lines from embryos donated by families with a history of fragile X syndrome, a disease that leads to mental retardation and is caused by the silencing of a particular gene. Studying the stem cells, they got a better understanding of when and how this silencing occurred.

Still, it is not yet clear how useful this new approach will be. Will a neuron from an Alzheimer’s patient have to sit in a petri dish for 70 years before it becomes diseased? Or, as is the case with some diseases, will the neurons have to interact with other types of cells?

Moreover, scientists already have many tools to figure out causes of disease — imaging systems that can peer into cells, knockout mice, genome studies. But it is not always easy to translate knowledge about a disease into a treatment. And even if it were, it still takes years of testing in animals and people before a drug can reach the market.

The gene responsible for Huntington’s disease was discovered in 1993, but there is still no cure. And the decoding of the human genome, contrary to some early expectations, has not led to a burst of new drugs, at least not yet.

When it comes to the direct approach, creating replacement cells and tissues for transplants, there are many challenges for both cells. Scientists do not envision transplanting embryonic stem cells themselves, either the real ones or the new close imitations, because they could turn into tumors inside the body.

So the idea is to differentiate the stem cells into specific types of cells. Scientists have made progress in creating some cell types, like the dopamine-producing neurons that might treat Parkinson’s disease. Other cell types are proving more difficult, like insulin-producing islet cells to treat diabetes.

The transplanted cells must be very pure, because any remnants of the original stem cells might turn into tumors, said Dr. Steven A. Goldman, a neurologist at the University of Rochester. He and colleagues implanted dopamine-producing neurons derived from human embryonic stem cells into mice with Parkinson’s disease. While their symptoms improved, they all got brain tumors.

Another challenge is to get the cells to hook up correctly with what is already in the body. Scientists at the Karolinska Institute in Sweden injected neural stem cells into rats with spinal cord injuries. The rats’ motor ability improved, but the implant prompted nerve growth in a way that made even a slight touch painful.

Despite the challenges, two biotechnology companies hope next year to begin the first clinical trials of therapies derived from human embryonic stem cells. Geron plans to test a type of neural cell as a treatment for spinal cord injuries, and Advanced Cell Technology wants to plant retinal epithelium cells into the eye to treat retina diseases.

The new cells have a big strike against them. They were made by inserting four genes into skin cells, causing the cells to revert back to a blank slate.

But the viruses used to carry the genes into the cells incorporate themselves into the cells’ DNA at random places, potentially causing mutations and cancers. And one of the genes used by the Japanese team is known to cause cancer.

The Food and Drug Administration “would never allow us to use those virally modified cells in patients,” said Dr. Robert Lanza, the chief scientific officer of Advanced Cell Technology.

Scientists are exploring ways to reprogram the skin cells without those viruses. But any genetically engineered cell is likely to face scrutiny from the drug agency.

On the other hand, the new cells have one advantage over the embryonic cells. Stem cells can be derived from a patient’s own skin cells, so tissue made from those stem cells would not be rejected by the immune system. Trying to do that with embryonic stem cells would require cloning.

Another possible advantage could be fewer intellectual property restrictions. Some scientists working with embryonic stem cells say their work has been encumbered by the requirement to get a license from the patent holder, the University of Wisconsin.

Wisconsin is applying for patent protection on the new technique but does not intend to require academic scientists to get a license.

“They can do it in their own lab,” said Carl E. Gulbrandsen, the managing director of the Wisconsin Alumni Research Foundation, the university’s patenting arm. “They don’t have to tell me about it, and I don’t really have to know.”

Despite all the remaining challenges, scientists say there is no denying the magnitude of the advance made last week. “It’s exciting, it’s seminal, it’s major — quite frankly I think it’s potentially Nobel-level,” said Dr. Kenneth R. Chien, director of the cardiovascular disease program at the Harvard Stem Cell Institute. “But there’s still a lot more work to do.”
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

December 4, 2007
Findings
In the Future, Smart People Will Let Cars Take Control

By JOHN TIERNEY
As the baby boomers cruise into their golden years, I have good news for them — and for everyone else in danger of being run over by these aging drivers. The boomers will not be driving like Mr. Magoo. An electronic chauffeur will conduct them on expressways, drop them at the mall entrance and then go park their cars.

If you doubt this prediction, I don’t blame you. The self-driving car ranks right up there with the personal hovercraft as the futurist vision that never comes true. In 1969, Disney unveiled Herbie the Love Bug; in 1940, Popular Mechanics promised a car that would chauffeur you across America in a single day to visit Aunt Lillian.

At the 1939 World’s Fair, the crowds at the General Motors Futurama exhibit saw traffic speeding 100 miles per hour thanks to electronic help. “Safe distance between cars is maintained by automatic radio control,” a voice explained as visitors looked down on the vast diorama of the World of Tomorrow, complete with hangars for dirigibles and landing decks for autogyros.

“Does it seem strange? Unbelievable?” the announcer intoned. “Remember, this is the world of 1960!”

O.K., so they were a little off on the date. But today, finally, those electronically spaced cars are on the highway. You can buy cars with “adaptive cruise-control” that automatically slow down if the radar or laser detects you tailgating. Your car can warn you when you stray across lane markings, and these kinds of sensors are already being used experimentally in cars that drive themselves.

These smart cars still have their bugs, but engineers have made amazing progress the past several years. In 2004, when the Defense Advanced Research Projects Agency held its first Grand Challenge for driverless cars, none made it more than seven miles. At Darpa’s next Grand Challenge, in 2005, five cars made it 132 miles to the finish. And then, last month, six cars completed a 60-mile course that was the grandest challenge yet because they had to deal with traffic along the way.

These empty cars drove themselves around an Air Force base in Southern California, finding parking spots, obeying stop signs, idling in traffic, yielding to other cars at intersections and merging into traffic at 30 m.p.h. There was one accident and a few near misses, but the cars’ engineers are so buoyed by the results that they’re hoping the next competition will be a high-speed race on a Grand Prix course.

“Within five years, it’s totally feasible to build an autonomous car that will work reliably in several limited domains,” says Sebastian Thrun, a computer scientist at Stanford and head of its racing team, which won the 2005 Darpa competition and finished second in last month’s. In five years he expects a car that could take over simple chores like breezing along an expressway, inching along in stop-and-go traffic, or parking in the lot at a mall or airport after dropping off the driver. In 20 years, Dr. Thrun figures half of new cars sold will offer drivers the option of turning over these chores to a computer, but he acknowledges that’s just an educated guess. While he doesn’t doubt cars will be able to drive themselves, he’s not sure how many humans will let them.

Some people won’t ever want to yield control; others will worry that the first smart cars will be like the early versions of Windows. There will be many, many car-computer jokes involving the word “crash.”

But cars, unlike humans, will keep getting smarter. They will learn from their mistakes. They will not get distracted by cellphone calls. They will not drive drunk. Smart cars will never be infallible, but they don’t have to be. They just have to be better than the drivers who now cause more than 90 percent of traffic accidents and kill a million of their fellow humans per year. Smart cars might make their debut in special lanes where each car has to enter through a checkpoint (at highway speeds) to make sure its systems are working properly. Drivers would be enticed with another promise of smart cars: no traffic jams.

When a freeway filled with human drivers is operating at full capacity, Dr. Thrun notes, the cars actually occupy less than 10 percent of the road’s surface area. The rest is empty space between cars. Smart cars could be grouped more closely together, doubling or tripling the road’s capacity, as engineers have demonstrated by running a platoon of driverless Buicks, spaced just 15 feet apart, at 65 m.p.h. down Interstate 15 near San Diego.

When that experiment was done, in 1997, it seemed an impractical idea because the cars were guided by magnets embedded in the road, and it was hard to imagine building such “smart roads” across the country. But since then, cars have gotten so much smarter that they can navigate old-fashioned dumb roads.

In the near future, guided not just by G.P.S. satellites but by high-precision internal maps and inertial sensors, they’ll know their position so precisely that they won’t even need lane markings for guidance. They’ll communicate with other smart cars on the road, enabling a swarm of closely spaced cars to move in unison (and react more quickly to problems than humans drivers could). A road system filled with these cars wouldn’t even need traffic lights — the cars could just talk among themselves.

If, according to Moore’s Law, computing power keeps doubling every couple years, human drivers will soon be outclassed by computers just as chess players were. The only question will be how long it takes humans to adapt to these new chauffeurs. Some experts think smart cars won’t become common before 2050, but I’d bet on it happening sooner.

And even if humans stubbornly cling to the steering wheel, they could still end up sharing the road with smart cars. By around 2030, according to some believers in Moore’s Law, there will be computers more powerful than the human brain, leading to the emergence of superintelligent “post-humans.” If these beings do appear, I have no doubt how they’ll get around. They’d never be stupid enough to get in a car driven by a member of Mr. Magoo’s species.

Interesting photos and links at:
http://www.nytimes.com/2007/12/04/scien ... ?th&emc=th

*****

December 4, 2007
One Last Ride to the Hubble
By DENNIS OVERBYE
GREENBELT, Md. — It’s the last roundup for the People’s Telescope.

Next August, after 20 years of hype, disappointment, blunders, triumphs and peerless glittering vistas of space and time, and four years after NASA decided to leave the Hubble Space Telescope to die in orbit, setting off public and Congressional outrage, a group of astronauts will ride to the telescope aboard the space shuttle Atlantis with wrenches in hand.

That, at least, is the plan.

“It’s been a roller coaster ride from hell,” Preston Burch, the space telescope’s project manager, said in his office here at the Goddard Space Flight Center of the controversy and uncertainty.

In a nearby building, the Hubble’s astronaut knights — dressed as if for surgery, in white gowns, hoods and masks —swarmed through a giant clean room to kick the tires, so to speak, of new instruments destined for the Hubble and to try out techniques and tools under the watchful eye of the Goddard engineers. They practiced sliding a new wide-field camera 3, suspended in air like a magician’s grand piano, in and out of its slot on a replica of the telescope that is mechanically and electrically exact down to the tape around the doors. “We have to train their minds and bodies,” said Michael Weiss, the deputy project manager of Hubble, adding that when the astronauts see the real telescope in orbit, “they say they’ve seen it before.”

Spacewalking astronauts have refurbished the Hubble four times in the last two decades; but the trip planned for August, almost everybody agrees, really will be the last service call. The shuttles are scheduled to stop flying in 2010, and without periodic maintenance, the telescope’s gyroscopes and batteries are expected die within about five years.

Astronauts, engineers and scientists here say they are resolved to pull off the most spectacular rejuvenation of the telescope yet, one, they say, that will leave it operating at the apex of its abilities well into the next decade so that it can go out in a blaze of glory.

“It will be a brand new telescope, practically,” said Matt Mountain, director of the Space Telescope Science Institute on the Johns Hopkins campus in Baltimore. He added, “We want to return crackerjack science we can be proud of.”

The last visit, Dr. Mountain explained, is unique. “You don’t have to do routine maintenance,” he said. “It’s like a car you’re only going to keep another 20,000 miles. You don’t buy new tires.”

Engineers and project managers are busy mapping out five days of spacewalks.

If all goes well — never a given 350 miles above Earth — the astronauts will install a new camera and spectrograph and change out all the gyroscopes that keep it properly pointed and the batteries that keep it running. They are also planning to repair a broken spectrograph and the Hubble’s workhorse, the Advanced Camera for Surveys, which had a severe short-circuit last winter and was pronounced at the time probably beyond repair.

Dramatic turnabouts have characterized the history of the Hubble telescope, which was hailed before its launching in April 1990 as the greatest advance in astronomy since Galileo invented the telescope.

In space, the Hubble would be able to discern details blurred by the turbulent murky atmosphere. But its 94-inch diameter mirror turned out to have been polished to the wrong shape, leaving it with what astronomers call a spherical aberration. The Hubble became branded as a “technoturkey.”

In 1993, astronauts fitted the telescope with corrective lenses (at the cost of removing one of its five main instruments, a photometer), and the cosmos snapped into razorlike focus.

Three more visits by astronauts kept the Hubble running and, by replacing old instruments, actually made it more powerful. Along the way, the astronauts graduated from yanking equipment fitted with large astronaut-friendly handles to operating on instruments never meant to be repaired by people wearing the equivalent of boxing gloves in space.

In 2002, after an infrared camera named Nicmos unexpectedly ran out of coolant, the astronauts attached a mechanical refrigerator to run coolant through its pipes. A year later, the Hubble’s astronomers used the rejuvenated camera along with the advanced survey camera to record the deepest telescopic views ever obtained of the universe. The images captured galaxies as they existed a few hundred million years after the beginning of time.

“When you have an instrument that reaches so far beyond what you’ve ever had before, you make discoveries that nobody ever thought of before,” said John Grunsfeld, who will be the payload commander on the Atlantis mission. “And we see things that nobody ever saw before. As a result, you know, Hubble became not just an observatory, but an icon for all of science. And Hubble has become part of our culture.”

That status did not come cheaply.

Edward Weiler, director of the Goddard center and formerly associate administrator for science at NASA, estimated that over the years the Hubble had cost $9 billion. “There are few people, especially Americans, who won’t say it was worth it,” he said.

All this seemed doomed to a premature end after the shuttle Columbia disaster in 2003 that killed its crew of seven. Sean O’Keefe, who was then the NASA administrator, declared that a shuttle flight to the telescope was too risky because, unlike the space station, it offered no safe haven if anything went wrong with the shuttle. The public was appalled. Schoolchildren even offered to send their pennies to NASA to keep the telescope going.

Some astronomers and engineers challenged the reasoning of Mr. O’Keefe, whose background was in public administration, and not engineering. Others in the space science community, noting that the science budget was being squeezed by President Bush’s Moon-Mars initiative, suggested that it was time to move on and that the Hubble repair money might be better spent on other science projects.

“Everybody could see where he was coming from,” David Leckrone of Goddard, the Hubble’s project scientist, said, referring to Mr. O’Keefe’s distress about the Columbia and a mandate for increased emphasis on safety. But, he added, “It seemed so un-NASA-like. We would never have sent anybody to the Moon if we were so risk averse.”

“I thought we were dead,” Dr. Leckrone said. “As long as he was administrator, it stuck.”

In February 2005, however, Mr. O’Keefe resigned to become chancellor at Louisiana State University in Baton Rouge. His successor, Michael Griffin, who has a Ph.D. in aerospace engineering, instituted a rigorous risk analysis, culminating in a two-day meeting of experts that concluded it was no riskier to fly to the telescope than to go to the space station. In fall 2006, after the shuttles had begun flying again, Dr. Griffin approved the Hubble mission to a standing ovation from scientists and engineers.

“We all agree the risks are acceptable,” Dr. Leckrone said. “Griffin led us through that process with a good deal of intellectual vigor. He didn’t fake it.”

As a backup, NASA will have the shuttle Endeavor, which is scheduled for a September mission to the space station, prepped for a quick launching if a rescue is needed.

In the meantime, engineers, challenged by Mr. O’Keefe to keep the Hubble going as long as possible, learned to run it on a kind of austerity program, using two gyroscopes to keep the telescope pointed instead of the usual three (one for each dimension in space). They also learned how to preserve the batteries, which derive power from solar panels in the sunlit part of each orbit and provide electricity in the dark part. As a result, the batteries, which degraded rapidly for years, are now actually slightly stronger than before, the engineers say, and the Hubble has a healthy gyroscope in reserve in case one fails.

“If it weren’t for two-gyro science,” Mr. Weiss, the deputy project manager, said, “the next gyro failure would take us out of science.”

Besides Dr. Grunsfeld, who has been to Hubble twice, the crew includes Cmdr. Scott Altman, who led a Hubble mission in 2002; the pilot, Gregory Johnson; and the mission specialists, Andrew J. Feustel, Megan McArthur, Col. Mike T. Good and Michael J. Massimino, who also worked on the Hubble in 2002 and performed two spacewalks.

The new wide-field camera was designed to extend the Hubble’s vision into the ultraviolet wavelengths characteristic of the hottest stars and into the longer infrared wavelengths characteristic of cool stars, complementing the abilities of the advanced survey camera. It will replace the wide-field planetary camera 2, which has been in the telescope since 1993 and has been its only visible-light camera for the last year.

When the old camera is slid out, perhaps as early as the first spacewalk, will be “a heart-stopping moment,” Dr. Mountain said.

Dr. Grunsfeld’s crew will install another new instrument, the Cosmic Origins Spectrograph, into the slot now occupied by an old corrective optics package known as Costar that is no longer needed.

The instruments installed on the Hubble since the 1993 repair were built taking the mirror’s aberration into account. The new spectrograph is also designed to be sensitive to invisible ultraviolet light. Astronomers hope to use it to map a so-called “cosmic web,” stretching through intergalactic space, in which two-thirds of atoms in the universe are thought to be drifting and hiding.

Those tasks will be the easier parts.

One of the bigger challenges of the mission will be surgery on the Space Telescope Imaging Spectrograph, which can take pictures of things and break down their light to analyze their composition. The spectrograph had an electrical failure in 2004. To get inside the spectrograph, 111 screws that were never meant to be removed in space have to be unscrewed and kept from floating off. The plan is to clamp a plate over them beforehand and unscrew them through tiny holes.

No such option exists for the Advanced Camera, the choice for 70 percent of Hubble’s prospective users and the chief dark-energy-hunting instrument on or off the planet. It suffered a huge short-circuit in its power supply last winter.

In a task that could be spread over two spacewalks, the astronauts will clamp a new power supply to the outside of the camera. From there, according to ground tests, power can be fed back inside to the other parts of the camera through existing wires, unless they were damaged in the short-circuit.

In one additional piece of business, the astronauts will attach a grapple fixture to the bottom of the telescope so that a robot spacecraft could grab it and attach a rocket module in the future. The rocket would then drop the telescope into the ocean.

But that time is not yet. The telescope’s orbit will be stable through 2024, according to recent calculations.

All of this work could, in principle, be performed in the allotted five days of spacewalks. In that case, when the Atlantis pulls away and human eyes glimpse the Hubble for the last time in person, the telescope would have its full complement of instruments to dissect the light from the cosmos for the first time since 1993.

Running down a list of subjects like planets around other stars, dark energy and the structure of the universe, Dr. Leckrone called the telescope a toolkit for discovery. Noting that any astronomer in the world could propose to use it, he said: “A lot of brain power comes to Hubble. It’s mouthwatering to think of what they will do with it.”

Asked whether the astronomers were tempted to run the rejuvenated instrument frugally to prolong its life beyond its anticipated 2013 demise, Dr. Mountain said the idea was to go for broke.

“We don’t want to trade science for false longevity,” he said.

There are interesting multimedias and links at:
http://www.nytimes.com/2007/12/04/scien ... ref=slogin
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Real and virtual worlds

Better together

Dec 6th 2007
From The Economist print edition


The internet, supposedly a new realm, is most useful when coupled to the real world

Illustration by David Simonds

IN THE early days of the internet, the idea that it represented an entirely new and separate realm, distinct from the real world, was seized upon by both advocates and critics of the new technology. Advocates liked the idea that the virtual world was a placeless datasphere, liberated from constraints and restrictions of the real world, and an opportunity for a fresh start. This view was expressed most clearly in the “Declaration of the Independence of Cyberspace” issued by John Perry Barlow, an internet activist, in February 1996. “Governments of the industrial world, you weary giants of flesh and steel, I come from cyberspace, the new home of mind,” he thundered. “Cyberspace does not lie within your borders. Our world is different. We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth.”

Where Mr Barlow and other cyber-Utopians found the separation between the real and virtual worlds exciting, however, critics regarded it as a cause for concern. They worried that people were spending too much time online, communing with people they had never even met in person in chat rooms, virtual game worlds and, more recently, on social-networking sites such as MySpace and Facebook. A study carried out by the Stanford Institute for the Quantitative Study of Society in 2000, for example, found that heavy internet users spent less time talking to friends and family, and warned that the internet could be “the ultimate isolating technology”.

Both groups were wrong, of course. The internet has not turned out to be a thing apart. Unpleasant aspects of the real world, such as taxes, censorship, crime and fraud are now features of the virtual world, too (see Technology Quarterly, in this issue). Gamers who make real money selling swords, gold and other items in virtual game worlds may now find that the tax man wants to know about it. Designers of virtual objects in Second Life, an online virtual world, are resorting to real-world lawsuits in order to protect their intellectual property. And several countries have managed to impose physical borders on the internet to enforce local laws, from censorship in China to France's ban on the sale of Nazi memorabilia.

Mind meld

At the same time, however, some of the most exciting uses of the internet rely on coupling it with the real world. Social networking allows people to stay in touch with their friends online, and plan social activities in the real world. The distinction between online and offline chatter ceases to matter. Or consider Google Earth, which puts satellite images of the whole world on your desktop and allows users to link online data with specific physical locations. The next step is to call up information about your surroundings using mobile devices—something that is starting to become possible. Beyond that, “augmented reality” technology blends virtual objects seamlessly into views of the real world, making it possible to compare real buildings with their virtual blueprints, or tag real-world locations with virtual messages.

All these approaches treat the internet as an overlay or an adjunct to the physical world, not a separate space. Rather than seeing the real and virtual realms as distinct and conflicting, in short, it makes sense to see them as complementary and connected. The resulting fusion is not what the Utopians or the critics foresaw, but it suits the rest of us just fine.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

New stem cells used in treatment

Julie Steenhuysen
Reuters


Friday, December 07, 2007


Using a new type of stem cells made from ordinary skin cells, U.S. researchers said Thursday they treated mice with sickle cell anemia, proving in principle that such cells could be used as a therapy.

U.S. and Japanese researchers last month reported they had reprogrammed human skin cells into behaving like embryonic stem cells, the body's master cells. They call the cells induced pluripotent stem cells, or iPS cells for short.

The Japanese team had previously done the reprogramming work in mouse skin cells.

A team at the Whitehead Institute of Biomedical Research in Cambridge, Mass., has now used the new cells to treat mice engineered to have sickle cell anemia, a disease of the blood caused by a defect in a single gene.

"This is the first evaluation of these cells for therapy," said Dr. Jacob Hanna, who worked on the study.

"The field has been working for years on strategies to generate customized stem cells," he added in an interview.

Creating stem cell therapies from a person's own cells would make them genetically identical, eliminating the need for immune suppression or donor matching, Hanna said.

"Now, with the breakthrough of this new method for generating stem cell-like cells, can we try to substitute a diseased tissue in a living animal?"

Hanna and colleagues working in Rudolf Jaenisch's lab at Whitehead Institute took skin cells from diseased mice and inserted four genes that reprogram the cells into becoming iPS cells.

"We call it the magic four factor," Hannah said.

Pluripotent or multipurpose cells, such as embryonic stem cells and the new cells, can morph into any type of cell in the human body.

The researchers then coaxed these mouse master cells into becoming blood-forming stem cells and substituted the faulty gene that causes sickle cell anemia with a working one. When they transplanted these cells into the diseased mice, tests showed normal blood and kidney function, they report in today's issue of the journal Science.

"This demonstrates that iPS cells have the same potential for therapy as embryonic stem cells, without the ethical and practical issues raised in creating embryonic stem cells," Jaenisch said in a statement.

But the technique is far from perfected. The four genes needed to turn skin cells into master cells are delivered using a type of virus called a retrovirus.

"Once they enter the genome, there is the danger that they can silence some genes that are important or they can activate some dangerous genes that shouldn't be activated," Hanna said.

Another obstacle is that one of the four genes used is c-Myc, which is known to cause cancer.

Hanna and colleagues got around that by removing the c-Myc gene after it had done its job of converting the skin cells into iPS cells. "It is far from solving the problem," he said.

Scientists hope to use stem cells to treat a host of diseases like diabetes, Parkinson's disease and spinal injuries. And the new technique for making stem cells will make them easier to study.

****
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

December 18, 2007
Findings
Why Nobody Likes a Smart Machine
By JOHN TIERNEY

At a Best Buy store in Midtown Manhattan, Donald Norman was previewing a scene about to be re-enacted in living rooms around the world.

He was playing with one of this year’s hot Christmas gifts, a digital photo frame from Kodak. It had a wondrous list of features — it could display your pictures, send them to a printer, put on a slide show, play your music — and there was probably no consumer on earth better prepared to put it through its paces.

Dr. Norman, a cognitive scientist who is a professor at Northwestern, has been the maestro of gizmos since publishing “The Design of Everyday Things,” his 1988 critique of VCRs no one could program, doors that couldn’t be opened without instructions and other technologies that seemed designed to drive humans crazy.

Besides writing scholarly analyses of gadgets, Dr. Norman has also been testing and building them for companies like Apple and Hewlett-Packard. One of his consulting gigs involved an early version of this very technology on the shelf at Best Buy: a digital photo frame developed for a startup company that was later acquired by Kodak.

“This is not the frame I designed,” Dr. Norman muttered as he tried to navigate the menu on the screen. “It’s bizarre. You have to look at the front while pushing buttons on the back that you can’t see, but there’s a long row of buttons that all feel the same. Are you expected to memorize them?”

He finally managed to switch the photo in the frame to vertical from horizontal. Then he spent five minutes trying to switch it back.

“I give up,” he said with a shrug. “In any design, once you learn how to do something once, you should be able to do it again. This is really horrible.”

So the bad news is that despite two decades of lectures from Dr. Norman on the virtue of “user-centered” design and the danger of a disease called “featuritis,” people will still be cursing at their gifts this Christmas.

And the worse news is that the gadgets of Christmas future will be even harder to command, because we and our machines are about to go through a rocky transition as the machines get smarter and take over more tasks. As Dr. Norman says in his new book, “The Design of Future Things,” what we’ll have here is a failure to communicate.

“It would be fine,” he told me, “if we had intelligent devices that would work well without any human intervention. My clothes dryer is a good example: it figures out when the clothes are dry and stops. But we are moving toward intelligent machines that still require human supervision and correction, and that is where the danger lies — machines that fight with us over how to do things.”

Can this relationship be saved? Until recently, Dr. Norman believed in the favorite tool of couples therapists: better dialogue. But he has concluded that dialogue isn’t the answer, because we’re too different from the machines.

You can’t explain to your car’s navigation system why you dislike its short, efficient route because the scenery is ugly. Your refrigerator may soon know exactly what food it contains, what you’ve already eaten today and what your calorie limit is, but it won’t be capable of an intelligent dialogue about your need for that piece of cheesecake.

To get along with machines, Dr. Norman suggests we build them using a lesson from Delft, a town in the Netherlands where cyclists whiz through crowds of pedestrians in the town square. If the pedestrians try to avoid an oncoming cyclist, they’re liable to surprise him and collide, but the cyclist can steer around them just fine if they ignore him and keep walking along at the same pace. “Behaving predictably, that’s the key,” Dr. Norman said. “If our smart devices were understandable and predictable, we wouldn’t dislike them so much.” Instead of trying to anticipate our actions, or debating the best plan, machines should let us know clearly what they’re doing.

Instead of beeping and buzzing mysteriously, or flashing arrays of red and white lights, machines should be more like Dr. Norman’s ideal of clear communication: a tea kettle that burbles as the water heats and lets out a steam whistle when it’s finished. He suggests using natural sounds and vibrations that don’t require explanatory labels or a manual no one will ever read.

But no matter how clearly the machines send their signals, Dr. Norman expects that we’ll have a hard time adjusting to them. He wasn’t surprised when I took him on a tour of the new headquarters of The New York Times and he kept hearing complaints from people about the smart elevators and window shades, or the automatic water faucets that refuse to dispense water. (For Dr. Norman’s analysis of our office building of the future, go to nytimes.com/tierneylab.)

As he watched our window shades mysteriously lowering themselves, having detected some change in cloud cover that eluded us, Dr. Norman recalled the fight that he and his colleagues at Northwestern waged against the computerized shades that kept letting sunlight glare on their computer screens.

“It took us a year and a half to get the administration to let us control the shades in our own offices,” he said. “Badly designed so-called intelligent technology makes us feel out of control, helpless. No wonder we hate it.” (For all our complaining, at The Times we have nicer shades that let us override the computer.)

Even when the bugs have been worked out of a new technology, designers will still turn out junk if they don’t get feedback from users — a common problem when their customer is a large bureaucracy. Engineers have known how to build a simple alarm clock for more than a century, so why can’t you figure out how to set the one in your hotel room? Because, Dr. Norman said, the clock was bought by someone in the hotel’s purchasing department who has never tried to navigate all those buttons at 1 in the morning.

“Our frustrations with machines are not going to be solved with better machines,” Dr. Norman said. “Most of our technological difficulties come from the way we interact with our machines and with other people. The technology part of the problem is usually pretty simple. The people part is complicated.”
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Next Sexual Revolution
By RONALD W. DWORKIN
December 18, 2007; Page A21

Marxists divide life into real events and pseudo-events. Real events, such as wars and revolutions, have sociological significance. Pseudo-events have no such significance, no matter how exciting they are, or how much of a spectacle they are on television. The Super Bowl is a pseudo-event. So is the World Series. So are most medical discoveries.

The last "real" event in medicine (perhaps the greatest "real" medical event of the 20th century) was the creation of the birth-control pill, which helped fuel a sexual revolution that changed people's entire reproductive patterns. The political consequences reverberate to this day.

Today another "real" event looms: a practical method of storing unfertilized human eggs. Until now, only fertilized eggs (embryos) and sperm have been amenable to cryopreservation. The high water content in unfertilized eggs causes crystallization under freezing conditions, rendering the eggs useless when thawed.

A couple can store embryos indefinitely. A man can store his sperm indefinitely. But until now, a woman has been unable to store her eggs. If she wants to postpone having children, she must mix some sperm with her eggs before freezing them. That means going to the sperm bank, or getting sperm the old-fashioned way: going out on blind dates or asking friends if they know someone, all while worrying about her biological clock and working on her career.

The technology permitting egg storage, called "vitrification," is still in its infancy, but success is inevitable, and when it arrives, the sociological consequences will be enormous. Right now, one in five children world-wide is born to women over 35. When mass egg storage becomes feasible, that number will likely increase dramatically, and include not just women in their late 30s and 40s, but also women in their 50s, even 60s. The hurdle for a 50-year old woman trying to get pregnant is not that she can't carry a baby -- supplemental hormones can fix that, even after menopause -- but that her 50-year old eggs, assuming she has any eggs, won't implant in her uterus. But eggs harvested when she was a 20-year-old, stored for three decades, then thawed and fertilized, will implant. A uterus is ageless.
One consequence of this new technology is a potential reversal of the declining birth rate in Western countries. Low birth rates, especially in Europe, have already caused political and cultural dislocation. Raising children while building a serious career is hard for women, and when presented with the choice, many women opt for the latter. Half of Germany's female scientists, for example, are reportedly childless. By the time a career is established, say, in a woman's 40s, it may be too late to have a baby. If women could store their eggs, they could remain fertile.

Freezing unfertilized eggs gives women a way out of a complicated cultural maze. Decades ago, the lives of men and women diverged at adolescence. Men prepared for careers while women prepared for domestic life. Today, many young men and women go through high school, college and professional school often mistakenly assuming no differences in their respective trajectories.

When I suggested to a 22-year-old female medical student that she consider a career in anesthesiology because the hours were flexible enough to raise a family, she shot back: "I went to Harvard! Now I'm going to Johns Hopkins! I'm going to be a department chairman someday! And you want to put me on the mommy track?" Seven years later, when this woman applied for a job as an anesthesiologist, the first question she asked me was: "I'm trying to have a baby. Can I go part time?"

Our culture encourages women to pursue high-powered careers. Many women must pursue at least some kind of career: With the divorce rate over 50%, women can no longer rely on the integrity of the family unit to support them. The culture paints a rosy image about career and family. Then biological truth breaks through, by which time these women have lost a decade of their best childbearing years.

Women who opt to freeze their unfertilized eggs will gain those years back -- and more -- giving them the freedom to leisurely follow the male career trajectory. No more late night panicking. No more marrying a man you don't love "just to have the baby." No more lurching from Harvard to the mommy track.

True, if these women still decide not to have children when they hit their 40s or 50s, having grown accustomed to freedom, then the population in Western countries will not rise but plummet further. Yet most middle-aged people know that many careers can be pretty dull, without much chance to create. Following rules and procedures until midnight in a law firm may seem acceptable when you're 25, but not when you're 50. Armed with this insight, money and perfect eggs -- and with an expected life span of 86 years -- many women will likely choose to create a family.

But what kind of family? Women in their 30s are reluctant to use banked sperm to get pregnant, in part because they still hope to meet someone, because they can't support themselves as single mothers, or because they fear being judged by their peers.
A woman in her 50s probably has less hope of finding a man who wants to start a family than a woman in her 30s. And so a 50-year-old woman, without serious marital options, loaded with money and eggs, and far too wizened to worry about what other people say, might just go ahead and call that sperm bank if she wants a family. Or maybe she'll marry a 70-year-old man, who thinks that if women can be mothers into their 50s and 60s, why can't he be a father too?

While Marxists divide life into real events and pseudo-events, a more accurate division is between the truths of the times and the truths of fact. Young women forsaking their careers to bear children -- this is a truth of the times. Women driven by nature to procreate but having to find a new way to do so amid today's realities -- this is a truth of fact that is likely to prevail in the end.

Dr. Dworkin is a senior fellow at Hudson Institute and the author of "Artificial Happiness" (Basic Books, 2006).
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

There is an illuminative multimedia linked at:

http://www.nytimes.com/2007/12/26/busin ... ?th&emc=th

December 26, 2007
Hospitals Look to Nuclear Tool to Fight Cancer
By ANDREW POLLACK
There is a new nuclear arms race under way — in hospitals.

Medical centers are rushing to turn nuclear particle accelerators, formerly used only for exotic physics research, into the latest weapons against cancer.

Some experts say the push reflects the best and worst of the nation’s market-based health care system, which tends to pursue the latest, most expensive treatments — without much evidence of improved health — even as soaring costs add to the nation’s economic burden.

The machines accelerate protons to nearly the speed of light and shoot them into tumors. Scientists say proton beams are more precise than the X-rays now typically used for radiation therapy, meaning fewer side effects from stray radiation and, possibly, a higher cure rate.

But a 222-ton accelerator — and a building the size of a football field with walls up to 18-feet thick in which to house it — can cost more than $100 million. That makes a proton center, in the words of one equipment vendor, “the world’s most expensive and complex medical device.”

Until 2000, the United States had only one hospital-based proton therapy center. Now there are five, with more than a dozen others announced. Still more are under consideration.

Some experts say there is a vast need for more proton centers. But others contend that an arms race mentality has taken hold, as medical centers try to be first to take advantage of the prestige — and the profits — a proton site could provide.

“I’m fascinated and horrified by the way it’s developing,” said Dr. Anthony L. Zietman, a radiation oncologist at Harvard and Massachusetts General Hospital, which operates a proton center. “This is the dark side of American medicine.”

Once hospitals have made such a huge investment, experts like Dr. Zietman say, doctors will be under pressure to guide patients toward proton therapy when a less costly alternative might suffice.

Similar cost concerns were expressed in the past about other new technology like M.R.I. scanners. While those have become accepted staples of medical practice, there is still concern about their overuse and the impact on medical spending.

Dr. Zietman said that while protons were vital in treating certain rare tumors, they were little better than the latest X-ray technology in dealing with prostate cancer, the common disease that many proton centers are counting on for business.

“You can scarcely tell the difference between them except in price,” he said. Medicare pays about $50,000 to treat prostate cancer with protons, almost twice as much as with X-rays.

Proponents, however, are adamant that proton centers provide better treatment.

“It all comes down to the physics,” said Dr. Jerry D. Slater, the head of radiation medicine at Loma Linda University Medical Center in Southern California. “Every X-ray beam I use puts most of the dose where I don’t want it.” By contrast, he said, proton beams put most of the dose in the tumor.

Loma Linda built the nation’s first hospital-based proton center in 1990 and has treated about 13,000 patients. Its success has inspired others.

Companies have sprung up to help finance, build and operate the proton centers. In some cases, local and state governments, seeking to attract medical tourists, have chipped in. Such financing is allowing proton centers to be built by community hospitals or groups of physicians.

One of the biggest and most costly projects, with a bill exceeding $140 million, is being undertaken by Hampton University in Virginia, a historically black college that does not have a medical school.

“Here at Hampton we dream no small dreams,” said William R. Harvey, the president. He said a proton center would help African-Americans, who have higher rates of some cancers than whites. And he said a medical school was not needed — that doctors would be hired to run the outpatient center.

Some of the planned centers will be very close together, raising the odds of overcapacity. Two proton centers are planned for Oklahoma City, for example, and two more in the western suburbs of Chicago.

The institutions building the centers say there is a need for many more of them. The existing centers, which collectively can treat only several thousand patients a year, are turning people away. And patients who are accepted often have to spend weeks in a city far from their homes.

Proponents say that more than 800,000 Americans — representing nearly two-thirds of new cancer cases — undergo radiation therapy each year. If only 250,000 of them could benefit from protons, they would fill more than 100 centers.

“If they built one across the street I wouldn’t worry about it,” said James D. Cox, chief of radiation oncology at the M. D. Anderson Cancer Center in Houston, which opened a $125 million proton center last year.

X-rays, which are high-energy electromagnetic waves, pass through the body, depositing their energy all along the way, not just in the tumor. By contrast, protons — subatomic particles with a positive electrical charge — can be made to stop on the tumor and dump most of their payload there.

Tumors in or near the eye, for instance, can be eradicated by protons without destroying vision or irradiating the brain. Protons are also valuable for treating tumors in brains, necks and spines, and tumors in children, who are especially sensitive to the side effects of radiation.

When 10-year-old Brooke Bemont was about to undergo X-ray treatment for a brain tumor last summer, a doctor warned her mother, “Do not plan on your daughter ever going to Harvard.” The radiation would damage Brooke’s mental capacity, she said.

So the family, from St. Charles, Ill., spent five weeks in Boston as Brooke was treated with protons at Massachusetts General Hospital Cancer Center. “If there was a potential to save even a little of her brain tissue, there was no question that we would do it,” said Christal Bemont, Brooke’s mother. She added that Brooke was now apparently cancer-free and doing fairly well.

Head, spine and childhood cancers are rare, though. Most people undergoing proton treatment are men with localized prostate cancer.

Proton therapy can help avoid the worst side effects, like impotence, by exposing the bladder and rectum of a prostate patient to less radiation than X-rays. The stray radiation, though, from the newest form of X-rays, called intensity-modulated radiation therapy, is already low, diminishing any advantages from proton therapy.

“There are no solid clinical data that protons are better” said Dr. Theodore S. Lawrence, the chairman of radiation oncology at the University of Michigan. “If you are going to spend a lot more money, you want to make sure the patient can detect an improvement, not just a theoretical improvement.”

An economic analysis by researchers at Fox Chase Cancer Center in Philadelphia found that proton treatment would be cost-effective for only a small subset of prostate cancer patients.

Lack of data aside, men are flocking to proton treatment.

“I’m 67 years old, and the last thing I want to do is wear a diaper for the rest of my life,” said Pete Freeman of Spokane, Wash., who was undergoing treatment at Loma Linda.

Some men hear about proton therapy from the Brotherhood of the Balloon, a group of 3,000 men who have had the treatment. (A balloon is inserted into the rectum and filled with water to immobilize the prostate during treatment.)

The organization, which now gets some financial support from Loma Linda, was founded by Robert J. Marckini, a former Loma Linda patient who calls himself Proton Bob.

At Loma Linda, prostate cancer treatment requires about two months of daily sessions. The actual irradiation, which the patient does not feel, takes only about a minute. Most men with early prostate cancer have no symptoms from their disease and many say the treatment has few immediate side effects, other than fatigue and an urgency to urinate.

“We go have our treatments, and we go out and play golf,” said Harold J. Phillips, an accountant from Tacoma who was being treated recently at Loma Linda.

Doctors are also learning how to use protons to treat lung and breast cancer. And over time, doctors say, costs should come down as the technology improves and it becomes more routine to build and operate proton centers. One company is trying to develop a $20 million proton system and has received orders from several hospitals.

On the horizon is therapy using beams of carbon ions, which are said to be even more powerful in killing tumors. Touro University says it will build a combined proton and carbon therapy center outside San Francisco, to open as early as 2011. The Mayo Clinic is also seriously considering one. Such centers will cost even more — as much as $300 million.

*****
There are interesting videos linked at:

http://www.nytimes.com/2007/12/26/healt ... ?th&emc=th

December 26, 2007
Six Killers: Alzheimer’s Disease
Finding Alzheimer’s Before a Mind Fails
By DENISE GRADY
For a perfectly healthy woman, Dianne Kerley has had quite a few medical tests in recent years: M.R.I. and PET scans of her brain, two spinal taps and hours of memory and thinking tests.

Ms. Kerley, 52, has spent much of her life in the shadow of an illness that gradually destroys memory, personality and the ability to think, speak and live independently. Her mother, grandmother and a maternal great-aunt all developed Alzheimer’s disease. Her mother, 78, is in a nursing home in the advanced stages of dementia, helpless and barely responsive.

“She’s in her own private purgatory,” Ms. Kerley said.

Ms. Kerley is part of an ambitious new scientific effort to find ways to detect Alzheimer’s disease at the earliest possible moment. Although the disease may seem like a calamity that strikes suddenly in old age, scientists now think it begins long before the mind fails.

“Alzheimer’s disease may be a chronic condition in which changes begin in midlife or even earlier,” said Dr. John C. Morris, director of the Alzheimer’s Disease Research Center at Washington University in St. Louis, where Ms. Kerley volunteers for studies.

But currently, the diagnosis is not made until symptoms develop, and by then it may already be too late to rescue the brain. Drugs now in use temporarily ease symptoms for some, but cannot halt the underlying disease.

Many scientists believe the best hope of progress, maybe the only hope, lies in detecting the disease early and devising treatments to stop it before brain damage becomes extensive. Better still, they would like to intervene even sooner, by identifying risk factors and treating people preventively — the same strategy that has markedly lowered death rates from heart disease, stroke and some cancers.

So far, Alzheimer’s has been unyielding. But research now under way may start answering major questions about when the disease begins and how best to fight it.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

December 30, 2007
Bright Ideas
Innovative Minds Don’t Think Alike
By JANET RAE-DUPREE

IT’S a pickle of a paradox: As our knowledge and expertise increase, our creativity and ability to innovate tend to taper off. Why? Because the walls of the proverbial box in which we think are thickening along with our experience.

Andrew S. Grove, the co-founder of Intel, put it well in 2005 when he told an interviewer from Fortune, “When everybody knows that something is so, it means that nobody knows nothin’.” In other words, it becomes nearly impossible to look beyond what you know and think outside the box you’ve built around yourself.

This so-called curse of knowledge, a phrase used in a 1989 paper in The Journal of Political Economy, means that once you’ve become an expert in a particular subject, it’s hard to imagine not knowing what you do. Your conversations with others in the field are peppered with catch phrases and jargon that are foreign to the uninitiated. When it’s time to accomplish a task — open a store, build a house, buy new cash registers, sell insurance — those in the know get it done the way it has always been done, stifling innovation as they barrel along the well-worn path.

Elizabeth Newton, a psychologist, conducted an experiment on the curse of knowledge while working on her doctorate at Stanford in 1990. She gave one set of people, called “tappers,” a list of commonly known songs from which to choose. Their task was to rap their knuckles on a tabletop to the rhythm of the chosen tune as they thought about it in their heads. A second set of people, called “listeners,” were asked to name the songs.

Before the experiment began, the tappers were asked how often they believed that the listeners would name the songs correctly. On average, tappers expected listeners to get it right about half the time. In the end, however, listeners guessed only 3 of 120 songs tapped out, or 2.5 percent.

The tappers were astounded. The song was so clear in their minds; how could the listeners not “hear” it in their taps?

That’s a common reaction when experts set out to share their ideas in the business world, too, says Chip Heath, who with his brother, Dan, was a co-author of the 2007 book “Made to Stick: Why Some Ideas Survive and Others Die.” It’s why engineers design products ultimately useful only to other engineers. It’s why managers have trouble convincing the rank and file to adopt new processes. And it’s why the advertising world struggles to convey commercial messages to consumers.

“I HAVE a DVD remote control with 52 buttons on it, and every one of them is there because some engineer along the line knew how to use that button and believed I would want to use it, too,” Mr. Heath says. “People who design products are experts cursed by their knowledge, and they can’t imagine what it’s like to be as ignorant as the rest of us.”

But there are proven ways to exorcise the curse.

In their book, the Heath brothers outline six “hooks” that they say are guaranteed to communicate a new idea clearly by transforming it into what they call a Simple Unexpected Concrete Credentialed Emotional Story. Each of the letters in the resulting acronym, Succes, refers to a different hook. (“S,” for example, suggests simplifying the message.) Although the hooks of “Made to Stick” focus on the art of communication, there are ways to fashion them around fostering innovation.

To innovate, Mr. Heath says, you have to bring together people with a variety of skills. If those people can’t communicate clearly with one another, innovation gets bogged down in the abstract language of specialization and expertise. “It’s kind of like the ugly American tourist trying to get across an idea in another country by speaking English slowly and more loudly,” he says. “You’ve got to find the common connections.”

In her 2006 book, “Innovation Killer: How What We Know Limits What We Can Imagine — and What Smart Companies Are Doing About It,” Cynthia Barton Rabe proposes bringing in outsiders whom she calls zero-gravity thinkers to keep creativity and innovation on track.

When experts have to slow down and go back to basics to bring an outsider up to speed, she says, “it forces them to look at their world differently and, as a result, they come up with new solutions to old problems.”

She cites as an example the work of a colleague at Ralston Purina who moved to Eveready in the mid-1980s when Ralston bought that company. At the time, Eveready had become a household name because of its sales since the 1950s of inexpensive red plastic and metal flashlights. But by the mid-1980s, the flashlight business, which had been aimed solely at men shopping at hardware stores, was foundering.

While Ms. Rabe’s colleague had no experience with flashlights, she did have plenty of experience in consumer packaging and marketing from her years at Ralston Purina. She proceeded to revamp the flashlight product line to include colors like pink, baby blue and light green — colors that would appeal to women — and began distributing them through grocery store chains.

“It was not incredibly popular as a decision amongst the old guard at Eveready,” Ms. Rabe says. But after the changes, she says, “the flashlight business took off and was wildly successful for many years after that.”

MS. RABE herself experienced similar problems while working as a transient “zero-gravity thinker” at Intel.

“I would ask my very, very basic questions,” she said, noting that it frustrated some of the people who didn’t know her. Once they got past that point, however, “it always turned out that we could come up with some terrific ideas,” she said.

While Ms. Rabe usually worked inside the companies she discussed in her book, she said outside consultants could also serve the zero-gravity role, but only if their expertise was not identical to that of the group already working on the project.

“Look for people with renaissance-thinker tendencies, who’ve done work in a related area but not in your specific field,” she says. “Make it possible for someone who doesn’t report directly to that area to come in and say the emperor has no clothes.”

Janet Rae-Dupree writes about science and emerging technology in Silicon Valley.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The future of futurology

Dec 30th 2007
From Economist.com


Think small, think short—and listen

SO THERE you are on the moon, reading The World in 2008 on disposable digital paper and waiting for the videophone to ring. But no rush, because you’re going to live for ever—and if you don’t, there’s a backed-up copy of your brain for downloading to your clone.


Yes? No? Well, that’s how the 21st century looked to some futurologists 40 or 50 years ago, and they’re having a hard time living it down now. You can still get away (as we do) with predicting trends in the world next year, but push the timeline out much further, and you might as well wear a T-shirt saying “crackpot”. Besides, since the West began obsessing a generation ago about accelerating social and technological change, people in government and industry can spend weeks each year in retreats brainstorming and scenario-building about the future of their company or their industry or their world. The only thing special about a futurologist is that he or she has no other job to do.

Small wonder that futurology as we knew it 30 or 40 years ago—the heyday of Alvin Toffler’s “Future Shock”, the most popular work of prophecy since Nostradamus—is all but dead. The word “futurologist” has more or less disappeared from the business and academic world, and with it the implication that there might be some established discipline called “futurology”. Futurologists prefer to call themselves “futurists”, and they have stopped claiming to predict what “will” happen. They say that they “tell stories” about what might happen. There are plenty of them about, but they have stopped being famous. You have probably never heard of them unless you are in their world, or in the business of booking speakers for corporate dinners and retreats.

We can see now that the golden age of blockbuster futurology in the 1960s and 1970s was caused, not by the onset of profound technological and social change (as its champions claimed), but by the absence of it. The great determining technologies—electricity, the telephone, the internal combustion engine, even manned flight—were the products of a previous century, and their applications were well understood. The geopolitical fundamentals were stable, too, thanks to the cold war. Futurologists extrapolated the most obvious possibilities, with computers and nuclear weapons as their wild cards. The big difference today is that we assume our determining forces to be ones that 99% of us do not understand at all: genetic engineering, nanotechnology, climate change, clashing cultures and seemingly limitless computing power. When the popular sense of direction is baffled, there is no conventional wisdom for futurologists to appropriate or contradict.

Popcorn and prediction markets
There are still some hold-outs prophesying at the planetary level: James Canton, for example, author of “Extreme Future”. But the best advice for aspirant futurists these days is: think small. The best what-lies-ahead book of 1982 was “Megatrends”, by John Naisbitt, which prophesied the future of humanity. A quarter-century later, its counterpart for 2007 was “Microtrends”, by Mark Penn, a public-relations man who doubles as chief strategy adviser to Hillary Clinton’s 2008 presidential campaign. “Microtrends” looks at the prospects for niche social groups such as left-handers and vegan children. The logical next step would be a book called “Nanotrends”, save that the title already belongs to a journal of nano-engineering.

The next rule is: think short-term. An American practitioner, Faith Popcorn, showed the way with “The Popcorn Report” in 1991, applying her foresight to consumer trends instead of rocket science. The Popcornised end of the industry thrives as an adjunct of the marketing business, a research arm for its continuous innovation in consumer goods. One firm, Trendwatching of Amsterdam, predicts in its Trend Report for 2008 a list of social fads and niche markets including “eco-embedded brands” (so green they don’t even need to emphasise it) and “the next small thing” (“What happens when consumers want to be anything but the Joneses?”).

A third piece of advice: say you don’t know. Uncertainty looks smarter than ever before. Even politicians are seeing the use of it: governments that signed the Kyoto protocol on climate change said, in effect: “We don’t know for sure, but best to be on the safe side”—and they have come to look a lot smarter than countries such as America and Australia which claimed to understand climate change well enough to see no need for action.

The last great redoubt of the know-alls has been the financial markets, hedge funds claiming to have winning strategies for beating the average. But after the market panic of 2007 more humility is to be expected there too.

A fourth piece of advice for the budding futurist: get embedded in a particular industry, preferably something to do with computing or national security or global warming. All are fast-growing industries fascinated by uncertainty and with little use for generalists. Global warming, in particular, is making general-purpose futurology all but futile. When the best scientists in the field say openly that they can only guess at the long-term effects, how can a futurologist do better? “I cannot stop my life to spend the next two or ten years to become an expert on the environment,” complains Mr Naisbitt in his latest book, “Mindset” (although the rewards for Al Gore, who did just that, have been high).

A fifth piece of advice: talk less, listen more. Thanks to the internet, every intelligent person can amass the sort of information that used to need travel, networking, research assistants, access to power. It is no coincidence that the old standard work on herd instinct, Charles Mackay’s “Extraordinary Popular Delusions and the Madness of Crowds”, has been displaced by James Surowiecki’s “The Wisdom of Crowds”.

The most heeded futurists these days are not individuals, but prediction markets, where the informed guesswork of many is consolidated into hard probability. Will Osama bin Laden be caught in 2008? Only a 15% chance, said Newsfutures in mid-October 2007. Would Iran have nuclear weapons by January 1st 2008? Only a 6.6% chance, said Inkling Markets. Will George Bush pardon Lewis “Scooter” Libby? A better-than-40% chance, said Intrade. There may even be a prediction market somewhere taking bets on immortality. But beware: long- and short-sellers alike will find it hard to collect.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

'Immortality' in forecast
Future society also predicts invisibility cloak, end of privacy

Shannon Proudfoot
CanWest News Service


Thursday, January 03, 2008


Virtual immortality, a real-life version of Harry Potter's invisibility cloak and smart fabrics that release a scent to drive away unwanted company could be just over the horizon for 2008 and beyond. That's according to the World Future Society's annual forecast of upcoming innovations.

"We've strived over the years to do one thing and to really excel at it, and that is to be a neutral clearing house for ideas on the future," says Patrick Tucker, director of communications for the society and editor of its magazine, The Futurist.

To that end, the Washington, D.C.-based organization keeps tabs on technological developments, public policy shifts and non-governmental organizations. The work is not about crystal ball prophecies, but highlighting what could happen based on current trends, he adds.

"No one should take them as predictions," he says. "It's impossible to know the future, but it is possible to change it."

Since 1985, The Futurist has published a year-end list of the most relevant forecasts. Some of them may be 20 or 30 years from fruition, Tucker says, but many are immediate possibilities. They've developed a reasonably good record for accuracy, he says.

"We did do a couple of things right that we're very proud of," he says. "We spotlighted the emergence of the Internet, we had someone who wrote about virtual reality before it entered the mainstream. We forecast the end of the Cold War."

So what could 2008 bring?

The society forecasts that the growth of surveillance technologies and voyeuristic venues like YouTube will ultimately spell the death of any notions of "privateness." At the same time, increasingly sophisticated virtual reality graphics and artificial intelligence will allow computers to capture someone's voice and appearance, even their personality and knowledge. This could create "virtual immortality" in which it's possible to visit with the dearly departed long after they've shuffled off this mortal coil.

The World Future Society also believes Harry Potter's invisibility cloak will have real-world competition within a decade, with "optical cloaking" that bends light around an object and makes it disappear from view.

Right now, "smart fabrics" with computing capabilities woven into them are a $400-million market, Tucker says, confined mostly to self-heating car seats and military uses. The technology is still clunky, but as it evolves, he foresees increasingly sophisticated and playful products such as scented clothing.

"You would smell slightly different depending on whether you're listening to one genre of music or another," he says. "You could emit a different fragrance depending on whether you enjoy the company of the person you're with or if you wanted to theoretically send them away."

British researcher Jenny Tillotson has already created garments and accessories that harness some of this potential, he says.

The biggest current obstacle to smart fabric development is finding a tiny and reliable power source, Tucker says.

Wireless Internet access everywhere will be the next major technological breakthrough, according to William Halal, president of TechCast.org, which tracks upcoming innovations and their likely time frames.

Apple's iPhone and similar devices in development at Google have put the Internet in people's pockets, he says, but they have yet to overcome the limitations of a minuscule keyboard and eye-straining screen.

Within the next five to 10 years, artificial intelligence will allow people to simply tell their phones what to do, says Halal, who is also a professor of science, technology and innovation at George Washington University in the U.S. capital. At the same time, display capabilities will become so sophisticated the screen will simply be projected in front of us or onto our eyeglasses, he says -- a development that already exists in prototype.

"It's going to be enormous," he says.

Another development Halal foresees, likely within a decade, is that virtual education will go mainstream. Universities are really conservative institutions at heart, he says, and they've been reluctant to embrace the practice because it's not traditional.

Those programs or schools that have ventured into virtual education have found it prohibitively expensive because they essentially use it to duplicate the experience of a small-scale regular classroom, he says.

"The benefits of virtual education are going to occur when they really leverage the technology and have big numbers, like hundreds or thousands of students around the world, and the best professors are lecturing," Halal says.

It could take the form of sophisticated video conferencing, he says, or eventually professors and students could interact in a virtual classroom through holographic avatars.

In the intellectual sphere, Tucker also sees the deeply ambivalent potential for technology to create a society of "educated illiterates." Artificial intelligence will evolve to the point where we can simply ask our computers a verbal question to get any information we need, he says. When we can "outsource" all our typical reading tasks to computers, traditional literacy could become obsolete, he adds, possibly replaced by web-friendly skills like creativity or critical thinking.

Whether that's a good or bad thing depends on your perspective, he says, but working for the World Future Society doesn't mean he's enthusiastic about everything that might come to pass.

"I would argue that the ability to sit and read and think about information in a slow and patient and focused way is a very valuable skill, and it's one that we're cheapening, which frightens me," Tucker says.

"But other people don't necessarily share that fear."

© The Calgary Herald 2008
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Stem cell research can co-exist with ethics

Susan Martinuk
Calgary Herald


Friday, January 04, 2008

CREDIT: Calgary Herald Archive, Getty Images
A graduate student tests for stem cell secretion at the Reeve-Irvine Research Center at University of California.

Recent scientific breakthroughs that have been described as "the biological equivalent of the Wright brothers' first airplane ride" have demonstrated that human skin cells can be converted into the highly valued embryonic state where they theoretically have the potential to develop into any cell, tissue or organ that will treat almost any disease.

Taking fully developed cells back to an immature state may seem like a step backwards. But, once in that state, the addition of nutrients, growth factors and hormones can direct the cells to develop into new heart, nerve, muscle or any kind of cell required to replace or regenerate damaged cells and diseased cells in the body. The technique is called therapeutic cloning.

In November, researchers from the U.S. and Japan reported they had tricked cultured adult cells (grown in the lab) to convert to embryonic cells.

Then, in December, another U.S. group accomplished the same thing, using scrapings of skin cells from adult volunteers.

In other words, a working technique now exists to take a swab of cells from injured or diseased individuals, and then convert them into genetically identical cells that are needed to fix the problem.

The concept is remarkable. The technique is practical, simple and cheap. Most importantly, the implications are enormous, as it now renders moot the controversies that have prevented this research from reaching its full potential.

For years, scientists, politicians and financial investors have squabbled about how to best capture this potential -- and the dividing line has focused on ethics.

One group has supported the use of embryonic stem cells, obtained from human embryos created for that purpose or by destroying large numbers of embryos that are headed for the discard pile at infertility clinics.

Consequently, the research is highly controversial and any progress (or resultant cure) is laden with ethical and moral questions.

A second group skirted the issue by converting readily available adult cells (mostly from blood or bone marrow) into cells needed to treat disease.

The successes of this latter group have far outstripped those of the former, but news of scientific gains was often lost amid the overbearing voices of those who have no qualms about taking a human life to save a human life, and who believe research should be viewed from a strictly utilitarian perspective.

This debate has been driven not by science, but by the power of celebrity.

Michael J. Fox, Christopher Reeve and Mary Tyler Moore have all made heart-rending pleas in support of embryonic stem cell research as a panacea for Parkinson's, spinal cord injuries and diabetes, respectively.

Ronald Reagan's wife, Nancy, also took up the call in the wake of his death from Alzheimer's. All of this produced much public sympathy, but it also created the mistaken impression that all hope lay with embryonic cell research.

President George W. Bush was accused of withholding government money for embryonic stem cell research when, in fact, embryonic research had billions of government and private money at its disposal.

In 2001, Bush directed the majority of federal money toward adult stem cell research and, as unpopular as this may sound, he was right.

Bush backed the right science and made a courageous stand for morality and ethics in medicine.

In contrast, Governor Arnold Schwarzenegger may have some explaining to do.

He successfully encouraged voters to support his 2004 proposal for California to borrow $3 billion over 10 years to invest in embryonic stem cell research.

New Jersey voters just rejected a similar proposal (on a far smaller level), even though their governor had promised investment returns of 16,000 per cent. Embryonic stem cell research isn't over. But it will be difficult to justify the wholesale creation, and destruction, of human embryos and the expense that is required to obtain the very same cells that are now readily available by running a cotton swab over the inside of a cheek.

The principle behind Ockham 's razor stands: The simplest solution is the best.

Scottish professor Ian Wilmut, known as a leader in human cloning research and the infamous creator of Dolly (the first cloned sheep), says reprogramming "represents the future for stem cell research" and has abandoned his work utilizing other techniques.

In this case, it is a bonus that the simplest and best solution also preserves our moral obligations to upholding human dignity and life. Our society and our own humanity would surely suffer if we were to abandon those obligations just so we could live our modern lives with less pain and sorrow.

Susan Martinuk's column appears every Friday.
http://www.canada.com/components/print. ... d0eda41e96
***
Astronomer ponders ET's perspective
Alien's view of Earth helps in telescope design

Tom Spears
CanWest News Service


Friday, January 04, 2008



CREDIT: Herald Archive, AFP-Getty Images
University of Florida astronomer Eric Ford said he used space views of Earth, as an alien would see our world, to figure out how to design a space telescope to pick out Earth-like planets throughout the cosmos.

Astronomers from Boston and Florida have reversed the usual search for life on other planets, asking instead: What would we look like if ET, the extraterrestrial, pointed a telescope toward Earth?

Surprisingly, they say, even a faint view of Earth would show we're a likely place to harbour life, even if we just showed up as a single faint dot on some telescope in another solar system.

That's because our brightness -- the reflection of sunlight off our planet -- would grow brighter and dimmer in a pattern that shows the presence of land and oceans, our 24-hour day, and even clouds, they say.

This in turn teaches an excellent lesson in how we should turn the next space telescope toward distant planets, the astronomers believe.

At the University of Florida, astronomer Eric Ford had been wondering for a long time what kind of space telescope should follow Hubble and the other current ones.

"The reason we're doing this is to plan for future (telescope) missions, where we want to look for Earth-like planets," he said.

Searching for planets really needs a space telescope, he said. Observatories on Earth have to look through our atmosphere and the view is too blurry to see an Earth-sized planet far away.

He started figuring out what an eight-metre telescope would see.

(That means the mirror collecting light is eight metres wide -- large, though not the biggest that observatories have today. Coincidentally, that's the biggest size one could squeeze into the space shuttles, or a later launch vessel of similar size.)

Along with astronomers at the Massachusetts Institute of Technology and in the Canary Islands, he got to calculating whether the aliens could detect our home.

It was good news, as long as the aliens are friendly. Clouds reflect a lot of light. Land reflects a medium amount. Oceans don't reflect much at all.

That means that a spinning Earth will show ET a changing pattern of oceans and continents that repeats every 24 hours -- proving we don't always have one side toward the sun, frying, and the other side dark and freezing.

As well, little variations in the basic pattern would reveal clouds coming and going, showing it's warm enough to evaporate some water, but cool enough to have the liquid oceans, too.

"You may say 'Oh, big deal,' but if you did that for the rest of the solar system, there's no other planet that has continents and oceans," he said.

"This doesn't tell you there are little green men. It doesn't tell you there ever was or will be life. But it does tell you how to recognize a planet that has liquid and solids," he said.

Of the 240 planets found around other stars so far, none share these characteristics. That's partly because our current telescopes can mainly find hot planets very close to stars.

"The reason this is hard is that these (Earth-like planets) are sitting next to a very bright star. It's a bit like looking for a firefly next to stadium lights," Ford said.

NASA will launch its Kepler space telescope to search for Earth-like planets in 2009. A future telescope, called the Terrestrial Planet Finder, is still on the drawing board.

© The Calgary Herald 2008

http://www.canada.com/components/print. ... cb5a0fe06b
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

With venture capital cash in hand, 'YouTube for ideas' becomes a reality
By Tim Arango
http://www.iht.com/bin/printfriendly.php?id=9043331

Sunday, January 6, 2008
NEW YORK: In June 2006, Peter Hopkins, a civic-minded and idealistic 2004 Harvard graduate, trekked up to his alma mater from New York for a meeting with Lawrence Summers, the economist and former U.S. Treasury secretary.

Hopkins, who finagled the appointment through his friendship with Summers' assistant, had a business idea: a Web site that could do for intellectuals what YouTube, the popular video-sharing site, did for bulldogs on skateboards.

The pitch - "a YouTube for ideas" - appealed to Summers. "Larry, to his credit, is open to new ideas," Hopkins recalled recently. "He grilled me for two hours." In the age of user-generated content, Summers did have one worry: "Let's say someone puts up a porn video next to my macroeconomic speech?"

It took awhile, but a year after that meeting, Summers decided to invest ("a few tens of thousands of dollars," he said, adding "not something I'm hoping to retire on") in the site, called Big Think, which officially makes its debut Monday after several months of testing.

Big Think (www.bigthink.com) mixes interviews with public intellectuals from a variety of fields, from politics to law to business, and allows users to engage in debates on issues like global warming and the two-party system. It plans to add new features as it goes along, including a Facebook-like application for social networking, and Hopkins said he would like the site to become a go-to place for college students looking for original sources.

"I've had the general view that there is a hunger for people my age looking for more intellectual content," said Summers, who resigned as Harvard president in 2006 after making controversial comments about the lack of women in science and engineering. "I saw it as president of Harvard when I saw CEOs come up to my wife and want to discuss Hawthorne." (His wife, Elisa New, is a professor of English at Harvard).

A handful of other deep-pocketed investors also decided to chip in, including Peter Thiel, the Silicon Valley venture capitalist and co-founder of PayPal, the online payments site; Tom Scott, who struck it rich by founding, and selling, the juice company Nantucket Nectars and now owns Plum TV, a collection of local television stations in wealthy playgrounds like Aspen, Martha's Vineyard and the Hamptons; the television producer Gary David Goldberg, who was behind the hit shows "Spin City" and "Family Ties"; and David Frankel, a venture capitalist who was the lead investor in Big Think.

Frankel said "the initial investors may put in more. I imagine we will go out and raise more money in the future."

Hopkins and his partner, Victoria Brown, germinated the idea for Big Think while working together at PBS on the "Charlie Rose Show" in 2006.

When they surveyed the landscape, Hopkins, 24, and Brown, 33, saw a vast array of celebrity and sophomoric video content (remember the clips of cats urinating in toilets that caused a sensation on YouTube?).

"Everyone says Americans are stupid - that's what we generally heard from venture capitalists" when trying to raise money, Hopkins said. Obviously, Hopkins and Brown felt differently - and the success of the business basically hinges on proving that Americans have an appetite for other kinds of content.

Of course, Hopkins and Brown are not the first to see the Internet as an opportunity to further public discourse. It was invented largely by academics, and numerous sites like Arts & Letters Daily, an offshoot of the Chronicle of Higher Education, seek to serve intellectuals.

Big Think's business model right now is rudimentary: attract enough viewers, then sell advertising.

So for the time being, money will be flowing one way - out the door. Over the past several months, Big Think's handful of producers, working out of a pod of desks in a New York office space, have amassed a library of about 180 interviews with leading thinkers, politicians and businesspeople like the Republican presidential candidate Mitt Romney, Supreme Court Justice Stephen Breyer and the Blackstone co-founder Pete Peterson. Many of the interviews were conducted in a closet-turned-studio in a back room off the kitchen.

The interview style, borrowed from the documentary filmmaker Errol Morris, makes the interviewer almost an afterthought. The person asking the questions sits in an even smaller closet behind a shower curtain, and the subject hears the questions from a closed-circuit monitor. The finished product eliminates the interviewer's voice, and the questions appear as text on the screen.

"The whole idea is really to take the interviewer out of the equation," Hopkins said. "It allows people to be very candid. Pete Peterson went on about how his mother never loved him. It was like he was coming in for his last testament."

Tom Freston, the former chief executive of Viacom, has shown little interest in publicly reflecting on his 2005 firing by Sumner Redstone, the Viacom chairman. But he agreed to discuss it with Big Think, saying during an interview, "Say if you're a CEO of a public company, a lot of it you're playing defense. You're dealing with problems or crises. At the moment, in the smaller life I have for myself I've got a lot less of that, which is a good thing."

Videos stockpiled over the past months will be rolled out piecemeal and used in a variety of ways. For example, the site may pose the question "are two parties enough?" and assemble clips from U.S. politicians like John McCain and Dennis Kucinich. Readers would then have an opportunity to submit their own views.

"The idea behind Big Think is that you do have to sit down for a few minutes and listen to people who know more than you do," Hopkins said.

Hopkins expects his site will naturally appeal to secular Eastern intellectuals, but he wants to challenge their secularism with sections on faith and love and happiness. "There's a ton of evangelicals," including an interview with Rick Warren, the pastor and best-selling author of "The Purpose Driven Life."

"People, whether or not they believe in God, these issues really resonate," said Hopkins. "Look at the success of 'The Secret' and 'The Purpose Driven Life.' "
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Embryo cloning should cease
Research should focus on work that doesn't compromise ethics

Calgary Herald

Sunday, January 20, 2008

It was exactly two months ago that scientists working on independent projects at Japan's Kyoto University and the University of Wisconsin came up with a way to obtain stem cells without using embryos.

So it is rather strange that Stemagen Corp. of La Jolla, Calif., claims its researchers have successfully cloned human embryos and that their work will lead to great strides in the stem cell field. The truth is, it won't -- and it shouldn't go forward. There's no need for it, with the dazzlingly simple technique developed by the Japanese and American researchers, which involves reprogramming four genes in human skin cells. The reprogramming is like reformatting a computer disk, making the cells ready to be "initialized" to create whatever specific cell is needed.

The California researchers say they cloned five embryos at the blastocyst stage. According to the Advanced Fertility Center of Chicago, a blastocyst is an embryo that "has developed to the point of having two different cell components and a fluid cavity. Human embryos . . . usually reach the blastocyst stage by five days after fertilization."

A blastocyst is a pre-embryonic stage that occurs before the developing mass of cells normally implants itself in the wall of the uterus, to continue developing into a fetus.

The scientists in Japan and Wisconsin say their method can produce stem cells genetically matched to the donor, without the morally repugnant processes of cloning or creating and then killing human embryos for their stem cells.

There is more than a tinge of the Frankensteinish to the La Jolla lab's work. No stem cells could be taken from the blastocysts because during the scientific process of verifying that these were actually clones, the stem cells were destroyed.

These embryos, then, appear to have been created solely for the purpose of seeing if it could finally be done, to achieve some sort of misguided prestige in the scientific community at being the first to attain such a dubious distinction.

Dr. Douglas Melton, co-director of the Stem Cell Institute at Harvard University, calls the Japanese and American researchers' method "ethically uncomplicated."

There is no need for the La Jolla lab, or any other organization for that matter, to continue trying to clone human embryos; they should redirect their work in accordance with the latest development, instead.

Certain things need to firmly remain taboo in an enlightened society. One of them is performing ghoulish replications of human embryos just for the sake of it, and the other is creating human life, only to kill it for the purposes of science.

Research has come up with a new approach that liberates stem cell work from its moral shadows. It should be embraced for its impeccable ethics.

© The Calgary Herald 2008
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

http://www.abc4.com/mediacenter/local.a ... eoID=61147

Merry Go Round Empower Playground

The Empower Playground uses kid-power to create enough energy to light their school's classrooms. The project was designed by Brigham Young University engineering students. The first system will be installed in a community in Ghana.



Video story from ABC4 News in Utah.
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Patient uses own stem cells for new jaw

Sami Torma
Reuters


Saturday, February 02, 2008


Scientists in Finland said they had replaced a 65-year-old patient's upper jaw with a bone transplant cultivated from stem cells isolated from his own fatty tissue and grown inside his abdomen.

Researchers said on Friday the breakthrough opened new ways to treat severe tissue damage and made the prospect of custom-made living spare parts for humans a step closer to reality.

"There have been a couple of similar-sounding procedures before, but these didn't use the patient's own stem cells that were first cultured and expanded in laboratory and differentiated into bone tissue," said Riitta Suuronen of the Regea Institute of Regenerative Medicine, part of the University of Tampere.

She told a news conference the patient was recovering more quickly than he would have if he had received a bone graft from his leg.

"From the outside nobody would be able to tell he has been through such a procedure," she said.

She added, the team used no materials from animals -- preventing the risk of transmitting viruses than can be hidden in an animal's DNA, and followed European Union guidelines.

Stem cells are the body's master cells and they can be found throughout blood and tissues. Researchers have recently found that fat contains stem cells which can be directed to form a variety of different tissues. Using a patient's own stem cells provides a tailor-made transplant that the body should not reject.

© The Calgary Herald 2008
kmaherali
Posts: 25105
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Clock accurate for 200 million years
Atomic instrument vies to be world's most accurate

Julie Steenhuysen
Reuters


Saturday, February 16, 2008


U.S. physicists have made a clock so accurate it will neither gain nor lose even a second in more than 200 million years, a finding sure to please even the most punctually minded.

The clock, described in the Friday issue of the journal Science, outperforms the official atomic clock used by the U.S. Commerce Department's National Institute of Standards and Technology, which promises to keep accurate time down to the second for 80 million years.

The new atomic clock is vying for the title of world's most accurate with another experimental clock developed in the same lab at the Joint Institute for Laboratory Astrophysics, a collaboration between NIST and the University of Colorado in Boulder.

"These clocks are improving so rapidly that it is impossible to tell which one will be the best," said Tom 0'Brian, head of the Time and Frequency Division at NIST.

Such highly precise clocks are critical for deep space navigation, where even a slight error can make or break a space mission.

The secret to making an extremely accurate clock is speeding up how fast it ticks. "If you make a mistake, you can know about that mistake very fast," said Jun Ye, who developed the atomic clock at JILA.

Ye's clock has 430 trillion "ticks" per second.

Its pendulum uses thousands of strontium atoms suspended in grids of laser light. This allows the researchers to trap the atoms and measure the movement of energy inside.

"Essentially, we are probing the energy structure of the atom. We are probing how electrons make transitions between a set of energy levels," Ye said in a telephone interview.

"This is the time scale that was made by the universe. It is very stable."

To test his clock's accuracy, Ye and colleagues compared it with another optical atomic clock -- this one measuring calcium atoms.

****
Spacewalkers connect solar observatory

Ed Stoddard and Irene Klotz
Reuters


Saturday, February 16, 2008



CREDIT: NASA video, Agence France-Presse, Getty Images
Atlantis mission specialist Stanley Love moves the European Technology Exposure Facility from the space shuttle's cargo bay on Friday for installation on the International Space Station.

Two shuttle Atlantis astronauts wrapped up a spacewalk Friday to install a solar observatory and a science experiment on Europe's space lab.

The Columbus module, the European Space Agency's $1.9 billion permanent space laboratory, was launched aboard NASA's Atlantis last week and connected to the International Space Station Monday.

The solar observatory contains instruments which will, among other things, measure aspects of the sun's energy and help scientists decipher the impact of solar activity on Earth's climate.

The other facility attached to Columbus' hull will be used to conduct a range of space-related experiments. These include exposing lichen and fungi for around 1 1/2 years to space conditions to test the limits of their survival.

Another will evaluate the effects of space on different materials which may be used on spacecraft in low earth orbit.

"The aim is to improve components and materials for spacecraft design," Alan Thirkettle, the ISS program manager for the European Space Agency, told Reuters.

Lead spacewalker Rex Walheim and partner Stanley Love flew out of the station's airlock about 8:15 a.m. EST to begin the third and final outside excursion planned during Atlantis' nine-day visit. It lasted almost 7-1/2 hours.

They also picked up a broken gyroscope and did some inspection work on a hand rail outside the airlock but did not have time to examine a contaminated solar wing joint that has mired station operations since October.

© The Calgary Herald 2008
Post Reply