Monday, December 8, 2008

The Cataclysmic Death of Stars


Ever since he was a teenager, Stan Woosley has had a love for chemical elements and a fondness for blowing things up. Growing up in the late 1950s in Texas, "I did everything you could do with potassium nitrate, perchlorate, and permanganate, mixed with a lot of other things," he says. "If you mixed potassium nitrate with sulfur and charcoal, you got gunpowder. If you mixed it with sugar, you got a lot of smoke and a nice pink fire." He tested his explosive concoctions on a Fort Worth golf course: "I screwed the jar down tight and ran like hell."
kaboomWoosley, now an astronomer at the University of California at Santa Cruz, has graduated to bigger explosions—much bigger. Woosley studies some of the most powerful explosions since the birth of the universe: supernovae, the violent deaths of stars.

The universe twinkles with these cataclysms. They happen every second or so, usually in some unimaginably remote galaxy, blazing as bright as hundreds of billions of stars and creating a fireball that expands and cools for months.

We're lucky that they rarely strike close to home. The last supernova in our own galaxy exploded in 1604, rivaling Jupiter's brightness in the night sky and deeply impressing Johannes Kepler, the pioneering astronomer. A nearby supernova—within a few light-years—would bathe the Earth in lethal radiation.

Yet the legacy of supernovas is as close as our own bodies. The carbon in our cells, the oxygen in the air, the silicon in rocks and computer chips, the iron in our blood and our machines—just about every atom heavier than hydrogen and helium—was forged inside ancient stars and strewn across the universe when they exploded billions of years ago. Eager to understand our origins and, in some cases, simply wild about things that go bang, astronomers have been struggling for decades to understand why stars that shine peacefully for millions of years suddenly blow up.

Lately they've had two big breaks. One is a revelation about potent blasts of high-energy gamma rays that come from distant points in the heavens. For decades astronomers have puzzled over their origins, but space probes recently clinched the answer, which Woosley proposed more than a decade ago: Many gamma-ray bursts are the early warning signals from supernovas, emitted minutes before the explosion.

The link offers a glimpse of events leading up to the actual explosion—another mystery. There, too, researchers have made headway. Looking not at the heavens but at computer models of supernovas, some think they have figured out what may trigger the final cataclysm. The missing element may be unimaginably powerful reverberations—the sound of a star singing its own swan song.

For astronomers, there's usually no rush to study something before it vanishes. "The universe usually evolves as slowly as watching paint dry," says one. But these days, hundreds of astronomers keep cell phones and beepers close by so they can rush to work like doctors on call. They're waiting for word from a spacecraft called Swift.

Swift, launched in 2004, scans the skies for gamma rays. When it detects a burst, it swivels its telescopes toward the source to get a good fix and detect the afterglow—the lingering point of light that marks the spot where a burst originated. It also sends an alert to earthbound astronomers, who can take a closer look with bigger telescopes.

Early on February 18, 2006, Swift recorded an outpouring of gamma rays from somewhere toward the constellation Aries. Within three minutes, the satellite had determined the position of the burst and broadcast an alert. Two days later, astronomers at a telescope in Arizona reported that the burst came from a small, nearby galaxy, only a fraction as far away as usual.

Astronomers had already traced a connection between bursts and supernovas. But this burst was so close, and Swift had spotted it so quickly, that scientists hoped it would help confirm what they suspected: A gamma-ray burst is an exploding star's opening act.

After an unusually long flood of gamma rays and x-rays, lasting more than half an hour rather than the typical few seconds, the February 18 burst gave way to visible and infrared light. Within three days this afterglow was fading away—and then the supernova grabbed the spotlight.

Astronomers at the Very Large Telescope in northern Chile were watching the afterglow dwindle when they noticed a brightening. The star had exploded just a minute or so after the burst, but most of its energy was invisible ultraviolet and x-ray radiation. Its visible light had brightened more slowly, and now it was finally outshining the afterglow. For the first time, astronomers had seen a gamma-ray burst evolve into a supernova from the very beginning.

Eighteen days after the supernova flared into view, astronomers were still watching. Atop Palomar Mountain in southern California, the observatory dome's twin shutters slid open under patchy clouds, letting a sliver of night sky fall onto the caged mirror of the 200-inch (508-centimeter) Hale Telescope. Caltech astronomer Avishay Gal-Yam had two hours before the supernova would dip too low in the sky for the telescope to see it.

Still more luminous than a billion suns, the supernova outshone the combined light from all the stars in its home galaxy, glowing white-hot from the radioactive decay of unstable nickel atoms forged in the explosion. Gal-Yam pointed to a computer screen showing a squiggly line—the glow broken down into its component colors, or wavelengths. Each dip in the line represented a wavelength of light absorbed by a different element—silicon, cobalt, calcium, iron—in the debris of the star.

Destruction and creation were conjoined on the screen. The elements revealed there, like those from countless earlier supernovas, will eventually find their way into new stars and perhaps new planets, Gal-Yam said. He added: "I'm just really happy to be observing this."

The star had begun its race to destruction long before that night on Palomar, when it began to lose a lifelong fight against gravity. Gravity is responsible for setting newborn stars aflame, by squeezing atoms of hydrogen in the star's core so tightly that they fuse to make helium. The fusion generates light and heat and also exerts pressure that allows the core to withstand the enormous weight of the star's outer layers.

But when the core consumes all of its hydrogen, gravity compresses it. The temperature of the shrinking core rises to about a hundred million degrees, hot enough for helium nuclei to fuse and make carbon. The new surge of energy keeps the core from collapsing much further.

For an isolated star no heavier than the sun, there is little more to the story. The star burns all of its helium and shrivels. It turns into a white dwarf about the size of Earth, aging and cooling indefinitely—unless it lies close enough to another star to steal its neighbor's outer layers of hydrogen. If enough material falls onto the white dwarf, the siphoned fuel ignites a thermonuclear explosion. As the detonation spreads, the entire star blows up in what is known as a type 1a supernova—a giant nuclear bomb.

The supernova blossoming over Palomar was a different kind: not a thermonuclear blast but a star's catastrophic collapse. This is the only kind of supernova that can unleash a gamma-ray burst, and it is the inevitable fate of a star more than eight times as massive as the sun.

Such heavyweight stars always lose their battle with gravity. With the crushing weight of the star's outer layers bearing down on its core, the fusion reactions don't stop at carbon. The star continues to cook lighter nuclei into progressively heavier elements, but each nuclear reaction runs its course faster. The transformation from carbon to oxygen takes 600 years, from oxygen to silicon 6 months, from silicon to iron a day. Once the star's core turns to solid iron—a sphere no bigger than Earth that weighs as much as the sun—its fate is sealed. In less than a second, the star will explode.

Iron marks the end of the road because unlike lighter elements, iron atoms consume rather than create energy when they fuse. Fusion can no longer provide the energy to support the star's outer layers, and the core simply implodes. Usually the result is a neutron star, a stellar cinder so dense a teaspoon would weigh more than a billion tons. In the most massive stars the collapse leaves only a voracious pit called a black hole.

At this point, Woosley believes—before the collapse somehow turns into an explosion—some supernovas unleash a blast of gamma rays. Woosley's interest in these bursts goes back decades, when they were so mysterious that over a hundred more or less serious ideas about their cause were in play, from "starquakes" to the exhaust plumes of alien spacecraft. But his fascination deepened in the early 1990s, when a spacecraft called the Compton Gamma-Ray Observatory showed that gamma-ray bursts originate far beyond our galaxy. To appear as bright as they do, they had to be more energetic than anyone had imagined—far brighter than supernovas, Woosley's first love.

They also needed a source of energy far beyond what any ordinary star could provide. Perhaps the cataclysmic jolt of a collapsing star could somehow be harnessed to produce gamma rays. So Woosley set out to determine how a core-collapse supernova could generate a burst.

He and his collaborators, including Andrew MacFadyen of New York University, stage their explosions in computers. They start with a whopper of a star, about 40 times the mass of the sun, spinning so fast—several hundred miles a second at the equator—that it barely keeps from flying apart. Near the end of its life, unable to resist the pull of its own gravity, the core of the star collapses to make a black hole. But because the star has so much spin, some of the infalling material resists the tug of the newborn black hole. A swirling disk of material forms around the hole—a maelstrom deep within the doomed star.

"Rotation is the name of the game," says Woosley. Without spin, there would be no disk. And without a disk, there'd be no burst. Friction heats the disk, whipping around the black hole thousands of times a second, to 40 billion degrees (22 billion degrees Celsius), while new material keeps cascading in. Moments after the black hole forms, jets of superheated gas blowtorch outward.

Each jet may draw its energy directly from the friction in the disk, or from the newborn black hole, via the magnetic fields that link it to its surroundings. Like the original star, the black hole spins frenetically, which could cause the fields to stretch, twist, and snap like rubber bands, dumping vast amounts of energy into the disk.

Either way, the jet shoots outward, reaching the surface of the star in a mere ten seconds. If the star has retained its original, puffy envelope of hydrogen gas, the jet stops dead and the gamma-ray burst may fizzle. But if the powerful winds that blow from some massive stars have stripped away the hydrogen earlier in the star's life, the jet escapes, arrowing into space at more than 99 percent of the speed of light.

Now comes the burst: High-speed collisions between blobs of material in each jet produce a cascade of speedy electrons. The electrons whirl around the jet's magnetic fields, flinging out gamma rays. Over many days, as the jet plows into the thin gas between the stars, it generates an afterglow at visible, infrared, and radio wavelengths.

The February 2006 burst was dimmer than most, perhaps because the star was not massive enough to form a black hole. Woosley suggests that the same sequence of events—an implosion, a spinning disk, jets—can still happen when the stellar collapse ends with the formation of a fast-spinning neutron star rather than a black hole.

Even after the jets have erupted, the star has not yet exploded. "The jet gets to the surface of the star minutes beforehand," says Woosley. "The burst is a herald of the supernova."

It's not enough, however, to cause the explosion. "Just running a jet through a star won't make a very good supernova," says Woosley. "It will unbind some of the star, but most of it will fall back." To make a collapsing star explode, he says, "there needs to be something else."

In the stars that launch gamma-ray bursts, the spinning black hole and the disk may pump out enough energy to blow the star apart. But in most collapsing stars, the collapse ends when the Earth-size core crunches into a neutron star the size of a city, at a temperature of a hundred billion degrees (55 billion degrees Celsius). This is the point of maximum scrunch. The squeezed core rebounds like a squished sponge, launching a shock wave that races outward, ramming into the material that is still pouring down from the star's outer layers.

Astronomers once thought this shock would be enough to tear the star apart and generate the explosion, says Adam Burrows of the University of Arizona. Turns out it's not so simple.

Simulating a supernova gobbles enormous amounts of computer power, and even the largest supercomputers can't fully reproduce an exploding star in three dimensions. But over the years the models have improved, and the shock wave scenario has fallen apart.

Researchers found that less than a thousandth of a second after the shock wave is generated, a flood of tiny, nearly massless particles called neutrinos escapes from the center of the star. The neutrinos, born in the collapsing core, drain energy from the shock wave. The shock stalls, and—at least in the computer—the supernova is a dud.

Now Burrows and his colleagues are working with a computer model powerful enough to simulate how the core shakes and churns during the collapse, and they've finally seen how a collapsing star could turn around and explode. The turbulent infalling gas starts shaking the core, causing it to pulsate. Raining down from the star's outer layers, the gas wraps around the core, dancing over its surface and penetrating its depths.

"The core is oscillating, and the stuff falling onto the core is exciting it," says Burrows. In about eight-tenths of a second, the oscillations are so intense they send out sound waves. The waves exert a pressure that expels material, reinforcing the shock wave created by the star's collapse. They also amplify the core's vibrations in a runaway reaction, says Burrows, "until the star finally explodes."

For someone brave enough to come within hearing distance, the waves would be audible, roughly the F note above middle C.

Burrows acknowledges that sound waves may not be the full story. But his model tends to produce a lopsided explosion, and stars do indeed explode asymmetrically, with more punch in some directions than others. That was true for supernova 1987A, recorded 20 years ago, the closest and brightest supernova since 1604. Astronomers also have found that some of the neutron stars left behind by supernovas zip along at 500 miles a second (800 kilometers a second), as if the explosion had imparted an enormous kick in one direction.

Stronger evidence for the sound wave idea could come from two sprawling facilities, in Hanford, Washington, and Livingston, Louisiana, designed to detect gravitational waves—ripples in the fabric of space and time. Gravitational waves, predicted by Einstein's theory of general relativity but never directly observed, should be produced whenever immense masses shake and twist, as they do in the core of a supernova.

If sound waves really are at work inside a collapsing star, it should vibrate only at certain frequencies, generating matching gravitational waves. Burrows calculates that for a supernova in or near our galaxy, the existing detectors could pick up these signals—clues to a big, big noise.

Stars, it seems, really may go kaboom. Woosley, still in love with pyrotechnics, is delighted. "It's like God built the universe just for me."

NASA Delays Mars Probe

NASA is dumping plans for next year’s launch an ambitious science probe to the surface of Mars to search for signs of life.

The new plan, outlined during a press conference on Thursday, is to fly in 2011, the next time Earth and Mars are favorably positioned.

“If we could delay the launch for a few months we would, but launch opportunities don’t allow that,” said NASA chief Michael Griffin.

The problem revolves around motors in a system being designed to lower the Mars Science Laboratory onto the planet’s surface.

The delays will add another $400 million or more to the probe’s already-ballooned $1.9 billion price tag. Managers expect to delay other Mars probes, and if necessary, other projects in NASA’s space science portfolio, to cover the costs. Those folks weren't at the press conference.

PRI's The World: Technology Podcast 222


To be honest, I've always given myself wide latitude when it comes to choosing what goes into my weekly "tech" podcast. Often, when a major public health story comes along, I find it irresponsible not to pass it along to listeners. And so, the highlight of this week's Tech Podcast (WTP 222) is "Grace," a woman in Ivory Coast who is pictured here holding her AIDS medication. We started with some basic questions. Where were those pills made? How did they get to Grace? And would it be possible to trace a shipment of pills from their point of origin and into her hands? The World's Health and Science Editor, David Baron, takes you on that journey. It's fascinating. It's informative. And there are pictures. We follow David's reported piece with an interview with Elizabeth Pisani, an epidemiologist and author of the book The Wisdom of Whores. Pisani weighs in on PEPFAR (The President's Emergency Plan for AIDS Relief), the Bush Administration's multi-billion dollar effort to tackle the disease in the developing world.

Not to worry, though, because there's plenty of tech in the WTP this week as well. We go to Croatia and hear about how several opposition activists were arrested recently after they used the social networking site Facebook to, well, poke a bit of fun at the current Prime Minister, and to organize protests against him. Did the online protest translate into boots on the streets of Zagreb? Listen in, and find out.

Next up, we speak with Drew Cogbill, a student at the Parsons The New School for Design in NYC. For his thesis, he wanted to combine his twin love of design and technology in pursuit of a social networking set-up for people without access to the Internet, but with access to phones. His answer? Pigeon.My Pigeon User Number, should you want to add me as a contact, is 345-345.

We also return to CERN, outside of Geneva, to find out how and why the Large Hadron Collider suffered a magnet meltdown, and how scientists and engineers plan to fix the problems.

And we end with a global spin on a story you've probably heard about. President Elect Obama likes his BlackBerry...a lot. But he may have to give it (and maybe even email) up because of national security and privacy concerns. So, we go in search of other world leaders who really, really like their tech. Fun. And it all came courtesy of emails (clark.boyd [at] bbc.co.uk), Facebook messages and Tweets from listeners to the WTP. Thanks!

Humans 80,000 Years Older Than Previously Thought?


Modern humans may have evolved more than 80,000 years earlier than previously thought, according to a new study of sophisticated stone tools found in Ethiopia.

The tools were uncovered in the 1970s at the archaeological site of Gademotta, in the Ethiopian Rift Valley. But it was not until this year that new dating techniques revealed the tools to be far older than the oldest known Homo sapien bones, which are around 195,000 years old.
Using argon-argon dating—a technique that compares different isotopes of the element argon—researchers determined that the volcanic ash layers entombing the tools at Gademotta date back at least 276,000 years.

Many of the tools found are small blades, made using a technique that is thought to require complex cognitive abilities and nimble fingers, according to study co-author and Berkeley Geochronology Center director Paul Renne.

Some archaeologists believe that these tools and similar ones found elsewhere are associated with the emergence of the modern human species, Homo sapiens.

"It seems that we were technologically more advanced at an earlier time that we had previously thought," said study co-author Leah Morgan, from the University of California, Berkeley.

The findings are published in the December issue of the journal Geology.

Desirable Location

Gademotta was an attractive place for people to settle, due to its close proximity to fresh water in Lake Ziway and access to a source of hard, black volcanic glass, known as obsidian.

"Due to its lack of crystalline structure, obsidian glass is one of the best raw materials to use for making tools," Morgan explained.

In many parts of the world, archaeologists see a leap around 300,000 years ago in Stone Age technology from the large and crude hand-axes and picks of the so-called Acheulean period to the more delicate and diverse points and blades of the Middle Stone Age.

At other sites in Ethiopia, such as Herto in the Afar region northeast of Gademotta, the transition does not occur until much later, around 160,000 years ago, according to argon dating. This variety in dates supports the idea of a gradual transition in technology.
"A modern analogy might be the transition from ox-carts to automobiles, which is virtually complete in North America and northern Europe, but is still underway in the developing world," said study co-author Renne, who received funding for the Gadmotta analysis from the National Geographic Society's Committee for Research and Exploration.
Morgan, of UC Berkeley, speculates that the readily available obsidian at Gademotta may explain why the technological revolution occurred so early there.

Complicated family tree

The lack of bones at Gademotta makes it difficult to determine who made these specialist tools. Some archaeologists believe it had to be Homo sapiens, while other experts think that other human species may have had the required mental capability and manual dexterity.

Regardless of who made the tools, the dates help to fill a key gap in the archaeological record, according to some experts.

"The new dates from Gademotta help us to understand the timing of an important behavioral change in human evolution," said Christian Tryon, a professor of anthropology from New York University, who wasn't involved in the study.

If anything, the story has now become more complex, added Laura Basell, an archaeologist at the University of Oxford in the U.K.

"The new date for Gademotta changes how we think about human evolution, because it shows how much more complicated the situation is than we previously thought," Basell said.

"It is not possible to simply associate specific species with particular technologies and plot them in a line from archaic to modern."

Sunday, December 7, 2008

Turning an Energy Hog Into Bacon



One of the biggest offenders on an electric bill is the refrigerator, which can cost $150 annually to power if it's a clunky old one from the 1970s. That inefficiency adds up, unnecessarily stressing the grid. In the UK, the giant utility Npower and clean tech firm RLtec are launching a "smart fridge" trial to tackle this chilly problem.

The trial will start next year with 300 fridges equipped with dynamic demand technology that adjusts in real time to changing grid conditions without affecting performance. Before the end of next year, the program will expand to include 3,000 fridges.

Npower and RLtec just got the green light yesterday after an Imperial College study showed that the technology has the potential to save the country $328 million on energy costs, not including the $1.1 billion that goes to balancing the country's grid annually. Ironically, dynamic demand was invented by American engineer Fred Schweppe in the late 1970s. Seeing as we're in a financial pickle stateside, it'd be cool to see widespread dynamic demand here. Heck, we're already upgrading our TVs.

No More Stupid Chargers


Looking at the jumble of unique chargers and power adapters competing for our electrical outlets, something clearly has to give. Billions of these incongruent devices are manufactured, sold, and ultimately tossed. Plus, they suck up electricity even after they're done working and it's hard to remember to unplug.

The San Ramon, California-based company Green Plug is gearing up to help us with a smarter system. After lugging separate chargers to a friend's wedding, Green Plug founder and CEO Frank Paniagua Jr. envisioned attaching electronics--including larger devices such as computers--to a single, intelligent AC adapter. This green hub would plug into the wall and automatically detect each device's power demands, adjusting the power supply accordingly. Plus, it would turn itself off when not in use.

Green Plug isn't a device company--they make the technology that enables the system to work. To encourage widespread use, they've been licensing their technology to consumer electronics makers for free so future gizmos will be ready. Hard to argue with something useful that's free. Revenue comes from selling chips for the adapters.

So far Green Plug's system has just been a clever concept that's well-received at demos and in the press. However, a little birdie tells me that the company will be announcing a manufacturing partnership in a few weeks. The resulting intelligent adapters are scheduled to hit the market next spring, right in time for wedding season.

"Stem Cell Tourists" Go Abroad for Unproven Treatments


When Robert Ramirez was diagnosed with Parkinson's disease in 2006, his doctor gave him medication, but little hope: "He told me there wasn't much I could do except wait to 'pass to the other side.'"

Ramirez, a Peruvian-American mechanic in northern New Jersey, watched helplessly as his symptoms worsened. His left arm grew weak. His leg muscles went rigid. His trembling intensified.
Then his wife, Elvira, saw Jorge Tuma on a Peruvian news show. The Peruvian doctor, based in Lima, claimed to be treating Parkinson's and other diseases with injections of stem cells. They checked out his Web site: For $6,000, relief could be theirs. They booked a flight to Lima.

Ramirez is one of an increasing number of patients seeking stem cell therapies overseas—experts put the number in the thousands. And Tuma is one of dozens of non-U.S. doctors offering such treatments.

The trend is significant enough that the International Society for Stem Cell Research (ISSCR) released guidelines today for doctors using stem cells and would-be "stem cell tourists."

U.S. experts fear that some foreign doctors are rashly treating patients without waiting for clinical trials to validate the safety of their procedures.

"There are many doctors tapping into the public's sense of stem cells' potential to cure in countries with looser medical regulations," said Sean Morrison, director of the University of Michigan Center for Stem Cell Biology, and treasurer of ISSCR. "But the details of stem cell treatment are much more complicated."

Yet stem cell therapies are becoming a lucrative area of medical tourism, even though science has yet to divine their potential and controversy plagues the field.

The cells, found in embryos and certain adult body tissues, have the potential to grow into many different types of cells. But ethical issues surrounding the use of embryos as stem cell sources has slowed research in countries such as the U.S. and U.K.

Researchers in the U.S. are conducting clinical trials using both adult and embryonic stem cells to treat diseases, but the U.S. Food and Drug Administration has yet to license any such treatment.
The new ISSCR guidelines require that every treatment must be evaluated by experts with no vested interest in the procedure. The guidelines also advocate for an informed consent process, which would provide patients with full information about their procedure, and transparency in reporting of clinical trial results.

The ISSCR's handbook for patients advises against experimental stem cell treatments—those that are not part of official clinical trials. It also lists warning signs for dubious treatments, such as the claim that multiple diseases can be treated with the same type of cell.
That's just one of many optimistic assertions being made by foreign doctors, often on Web sites.

Timothy Caulfield at the University of Alberta's Health Law Institute surveyed 19 sites proffering stem cell treatments, and released his findings today. Ten sites portrayed the treatments as "very ready for public access," rather than experimental.

Most people learn about stem cell treatment offerings from "direct-to-consumer advertising" on the Internet, he said.

"There is a mismatch between what is being offered and what the existing scientific literature says," Caulfield said. "The people offering the treatments are able to trade on two things: the genuine excitement about stem cell research and the social controversy around it."

Clinics charge on average U.S. $21,500 for stem cell treatment, the survey noted, but recent news reports indicate that clinics in China may be charging as much as $70,000.

Researchers have found that unproven stem cell treatments can also cause complications for patients. In 2006, neurologist Bruce Dobkin, of the University of California, Los Angeles, found that some patients contracted meningitis after operations for chronic spinal cord injuries.

Nervous system complications and infections have also been reported after the use of stem cells to treat blood diseases.

Caulfield said that not all doctors offering stem cell therapy are quacks, but he also believes that anyone selling treatment should be publishing data to back up their claims.

"There are still real scientific barriers to this research—even the top stem cell researchers at Stanford University and in the United Kingdom are struggling with clinical trials," Caulfield said.

In the Trenches

Tuma, the cardiologist sought out by Ramirez, the Parkinson's patient, promises to restore ailing organs and tissues using adult stem cells harvested from the patients' own bodies.

Since 2005, Tuma has treated some 600 patients—about a quarter from outside Peru—for a range of conditions from Parkinson's to Type 2 diabetes to emphysema.

His method: injecting the affected organ with stem cells from the patient's own bone marrow.

"I always tell my patients this is not a cure, but I believe it is a tremendous new alternative to improve quality of life," Tuma said.

He operated on Ramirez in October 2007 in a simple procedure lasting 45 minutes. Tuma extracted and prepared bone marrow cells from Ramirez's spine and injected them into an artery in the brain. There, Tuma said, they began to generate new cells that would inhibit the advance of Parkinson's.

Within a week, Ramirez said he began to notice his legs loosen. Then some strength returned to his left arm. He feels better than before the operation and his symptoms are less discernible.

"I can dance with my wife and live almost a normal life," Ramirez said. "I'm very grateful to Dr. Tuma."

Like many doctors offering stem cell treatments, Tuma's approach has not been sanctioned by his country's government.

Though he's published small-scale results of his heart therapies in journals including the Journal of Cardiac Failure and Cardiovascular Revascularization Medicine, he's yet to release papers on the efficacy of his other treatments.

According to Insoo Hyun, a professor of bioethics at Case Western Reserve University and the chairman of the ISSCR Stem Cell Guidelines Task Force, charging patients for unproven treatments is considered unethical.

"Either you're doing research or you're offering a proven therapy, but some of these stem cell doctors seem to want it both ways," Hyun said.

No Time to Wait

Timothy Henry, a cardiologist at the Minneapolis Heart Institute/Abbott Northwestern Hospital, is authorized by the FDA to conduct randomized clinical trials using adult stem cells for heart conditions. He has treated 150 people and said the early data are promising.

But the United States has lagged behind the rest of the world in stem cell research because of the ethical concerns raised over embryonic stem cell research, he admits.

"Adult stem cell research has been very challenging with all the misinformation and confusion about embryonic stem cells," Henry said.

Desperate patients like Ramirez are reluctant to wait on hard evidence and FDA approval, however.

Roberto Brenes is another doctor performing adult stem cell implants. He attracts patients to a clinic in San Jose, Costa Rica, through the Web site cellmedicine.com.

He and his colleagues have treated between 50 and 70 multiple sclerosis patients with stem cells taken from fat tissue, charging between $15,000 and $25,000, with what he said are "pretty good success rates."

While Brenes acknowledges that there has never been a clinical trial demonstrating the efficacy of stem cell therapy for MS, he said many patients do not want to wait.

"This area is going to progress a lot in the next ten to 15 years, but a lot of patients need therapeutic help now and want to go through with the procedure," Brenes said.

Even if it means expensive follow-up visits: Tuma has told Ramirez to see him every six months so that he can check on his progress. And Tuma said that if the Parkinson's symptoms return, another procedure might be needed.

"I know the therapy is not a complete cure, but I don't think it's dangerous and would do it again," Ramirez said.

The ISSCR's Morrison, however, remains skeptical.

"A lot of patients will spend $6,000 to buy hope, but that still doesn't make it right to sell snake oil."

'RoboClam' Anchor Holds Ships Steady


Ships have changed dramatically over the last few thousand years, but one piece of technology -- the heavy, metal anchor -- has remained largely untouched. But scientists have now created a light-weight, cigarette-sized anchor that burrows itself into the sea floor, anchoring anything from small unmanned submersible to maybe even huge oil platforms.
The new anchor is based on one of nature's faster diggers, the oblong-shaped razor clam, Ensis directus.
"It turns out that clams are actually very fast diggers," said Anette "Peko" Hosoi of the Massachusetts Institute of Technology. "One of my students, Amos Winter, actually calls the razor clams we looked at something like the Ferreri of the clam world."
The RoboClam, as the device is called, digs itself into the ground in two ways, similar to how a razor clam digs.
First, the RoboClam vibrates, changing the relatively solid seabed into a quicksand-like fluid that is easier to dig through. Then the two "shells" of the machine expand, locking the anchor in place, while a worm-like foot pushes down. Once the foot is embedded, the shells contract and the foot pulls the rest of the machine down.
The team is still testing and refining the machine. For now, the RoboClam can push down with about 80 pounds of force, 36 times greater than a razor clam, and dig up to 15 inches deep. The researchers hope the RoboClam will eventually dig twice as far as a razor clam, which can reach depths of more than 28 inches at a rate of about 0.4 inches per second.
Once deep enough, the RoboClam is more than 10 times stronger and an order of magnitude more energetically efficient at burrowing than other vibration-based anchors. It is several orders of magnitude more efficient than traditional anchors, and, if necessary, can even dig itself out.
"I was amazed when I saw those numbers," said Hosoi. "I thought we were onto something great then."
The MIT scientists originally developed the RoboClam to anchor a small submersible known as the Bluefin, which was designed to gather information from the seabed. A traditional anchor would weigh too much and other vibration-based anchors took too much energy to use, both of which limited the use of the Bluefin. The RoboClam should solve these problems.
That's the theory, at least, said Wolfgang Lohsert, an expert in granular media at the University of Maryland who is testing the RoboClam to more fully explain the burrowing abilities of the RoboClam and razor clams.
"If you can dig more directly into sandy soil and also control the direction of the digging, there are a number of applications, including exploration of natural resources," said Lohsert. Oil giant Chevron is considering using the RoboClam as a new way to anchor its huge off-shore oil platforms. It might even be possible to use the drill on dry land.

New Model of Jupiter's Core Ignites Planet Birth Debate


Underneath its swirling cloud layers, Jupiter may harbor a solid core roughly equal in mass to 16 Earths—more than twice as large as previously believed.

That's the conclusion of a controversial new computer simulation that represents the first radical rethinking of the planet's core in nearly two decades.
The work has reignited debate among planetary scientists over how gas giants such as Jupiter first formed.

"The biggest surprise was the large core," said study leader Burkhard Militzer of the University of California, Berkeley.

"We concluded that the planet formed by core accretion," when colliding grains of dust, ice, and small planetary bodies meld to create planetary embryos and eventually fully formed planets.

Core of the Issue

Many scientists think core accretion is a good model for the birth of rocky terrestrial planets.

But it has been hard to apply to gas giants, which are so gassy and massive that simulations suggest they wouldn't have had enough time to grow as large as they are between their core formations and now.

One leading alternative theory, championed by Alan Boss of the Carnegie Institution in Washington, D.C., is disk instability.

This is when clumps of gases inside the disk of planet-forming material around a young star will cool and collapse to form gas giants.

But if Jupiter really has a much larger rock-ice core than thought, Militzer said, the accretion model becomes a better fit.

The study comes just about two years ahead of a recently approved NASA mission called Juno that may finally put an end to the decades-long dispute.


"The real surprise [from Juno] would be if Jupiter has no core at all," Boss said. "Both models for making Jupiter would be in trouble."

For their study, Militzer and colleagues used advanced computer simulations to model changes in temperature, density, and pressure all the way to Jupiter's deep interior.
The simulation also used data about Jupiter's size and gravity field obtained by previous spacecraft, noted co-author William Hubbard of the University of Arizona.

Details of the simulation were recently published in the Astrophysical Journal Letters.

Militzer thinks that after initial accretion, Jupiter's newly formed core swept through the early solar system and gathered most of the outlying gas left over from the formation of the sun about 4.6 billion years ago.

Based on this notion, it took a mere ten million years for Jupiter to achieve its current heft of 318 Earth masses, most of which is hydrogen and helium gases.

But co-author Hubbard is not ready to endorse any given formation theory.

"There's not agreement even among the model makers about what's going on with Jupiter," he said.

"I'm not wedded to the notion that Jupiter formed by core accretion. That's just a reasonable interpretation of our result.

"With [the Juno mission], I would like to get a diagnostic that shows the signature of a core, perhaps in the circulation of the planet, perhaps in the magnetic field," Hubbard added.

Saturday, December 6, 2008

Reviewing the Proximate Future: 2010 Electric Car Round-up


Two recent articles paint pictures of the future of cars powered partly or fully by electricity. Reported by the DOE's Energy Efficiency & Renewable Energy (EERE) division, the LA Auto Show made some of the trends visible. The most prominent, to my mind, was that US companies only showed two vehicles: two new hybrids from Ford. Much busier were European and Japanese companies, including BMW with its all electric MINI E (pictured here).

The trend towards foreign leadership in the electric car space is buttressed by this piece: "Foreign companies advance in race for electric car". The story here is positive for US interests in terms of creating a fertile infrastructure environment for electrics, as California's leaders are teaming with Shai Agassi & Project Better Place with the expressed goal of making San Francisco the “Electric Vehicle Capital of the U.S.” Notably absent in this work, however, are the Big 3 US car makers. Right now, they face short term structural and existential challenges and may not have the bandwidth to invest much in the future. Here's hoping they survive long enough to reinvent themselves ... and their cars.

Tom Leppert Has a Little Conversation with T. Boone Pickens


It was "only five, maybe ten minutes," Dallas Mayor Tom Leppert said of his conversation with T. Boone on using natural gas instead of diesel (the City is on the verge committing its bus fleet to diesel). It's interesting how competing objectives and moving target costs can obscure clear choice. Diesel saves $54 million over natural gas, proponents said at first...$200 million they said later. How are they forecasting for diesel and natural gas, given the volatile price histories of each? Is there a way for them to make account for the external costs of foreign diesel dependency? Are the fuels environmentally equivalent (diesel was dirty but is getting cleaner, but is it cleaner than natural gas today; will it get cleaner still going forward--who can weigh in authoritatively)?

In the absence of an over-arching national strategy (sustainably eliminate foreign oil dependence, for instance), it's tougher to evaluate options for local decisions with so many moving price pieces and in the swirl of so many opinions. Here's hoping that people like incoming National Security Advisor Jim Jones, a former Marine with a keen sense of how energy and national security issues intersect, can articulate a national strategy that creates a meaty backdrop for state and local decision making (like the for-now-stalled Dallas fleet fueling decision).

National strategy doesn't have to dole out incentives to influence local decision making. Example. Building mechanical engineers often over-designed air conditioning and air distribution systems because "you don't get in trouble with building owners if occupants don't complain," and most owners weren't good at connecting sloppy, inflated construction design with inflated construction costs and utility bills. Green building strategy helped us re-look at the habit of over-designing air conditioning and brought a renewed focus on efficient design. Green building strategy didn't hand out rebates to go green, it just helped people think about things differently.

Likewise, a clear national strategy to eliminate our dependence on foreign oil could help shape that swirl of opinion that decides whether city buses are diesel or natural gas.

Antarctic Cruise Ship Runs Aground; Oil Leak Spreading?


A cruise ship stranded itself on Antarctica's western peninsula on Thursday, and may be leaking unknown amounts of oil into the fragile oceans, one expert said.

All 122 passengers and crew were rescued from the leaking ship, Ushuaia, on Friday by the Chilean Navy. The ship did not appear to be in danger of sinking.
The Chilean vessel Aquiles transported 89 passengers and 33 crew members to the Presidente Frei Naval Base in Antarctica.

Jon Bowermaster, a National Geographic Expeditions Council grantee and writer, was on the National Geographic Explorer about 30 miles (48.2 kilometers) from the cruise ship when it ran aground after hitting a rock.
"We were in the same area on Wednesday, when hurricane force winds blew for much of the day, gusting over 100 miles [161 kilometers] per hour," Bowermaster told National Geographic News in an email from the Explorer.

"The Ushuaia reported having been in heavy weather; whether or not this contributed to its [grounding] is speculation, but would make sense."

Bowermaster witnessed the sinking of another Antarctic tourist vessel in November 2007. All 154 passengers of the Canadian M.S. Explorer escaped safely.

Alarm Call

The Panamanian-flagged Ushuaia sent out alarms midday Thursday after it started leaking fuel and taking on water.

A rock damaged the hull as the vessel passed through the Gerlache Strait, Chilean Captain Pedro Ojeda told Argentina's Telam news agency. The crash left the boat adrift in Guillermina Bay.

The Chilean Navy said the cruise ship was carrying 14 Danish passengers, 12 Americans, 11 Australians, 9 Germans, 7 Argentines, 7 British, 6 Chinese, 6 Spaniards, 5 Swiss, 3 Italians, 2 French, 2 Canadians, 2 Irish, a Belgian and a New Zealander. All were in good condition.

The cruise ship, built in 1970, operates from the Port of Ushuaia in southern Argentina, transporting passengers to Antarctica and islands in the icy waters of the South Atlantic.
The navy positioned the ship Lautaro near the abandoned Ushuaia in an attempt to prevent any environmental damage from leaking fuel.
But Bowermaster said it's still unknown how much fuel oil has spilled from the ship.

"A Chilean plane reports seeing no major leak, but it [has] also reported that a fuel leak has spread for half a mile around the ship," he wrote.

"Though containment efforts are being made, it is windy in the area again and the leak is spreading."

Ushuaia may not be able to free itself from the rocks, and has at least one hole, Bowermaster added.

"A sinking ship in this pristine, narrow channel would have long-lasting impact on both the local environment and the future of tourism along the [Antarctic] Peninsula."

"Accident Waiting to Happen"

In addition to the 2007 sinking of the M.S. Explorer, another ship—the Norwegian M.S. Fram—lost engine power during an electrical outage in December 2007 and struck a glacier, smashing a lifeboat but causing no injuries among its 300 passengers.
A boom in Antarctic tourism may be an "accident waiting to happen,"
More than 30,000 tourists were estimated to have made the trek to Antarctica on some 50 different ships during the November 2007 to February 2008 cruise season, according to the International Association of Antarctic Tour Operators, a trade group.

"A big question for those who oversee and monitor tourism in Antarctica is [whether] there be limits on who can visit Antarctica, and on what kind of ship?" Bowermaster added.

Public Ambivalent About Human Enhancing Nanotech


A team of researchers from North Carolina State University and Arizona State University recently released their "Public Awareness of Nanotechnology Study," the first national survey to examine public opinion on the use of nanotechnology for human enhancement. Enhancement meaning, among other things, artificial eyesight, human biomarkers that detect diseases early, implants to improve performance of soldiers on the battlefield and brain implants to permit basic computer to brain functions.

The researchers say, "Overall, we find that attitudes are largely ambivalent and dependent on the information provided in the question wording, but also that interest declines as they learn more and that equity is fairly important concern regarding the long-term distribution of potential benefits."

Some interesting findings:

Most people say they have not heard anything about nano for human enhancements (61%), while just 38% say they have heard nothing about nanotechnology in general.
Of those who have heard something about nanotechnology, most people associate it with “machines and computers” (84%) rather than “consumer products” (47%), even though nano-based applications are mostly enhancements to consumer products.
Interestingly, far fewer people believed that human enhancements were important at the end of the survey after they had been asked more questions about it (55%) than at the beginning before they heard much about it (81%).
I think the last point is the most interesting and harkens back to a post I did in October about how people filter scientific information. An increase in knowledge about the benefits of a particular scientific approach isn't necessarily going to garner more support.

Walruses Threatened by Shrinking Ice


A conservation group is going to court to force the federal government to consider adding the Pacific walrus to the list of threatened species.
The Center for Biological Diversity sued the U.S. Fish and Wildlife Service and Interior Secretary Dirk Kempthorne on Wednesday for failing to act on a petition seeking protection for walruses under the Endangered Species Act.
Walruses are threatened by global warming that melts Arctic sea ice, according to the group, one of the parties that successfully petitioned to list polar bears as threatened. The group also has filed petitions to protect Arctic seals.
The walrus petition was filed in February. The Fish and Wildlife Service was required by law to decide by May 8 whether the petition had merit, which would trigger a more thorough review and a preliminary decision after 12 months. The agency missed the deadline.Rebecca Noblin, an attorney for the Center for Biological Diversity, said the delay would harm walruses.
"Every day that goes by without protecting the walrus, we're emitting more greenhouse gases, accelerating the ice melt," Noblin said.
"In addition to the climate change, the other main threat is oil and gas development that continues to go forward without any consultation regarding walrus," she said.
Fish and Wildlife spokesman Bruce Woods said Wednesday the agency anticipates making a decision on the petition soon but has limited resources. Decisions on endangered species listings are driven by litigation, he said, forcing the agency to rank actions by court order rather than species need.
Global warming is blamed for Arctic sea ice shrinking to record low levels.
The National Snow and Ice Data Center said summer sea ice in 2008 reached the second lowest level, 1.74 million square miles, since satellite monitoring began in 1979. The loss was exceeded only by the 1.65 million square miles in 2007.
Like polar bears, listed as a threatened species in May, walruses depend on sea ice to breed and forage.
Walruses dive from ice over the shallow outer continental shelf in search of clams and other benthic creatures. Females and their young traditionally use ice as a moving diving platform, riding it north as it recedes in spring and summer, first in the northern Bering Sea, then into the Chukchi Sea off Alaska's northwest coast.
Sea ice in the Chukchi Sea, shared with the Russian Far East, for the last two years receded well beyond the outer continental shelf over water too deep for walruses to dive to reach clams. In the fall of 2007, herds congregated on Alaska and Siberia shores until ice re-formed.
According to the Center for Biological Diversity, warming sea temperatures and sea ice loss may also be reducing walrus prey at the bottom of the ocean.
The group hopes a listing could slow plans for offshore petroleum development. Oil companies in February bid on 2.7 million acres in the Chukchi Sea. Other lease sales are planned.
The Fish and Wildlife Service, along with its Russian counterparts, has nearly completed a comprehensive population count of walruses. The numbers are anticipated in the coming weeks, possibly by the end of the year, Woods said.

For Carbon Storage, Burn the Bogs?


Burning peat bogs in a controlled way may be a good way to sequester carbon from the atmosphere, according to new research.
Peatlands are dense carbon storage units -- as the woody plants growing on top of the peat die, they fall into oxygen-poor, water-logged soil that keeps their carbon-rich remains preserved indefinitely. Around the world, peat contains 30 percent of all carbon buried in soils, equivalent to slightly less than all of the carbon in the atmosphere today.
"Peatlands suck up between 1 and 2 percent of all anthropogenic carbon emissions in the U.K. annually," Fred Worrall of Durham University in the United Kingdom said.
In the United Kingdom, private land managers burn peatlands regularly to clear space for grouse habitat and sheep grazing. This is no tree-hugging exercise -- the grouse are cultivated for recreational hunting -- but Worrall and Gareth Clay, also of Durham University, think the burning may have the beneficial side effect of enhancing carbon sequestration in the bogs.When the vegetation growing on top of peat bogs burns, some of it turns into black carbon charcoal. The charcoal can sink into the murky depths where it is preserved. In a computer simulation, the researchers found that if controlled burns were applied to optimize this process, the bogs could absorb 20 to 30 percent more carbon than when they were left to grow naturally.
"The key is that only the top heather vegetation can be burned -- what we call a 'cool burn.' Once you start burning down into the litter and soil, all bets are off. It's definitely a carbon source," Worrall said.
There's also a risk that a burn could get out of control and turn into a wildfire, devastating the peat."This is kind of an up and coming idea," said Andrew Zimmermann of the University of Florida. "Making what is called 'biochar' to enhance carbon sequestration has potential to be used all over the world."
Zimmermann pointed to forests as having even more potential to store carbon by making charcoal, because as trees die, their carbon-rich wood and leaf litter are broken down by microorganisms and released back into the atmosphere.
"Peat is already preserving plenty of carbon. What we need is to preserve what is not already being preserved," he said.
But poor land management has severely damaged peat bogs, Worrall said, and erosion is already releasing up to 400 tons of carbon per square kilometer of peat each year from its sodden layers. To reverse that trend, close attention to the bogs is needed -- and perhaps even a fire every now and then.
"If you do nothing, peatlands are sure to become part of our greenhouse gas emissions," Worrall said. "But if we do something and it's the right thing, we can turn this system around and make it part of the solution."

Alien-like Squid With "Elbows" Filmed at Drilling Site

A mile and a half (two and a half kilometers) underwater, a remote control submersible's camera has captured an eerie surprise: an alien-like, long-armed, and—strangest of all—"elbowed" Magnapinna squid.In a brief video from the dive recently obtained by National Geographic News, one of the rarely seen squid loiters above the seafloor in the Gulf of Mexico on November 11, 2007.

The clip—from a Shell oil company ROV (remotely operated vehicle)—arrived after a long, circuitous trip through oil-industry in-boxes and other email accounts.

"Perdido ROV Visitor, What Is It?" the email's subject line read—Perdido being the name of a Shell-owned drilling site. Located about 200 miles (320 kilometers) off Houston, Texas (Gulf of Mexico map), Perdido is one of the world's deepest oil and gas developments.

The video clip shows the screen of the ROV's guidance monitor framed with pulsing inputs of time and positioning data.

In a few seconds of jerky camerawork, the squid appears with its huge fins waving like elephant ears and its remarkable arms and tentacles trailing from elbow-like appendages.

Despite the squid's apparent unflappability on camera, Magnapinna, or "big fin," squid remain largely a mystery to science.

ROVs have filmed Magnapinna squid a dozen or so times in the Gulf and the Pacific, Atlantic, and Indian Oceans.

The recent video marks the first sighting of a Magnapinna at an oil development, though experts don't think the squid's presence there has any special scientific significance.

But the video is evidence of how, as oil- and gas-industry ROVs dive deeper and stay down longer, they are yielding valuable footage of deep-sea animals.

Some marine biologists have even formed formal partnerships with oil companies, allowing scientists to share camera time on the corporate ROVs—though critics worry about possible conflicts of interest.

Real Deal

The Perdido squid may look like a science fiction movie monster, but it's no special effect, according to squid biologist Michael Vecchione of the U.S. National Oceanic and Atmospheric Administration (NOAA), who is based at the National Museum of Natural History in Washington, D.C.
In 1998 Vecchoine and University of Hawaii biologist Richard Young became the first to document a Magnapinna, based on juveniles of the Magnapinna pacifica species. M. pacifica was so unusual that the scientists had to create a new classification category to accommodate it: the family Magnapinnidae, which currently boasts four species.

In 2001 the pair released the first scientific report based on adult Magnapinna specimens, as seen via video. The study demonstrated that Magnapinna are common worldwide in the permanently dark zone of the ocean below about 4,000 feet (1,219 meters).
In 2006 a single damaged specimen from the North Atlantic led to the naming of a second Magnapinna species, M. talismani. And in 2007 the scientists documented two more: M. atlantica and a species based on a specimen from the mid-Atlantic.

That fourth Magnapinna species remains nameless, because its arms were too badly damaged for a full study. "However, it was clearly different from the three known species," Vecchione said.

The Magnapinna species apparently have only slight physical differences, mainly related to tentacle and arm structure in juveniles.

The subtlety of those variations makes it impossible to identify which species is in the oil-rig video, given that at least two Magnapinna species—M. atlantica and M. pacifica—are known to inhabit the Gulf of Mexico, Vecchione said.

Enduring Mystery

Based on analysis of videos not unlike the one captured at the Perdido site, scientists know that the adult Magnapinna observed to date range from 5 to 23 feet (1.5 to 7 meters) long, Vecchione said. By contrast, the largest known giant squid measured about 16 meters (52 feet) long.

And whereas giant squid and other cephalopods have eight short arms and two long tentacles, Magnapinna has ten indistinguishable appendages that all appear to be the same length.

"The most peculiar structure is that of the arms," said deep-sea biologist Bruce Robison of the Monterey Bay Aquarium Research Institute in California.

Referring to the way the tentacles hang down from elbow-like kinks, Robison said: "Judging from that structure, we think the animal feeds by dragging its arms and the ends of its tentacles along the seafloor as it drifts slowly above it."

The elbow-like angles allow the tentacles to spread out, perhaps preventing them from getting tangled.

"Imagine spreading the fingers of a hand and dragging the fingertips along the top of a table to grab bits of food," he added.

But NOAA's Vecchione suggests a feeding behavior that is more like trapping than hunting. He speculates that Magnapinna passively waits for prey to bump into the sticky appendages.

Strange Bedfellows?

As oil companies and their ROVs spend more time in the bathypelagic zone, more discoveries are sure to follow, experts say.

Eager for hard-to-come-by deep-sea video and data, some biologists are formally aligning themselves with the companies.

The U.K.-based SERPENT (Scientific and Environmental ROV Partnership using Existing iNdustrial Technology) project, for example, matches oil companies with researchers "to make cutting-edge ROV technology and data more accessible to the world's science community," according to the project's Web site.

Despite such partnerships, Monterey Bay's Robison said, most sightings of the Magnapinna squid have come from research vessels, not oil companies. The November 2007 video, for the record, was captured without scientific involvement.

Some scientists, including Robison, are not entirely comfortable relying on corporations for new data.

Andrew Shepard, director of NOAA's Undersea Research Center, is excited about the potential for new ocean resources, but he does have concerns.

"Oil companies are there to develop hydrocarbons, not find new species," Shepard said.

"These discoveries may, in fact, have a negative impact on very expensive and valuable lease tracts if someone decides a rare species needs to be protected."

But given how expensive and time consuming ROV-based deep-sea research is, scientific cooperation with industry is crucial, SERPENT project oceanographer Mark Benfield said.

"There are relatively few research vessels and far fewer ROVs and manned submersibles capable of working down through [extremely deep regions of the ocean]," said Benfield, who teaches at Louisiana State University.

Research funds are getting scarcer, he added, and "with SERPENT we gain access to sophisticated ROVs for free.

"These systems are based on vessels or rigs that spend months to years at a single location. This allows us to build up a much more complete picture of life in the deep-sea than would be possible with [only] academic ships and deep-submergence vehicles."

NOAA's Vecchione said he has "gotten a lot of interesting observations from the SERPENT project and other petro sources."

But the oil-industry collaborations "should not get in the way of purely scientific exploration," Vecchione said. "We need to be careful about deep-sea conservation."

Mars Phoenix's Twitter Proves a Huge Success


If the Phoenix Lander comes back to life on Mars, Twitter users could be among the first to know.
NASA gave the historic Space Age mission an Internet Age spin by adding a Twitter page, enabling the robotic interplanetary explorer to answer the hot micro-blogging Web site's trademark query: "What are you doing?"
Twitter rocketed to popularity with technology that lets people use mobile telephones or personal computers to continually keep friends updated on their activities with "tweets," text messages of no more than 140 characters.
When NASA Jet Propulsion Laboratory News Services manager Veronica McGregor was tasked with delivering word of the agency's first-ever robotic landing on Mars during a holiday weekend, she turned to the social-networking Web site.
"Readership and viewership in traditional news media usually goes down over a three-day weekend," said McGregor, a former CNN correspondent.
"The fact that Twitter could send messages right to people's cell phones -- it seemed like a good idea to let people know about the landing."
So McGregor created a plucky persona for the 420-million-dollar robot and planted a flag on a new NASA frontier: Twitter-verse.
"I dig Mars!" was among Lander Tweets. Blog posts after its unprecedented May touch-down included an ice-discovery message ending with "w00t!!! Best day ever!!"
Tweets at twitter.com/MarsPhoenix won numerous Internet awards and garnered nearly 40,000 dedicated followers -- 2,000 of whom joined after NASA lost contact with the Lander in November.
"There was a certain joy and exuberance that came with every day, and every sight it was seeing," McGregor said. "I think people really related to that."
The Lander's writing style helped the blog stand out, according to Twitter co-founder Biz Stone.
"It was the way she chose to send out the updates -- in the first person and anthropomorphizing the Lander -- that really made all of the difference," Stone said. "As a result, NASA gets a level of engagement with citizens they didn't have before."
NASA is not the first major organization to get its message out on Twitter. Computer maker Dell began using Twitter in January to publish Internet-only sales bargains. US cable television giant Comcast sent service trouble missives over the service.Some media organizations have also been using Twitter to gather timely, first-hand accounts from witnesses or to provide news updates to subscribers. CNN regularly presents live responses from Twitter users on the cable network, with footage of the service's feed on a computer screen.
Many businesses have turned to social-networking services MySpace and Facebook to spread messages.
McGregor prefers Twitter for its speed. "A lot of people said that the short posts were exactly the right amount of information they wanted to know," she said.
That pithiness, mixed with a little attitude, impressed Wired.com writer Alexis Madrigal.
"In the near-term, NASA has found an absolutely outstanding way to reach large numbers of influential people at a fairly low investment," Madrigal said.
NASA has begun setting up Twitter accounts for other missions in hopes of repeating their success.
"I'm hoping the lesson they'll take from this is that you need freedom to craft a character and a feed that will be appealing to people," Madrigal said.
McGregor said she is giving presentations throughout NASA on her successful experiment while pressing the agency to explore new communications strategies.
As for the Lander, it sent this farewell tweet: "I should stay well-preserved in this cold. I'll be humankind's monument here for centuries, eons, until future explorers come for me."

NASA Space Probe to Track CO2 on Earth


The occasionally acrimonious debate about the planet's climate has been missing a key component: accurate measurements of how much carbon dioxide is in the air and how it is being recycled by Earth.
That is the heart of a new NASA mission called the Orbital Carbon Observatory, which is set to launch early next year.
"We will uncover all kinds of patterns and cycles in carbon dioxide that people never thought existed. It'll be just like when the first ozone measurements were made," said project scientist Chip Miller, with the Jet Propulsion Laboratory in Pasadena, Calif.
"We get at the question of the sources of carbon dioxide and see how much is pulled out (of the atmosphere) by land and how much by seas," he said.
Many scientists consider carbon dioxide to be the telltale gas of global warming. Once it is released into the air, there is little chemistry to remove it. Its presence traps reflected sunlight. Plants, soils and the oceans of Earth reabsorb the gas, but that takes a while. Miller says that the average lifetime for carbon dioxide is about 300 years. About 20 percent of atmospheric carbon dioxide, however, lasts for 10,000 years or longer.

New Field Could Explain How Salmon, Turtles, Find Home


Sea turtles and salmon may use their sensitivity to Earth's magnetic field to guide them home at the end of their epic coming-of-age journeys, suggest scientists aiming to solve one of nature's enduring mysteries.

The newly proposed theory is one of several ideas being explored under the banner of an emerging scientific field dubbed movement ecology.

According to the field's proponents, the study of movement is central to understanding where animals and plants live, how they evolve and diverge, and why they become extinct.

By making movement central to ecological studies, scientists hope general theories about movement will emerge.

Such theories could, for example, help scientists predict how organisms will respond to global climate change and prevent the spread of pests and diseases.

Kenneth Lohmann, a marine scientist at the University of North Carolina, Chapel Hill, applied the concept of movement ecology to sea turtles and salmon.

His aim was to develop a hypothesis for how such animals navigate to their natal areas from distant oceanic locations.

Juvenile sea turtles and salmon leave their birthplaces with an inescapable wanderlust, swimming hundreds or even thousands of miles away.

But after years on the high seas, the biological urge to reproduce calls the creatures home, and they return to the very spots in which they were born.

How they do this has eluded scientists for decades. Lohmann says the secret to the marine animals' navigational success may lie in the variability of Earth's magnetic field.

Each coastal area has a unique magnetic signature, he said.

Previous studies, including work in Lohmann's lab on sea turtles, indicated both the turtles and salmon are sensitive to the magnetic field.

"What we're proposing is the sea turtles and salmon, when they begin life, basically learn, or imprint, on the magnetic field that marks their home area," Lohmann said.

"They retain this information. And years later, when it is time for them to return, they are able to exploit this information in navigating back to their home area."
Once the animals reach their native coastal areas, other senses, such as vision or smell, may guide them the rest of the way. Salmon, for example, are known to use smell to locate spawning grounds once the fish are nearby.

Lohmann and colleagues propose the theory in a paper published this week in a special package about movement ecology in the journal Proceedings of the National Academy of Sciences.

"We are excited about [the theory], because it really is the first plausible explanation for how sea turtles and salmon might be able to return," he said.

An Ancient Idea

Some 2,300 years ago, the Greek philosopher Aristotle searched for common features that unified animal movements of all types, noted Ran Nathan, an ecologist at the Hebrew University, Jerusalem.

This kicked off a long tradition of movement-ecology research.

But over the years, Nathan said, researchers have focused on different types of movement in specific species or landscapes, without looking at how different patterns impacted each other.

These scientists "never meet each other, they never talk to each other, they never go to the same conference, they publish in different journals," Nathan said.

In an effort to bring the scientific community together, Nathan led a yearlong project to establish a unifying framework for studying movement ecology.

Twelve teams of scientists were asked to address four basic questions: Why, how, where, and when do organisms move?

The methodology, Nathan said, applies to all types of organisms, from animals such as salmon, sea turtles, and elephants to bacteria and plants.

"If you give a legitimate field for the study of movement itself ... then people will study movement-related questions more thoroughly," Nathan said.

Martin Wikelski is a zoologist at the Max Planck Institute of Ornithology in Seewiesen, Germany, who specializes in animal movement.

The initiative to raise the prominence of movement ecology is "absolutely essential" to the understanding of wild animals, especially in an era complicated by a changing climate, Wikelski said.

"Every animal moves around and if we don't know the fate of these animals during movement, and how movement contributes to selection, then I think we are pretty much lost," he said.

For example, by understanding what animals encounter as they move about their environment, scientists may be able to determine the factors that cause some to go extinct.

Birds and Bees

James Mandel, an ecologist at Cornell University in Ithaca, New York, said the new paradigm is ideal for his research, which seeks connections between weather patterns and animal movement.

His team outfitted turkey vultures with GPS tags and two-way radio transmitters to collect data on the birds' hourly and daily movements.

One turkey vulture even carried a heart rate monitor to measure how much energy the bird expended during flight.

The researchers combined this data with information on the wind speed, atmospheric turbulence, and cloud height wherever the birds were.

The team found that turkey vultures soar from one billowing updraft of warm air to the next as they migrate thousands of miles between their summer and winter homes.

While many questions remain, Mandel said the data indicate the birds "are highly dependent on favorable weather conditions from energy source to energy source as they go."

Other teams applied the movement ecology framework to the study of elephants in Africa, elk in Canada, lynx in Spain, and butterflies from Estonia, Finland, and China.

Still other groups tested the methodology on seeds in Panama and various plants in the eastern U.S.

The Max Planck Institute's Wikelski, who is also a 2008 National Geographic emerging explorer (the National Geographic Society owns National Geographic News), is pioneering new tracking technology that allows scientists to study the movement of even the smallest creatures, such as bees.

The combination of Nathan's movement-ecology initiative and new technologies, he said, will open "a new era in biology."

Saturday, December 8, 2007

Dark Matter May Have Powered Universe's First Stars


Dark matter may have fueled the formation of the universe's first stars—vast, invisible giants totally unlike the blazing suns of today—scientists say.

According to a new theory, disintegrating fragments of the mysterious substance could have created "dark stars" hundreds of thousands of times wider than the sun around 13 billion years ago, just after the big bang.

Because these stars weren't fueled by fusing hydrogen and helium like known present-day stars, they would have been completely invisible—but scorching hot.

The findings "drastically alter the current theoretical framework for the formation of the first stars," said study co-author Paolo Gondolo, an astrophysicist at the University of Utah.

Scientists still don't know what dark matter is exactly, so the research could shed light on it and other astronomical mysteries, he added.

"We are always searching for ways to determine the nature of dark matter," Gondolo said.

The paper will appear in next month's issue of the journal Physical Review Letters.

Annihilation Engine

According to some theories of the universe, dark matter likely consists of hypothetical particles called neutralinos.

The new paper suggests that neutralinos annihilated each other in the early universe, producing subatomic particles called quarks and their antimatter counterparts, antiquarks.

The heat from this process was enough to prevent embryonic hydrogen and helium from cooling and shrinking. Such contraction ignites the self-sustaining fusion process that powers conventional stars.

"The heating can counteract the cooling, and so the star stops contracting for a while, forming a dark star" some 80 million to 100 million years after the big bang, Gondolo said.

Dark stars, made up mostly hydrogen and helium, would be vastly larger than the sun and other stars—up to 15,000 times the size of our solar system. And instead of shining, they would glow in the infrared.

"With your bare eyes, you can't see a dark star," Gondolo said. "But the radiation would fry you."

Wide-Ranging Implications

The theory may have wide-ranging implications about the importance of dark matter in the earliest stages of the universe.

For example, dark matter is widely believed to have helped with early galaxy formation, said Rennan Barkana, an astrophysicist at Tel Aviv University in Israel who was not involved with the new paper.

But until now it was thought that "the dark matter does not play any significant role in the formation of the star itself," he said.

That's important, because the substance is believed to make up most of the universe's matter, partly because galaxies rotate faster than can be explained by the visible matter within them.

In total, about 23 percent of the universe is thought to be dark matter, as opposed to 4 percent for the ordinary matter that makes up stars, planets, and moons.

The remaining 73 percent is thought to be dark energy, an even more mysterious force helping the universe to expand at increasing rates.

Search Is On

Emanuele Ripamonti, an astronomer at Universita' dell'Insubria in Como, Italy, said that in order for the new research to be plausible, the formation of stars from dark matter must rely on a cascade of events that are not yet well studied.

"Every time they make a choice, the authors pick the 'most likely' option, but in some cases there is no real consensus" about what would happen, he said.

Barkana called the theory intriguing and novel but agreed that more research is necessary.

"It is unclear whether in the end an observational prediction will come out that will allow the dark star possibility to be clearly distinguished from other scenarios," he said.

If dark stars exist, however, they would likely give themselves away by spewing gamma rays, neutrinos, and antimatter, study author Gondolo said. The stars would also be associated with clouds of cold, molecular hydrogen gas that wouldn't normally harbor such energetic particles.

If found, dark stars wouldn't only provide insights into dark matter, he added. They could also help unravel phenomena like the formation of heavy elements—thought to come from exploding conventional stars—and the rapid formation of black holes, which defies theoretical predictions.

"Without detailed simulations, we cannot pinpoint the further evolution of dark stars," he said. "We have to search for them."