Tuesday, December 26, 2006

Study shows extreme contrast in ozone losses at North, South Poles

Study shows extreme contrast in ozone losses at North, South Poles

A new study shows just how dramatic the ozone loss in the Antarctic has been over the past 20 years compared to the same phenomenon in the Arctic.

The study found "massive" and "widespread" localized ozone depletion in the heart of Antartica's ozone hole region, beginning in the late 1970s, but becoming more pronounced in the 1980s and 90s.

The US government scientists who conducted the study said that there was an almost complete absence of ozone in certain atmospheric air samples taken after 1980, compared to earlier decades. In contrast, the ozone losses in the Arctic were sporadic, and even the greatest losses did not begin to approach the regular losses in the northern hemisphere, the researchers said.

"Typically the Arctic loss is dramatically less than the Antarctic loss," said Robert Portmann, an atmospheric scientist with the National Oceanic and Atmospheric Administration in Boulder, Colorado.

Scientists have been tracking the expanding ozone hole over Antarctica for some 20 years now.

In October, NASA scientists reported that this year's hole is the biggest ever, stretching over nearly 11 million square miles.

In Antarctica, local ozone depletion at some altitudes frequently exceeded 90 percent, and often reached up to 99 percent during the Antarctic winter in the period after 1980 compared to earlier decades, the researchers said.

In the Arctic, the losses occasionally peaked at 70 percent, and some losses of 50 percent were seen in the mid 1990s, when temperatures were particularly low, but the scale and scope of the problem was much less than what was seen in the Northern Hemisphere.

Recent studies have also pointed to large ozone losses in the Southern Hemisphere, but the NOAA researchers said their study showed that these events were rare and did not appear to signal a trend.

"We saw small to moderate ozone losses in the very coldest winters, when the stratospheric conditions are ripe for ozone loss, but they were rarer than we expected," said Portmann.

The study, published in the Proceedings of the National Academy of Sciences, was based on more than 40 years' of ozone readings from polar observation stations and balloon-borne measuring mechanisms.

(c) www.physorg.com

Saturday, December 23, 2006

Space shuttle Discovery completes successful ISS mission

Space shuttle Discovery completes successful ISS mission

Space shuttle Discovery and its seven-member crew has landed in Florida after a 13-day mission that advanced construction of the International Space Station.

Discovery touched down at 2232 GMT on the landing strip at the Kennedy Space Center, near Cape Canaveral, where it had lifted off in a nighttime launch on December 9.

"You have seven thrilled people here," shuttle commander Mark Polansky said just after landing.

A controller at the Mission Control Center in Houston, Texas, replied: "Congratulations on what was probably the most complex assembly mission of the Station to date."

The Discovery astronauts spent eight days at the International Space Station (ISS), continuing the construction of the orbiting laboratory by attaching a two-tonne truss to its girder-like structure in the first of four space walks.

During the next two space walks, they rewired the station's power system and put it on a permanent basis.

A fourth space walk was added Monday to shake loose a solar array panel that had gotten stuck as it was being folded.

"It was a wonderful end to a great mission," Michael Griffin, a NASA administrator, said at a news conference at Kennedy Space Center where he welcomed the astronauts after they left the spacecraft.

"The crew on orbit and the crew on the ground could not have done better," he said.

The shuttle's return to Earth initially was scheduled Thursday but the mission was extended a day to allow for the unplanned space walk.

Poor weather conditions in Florida had scotched an earlier scheduled landing Friday and National Aeronautics and Space Administration officials had considered bringing the shuttle down at alternative sites.

A decision to land at Kennedy was made only at the last minute and "it turned out to be a great one," Griffin said.

NASA officials had hoped weather conditions would improve enough in Florida to avoid having to land the space shuttle at Edwards Air Force Base in California or White Sands Space Harbor in New Mexico.

A landing at either location in the western United States would have meant NASA would have to fly the shuttle to Florida, in the southeast, on the back of a modified Boeing 747 plane, which would cost some 1.7 million dollars.

The shuttle plunged at more than 26,500 kilometers (16,500 miles) an hour as it descended through the Earth's atmosphere, triggering a double sonic boom as it lowered through the skies above the runway on Florida's Atlantic coast.

On board were six US astronauts and one from the European Space Agency (ESA), Christer Fuglesang, Sweden's first astronaut.

After this mission, NASA plans at least 13 more shuttle flights -- including five in 2007 -- to complete construction of the International Space Station by 2010, when the three-spacecraft shuttle fleet is due to be retired.

ISS construction fell years behind schedule after the 2003 Columbia tragedy when the spacecraft disintegrated minutes ahead of landing, killing all seven astronauts aboard.

NASA suspended the shuttle program to deal with safety problems. The space shuttle Atlantis mission in September marked the resumption of ISS construction.

Discovery carried astronaut Sunita Williams to the ISS, where she will remain for six months. Williams replaced Thomas Reiter, a German ESA astronaut who had been on the station since July and who returned on Discovery.

(c) www.physorg.com

NIST laser-based method cleans up grubby nanotubes

Before carbon nanotubes can fulfill their promise as ultrastrong fibers, electrical wires in molecular devices, or hydrogen storage components for fuel cells, better methods are needed for purifying raw nanotube materials. Researchers at the National Institute of Standards and Technology (NIST) and the National Renewable Energy Laboratory (NREL, Golden, Colo.), have taken a step toward this goal by demonstrating a simple method of cleaning nanotubes by zapping them with carefully calibrated laser pulses.

When carbon nanotubes--the cylindrical form of the fullerene family--are synthesized by any of several processes, a significant amount of contaminants such as soot, graphite and other impurities also is formed. Purifying the product is an important issue for commercial application of nanotubes.

Before and after electron microscope images of a pyroelectric detector coated with single-walled nanotubes (SWNTs) visually demonstrate the effect of the laser cleaning process. In addition, the SWNTs look visibly blacker after laser treatment, suggesting less graphitic material and increased porosity. Credit: NIST

In a forthcoming issue of Chemical Physics Letters, the NIST/NREL team describes how pulses from an excimer laser greatly reduce the amount of carbon impurities in a sample of bulk carbon single-walled nanotubes, without destroying tubes. Both visual examination and quantitative measurements of material structure and composition verify that the resulting sample is "cleaner." The exact cleaning process may need to be slightly modified depending on how the nanotubes are made, the authors note. But the general approach is simpler and less costly than conventional "wet chemistry" processes, which can damage the tubes and also require removal of solvents afterwards.

"Controlling and determining tube type is sort of the holy grail right now with carbon nanotubes. Purity is a key variable," says NIST physicist John Lehman, who leads the research. "Over the last 15 years there's been lots of promise, but when you buy some material you realize that a good percentage of it is not quite what you hoped. Anyone who thinks they're going into business with nanotubes will realize that purification is an important--and expensive--step. There is a lot of work to be done."

The new method is believed to work because, if properly tuned, the laser light transfers energy to the vibrations and rotations in carbon molecules in both the nanotubes and contaminants. The nanotubes, however, are more stable, so most of the energy is transferred to the impurities, which then react readily with oxygen or ozone in the surrounding air and are eliminated. Success was measured by examining the energy profiles of the light scattered by the bulk nanotube sample after exposure to different excimer laser conditions. Each form of carbon produces a different signature.

Changes in the light energy as the sample was exposed to higher laser power indicated a reduction in impurities. Before-and-after electron micrographs visually confirmed the initial presence of impurities (i.e., material that did not appear rope-like) as well as a darkening of the nanotubes post-treatment, suggesting less soot and increased porosity.

The researchers developed the new method while looking for quantitative methods for evaluating laser damage to nanotube coatings for next-generation NIST standards for optical power measurements (see http://www.physorg.com/news2821.html). The responsivity of a prototype NIST standard increased 5 percent after the nanotube coating was cleaned.

Citation: K.E. Hurst, A.C. Dillon, D.A. Keenan and J.H. Lehman. Cleaning of carbon nanotubes near the [pi]-plasmon resonance. Chemical Physics Letters, In Press, Corrected Proof. Available online 15 November 2006.

(c) www.physorg.com

Wednesday, December 20, 2006

Secret Worlds: The Universe Within

Secret Worlds: The Universe Within  

View the Milky Way at 10 million light years from the Earth. Then move through space towards the Earth in successive orders of magnitude until you reach a tall oak tree just outside the buildings of the National High Magnetic Field Laboratory in Tallahassee, Florida. After that, begin to move from the actual size of a leaf into a microscopic world that reveals leaf cell walls, the cell nucleus, chromatin, DNA and finally, into the subatomic universe of electrons and protons.Secret Worlds: The Universe Within

Notice how each picture is actually an image of something that is 10 times bigger or smaller than the one preceding or following it. The number that appears on the lower right just below each image is the size of the object in the picture. On the lower left is the same number written in powers of ten, or exponential notation. Exponential notation is a convenient way for scientists to write very large or very small numbers. For example, compare the size of the Earth to the size of a plant cell, which is a trillion times smaller:

Earth = 12.76 x 10+6 = 12,760,000 meters wide
(12.76 million meters)
Plant Cell = 12.76 x 10-6 = 0.00001276 meters wide
(12.76 millionths of a meter)
 

Secret Worlds: The Universe Within

Scientists examine things in particular ways using a combination of very sophisticated equipment, everyday instruments, and many unlikely tools. Some phenomena that scientists want to observe are so tiny that they need a magnifying glass, or even a microscope. Other things are so far away that a powerful telescope must be used in order to see them. It is important to understand and be able to compare the size of things we are studying. To learn more about the relative sizes of things, visit our Perspectives: Powers of 10 activity site.

(C) www.micro.magnet.fsu.edu

Tuesday, December 19, 2006

Planets Were Formed From A Giant Mix, Suggests New Analysis

 

Our Solar System may have been created in a gigantic mixing process far more extensive than previously imagined, according to research published today.

NASA's dust collector, which brought samples of Comet Wild-2 to Earth. (Image courtesy of Imperial College London)The findings, reported in the journal Science, come from the first analysis of dust fragments from Comet Wild-2, captured by NASA's Stardust spacecraft and brought to Earth in January 2006. Because comets are among the oldest objects in the Solar System, the team, which includes researchers from Imperial College London and the Natural History Museum, believes their sample of dust can provide insights into how Earth and other planets came to be formed.

Using spectroscopy technology which does not damage the mineral content of the particles, the team found that the comet dust is made up of many different mineral compositions rather than a single dominant one. This implies that the dust was formed in many different environments before coming together to make the comet, indicating a great deal of mixing in the early Solar System prior to the formation of planets.

Particularly significant was the discovery of calcium aluminium inclusions, which are amongst the oldest solids in the Solar System and are thought to have formed close to the young Sun. This discovery suggests that components of the comet came from all over the early Solar System, with some dust having formed close to the Sun and other material coming from the asteroid belt between Mars and Jupiter. Since Wild-2 originally formed in the outer Solar System, this means that some of its composite material has travelled great distances. Dr Phil Bland of Imperial's Department of Earth Science and Engineering says:

"We weren't expecting to find such widely-spread material in the sample of dust we were given to examine. The composition of minerals is all over the place, which tells us that the components that built this comet weren't formed in one place at one time by one event. It seems that the Solar System was born in much more turbulent conditions than we previously thought."

The researchers have also found evidence of surprising variety in cometary composition. NASA's 2005 Deep Impact mission, which provided images of material blasted from the nucleus of the comet Tempel 1, revealed evidence of aqueous activity within the comet. However the dust from Wild-2 has none of those characteristics and apparently has not interacted with water at all. Anton Kearsley of the Natural History Museum says:

"This is a very interesting mismatch, and it seems that comets are not all the same. Perhaps they vary as much in their evolution as in the composition of the dust from which they are made."

This is the first time scientists have had the opportunity to study samples from a comet, having previously relied on studying comets from afar or analysing interplanetary dust particles of uncertain origin. Dr Bland adds:

"Comets are likely to be the oldest objects in our Solar System and their components have remained largely unchanged, so discovering more about what they have experienced gives us a snapshot of the processes that formed the planets over four and a half billion years ago. Fundamentally we still don't know how you make planets from a cloud of dust and gas. Hopefully the Wild-2 samples will help us towards an answer."

The analysis was carried out by the Impacts and Astromaterials Research Centre, a joint Imperial-Natural History Museum research group funded by the Particle Physics and Astronomy Research Council.

[Thanks goes to www.sciencedaily.com for this article]

Saturday, December 16, 2006

Mind Controllable Robots: Too Late?

Mind Controllable Robots

In the war between robots and humans, the humans just scored a major victory. Researchers at the University of Washington have successfully demonstrated a robotic interface operated through mind control. Utilizing an electrode cap (a non-invasive tool generating a noisy signal), mental powers commanded the robot to walk to a block, pick it up, and set it down in a designated area.

Hit the link for the video demonstration. Now we just need scientists to hone the "don't blow my head off with that laser" command and we'll be all set. – Mark Wilson

[original post: www.gizmodo.com]

Friday, December 15, 2006

Global Warming Affects Space Station Orbit

Global Warming Affects Space Station Orbit

By Robert Roy Britt
LiveScience Managing Editor

As the climate warms near Earth's surface, the upper atmosphere is getting less dense, a change that will mean less drag on satellites, scientists announced today.

Carbon dioxide emissions from the burning of fossil fuels will cause a 3 percent reduction in air density in the outermost layer of the atmosphere by 2017, the researchers predict.

Among the affected satellites: The International Space Station and the Hubble Space Telescope.

"We're seeing climate change manifest itself in the upper as well as lower atmosphere," said Stan Solomon of the National Center for Atmospheric Research (NCAR). "This shows the far-ranging impacts of greenhouse gas emissions."

The finding was presented today at a meeting of the American Geophysical Union in San Francisco.

The thermosphere extends from about 60 miles above Earth to 400 miles. The air is incredibly thin, but still causes drag on satellites in low Earth orbit. NASA routinely boosts the orbit of the space station as it is constantly degrading. Other satellites have limited life spans in part because the thin air up there eventually drags them down.

A thinning thermosphere means satellites can stay aloft longer.

Carbon dioxide molecules absorb radiation. Near Earth's surface, the molecules collide frequently with other molecules and the energy is released as heat, warming the air, the scientists explained. In the much thinner thermosphere, a carbon dioxide molecule has ample time to radiate energy to space because collisions are infrequent. The result is a cooling effect. And as it cools, the thermosphere settles, so that the density at a given height is reduced.

The effect varies with changes in the 11-year cycle of the Sun's activity, too. Being able to now predict the changes will help satellite operators plan better, the researchers said.

"Satellite operators noticed the solar cycle changes in density at the very beginning of the Space Age," Solomon said. "We are now able to reproduce the changes using the NCAR models and extend them into the next solar cycle."

The findings are also detailed in the journal Geophysical Research Letters and explained in a video presentation.

[original post: www.livescience.com]

Thursday, December 7, 2006

A terabyte of data on a regular DVD?

This is the promise of the 3-D Optical Data Storage system developed at the University of Central Florida (UCF). This technology allows to record and store at least 1,000 GB of data on multiple layers of a single disc. The system uses lasers to compact large amounts of information onto a DVD and the process involves shooting two different wavelengths of light onto the recording surface. By using several layers, this technique will increase the storage capacity of a standard DVD to more than a terabyte.

This technology has been developed by Kevin D. Belfield, Department Chair and Professor of Chemistry at UCF, and his colleagues in the Belfield Research Group. So how does this work?

The process involves shooting two different wavelengths of light onto the recording surface. The use of two lasers creates a very specific image that is sharper than what current techniques can render. Depending on the color (wavelength) of the light, information is written onto a disk. The information is highly compacted, so the disk isn’t much thicker. It’s like a typical DVD.

The challenge scientists faced for years was that light is also used to read the information. The light couldn’t distinguish between reading and writing, so it would destroy the recorded information. Belfield’s team developed a way to use light tuned to specific colors or wavelengths to allow information that a user wants to keep to stay intact.

Below is a picture showing how this two-photon 3D optical system reads the data. "This 3D image was reconstructed from successively two-photon fluorescence imaging (readout) of 33 XY data planes along the axial direction (1 micron distance between each image). The principle for this novel two-photon 3D optical storage device was based on a bichromophoric mixture consisting of diaryletheneand fluorene derivative, suitable for recording data in thick storage media." (Credit: Dr. Zhen-Li Huang, UCF)

A terabyte of data on a regular DVD?

This research work has been published by Advanced Materials under the title "Two-Photon 3D Optical Data Storage via Fluorescence Modulation of an Efficient Fluorene Dye by a Photochromic Diarylethene" (Volume 18, Issue 21, Pages 2910-2914, Published online on October 30, 2006). Here is a link to the abstract.

This work has also been reviewed by Rachel Pei Chin Won in Nature Photonics under the title "Two photons are better than one" (November 16, 2006). Here are more details about this "Two-Photon 3-D Optical Data Storage" system.

[The researchers] have fabricated a two-photon three-dimensional optical data system using a photochromic polymer. They show that the system is suitable for recording data in thick storage media and for providing a readout method that does not erase existing stored information — they perform 10,000 readout cycles with only a small reduction in contrast. Also, contrary to other techniques, this method allows reading and writing of data at the same wavelength, which is achieved by changing the intensity of the laser light.

Nature Photonics also describes what kind of lasers were used by Belfield and his team.

Although the authors used a relatively expensive femtosecond Ti-sapphire laser to both read and write the information, they suggest that the data could be read using cheaper nanosecond laser diodes with comparable laser intensity, making this high density data-storage system more cost effective.

But when will we able to use DVDs with a terabyte capacity? Not before several years. In fact, the researchers just received a $270,000, three-year grant from the National Science Foundation to continue its work.

In the mean time, you can still visit — virtually — Belfield's lab. In particular, you should take a look at this page about High-Density Optical Data Storage, from which the above illustration has been extracted, and a photo gallery about One vs Two-photon Excitation.

[original post: http://blogs.zdnet.com]

Wednesday, December 6, 2006

Danger? Nanotube-Infested Waters Created in the Lab

 

Carbon nanomaterials can mix in water despite being hydrophobic, raising the possibility of a spreading spill in the future.Carbon Nanotubes

Carbon nanotubes--and their spherical cousins known as buckyballs--are proving to have myriad uses, finding employ in improved solar cells, electronics and medical probes. But the production volume of the tricky nanomaterials remains nanoscale when compared with the production volume of other industrial components. Nevertheless, environmental engineers have begun investigating how such materials might interact with natural environments if accidentally released and have discovered that at least some of the hydrophobic (water fearing) materials persist quite readily in natural waters.
Jae-Hong Kim of the Georgia Institute of Technology and his colleagues investigated how so-called multiwalled carbon nanotubes--layered, straws-within-straws of carbon atoms--interacted with natural water, in this case samples taken from the nearby Suwannee River. To their surprise, the carbon nanomaterial did not clump together as it tried to avoid water molecules, rather it interacted with the negatively charged natural organic matter in the river water. This organic matter seemed to shield the nanotubes and allow them to disperse throughout the water after an hour of mixing, instead of clumping and settling. "At the beginning, the solution is very black and, over time, it becomes grayish," Kim says. "What is interesting is that it is still grayish after a month." In other words, the nanotubes do not settle even after this time period.
This monthlong suspension means that Suwannee River water was actually better at promoting the dispersal of carbon nanotubes than chemical surfactants, which can maintain nanotubes in solution for roughly four days, according to the paper presenting the finding published online in Environmental Science and Technology. Similar studies with buckyballs--stable balls of 60 carbon atoms, also known as C60--had required copious organic solvents in order to maintain suspensions.

Because of the presence of such solvents, toxicity tests on C60 have been open to question as to whether the buckyballs or the solvents caused the damaging effects. Environmental engineers Volodymyr Tarabara and Syed Hashsham of Michigan State University and their colleagues tested the toxic effects of such buckyballs in water--without solvents--on lymphocytes, human immune cells. The researchers created solutions of C60 and water using ethanol at levels previously proven to have no toxic impact and using weeks worth of magnetically powered stirring.
At concentrations as low as 2.2 micrograms per liter, the clumps of C60 damaged the DNA of the immune cells, according to microscopic analysis presented in the December 1 issue of Environmental Science and Technology. The exact mechanism by which C60causes the DNA damage remains unclear, particularly because imaging could not detect the smallest of the buckyball clumps, but its DNA damaging effect was dose dependent. "We are not sure if very very small particles exist, one or two nanometers big," Tarabara says. "They may be very important as far as cellular damage."
Regardless, such nanopollution is unlikely to occur anytime soon: "The fact of the matter is that it takes weeks of mixing to generate appreciable concentrations in the size range where the particles are small enough not to settle," Tarabara notes. "It's not something that we can expect to be out there loose." But the environmental engineers argue that such research should be carried out before any widespread adoption of the new carbon nanomaterials takes place, especially because they seem to have a few surprises in store. "One thing is definite," Kim says, "these materials were not traditionally considered an aqueous-based contaminant." He adds: "I am saying, 'Well, it seems possible.'" --David Biello

[source: www.sciam.com]

Tag Cloud