Wednesday, January 3, 2007

Lost lakes of Titan are found at last

Titan Has Liquid Lakes, Scientists Report

This colorized radar view from Cassini shows lakes on Titan. Color intensity is proportional to how much radar brightness is returned. The colors are not a representation of what the human eye would see. Image credit: NASA/JPL/USGS

Lakes of methane have been spotted on Saturn's largest moon, Titan, boosting the theory that this strange, distant world bears beguiling similarities to Earth, according to a new study.

Titan has long intrigued space scientists, as it is the only moon in the Solar System to have a dense atmosphere -- and its atmosphere, like Earth's, mainly comprises nitrogen.

Titan's atmosphere is also rich in methane, although the source for this vast store of hydrocarbons is unclear.

Methane, on the geological scale, has a relatively limited life. A molecule of the compound lasts several tens of millions of years before it is broken up by sunlight.

Given that Titan is billions of years old, the question is how this atmospheric methane gets to be renewed. Without replenishment, it should have disappeared long ago.

A popular hypothesis is that it comes from a vast ocean of hydrocarbons.

But when the US spacecraft Cassini sent down a European lander, Huygens, to Titan in 2005, the images sent back were of a rugged landscape veiled in an orange haze.

There were indeed signs of methane flows and methane precipitation, but nothing at all that pointed to any sea of the stuff.

But a flyby by Cassini on July 22 last year has revealed, thanks to a radar scan, 75 large, smooth, dark patches between three and 70 kilometers across (two and 42 miles) across that appear to be lakes of liquid methane, scientists report on Thursday.

They believe the lakes prove that Titan has a "methane cycle" -- a system that is like the water cycle on Earth, in which the liquid evaporates, cools and condenses and then falls as rain, replenishing the surface liquid.

As on Earth, Titan's surface methane may well be supplemented by a "table" of liquid methane that seeps through the rock, the paper suggests.
Some of the methane lakes seem only partly filled, and other depressions are dry, which suggests that, given the high northerly latitudes where they were spotted, the methane cycle follows Titan's seasons.
In winter, the lakes expand, while in summer, they shrink or dry up completely -- again, another parallel with Earth's hydrological cycle.
The study, which appears on Thursday in the British weekly journal Nature, is headed by Ellen Stofan of Proxemy Research in Virginia and University College London.
Titan and Earth are of course very different, especially in their potential for nurturing life. Titan is frigid, dark and, as far as is known, waterless, where as Earth is warm, light and has lots of liquid water.
But French astrophysicist Christophe Sotin says both our planet and Titan have been sculpted by processes that, fundamentally, are quite similar.
The findings "add to the weight of evidence that Titan is a complex world in which the interaction between the inner and outer layers is controlled by processes similar to those that must have dominated the evolution of any Earth-like planet," Sotin said in a commentary.
"Indeed, as far as we know," Sotin added, "there is only one planetary body that displays more dynamism than Titan. Its name is Earth."

(c) www.physorg.com

Tuesday, January 2, 2007

A Robot in Every Home

by Bill Gates

Imagine being present at the birth of a new industry. It is an industry based on groundbreaking new technologies, wherein a handful of well-established corporations sell highly specialized devices for business use and a fast-growing number of start-up companies produce innovative toys, gadgets for hobbyists and other interesting niche products. But it is also a highly fragmented industry with few common standards or platforms. Projects are complex, progress is slow, and practical applications are relatively rare. In fact, for all the excitement and promise, no one can say with any certainty when--or even if--this industry will achieve critical mass. If it does, though, it may well change the world.

A Robot in Every Home
AMERICAN ROBOTIC: Although a few of the domestic robots of tomorrow may resemble the anthropomorphic machines of science fiction, a greater number are likely to be mobile peripheral devices that perform specific household tasks.

Of course, the paragraph above could be a description of the computer industry during the mid-1970s, around the time that Paul Allen and I launched Microsoft. Back then, big, expensive mainframe computers ran the back-office operations for major companies, governmental departments and other institutions. Researchers at leading universities and industrial laboratories were creating the basic building blocks that would make the information age possible. Intel had just introduced the 8080 microprocessor, and Atari was selling the popular electronic game Pong. At homegrown computer clubs, enthusiasts struggled to figure out exactly what this new technology was good for.

But what I really have in mind is something much more contemporary: the emergence of the robotics industry, which is developing in much the same way that the computer business did 30 years ago. Think of the manufacturing robots currently used on automobile assembly lines as the equivalent of yesterday's mainframes. The industry's niche products include robotic arms that perform surgery, surveillance robots deployed in Iraq and Afghanistan that dispose of roadside bombs, and domestic robots that vacuum the floor. Electronics companies have made robotic toys that can imitate people or dogs or dinosaurs, and hobbyists are anxious to get their hands on the latest version of the Lego robotics system.

Meanwhile some of the world's best minds are trying to solve the toughest problems of robotics, such as visual recognition, navigation and machine learning. And they are succeeding. At the 2004 Defense Advanced Research Projects Agency (DARPA) Grand Challenge, a competition to produce the first robotic vehicle capable of navigating autonomously over a rugged 142-mile course through the Mojave Desert, the top competitor managed to travel just 7.4 miles before breaking down. In 2005, though, five vehicles covered the complete distance, and the race's winner did it at an average speed of 19.1 miles an hour. (In another intriguing parallel between the robotics and computer industries, DARPA also funded the work that led to the creation of Arpanet, the precursor to the Internet.)

What is more, the challenges facing the robotics industry are similar to those we tackled in computing three decades ago. Robotics companies have no standard operating software that could allow popular application programs to run in a variety of devices. The standardization of robotic processors and other hardware is limited, and very little of the programming code used in one machine can be applied to another. Whenever somebody wants to build a new robot, they usually have to start from square one.

Despite these difficulties, when I talk to people involved in robotics--from university researchers to entrepreneurs, hobbyists and high school students--the level of excitement and expectation reminds me so much of that time when Paul Allen and I looked at the convergence of new technologies and dreamed of the day when a computer would be on every desk and in every home. And as I look at the trends that are now starting to converge, I can envision a future in which robotic devices will become a nearly ubiquitous part of our day-to-day lives. I believe that technologies such as distributed computing, voice and visual recognition, and wireless broadband connectivity will open the door to a new generation of autonomous devices that enable computers to perform tasks in the physical world on our behalf. We may be on the verge of a new era, when the PC will get up off the desktop and allow us to see, hear, touch and manipulate objects in places where we are not physically present.

From Science Fiction to Reality
The word "robot" was popularized in 1921 by Czech playwright Karel Capek, but people have envisioned creating robotlike devices for thousands of years. In Greek and Roman mythology, the gods of metalwork built mechanical servants made from gold. In the first century A.D., Heron of Alexandria--the great engineer credited with inventing the first steam engine--designed intriguing automatons, including one said to have the ability to talk. Leonardo da Vinci's 1495 sketch of a mechanical knight, which could sit up and move its arms and legs, is considered to be the first plan for a humanoid robot.

Over the past century, anthropomorphic machines have become familiar figures in popular culture through books such as Isaac Asimov's I, Robot, movies such as Star Wars and television shows such as Star Trek. The popularity of robots in fiction indicates that people are receptive to the idea that these machines will one day walk among us as helpers and even as companions. Nevertheless, although robots play a vital role in industries such as automobile manufacturing--where there is about one robot for every 10 workers--the fact is that we have a long way to go before real robots catch up with their science-fiction counterparts.

One reason for this gap is that it has been much harder than expected to enable computers and robots to sense their surrounding environment and to react quickly and accurately. It has proved extremely difficult to give robots the capabilities that humans take for granted--for example, the abilities to orient themselves with respect to the objects in a room, to respond to sounds and interpret speech, and to grasp objects of varying sizes, textures and fragility. Even something as simple as telling the difference between an open door and a window can be devilishly tricky for a robot.

But researchers are starting to find the answers. One trend that has helped them is the increasing availability of tremendous amounts of computer power. One megahertz of processing power, which cost more than $7,000 in 1970, can now be purchased for just pennies. The price of a megabit of storage has seen a similar decline. The access to cheap computing power has permitted scientists to work on many of the hard problems that are fundamental to making robots practical. Today, for example, voice-recognition programs can identify words quite well, but a far greater challenge will be building machines that can understand what those words mean in context. As computing capacity continues to expand, robot designers will have the processing power they need to tackle issues of ever greater complexity.

A Robot in Every Home
COMPUTER TEST-DRIVE of a mobile device in a three-dimensional virtual environment helps robot builders analyze and adjust the capabilities of their designs before trying them out in the real world. Part of the Microsoft Robotics Studio software development kit, this tool simulates the effects of forces such as gravity and friction.

Another barrier to the development of robots has been the high cost of hardware, such as sensors that enable a robot to determine the distance to an object as well as motors and servos that allow the robot to manipulate an object with both strength and delicacy. But prices are dropping fast. Laser range finders that are used in robotics to measure distance with precision cost about $10,000 a few years ago; today they can be purchased for about $2,000. And new, more accurate sensors based on ultrawideband radar are available for even less.

Now robot builders can also add Global Positioning System chips, video cameras, array microphones (which are better than conventional microphones at distinguishing a voice from background noise) and a host of additional sensors for a reasonable expense. The resulting enhancement of capabilities, combined with expanded processing power and storage, allows today's robots to do things such as vacuum a room or help to defuse a roadside bomb--tasks that would have been impossible for commercially produced machines just a few years ago.

A BASIC Approach
In february 2004 I visited a number of leading universities, including Carnegie Mellon University, the Massachusetts Institute of Technology, Harvard University, Cornell University and the University of Illinois, to talk about the powerful role that computers can play in solving some of society's most pressing problems. My goal was to help students understand how exciting and important computer science can be, and I hoped to encourage a few of them to think about careers in technology. At each university, after delivering my speech, I had the opportunity to get a firsthand look at some of the most interesting research projects in the school's computer science department. Almost without exception, I was shown at least one project that involved robotics.

At that time, my colleagues at Microsoft were also hearing from people in academia and at commercial robotics firms who wondered if our company was doing any work in robotics that might help them with their own development efforts. We were not, so we decided to take a closer look. I asked Tandy Trower, a member of my strategic staff and a 25-year Microsoft veteran, to go on an extended fact-finding mission and to speak with people across the robotics community. What he found was universal enthusiasm for the potential of robotics, along with an industry-wide desire for tools that would make development easier. "Many see the robotics industry at a technological turning point where a move to PC architecture makes more and more sense," Tandy wrote in his report to me after his fact-finding mission. "As Red Whittaker, leader of [Carnegie Mellon's] entry in the DARPA Grand Challenge, recently indicated, the hardware capability is mostly there; now the issue is getting the software right."

Back in the early days of the personal computer, we realized that we needed an ingredient that would allow all of the pioneering work to achieve critical mass, to coalesce into a real industry capable of producing truly useful products on a commercial scale. What was needed, it turned out, was Microsoft BASIC. When we created this programming language in the 1970s, we provided the common foundation that enabled programs developed for one set of hardware to run on another. BASIC also made computer programming much easier, which brought more and more people into the industry. Although a great many individuals made essential contributions to the development of the personal computer, Microsoft BASIC was one of the key catalysts for the software and hardware innovations that made the PC revolution possible.

After reading Tandy's report, it seemed clear to me that before the robotics industry could make the same kind of quantum leap that the PC industry made 30 years ago, it, too, needed to find that missing ingredient. So I asked him to assemble a small team that would work with people in the robotics field to create a set of programming tools that would provide the essential plumbing so that anybody interested in robots with even the most basic understanding of computer programming could easily write robotic applications that would work with different kinds of hardware. The goal was to see if it was possible to provide the same kind of common, low-level foundation for integrating hardware and software into robot designs that Microsoft BASIC provided for computer programmers.

A Robot in Every Home
BIRTH OF AN INDUSTRY: iRobot, a company based in Burlington, Mass., manufactures the Packbot EOD, which assists with bomb disposal in Iraq


Tandy's robotics group has been able to draw on a number of advanced technologies developed by a team working under the direction of Craig Mundie, Microsoft's chief research and strategy officer. One such technology will help solve one of the most difficult problems facing robot designers: how to simultaneously handle all the data coming in from multiple sensors and send the appropriate commands to the robot's motors, a challenge known as concurrency. A conventional approach is to write a traditional, single-threaded program--a long loop that first reads all the data from the sensors, then processes this input and finally delivers output that determines the robot's behavior, before starting the loop all over again. The shortcomings are obvious: if your robot has fresh sensor data indicating that the machine is at the edge of a precipice, but the program is still at the bottom of the loop calculating trajectory and telling the wheels to turn faster based on previous sensor input, there is a good chance the robot will fall down the stairs before it can process the new information.

Concurrency is a challenge that extends beyond robotics. Today as more and more applications are written for distributed networks of computers, programmers have struggled to figure out how to efficiently orchestrate code running on many different servers at the same time. And as computers with a single processor are replaced by machines with multiple processors and "multicore" processors--integrated circuits with two or more processors joined together for enhanced performance--software designers will need a new way to program desktop applications and operating systems. To fully exploit the power of processors working in parallel, the new software must deal with the problem of concurrency.

One approach to handling concurrency is to write multi-threaded programs that allow data to travel along many paths. But as any developer who has written multithreaded code can tell you, this is one of the hardest tasks in programming. The answer that Craig's team has devised to the concurrency problem is something called the concurrency and coordination runtime (CCR). The CCR is a library of functions--sequences of software code that perform specific tasks--that makes it easy to write multithreaded applications that can coordinate a number of simultaneous activities. Designed to help programmers take advantage of the power of multicore and multiprocessor systems, the CCR turns out to be ideal for robotics as well. By drawing on this library to write their programs, robot designers can dramatically reduce the chances that one of their creations will run into a wall because its software is too busy sending output to its wheels to read input from its sensors.

In addition to tackling the problem of concurrency, the work that Craig's team has done will also simplify the writing of distributed robotic applications through a technology called decentralized software services (DSS). DSS enables developers to create applications in which the services--the parts of the program that read a sensor, say, or control a motor-- operate as separate processes that can be orchestrated in much the same way that text, images and information from several servers are aggregated on a Web page. Because DSS allows software components to run in isolation from one another, if an individual component of a robot fails, it can be shut down and restarted--or even replaced--without having to reboot the machine. Combined with broadband wireless technology, this architecture makes it easy to monitor and adjust a robot from a remote location using a Web browser.

What is more, a DSS application controlling a robotic device does not have to reside entirely on the robot itself but can be distributed across more than one computer. As a result, the robot can be a relatively inexpensive device that delegates complex processing tasks to the high-performance hardware found on today's home PCs. I believe this advance will pave the way for an entirely new class of robots that are essentially mobile, wireless peripheral devices that tap into the power of desktop PCs to handle processing-intensive tasks such as visual recognition and navigation. And because these devices can be networked together, we can expect to see the emergence of groups of robots that can work in concert to achieve goals such as mapping the seafloor or planting crops.

These technologies are a key part of Microsoft Robotics Studio, a new software development kit built by Tandy's team. Microsoft Robotics Studio also includes tools that make it easier to create robotic applications using a wide range of programming languages. One example is a simulation tool that lets robot builders test their applications in a three-dimensional virtual environment before trying them out in the real world. Our goal for this release is to create an affordable, open platform that allows robot developers to readily integrate hardware and software into their designs.

Should We Call Them Robots?
How soon will robots become part of our day-to-day lives? According to the International Federation of Robotics, about two million personal robots were in use around the world in 2004, and another seven million will be installed by 2008. In South Korea the Ministry of Information and Communication hopes to put a robot in every home there by 2013. The Japanese Robot Association predicts that by 2025, the personal robot industry will be worth more than $50 billion a year worldwide, compared with about $5 billion today.

As with the PC industry in the 1970s, it is impossible to predict exactly what applications will drive this new industry. It seems quite likely, however, that robots will play an important role in providing physical assistance and even companionship for the elderly. Robotic devices will probably help people with disabilities get around and extend the strength and endurance of soldiers, construction workers and medical professionals. Robots will maintain dangerous industrial machines, handle hazardous materials and monitor remote oil pipelines. They will enable health care workers to diagnose and treat patients who may be thousands of miles away, and they will be a central feature of security systems and search-and-rescue operations.

Although a few of the robots of tomorrow may resemble the anthropomorphic devices seen in Star Wars, most will look nothing like the humanoid C-3PO. In fact, as mobile peripheral devices become more and more common, it may be increasingly difficult to say exactly what a robot is. Because the new machines will be so specialized a nd ubiquitous--and look so little like the two-legged automatons of science fiction--we probably will not even call them robots. But as these devices become affordable to consumers, they could have just as profound an impact on the way we work, communicate, learn and entertain ourselves as the PC has had over the past 30 years.

---

BILL GATES is co-founder and chairman of Microsoft, the world's largest software company. While attending Harvard University in the 1970s, Gates developed a version of the programming language BASIC for the first microcomputer, the MITS Altair. In his junior year, Gates left Harvard to devote his energies to Microsoft, the company he had begun in 1975 with his childhood friend Paul Allen. In 2000 Gates and his wife, Melinda, established the Bill & Melinda Gates Foundation, which focuses on improving health, reducing poverty and increasing access to technology around the world.

(c) www.sciam.com

Monday, January 1, 2007

India to test space capsule as part of moon mission

India to test space capsule as part of moon mission

India plans to launch a capsule into orbit early next year and bring it back to Earth, an initial step towards an unmanned mission to the moon by 2010, a news report said Sunday.

The Indian Space Research Organisation (ISRO), which has said it hopes to send an unmanned probe to the moon in the next three years, has said it needs to test its re-entry and recovery technology, the Indian Express report said.

The agency will launch a 50-kilogram (110 pound) capsule and then have it re-enter and splash into the Bay of Bengal after 15 to 30 days of orbit around the Earth, the newspaper said.

The announcement is the latest by India's space agency that show an expansion in policy from projects meant to aid national development to a growing interest in space exploration, the report said.

The space agency also said last month that it planned to send an unmanned mission to Mars by 2013 to look for evidence of life.

The six-to-eight-month mission, would cost three billion rupees (67 million dollars), the Hindustan Times reported.

© 2006 AFP

(c) www.physorg.com

Molecular Anatomy Of Influenza Virus Detailed

Science Daily — Scientists at the National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), part of the National Institutes of Health in Bethesda, Md., and colleagues at the University of Virginia in Charlottesville have succeeded in imaging, in unprecedented detail, the virus that causes influenza.

Molecular Anatomy Of Influenza Virus Detailed
The three-dimensional structure of influenza virus from electron tomography. The viruses are about 120 nanometers -- about one ten thousandth of a millimeter -- in diameter.

A team of researchers led by NIAMS' Alasdair Steven, Ph.D., working with a version of the seasonal H3N2 strain of influenza A virus, has been able to distinguish five different kinds of influenza virus particles in the same isolate (sample) and map the distribution of molecules in each of them. This breakthrough has the potential to identify particular features of highly virulent strains, and to provide insight into how antibodies inactivate the virus, and how viruses recognize susceptible cells and enter them in the act of infection.

“Being able to visualize influenza virus particles should boost our efforts to prepare for a possible pandemic flu attack,” says NIAMS Director Stephen I. Katz, M.D., Ph.D. “This work will allow us to ‘know our enemy' much better.”

One of the difficulties that has hampered structural studies of influenza virus is that no two virus particles are the same. In this fundamental respect, it differs from other viruses; poliovirus, for example, has a coat that is identical in each virus particle, allowing it to be studied by crystallography.

The research team used electron tomography (ET) to make its discovery. ET is a novel, three-dimensional imaging method based on the same principle as the well-known clinical imaging technique called computerized axial tomography, but it is performed in an electron microscope on a microminiaturized scale.

The mission of the National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), a part of the Department of Health and Human Services' National Institutes of Health, is to support research into the causes, treatment, and prevention of arthritis and musculoskeletal and skin diseases; the training of basic and clinical scientists to carry out this research; and the dissemination of information on research progress in these diseases. For more information about NIAMS, call the information Clearinghouse at (301) 495-4484 or (877) 22-NIAMS (free call) or visit the NIAMS Web site at http://www.niams.nih.gov.

The National Institutes of Health (NIH) — The Nation's Medical Research Agency — includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. It is the primary federal agency for conducting and supporting basic, clinical and translational medical research, and it investigates the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.

Reference: Harris A, et al . Influenza virus pleiomorphy characterized by cryoelectron tomography. PNAS 2006;103(50):19123-19127.

(c) www.sciencedaily.com

Gene-engineered cattle resist mad cow disease: study

Gene-engineered cattle resist mad cow disease: study

WASHINGTON (Reuters) - U.S. and Japanese scientists reported on Sunday that they had used genetic engineering to produce cattle that resist mad cow disease.

They hope the cattle can be the source of herds that can provide dairy products, gelatin and other products free of the brain-destroying disease, also known as bovine spongiform encephalopathy or BSE.

Writing in the journal Nature Biotechnology, the researchers said their cattle were healthy at the age of 20 months, and sperm from the males made normal embryos that were used to impregnate cows, although it is not certain yet that they could breed normally.

The cattle lack the nervous system prions, a type of protein, that cause BSE and other related diseases such as scrapie in sheep and Creutzfeldt-Jakob disease, known as CJD, in humans, the researchers said.

"(Prion-protein-negative) cattle could be a preferred source of a wide variety of bovine-derived products that have been extensively used in biotechnology, such as milk, gelatin, collagen, serum and plasma," they wrote in their report.

Yoshimi Kuroiwa of Kirin Brewery Co. in Tokyo, Japan and colleagues made the cattle, known as knockouts because a specific gene has been "knocked" out of them, using a method they call gene targeting.

"By knocking out the prion protein gene and producing healthy calves, our team has successfully demonstrated that normal cellular prion protein is not necessary for the normal development and survival of cattle. The cows are now nearly 2 years old and are completely healthy," said James Robl of Hematech, a South Dakota subsidiary of Kirin.

"We anticipate that prion protein-free cows will be useful models to study prion disease processes in both animals and humans," Robl, an expert in cloning technology, said in a statement.

Misfolded prion proteins are blamed for BSE and other, similar brain diseases. It is known that certain genetic variations make animals more susceptible to the diseases.

BSE swept through British herds in the 1980s and people began developing an odd, early-onset form of CJD called variant CJD or vCJD a few years later. CJD normally affects one in a million people globally, usually the elderly, as it has a long incubation period.

There is no cure and it is always fatal.

As of November 2006, 200 vCJD patients were reported worldwide, including 164 patients in Britain, 21 in France, 4 in the Republic of Ireland, 3 in the United States, 2 in the Netherlands and 1 each in Canada, Italy, Japan, Portugal, Saudi Arabia and Spain.

The disease may have first started to infect cattle when they were fed improperly processed remains of sheep, possibly sheep infected with scrapie. Although people are not known to have ever caught scrapie from eating sheep, BSE can be transmitted to humans.

BSE occasionally occurs in cattle outside Britain although it is now rare.

(c) www.sciam.com

Tuesday, December 26, 2006

Study shows extreme contrast in ozone losses at North, South Poles

Study shows extreme contrast in ozone losses at North, South Poles

A new study shows just how dramatic the ozone loss in the Antarctic has been over the past 20 years compared to the same phenomenon in the Arctic.

The study found "massive" and "widespread" localized ozone depletion in the heart of Antartica's ozone hole region, beginning in the late 1970s, but becoming more pronounced in the 1980s and 90s.

The US government scientists who conducted the study said that there was an almost complete absence of ozone in certain atmospheric air samples taken after 1980, compared to earlier decades. In contrast, the ozone losses in the Arctic were sporadic, and even the greatest losses did not begin to approach the regular losses in the northern hemisphere, the researchers said.

"Typically the Arctic loss is dramatically less than the Antarctic loss," said Robert Portmann, an atmospheric scientist with the National Oceanic and Atmospheric Administration in Boulder, Colorado.

Scientists have been tracking the expanding ozone hole over Antarctica for some 20 years now.

In October, NASA scientists reported that this year's hole is the biggest ever, stretching over nearly 11 million square miles.

In Antarctica, local ozone depletion at some altitudes frequently exceeded 90 percent, and often reached up to 99 percent during the Antarctic winter in the period after 1980 compared to earlier decades, the researchers said.

In the Arctic, the losses occasionally peaked at 70 percent, and some losses of 50 percent were seen in the mid 1990s, when temperatures were particularly low, but the scale and scope of the problem was much less than what was seen in the Northern Hemisphere.

Recent studies have also pointed to large ozone losses in the Southern Hemisphere, but the NOAA researchers said their study showed that these events were rare and did not appear to signal a trend.

"We saw small to moderate ozone losses in the very coldest winters, when the stratospheric conditions are ripe for ozone loss, but they were rarer than we expected," said Portmann.

The study, published in the Proceedings of the National Academy of Sciences, was based on more than 40 years' of ozone readings from polar observation stations and balloon-borne measuring mechanisms.

(c) www.physorg.com

Saturday, December 23, 2006

Space shuttle Discovery completes successful ISS mission

Space shuttle Discovery completes successful ISS mission

Space shuttle Discovery and its seven-member crew has landed in Florida after a 13-day mission that advanced construction of the International Space Station.

Discovery touched down at 2232 GMT on the landing strip at the Kennedy Space Center, near Cape Canaveral, where it had lifted off in a nighttime launch on December 9.

"You have seven thrilled people here," shuttle commander Mark Polansky said just after landing.

A controller at the Mission Control Center in Houston, Texas, replied: "Congratulations on what was probably the most complex assembly mission of the Station to date."

The Discovery astronauts spent eight days at the International Space Station (ISS), continuing the construction of the orbiting laboratory by attaching a two-tonne truss to its girder-like structure in the first of four space walks.

During the next two space walks, they rewired the station's power system and put it on a permanent basis.

A fourth space walk was added Monday to shake loose a solar array panel that had gotten stuck as it was being folded.

"It was a wonderful end to a great mission," Michael Griffin, a NASA administrator, said at a news conference at Kennedy Space Center where he welcomed the astronauts after they left the spacecraft.

"The crew on orbit and the crew on the ground could not have done better," he said.

The shuttle's return to Earth initially was scheduled Thursday but the mission was extended a day to allow for the unplanned space walk.

Poor weather conditions in Florida had scotched an earlier scheduled landing Friday and National Aeronautics and Space Administration officials had considered bringing the shuttle down at alternative sites.

A decision to land at Kennedy was made only at the last minute and "it turned out to be a great one," Griffin said.

NASA officials had hoped weather conditions would improve enough in Florida to avoid having to land the space shuttle at Edwards Air Force Base in California or White Sands Space Harbor in New Mexico.

A landing at either location in the western United States would have meant NASA would have to fly the shuttle to Florida, in the southeast, on the back of a modified Boeing 747 plane, which would cost some 1.7 million dollars.

The shuttle plunged at more than 26,500 kilometers (16,500 miles) an hour as it descended through the Earth's atmosphere, triggering a double sonic boom as it lowered through the skies above the runway on Florida's Atlantic coast.

On board were six US astronauts and one from the European Space Agency (ESA), Christer Fuglesang, Sweden's first astronaut.

After this mission, NASA plans at least 13 more shuttle flights -- including five in 2007 -- to complete construction of the International Space Station by 2010, when the three-spacecraft shuttle fleet is due to be retired.

ISS construction fell years behind schedule after the 2003 Columbia tragedy when the spacecraft disintegrated minutes ahead of landing, killing all seven astronauts aboard.

NASA suspended the shuttle program to deal with safety problems. The space shuttle Atlantis mission in September marked the resumption of ISS construction.

Discovery carried astronaut Sunita Williams to the ISS, where she will remain for six months. Williams replaced Thomas Reiter, a German ESA astronaut who had been on the station since July and who returned on Discovery.

(c) www.physorg.com

Tag Cloud