More than a year after Hurricane Katrina hit the Gulf Coast, Americans are still wondering how prepared the nation is for the next "big one." They have reason to be concerned. A recent assessment by the Department of Homeland Security indicates that 27% of the states and 10% of the cities evaluated were not prepared to handle a "catastrophic event" of any kind. And while the National Oceanic and Atmospheric Association (NOAA) now believes this hurricane season will not be as severe as they once predicted, the agency still can't specify with any real certainty when and were the next tropical storm or hurricane will strike.
"Science has not evolved enough to accurately predict on seasonal timescales when and where these storms will likely make landfall," says Gerry Bell, NOAA's lead seasonal hurricane forecaster. "Exactly when and where landfall occurs is strongly controlled by the weather patterns in place as the storms approach land. These weather patterns generally cannot be predicted more than several days in advance."
Can technology help? It already has. But so far most predictions—or warnings—of natural disasters have come out of the constant tweaking of computer models developed by supercomputers, weather-specific satellites, and radar.
A DIRE NEED
The need for innovation is obvious. Hurricane Katrina taught the U.S. a lot of hard lessons. When the storm hit the Gulf in August 2005, the region's emergency communications infrastructure had difficulties coping with the demand. What's more, the various groups of first responders, who already had the monumental challenge of evacuating a city that even in the face of the storm was ill-prepared to do so, were unable to communicate with each other. The cause? Radios that weren't interoperable.
Washington Technology, a bimonthly magazine for IT system integrators and resellers, highlighted the problem in its coverage of the first anniversary of the tragedy. "The preparation for and response to Hurricane Katrina show we are still an analog government in a digital age," said the final report of the congressional Select Bipartisan Committee on Preparations for and Response to Hurricane Katrina. "We must recognize that we are woefully incapable of storing, moving and accessing information— especially in times of crisis."
Communications interoperability is another huge issue. When local government officials from across the country met at the United States Conference of Mayors this past June, it was revealed that municipal public safety agencies in 80% of the cities in the U.S. use equipment that is not interoperable. Most of these systems operate on different frequencies, and currently there are no public safety radios that operate on more than one public safety frequency band.
In most areas, public safety communications require users to link incompatible radios by plugging them into a switch/programmable interconnect device. If responders with VHF radios arrive at an incident in which the radio system in use is an 800-MHz system, for instance, a VHF radio and the 800-MHz radio would be plugged into the switch. When a responder with a VHF radio talks, the VHF radio connected to the switch would output the audio through the switch, and the 800-MHz radio would rebroadcast the same audio.
Perhaps the most promising solution to the interoperability problem is SDR, or software-defined radio, which can update and change modulation schemes, protocol standards, and frequency bands (see "SDR Tuning Up To Provide Disaster Relief," p. 43).
The SDR Forum's "Report on SDR Technology for Public Safety" calls for SDR-based multiband radios, which would enable first responders to have a single radio that could be configured to operate on radio systems regardless of band. That by itself won't solve the interoperability problem, but the SDR trade group says it would provide capabilities that would address many situations in which responders to emergencies have incompatible radio systems.
Another major step, the SDR Forum believes, would be the ability to license the protocols of proprietary systems so the responders could have radios that operate independently of frequency band and vendor protocols. Market researcher Venture Development Corp. (VDC) suggests that local, regional, and state first responders may be leading the way in deploying SDR.
VDC recently collected detailed Web survey responses from more than 300 U.S. first-responder units (Fig. 1). Its finding revealed that SDR is currently considered a critical part of their nextgeneration communications infrastructure—despite the fact that the typical pipeline for this type of technology is the Pentagon. But interoperability is only part of the issue.
"Though there are interoperability channels right now in most public safety frequency allocations, those channels, and all others, become useless where the communications infrastructure of public safety facilities becomes inoperative," says Harold Kramer, chief operating officer of the American Radio Relay League (ARRL), a national association for amateur radio operators that has been in existence for over 90 years.
"Hardening" of public safety facilities is called for, Kramer says, but the ARRL sees an increasing role for decentralized portable amateur radio stations that aren't infrastructuredependent in providing interoperability communications on-site (see "Hams To The Rescue" at www.electronicdesign.com, ED Online 13650).
Immediately at the onset of Katrina, about a thousand Federal Communications Commission-licensed amateur radio operators began providing continuous high-frequency (HF), VHF, and UHF communications for state, local, and federal emergency workers in and around Louisiana, Mississippi, and Alabama while agencies like local fire, EMS, and disaster management teams struggled to communicate at all.
The ability to produce more precise short-term forecasts has long been the desire of weather forecasters. Now, a new tool may help them accomplish this goal.
The National Center for Atmospheric Research (NCAR), for the first time, has been testing multiple Doppler weather radars to track water vapor in the lower atmosphere. Measuring the low-level moisture is expected to help forecasters pin down the locations and timing of storms that might rage a few minutes to a few hours later.
Named REFRACTT (for Refractivity Experiment For H2O Research and Collaborative Operational Technology Transfer), the project enables researchers to measure changes in the speed of radar signals caused by refraction, which in turn reveal the presence or absence of atmospheric moisture. If the project is successful, this refractivity technique could be added in the next few years to the national network of Doppler radars operated by NOAA's National Weather Service.
"Nobody's ever seen such high-resolution data on moisture before," says Rita Roberts, NCAR scientist and lead investigator of REFRACTT. "We believe this could greatly help forecasters predict where heavy rains might develop."
REFRACTT is being funded by the National Science Foundation, NCAR's primary sponsor. Along with four radars, scientists are using computer models, satellites, NCAR radiosondes (weather balloons), and ground-based sensors that intercept GPS signals and infer atmospheric moisture.
When meteorologists use Doppler radar to track storms, they normally monitor signals that strike raindrops, hailstones, or snowflakes and bounce back toward the radar. The strength of the returning signals indicates the intensity of rain, hail, or snow, while the change in signal frequency holds information on wind speed.
With REFRACTT, scientists are adding a third variable—the speed of the radar signals. They're using fixed targets such as power lines and silos to see how much the radar signal is sped up or slowed down by variations in water vapor. The resulting data on refractivity is plotted on a map that shows scientists where the moisture is located.
Currently, National Weather Service radars detect rainfall and winds, but not water vapor. Moreover, weather stations and weather balloon launches that measure water vapor are often separated by 50 to 100 miles or more. Consequently, there is no regular monitoring of low-level moisture between surface stations.
IT CAME FROM OUTER SPACE
Users of space weather forecasts are also anticipating greater accuracy in anticipating solar storms following a federal review of the National Space Weather Program, which focuses on tracking solar disturbances in Earth's space environment.
"As our basic commercial infrastructure becomes more reliant on electronic equipment, wireless communications, and satellite services, our national economy is more vulnerable to space weather," says Conrad Lautenbacher, a Navy vice admiral and undersecretary of commerce for oceans and atmosphere and NOAA administrator.
"NOAA's Space Environment Center is the first line of defense against damage to critical equipment," Lautenbacher says. "We have been given a clear roadmap by the National Space Weather Program Assessment Committee for how to move the program into the stages of preparedness."
The panel has drawn a series of 23 recommendations for improvements. These include the use of small, cost-effective so-called micro-satellites to provide long-term, continuous observations for operational forecasts.
NOAA's GOES-13 satellite, now in orbit and ready to replace an earlier GOES when needed, carries new instruments for measuring solar X-rays and extreme ultraviolet radiation. X-rays travel through space at the speed of light, arriving in Earth's atmosphere eight minutes after the start of a solar storm. At 90 miles up, they disrupt high-frequency communications critical for airline and military operations.
The costs of space weather disruptions to government satellites alone are estimated at $100 million a year. Diverting commercial flights to avoid radiation exposure and communications problems can cost an airline $100,000 per flight, says NOAA.
UAV TRACKS HURRICANE
One of the newer tools for weather tracking and forecasting is unmanned aircraft. This became a reality last year when hurricane researchers at the NOAA Atlantic Oceanographic and Meteorological Laboratory (AOML) in Miami kept an unmanned aircraft in the air for 10 hours during Tropical Storm Ophelia. Known as Aerosonde, the aircraft provided first-ever detailed observations of the near-surface, high-wind hurricane environment, an area often too dangerous for manned NOAA and U.S. Air Force Reserve aircraft to observe directly (Fig. 2). The unmanned aerial vehicle (UAV) was designed and built by Australia-based Aerosonde Pty. Ltd., which was acquired by AAI Corp. in June.
The Aerosonde platform that flew into Ophelia was specially outfitted with instruments used in traditional hurricane observations, including GPS, dropwindsondes, and a satellite communications system that relayed information on temperature, pressure, humidity, and wind speed every half second in real-time.
The Aerosonde also carried a downward-positioned infrared sensor that was used to estimate the underlying sea surface temperature. All data was transmitted in near real time to the NOAA National Hurricane Center and AOML, where the NOAA Hurricane Research Division is located.
Another new joint NOAA/NASA project is to investigate why some thunderstorms developing off the coast of Africa that seem to work their way to the Americas cause damage and why others don't. Essentially, NOAA and NASA will attempt to predict the intensity of these storms, using specially equipped NASA aircraft and satellites, weather radar, and balloons.
So what's it going to take to improve the forecasting of natural disasters? More of the same, only better—that i s, more computing power to improve modeling techniques and data gathering and analysis, more sophisticated weather-specific satellites, and sensors.
NEED MORE INFORMATION?