“Typically, engineers are becoming more specialized and are less likely to understand the hazards of the space environment. Often we have to relearn things that were known by people who retire or move on…it is also recognized that private companies have a disinclination to release information on their own problems” [NOAA/SEC Satellite Working Group, 1999]

Sometimes we work too hard to make coincidences into real cause and effect: recall the example of the Exxon Valdez accident during the March 1989 space weather event. And sometimes it’s not easy to grasp just how complex space weather issues have become in the last 10 years. There are many facets to the story, and like a diamond, the impression you get depends on your perspective. When I first started learning about this subject, I was overwhelmed by the lack of careful documentation, and the impossibility of ever finding it for many of the outages I had heard about. I was also nervous about the basic issue of simultaneous events masquerading as cause-and-effect. If you are intent on seeing every mishap as a demonstration that satellites are inherently vulnerable, then there is ample circumstantial evidence to support the claim.

Why is it that satellites are assumed to be unaffected by events like cosmic rays, solar flares, energetic electrons until such a claim can be ‘proven’ by recovering the ‘body’? If we know that space is a hostile environment to off-the-shelf and non-radiation hardened systems, why do multi-national corporations greet the inevitable failures with suspicion when they do happen? Where did this presumption of innocence begin to enter the professional discussion?

So long as there is any scientific uncertainty about cause and effect, well founded or otherwise, there will be no measurable change in the attitudes of satellite manufacturers. As Goddard space scientist Michael Lauriente notes,

“Because of the statistical nature of the risks, apathy reigns among some of the spacecraft community”

Space physicist A. Vampola from the Aerospace Corporation also confronts the irony of the sparring between scientists and satellite owners,

“The space environment is hostile. It is not a benign vacuum into which an object can be placed and be expected to operate exactly as it did when it was designed and tested on the ground”

There are plenty of uniquely interested parties who would denounce each satellite ‘kill’ as simply a technological problem with insulation or some other factor, and there is also ample cause to support this point of view. Scientific data is too sparse for us to look into every cubic meter and say with authority that ‘this satellite was killed by that event’. This means that you are completely free to argue that satellites are actually quite invulnerable to any space weather effect, and that the failures are purely a matter of quality control. Why, then, do satellite owners take out insurance policies? Why do they over-design solar panels and go on record saying that they expect to lose up to six satellites per year?

A satellite communications corporation may very well be intent on maintaining the status quo in supporting the rag-tag effort of space weather research, and may find just cause to believe that this strategy is OK too. After all, severe solar storms capable of producing a blackout happen very infrequently compared to other kinds of power outages (ice storms, line sags and the like). Since satellite owners suppress evidence of anomalies traceable to solar storms, there is no trail of problems that require better space weather models. However, NOAA has many satellite owners on confidential lists that receive daily updates about space conditions. In a classic ‘Catch-22’ situation, these clients would like better forecasts but cannot publicly support such an effort at NOAA or NASA without going public and implying that their service is vulnerable.

While satellite owners and electric power utilities continue to expand their vulnerability and reap hundreds of billions of dollars in annual profits, space weather forecasting continues to languish. Ironically, only about $50 million per year supports scientists to develop the forecasting techniques to safeguard $100 billion in satellite assets, and $210 billion in electrical power company profits. Moreover, both NASA and NOAA have been forced into flat or declining budgets for these specific activities. Back in June, 1997, NOAA’s Space Environment Center was facing a funding crisis with ‘flat funding’ or zero growth planned for the years following 1997. According to Ernest Hildner, the Director of NOAA’s Space Environment Center in Boulder, Colorado,

“Without increased funding, we will be facing the next solar cycle with two-thirds as much staff as we had during the last solar cycle”

At NASA, through much of the 1990’s, efforts to develop better forecasting models were also stuck on the slow track, at the same time that new floods of data entered the NASA data archives. Space physics, and modeling research in support of space weather forecasting, tends to be underfunded given the complexity of the subject. It doesn’t seem to have a popular constituency the way cosmology and planetary exploration does. A mission like SOHO costs $1.2 billion to build and support of which $400 million was supplied by NASA and the rest provided by the European Space Agency. According to Art Poland, who was the former NASA, Project Manager for SOHO,

“SOHO can support 20-25 scientists full time, and has done so for the last 4 years. Including institutional overhead costs, it costs about $100,000 to support each scientist. The total research budget for a mission like SOHO is about $2 million per year. To support the entire ISTP program each year including technicians, engineers and scientists costs about $50 million.”

So, despite the billions of dollars that go into building and supporting satellite hardware, in some cases less than a few percent of the total mission cost ever shows up as support for scientists to analyze the incoming windfall of new data. Researchers may get dazzling images of CMEs from one satellite like SOHO, between 1996-2001, solar wind data from another satellite like ACE or WIND from 1999-2002, and geospace data from another satellite such as IMAGE between 2000-2002. This information may not overlap in time, it may not even overlap in wavelength, yet it all has to be knitted together. With working satellites, researchers make the constant plea not to shut off one satellite before a new one can be launched with instruments able to provide overlapping coverage. This makes developing space weather models a frustrating enterprise because it is easier to decipher the language of space weather if you have complete sentences, than if you have fragments.

No agency but NASA is mandated to build the satellites that keep watch on space weather. Instead, space weather monitoring activities are forced to piggyback on satellites optimized for pure research. The organizations that benefit most from NASA research satellites receive the space weather data free of charge, and make little or no financial investment in data acquisition themselves. They also rely on NASA to launch research satellites, and use the data that follows to temporarily improve their forecasting. This solution is certainly easier than investing in space weather satellites, but it also means that newer, better forecasting techniques, for example in the electric power industry, are held hostage to the necessities of ongoing, yet uncertain NASA funding.

The National Academy of Science evaluated the readiness of NASA, NOAA, DoD and DOE to develop a comprehensive, space weather forecasting model. Although they identified many successes of the existing programs, they also identified glaring problems that work against accomplishing the National Space Weather Program goals anytime soon. NASA, up until 1997, was planning to stop funding its ISTP program after fiscal year 1998; NOAA’s Space Weather Center had suffered a 33% staff reduction and had stopped supporting the translation of data-based models into forecasting tools. “The lack of commitment at NOAA to this unique and critical role will have a fundamental impact on the success of the NSWP” A similar critique was leveled against DoD where the staff turnover time was less than a single 11-year solar cycle, so there was no institutional memory of what was learned during past cycles.

Fortunately, NASA’s Living with a Star program which was set up in 1999, promises to keep ISTP fully funded until its spacecraft expire from wear and tear. The program will also set in motion a more aggressive support of space weather research during the next decade. To be ready for the next solar maximum in 2011, missions must be planned today so that the necessary hardware will be in place when the new round of solar activity rises to a climax.

Beyond the interests of the research community, many different elements of our society have recently begun to appreciate the need for a deeper understanding of the space environment. The military and commercial satellite designers, for example, are not happy that they have to use 20-30 year-old models to predict space weather conditions. This forces them to fly replacement satellites more frequently, or over-design others to withstand even the occasional major solar flare. Unfortunately, the current trends in satellite design seem to be directed towards increasing satellite vulnerability to disabling radiation damage. As pointed out by William B. Scott in Aviation Week and Space Technology,

“Austere defense budgets also have increased reliance on more affordable, but perhaps less robust, commercial off-the-shelf hardware…expensive radiation-hardened processors are less likely to be put on some military satellites or communication systems now, than was once the case according to USAF officers…newer chips are much more vulnerable than devices of 10-15 years ago”

In an age when cheaper, faster and smaller drives NASA’s, the military and commercial satellite designs, satellites have become more susceptible to solar storm damage than their less sophisticated but more reliable predecessors. As more satellites become disabled by ‘mysterious’ events that owners prefer not to discuss openly, old lessons in satellite design need to be rediscovered. A recent advertisement some satellite manufacturers boast that they employ “…advanced composite materials which improve performance while reducing weight”. This also makes for poor shielding because weight, and quite a bit of it for shielding, has proven itself to be a good radiation mitigation strategy.

In all of this concern over satellite survivability, there looms another harsh circumstance that may make long satellite lifetimes impossible. Large networks of satellites in LEO are suffering something of a shakeout that has nothing to do with the rise and fall of solar storms. The pace of communications technology, and consumer needs, has begun to change so quickly, that by the time a billion-dollar satellite network is in place, it is nearly obsolete. Companies have to install their networks within one to two years, or run the risk of becoming a technological backwater before the planned life span has been reached. The two companies which have put in place the two largest networks by 1999, Iridium and Globalstar, have already filed for bankruptcy. Iridium was able to sell only 10,000 of the 100,000 wireless telephone handsets it planned, and these phones did not have the capability to send data along with voice transmissions. No sooner had Iridium gone on the air, that its owners regretted the decision not to include data lines. Globalstar, meanwhile, had not even completed installing its full satellite system before it followed Iridium LLC into bankruptcy. Yet this kind of record, so soon in the fledgling LEO industry, doesn’t seem to bother some people in the industry. Satellite communications is still seen as a highly lucrative business despite lightweight satellites that will face space weather problems. The tide of technological progress is sweeping the industry at an ever-faster pace. A sense of urgency now pervades the industry: make profits as quickly as possible. For example, Bruce Elbert, Senior Vice President of Hughes Space and Communications International, and the leading manufacturer of geosynchronous communications satellites, suggested with great enthusiasm,

“The next millennium may see all land-line communications going wireless. You could wait [to enter this market], but why put off gaining the economic and potentially competitive advantages of using satellite technology today?”

Meanwhile, communication technology has not stopped evolving. New devices and systems are on the rise that may eventually make communication satellites less desirable, at least by the peak of the next solar cycle in 2011.

The first fiber optic cable, TAT-8, entered commercial service in 1988, and since then no fewer than 408,000 miles of fiber have been laid across the oceans by 2000. The current investment in undersea fiber exceeds $30 billion and is expected to surpass $50 billion by 2003. One project alone, Project Oxygen, will be a $14 billion cable tying 30 countries in Europe and North America together. It will take no more than three months to lay this cable, and when it is completed, it will carry 25 million phone calls and 10,000 video channels. It can carry all of the world’s Internet traffic on one of its four fibers which can deliver 320 gigabytes per second of capacity. The combined bandwidth of 1.2 terabytes per second is enough to transmit the entire text contents of the Library of Congress in a few seconds.

The driving force behind the spectacular growth in fiber optic technology is the Internet and the insatiable appetite it creates for massive volumes of data delivered immediately. At the same time, the explosive growth in the market for cellular phones has driven the satellite industry to meet traveler’s desires to stay in-touch with family and co-workers no matter where they may be on the Earth. The only drawback to fiber optic communications is that to take advantage of the high data rates it requires landlines to individual users. Satellites, meanwhile, require that their users only need a portable hand phone or a satellite dish to receive their signals directly. Satellites work well when connecting large numbers of rural or off the road users. Yet, fiber optic cables still have the advantage of the highest data rates, and they do not have to be replaced every 5-10 years the way satellites must. Modern cables are not rendered useless as new, faster, technologies arise. Only the equipment at the cable’s end stations has to be upgraded, not the entire undersea fiber cable. In addition, when a cable is damaged, it can be easily repaired underwater using remote-controlled robots. On the other hand, satellite transponders create the bottleneck problem as the capability of this technology increases, and they can only be upgraded by launching a new $200 million satellite.

Unlike satellites, which largely support our entertainment and financial needs, the pervasive use of electrical power can create far more serious problems than even the most dramatic satellite outage. Electrical power outages can completely shutdown an entire region’s commerce for hours or days, and they can cause death. Geomagnetic storms will continue to bombard the power grid, and there is little we can do to harden the system. The blackouts we will experience in the future will have as much to do with our failure to keep electrical supply ahead of demand, as it does with the failure of the electrical power industry to put in place the proper forecasting techniques. Unlike satellite design, there are few easy solutions to the declining operating margins because power plants seem to be universally unpopular. Our vulnerability to the next blackout has become as much a sociological issue as a technological one.

When the ice storm of January, 1999 darkened the lights of nearly a half a million residents in Virginia, Maryland and Washington D.C. no one thought it was especially amusing or trivial even though only 0.1% of the U.S. population was involved. My family and I were among the last to have our electrical service restored after waiting in the cold and dark for five days. The first night was a curious mix of concern and genuine delight at the new experience. We huddled together under down comforters and actually sang campfire songs as our home slowly gave up its latent heat. We were delighted to see the beautiful trees, like sculptures of ice, bent over along the street which was now an impassable scatting rink. Elsewhere, emergency rooms were filling up with people who had broken bones, turned wrists and ankles, or blood streaming down from head wounds and concussions. Hundreds of traffic accidents offered up a handful of fatalities as people died for no reason other than being in the wrong place at the wrong time. By the third night, we had joined thousands of others in Montgomery County, calling the electrical utility company to find out when we might be reconnected. We found ourselves the only street in our community that did not have electrical service. At night, we watched our neighbor’s security light cast ghostly figures of tree limbs on our bedroom walls, by day we frequented local shopping malls to keep warm and ate our meals at restaurants. It wasn’t the Sun and its mischief that had brought this on, but a common natural occurrence of more mundane origin. Still, the discomfort, expense, physical pain, and even death that it caused is a potent reminder that we can no longer as tolerant of power and communication outages as we once were.

When I first bemoaned how astronomy and astrophysics seldom have practical consequences, I hardly suspected that simply the search for answers to questions in cosmology and galactic astronomy could cause a domino-fall that could make blackouts more likely. The very satellites that I backed as a professional astronomer to further my particular area of research and curiosity, were in the zero-sum game of modern budget analysis, robbing the space scientists of the tools they would need to improve space weather forecasting. In the future, the next power outage my family and I have to endure may, in some sense, be the consequence of my own professional choices elsewhere in life.

Our vulnerability to solar storms during Cycle 23 follows a pattern that can be seen in other more familiar arenas. Between 1952 and 1986, the average cost of national disasters (earthquakes, tornadoes, hurricanes and floods) averaged $8 billion every five years. Then in 1987-1991 this doubled to $20 billion, and in 1992-1996 the cost skyrocketed to over $90 billion. The reason for this sudden increase in the last five to ten years is that more people, often with higher incomes, are moving in droves to coastal- those areas typically battered by earthquakes and hurricanes. As Greg van der Vink and his colleagues at Princeton University’s Department of Geophysics have noted,

“We are becoming more vulnerable to natural disasters because of the trends of our society rather than those of nature. In other words, we are placing more property in harms way”

High-cost events, such as a devastating earthquake, simply are not factored into the way our society now plans and develops communities. In fact, economic incentives that would encourage more responsible ways to use land are actively denounced, or worse, stifled. The kinds of institutions and programs that have been established actually subsidize bad choices, for example, they offer low-interest rebuilding loans after an earthquake, or help to rebuild beachfronts after a hurricane.

There is much in this terrestrial process that we now see reflected in the way we conduct activities in space. Satellite insurers underwrite bad satellite designs by charging only slightly higher rates (3.7% per year compared to 1.2% for ‘good’ satellites) for poor or risky designs. And once the stage has been set, it is difficult to change old habits. Like homeowners rebuilding property along receding coastlines after a hurricane, satellite manufacturers insist on orbiting low-cost, inherently vulnerable satellites. Electrical utilities, meanwhile, forego investment in even the simplest ground-induced current mitigation technologies and prefer to view GIC problems as a local technical difficulty with a specific piece of equipment.

In the next 10 years, the excitement of Cycle 23 will fall behind us just as the heyday of Cycle 22 has now become an historical footnote. In 2006, we will find ourselves at the beginning of a new round of solar storminess. The great experiment of the LEO communication satellite networks will have run their inevitable courses, either as intermittent successes, or as technological dinosaurs. Barring any major blackouts this time around, electrical utilities operating in a deregulated climate will debate on a region by region basis, whether the investment in GIC forecasting makes sense or not. We will also have weathered several years of round-the-clock occupancy of the International Space Station. This latter experience will teach us many lessons – some harsh – about what it really takes to deal with space weather events, and what is needed to become permanent occupants of space. At some time, perhaps in the 22nd century, our present concern and obsession with harmful space weather events will be a thing of the past. There is even the hope that the sun may unexpectedly cease its stormy, cyclical behavior for a century or two as it has in the past.

But for now, for this year, for this Cycle, we must remain vigilant even in the face of what seems like little cause for outright concern. For every pager that is temporarily silenced, others will work just fine. For every city that is darkened or loses air conditioning, a hundred others will experience just another ordinary day. Space weather is like a shark whose fin we see gliding across the interplanetary waters. We know it is there, but we haven?t completely figured out what to do when it strikes.

Submit your review
1
2
3
4
5
Submit
     
Cancel

Create your own review

Average rating:  
 0 reviews