Measuring the Abilities and Predicting the Future of Computer Modeling as Applied to GCMs

 

 

When trying to predict the activity of a forest system there are many variables to be considered. Not only must stimuli be monitored and predicted on a wide range of structures, i.e. from the organelle to stand levels, but the sheer number of the stimuli involved is dizzying. This incredible number of stimuli make it impossible for any model to ever weigh all of their effects. Thus, a model�s creator must pick and choose between the stimuli trying to choose those that have the most impact on the population development.

When considering soils one must take in to account levels of various nutrients, hydrogen ion levels, poison levels, the type of soil, water content, the soils susceptibility to change, its structure, and the varying structures at different depths. Weather variables that must be considered include the changing of seasons, changes in sun activity, precipitation levels, acid levels in precipitation, wind patterns, and temperature variance. Animal impact must be considered also. The effects of insects and animal destruction (such as snowshoe hare feeding on willow seedlings in the winter) can have a lasting impact on a population. Not only do animals have an impact on their environment but they also act as a reliable indicator of biological change. The effects that a change in the environment have on a species can be reliably measured and a new field of study called zooindication has recently formed. Through zooindication the impacts of industrial pollution and climate change can be monitored.(Krivolutzky, 1985) Human intervention can have the greatest impact on a population but it is very difficult to consider trends such as human caused forest fires, forest harvesting, and urban and rural development. However, trends of some of the byproducts of industrialization can be monitored in the form of pollutants in the airways and waterways. Worse yet is the huge impact that the various stimuli can have on each other. Cooling trends can lead to increased snow levels. Increased snow levels themselves cause increased mortality rates and deprive animals of underbrush to eat. They in turn must take to eating the bark of trees to survive and thus increased tree mortality results. The spread of a city can result in the direst removal of forests while the construction of buildings calls for more trees to be harvested for building materials. The increase in factories and city wastes can pollute the air streams and waterways and effect forests downstream and downwind. This is not all, however, as weather patterns and stream/river conditions decide how far these impacts are felt and to what extent. Among the many techniques used to monitor and predict climate change general circulation models (most often referred to as GCMs) are the most sophisticated as well as most popular. GCMs are global vegetation dynamics models and predict the kinds and rates of changes in global vegetation communities in response to climate change.(Dale, 1994) GCMs are the only type of climate models that include the physical and geographical detail required for long-term analyses of climate impact on a regional level. The results from a GCM are used to evaluate the effect of a given climatic change on resources such as agriculture, forests, and water resources.(Cushman, 1988) There are however major limitations to the GCMs in use to day. These limitations show up on environmental, organizational, and technological levels. It is the shortcomings in the definition of GCMs presented in the previous paragraph, however, that show the environmental limits of models past and present. These shortcomings are the shortage of information on the interplay between the weather and the effected systems. Not only are agriculture, forests, and water resources sensitive to climate, but climate is sensitive to the conditions of agriculture, forests, and water resources. Taking this a logical one step further water resources are dependent on agriculture and forests and so on and so on. The problem with most of these models is that they have been "stand-alone" in nature, which makes it difficult to introduce inter-dependencies amongst them in a reliable manner.(Bindingdale, 1995) In other words it is the feedbacks that are so hard to calculate and cause many problems. In the area of global warming, for example, warmer temperatures result in the atmosphere holding more water vapor. Water vapor is a powerful greenhouse gas and can further increase global warming. On the other hand, increased cloud formations can decrease temperatures. Add into this the observation that increased CO2 could result in increased plant growth, and thus decreased temperatures, and you have a very complicated set of interactions.(PUC, 1994) Taking the step to implement biological factors with those of weather and geography is not an easy step; nor is it the final solution. Other environmental shortcomings of present systems that must be addressed are the impact of many of the stimuli mentioned earlier; most notably those of human origin. Biophysical models estimate potential, not expected, vegetation levels. These estimates do not take in to account the human drains on land and water resources to produce food, forest products, habitation, transportation, and industrial products. It is obvious that this human impact has greatly changed the shape and look of the land over the last few hundred years. We as humans have consciously manipulated the landscape and its ecosystems for human use. These changes result in the introduction of non-native species, the destroying of others, the deposition of airborne and waterborne pollutants and nutrients, and the exhaustion of natural resources. Pollutants in particular, especially the combined effe s of various pollutants as well as the combined effects of pollutants and existing conditions and nutrients, are an area that is less studied than the limits of the problem.(Kozuharov, 1985) The resulting lack of realism in the models is not necessarily a result of the failure of the biophysical model but it is a clear indicator that nonbiophysical factors must also be taken into account.(Frederick, 1994) The next set of problems with current modeling methods is coordinational in nature and involves the assimilation of the widely varying types and source of data. The interfacing of geophysical and ecological data is the coordination and combination of data from widely varying sources for the purpose of modeling at various scales. The data being interfaced can be products of a single integrated study or can be taken from several studies performed at various times and places using various methods. It is not necessary that the data in question was collected with integration in mind, however the process of integration is very much complicated when considering varying observation techniques, purposes, levels of detail, and degrees of accuracy. The data available for analysis is often inconsistent and adds its own margin of error to the equation. As a result the ways in which the data is integrated area often ill-defined and constantly changing.(NRC, 1995) At its most basic level interfacing involves the identification, reading, and combination of data. However, in practice these simple concepts are technically complex, stretching the limits of existing knowledge to its limits. In addition, the very act of interfacing frequently requires crossing disciplinary, administrative, and international boundaries, thereby adding another level of complexity to the process. Interfacing efforts can be confounded by a variety of obstacles. The challenges facing global change research are extreme because of the massive volume of data, the geographic scale, the scope of modeling efforts, the number of organizations involved, and the ever changing nature of the research itself. To solve these problems, or at least cope with them, there are a number of steps that can be taken. The problem of the amount of data is twofold: first, data from past years is either unavailable or goes unprocessed and second, the sheer quantity of new data poses significant challenges for nearly every type of data storage. In order to process existing data there must be a steady effort made to carefully and gradually collect, interpret, and input the countless and priceless records kept by the Forest Service and other government agencies. The impact, weight, and format of this data must constantly be judged and monitored. To cope with the large influx of information, especially the more and more detailed information being collected daily, data management and interfacing methods must be weighed carefully in terms of their ability to deal with large volumes of data. Better, more efficient storage methods must constantly be sought. (NRC, 1995) In order to solve the problem of inadequate data there are obvious steps that must be taken. One, process the information lying in filing cabinets and small computer systems across the country and around the world. This would involve a large scale effort to find all of this information and make it compatible with existing systems. Two, increase, on both a national and international level, the gathering of pertinent information. The step necessary to facilitate all of these problems is a worldwide cooperation between governments on a level such as the United Nations to establish a wide reaching agreement on resource sharing, data research, and data compatibility. If organizational and governmental cooperations continue to stay at today�s levels there is definitely a finite limit to the degree of effectiveness any GCM will have. The scientific limitations of predicting global change are very frustrating. However, we can be reassured by the fact that daily advances in computing power and observational techniques are happening almost faster than they can be understood and implemented. Computers especially are advancing at such a rate that we can be safe in guessing that the GCMs of only five years from now will make current models look like a magic eight ball. (Are global temperatures rising? Outlook is promising. Or... Will ocean levels rise? Concentrate and ask again.) New methods such as microwave modeling, and synthetic aperture radar (SAR) are being used to investigate the characteristics of forest stands. For example, a mixed coniferous forest stand has been modeled at SAR frequencies. The extensive measurements of ground truth and canopy geometry parameters were performed in a 200 m-square hemlock-dominated plot inside a forest. Hemlock trees in the forest are modeled by characterizing tree trunks, branches, and needles. (IEEE, p. 630) The lack of computing power is a very limiting factor. Reliable computation of the dynamic behavior of complex systems under future and novel conditions is required for environmental assessments. This requires modeling and simulation. Descriptive models which merely represent historical time series observations by regression functions are ill-suited for reliable dynamic assessments of possible development paths. This requires models which model relevant interactions and processes and use them to simulate real systems behavior.(IT+TI, p. 20) From a processing standpoint there is always a shortage of computing power available to process the job at hand, much less any increase in data. This lack of power leads to perhaps the largest area of concern: the scale of models. The size of the plots a model makes its calculations on can greatly affect the accuracy. Current models vary in the size of the plots they calculate. The GISS model has dimensions of 7.83( X 10( while the NCAR model has gridcell dimensions of 4.44( X 7.5(. These are areas of 650,000 km2 and 330,000 km2, respectively. The most accurate model now in use is UKMO model with a gridcell size of 3( X 330 km, or 110,000 km2. The UKMO, however, has only been used for weather forecasting. Advances in the next five to ten years should produce gridcell dimensions in the range of 1.2( X 1.2( (15,000 km2) to an optimistic 0.5( X 0.5( (2,000 km2).(Cushman, 1988) In software design advances looming ever closer are mostly in the category of "smart" programs. These are programs which are, to a degree, able to learn from the information they are presented. They can be considered the practical side of artificial intelligence. One example of such advancements is the neural network. Neural networks are a new, capable technology that is as-of-yet still poorly understood by programmers. Neural networks learn how to recognize patterns and solve problems that befuddle other types of computer programs. Coded once, a good neural network program (simulation) can be trained to solve a variety of different problems.(Phillips, 1996) The application to GCMs that arises here is obvious. A program which is given a general algorithm for a simulation of a climate process, such as a pattern of floods, can take its predicted results and use the discrepancies between the predicted results and observed results to form a more accurate algorithm. As time goes on the program is able o develop more and more accurate algorithms and even algorithms to predict the interactions between its various algorithmic parts.

To store and manage the data being collected it is necessary to relate it all geographically. Current efforts focus on the global positioning system(GPS) and the geographic information system(GIS). The functional limitations of GPS unfortunately are only down to within 76 meters. This currently excludes any possibility for small scale predictions within a GCM. Although methods such as Differential GPS(DGPS) and Survey GPS allow for precise measurements of events down to the scale of continental drift, the hundreds of thousands of dollars and time needed for such calculations make such measurements impossible for forest and ecosystem modeling.( Rozmiarek, 1995) Thus, GIS is ideal for the tracking and interpretation of information on a stand to forest level. The drawback to this system is that there is still a large amount of ambiguity present in a system as digital as this. We cannot get away from the fact that many of the stimuli affect forest on a individual level.(Hall, 1977) There is, however, presently little effort to form a GPS mapping system capable of keeping track of trees on an individual levels. This is perhaps the most frustrating aspect of global modeling because almost every problem associated with the field hinders any quick strides to a individual by individual system. It is at this point that we can draw the conclusion that extremely accurate models are dependent upon the development of high powered, accurate satellite systems capable of monitoring human expansion, the slightest change in stands of trees, the patterns of the weather, and the doings of ocean currents; and an international database to monitor this information; as well as the self correcting software algorithms just on the horizon required to quickly compensate for even the slightest variations. This leads to the conclusion that, in the long run, GCMs are most dependent upon technological advances. Problems associated with the gathering, the coordination, and compatibility of data will be a thing of the past. Although it will always be necessary to suggest new algorithms and relationships between natural systems to the programs much of the dirty work will fall out of the hands of humans. Perhaps Isaac Asimov was right when he wrote about . Perhaps one day, and one day soon, we will have a computer capable of answering our questions; of learning, of thinking, problem solving; of adapting to the future; and we will sit back and ask: How does it do that? And we will answer: We don�t know.

 

Bibliography

Bindingnavle-U. Knox-R. Kalb-V. Edited by: Roberts-C-A. Beumariage-T. Herring-C. Wallace-J. An Object-Oriented Environment for Re-Use of Ecosystem Models. SCS. San Diego, CA. 1995.

Bossel-H. "Modeling and simulation for environmental applications". IT+TI Informationstechnik und Technische Informatik. Vol. 36, No. 4-5. pp. 20-5. Aug. 1994.

Chauhan-N-S. Lang-R-H. Ranson-K-J. "Radar modeling of a boreal forest". IEEE Transactions on Geoscience and Remote Sensing. Vol. 29, No. 4. pp. 627-38. July 1991.

Cushman, R., Farrell, M., Koomanoff, F. "Climate and Regional Analysis: The Effect of Scale on Resource Homogeneity". Climatic Change. Vol. 13, No. 2. October 1988.

Dale, V., Rauscher, H. "Assessing Impacts of Climate Change on Forests: The State of Biological Modeling". Climatic Change. Vol. 28, No. 1-2. October 1994.

Frederick, K., Rosenberg, N. "Conclusions, Remaining Issues, and Next Steps". Climatic Change. Vol. 28, No. 1-2. October 1994.

Gold, H. J. Mathematical Modeling of Biological Systems - An Introductory Guidebook.

Wiley-Interscience. New York. 1977.

Hall, C.A.S., Day, J.W. Ecosystem Modeling in Theory and Practice: An Introduction with Case Histories. Wiley-Interscience. New York. 1977.

Kozuharov, S.I. "Plants as Bioindicators". Biological Monitoring of the State of the Environment: Bioindicators. IRL Press. Oxford, UK. 1985.

Krivolutzky, D.A. "Animals as Bioindicators". Biological Monitoring of the State of the Environment: Bioindicators. IRL Press. Oxford, UK. 1985.

National Research Council(NRC). Finding the Forest in the Trees. National Academy Press.

Washington, DC 1995.

Our Changing Planet: The FY 1996 U.S. Global Change Research Program. 1996.

Phillips, D. "The Backpropagation Neural Network". C/C++ Users Journal. Vol. 14, No. 1.

January 1996.

Preparing for an Uncertain Climate. 1994

Rozmiarek, A. "Global Positioning System: The New North Star". WIRED. Vol. 3, No. 10.

October 1995.

Thomas. W., Goldstein, G., Wilcox, W. Biological Indicators of Environmental Quality. Ann Arbor Science. Ann Arbor, MI. 1973.