Good Geoscience Data Yield Good Decisions
Should you pack your umbrella today? Is there too much pollution in the air to breathe freely during your morning run without your inhaler? Will the coastal road flood before your evening commute? Americans increasingly access and rely more and more on real-time and historic scientific data to improve personal decision-making and to better their lives. In particular, geoscience data are a critical resource for decision-making on local, national, and international scales. This week’s ongoing Advances in Earth Science Briefing Series focused on these data as an asset for decision-making. Speakers highlighted the current state of data collection and its myriad uses.
Seismic data is frequently used for national and global earthquake monitoring. It has also helped researchers better understand the dynamics of earthquakes, learn about processes occurring within the earth’s deep interior, generate tsunami warnings, and even identify North Korean nuclear tests. Katrin Hafner (Program Manager, Global Seismic Network (GSN), Incorporated Research Institutions for Seismology (IRIS)) described the 150 state-of-the-art seismic monitoring stations that make up the GSN, a globally distributed collection of high-end seismometers, as well as other seismic networks. IRIS, a consortium of seismic research institutions, has already archived data from over 1,000 seismic networks, which allows for the compilation of data for global and national earthquake monitoring. Ms. Hafner hopes that new ocean-based seismic stations will be coming online in the near future to enhance the global study.
Geoscience data provide more benefits to society than only improved earthquake observing. Monitoring and studying surface and groundwater flow can mitigate hazards like floods, sea level rise, and groundwater contamination. Water monitoring can also help increase our understanding of ecosystem changes, climate variability, and water availability and use. These data are particularly helpful when they are continually collected and become long-term datasets. According to Mark Bennet (Director, United States Geological Survey, Virginia and West Virginia Water Science Center), “Consistent, long-term data [are] the foundation for our science and [provide] the basis for decision-making by water resource managers.” As an example, Mr. Bennet highlighted that using 30 years of stream gauge data to determine the height of a bridge above a river cannot guarantee that the bridge would withstand a 100-year storm — longer term monitoring is required.
The benefits of ongoing atmospheric monitoring were also noted by Tim Dye (Senior Vice President and Chief Business Development Officer, Sonoma Technology, Inc.). In order to generate real-time, high-precision air quality assessments throughout the U.S., Sonoma’s program funded by the Environmental Protection Agency, AirNow, utilizes more than 2,000 observations that come from more than 90 local agencies spanning all 50 states. This website can be used by the public, researchers, industry, and the media to inform decision-making. In fact, some of the users of Sonoma Technology’s data include school children. Their educational program provided students with personal air testers to collect local air quality data and to teach these budding scientists about scientific methodology. One group of students was surprised to learn that idling buses and trucks in the faculty parking lot behind their school resulted in higher levels of pollutants than the busy road nearby. This information could be used to formulate improved school parking policies or to identify student-safe zones away from pollutants. “One thing about generating data, you never know who’s going to use it or how they’re going to use it,” Mr. Dye stated.
In the modern age of inexpensive electronics and rapid data transfer, possibilities to generate more and more earth-based data abound. Every wrist with a Fitbit or home with a carbon monoxide tester could soon have a personal air quality sensor. With the possibility for multitudes of new data, all speakers stressed the importance of having high-quality data that is reliable and consistent, making data accessible, and continuing monitoring to improve decision-making.