"In times of disaster, either manmade or natural, the government must be able to rapidly and efficiently share and disseminate critical information -- particularly spatial information -- across agencies, jurisdictions, rescue and recovery responders, and citizens.Effective emergency response requires that the government be able to quickly assess:
- Where the emergency is occurring;
- What kind of structures are involved;
- What buildings or open spaces can be used for staging emergency responses; and
- How best to move emergency responders to the site quickly and safely."
The common theme in this message is the "where" factor.Some in the
geospatial industry profess that everything (and I mean everything) has
a spatial component that can be identified and used to analyze entities
with respect to others in our world.I have engaged in many a debate about
this assertion and conclude that if all elements in our world don't have
a spatial component, enough do to warrant a method designed to collect
and maintain spatial dimensions.
But why re-examine National Preparedness? The concept, or better yet the very definition, is in transition due to changes in technology such as database design, geospatial applications, hardware, speed and method of transmission; plus a strong sense of urgency to protect ourselves following the 9/11 events in New York, the Pentagon, and SW Pennsylvania.Most can attest 9/11 changed everything...
But in spite of 9/11 and its aftermath, the geospatial industry was and still is changing.User requirements are moving from internally produced, discrete hardcopy products toward interoperable, seamless, on-line delivery of geospatial information via web-enabled infrastructures.
In the past, national preparedness relied on organizations such as the USGS, through the National Mapping Program, to provide geographic, cartographic, and remote sensing information, maps, and technical assistance. Of course, the best-known USGS maps were and still are the 7.5 minute, 1:24,000 topographic quadrangles.The 7.5 minute scale product depicts greater detail for a smaller area than intermediate-scale (1:50,000 and 1:100,000) and small-scale (1:250,000, 1:2,000,000 or smaller) products.This source of geospatial information solved the interoperability problem by being manufactured on paper ready-made to be handed from person to person and more importantly generally understood by most users.You can draw on them, index and file them, copy and share them. Life was good.
But the demands of geospatial information, like all other data types in the computer age, needed to be made fully user interactive and intuitive. For example, first responders using hardcopy maps to plot alternative routes to an emergency were quickly overcome by softcopy applications designed to spatially delineate the "best route" with respect to specific parameters of the current situation.Users want and demand accurate, assured, relevant delivery of geospatial information ready to be integrated into their everyday life; whether that means just finding their way across town to pickup the kids or responding to a major disaster.
To meet this demand, data and software vendors began filling the market with different ways to digitally organize, interact with, and communicate geospatial data.The amount of data continued to grow exponentially to the point where users of all levels of experience and needs became increasingly frustrated with the burden of large undifferentiated geospatial data stores, ambiguous displays, and less-than-satisfactory query results.In short, users were retrieving more data, but systems provided less-than-desirable support in differentiating accurate, assured, relevant data from data that served no purpose in the user's mission.
This current situation in the geospatial marketplace is one of the major reasons why the Federal Emergency Management Agency (FEMA) established the Interagency Geospatial Preparedness Team (IGPT)."On a daily basis, state and local governments are engaged in activities that save lives, protect property and promote the safety of Americans across the country," said FEMA Deputy Director Mike Brown."This initiative is designed to help them better access and use geospatial tools such as remote sensing, mapping, predictive modeling, charting and geographic information systems."
As stated by the March 05, 2003 news release by the National Imagery & Mapping Agency (NIMA), the IGPT's work will involve:
- Developing a strategy for establishing and maintaining geospatial preparedness that includes data, standards, systems, and expertise;
- Leveraging existing intergovernmental and public/private partnerships; and
- Linking to relevant existing federal programs and initiatives.
Needless to say, the level of effort to accomplish these data, technology, and organization assessments will require significant resources to execute. The question at hand, has the IGPT been allocated sufficient resources (budget and staff) to accomplish such a monumental task? Since the establishment of the IGPT, the team has been actively seeking the involvement of the industry and the private sector to partner some of the load and have found some willing participates; but will this be enough to get this very large, but needed investigation where it needs to be?
As the geospatial industry undergoes changes due to technological advancements and more complicated demands from users, the need for reevaluation of our Nation's geospatial preparedness is very important for the security of our way of life.Let's be sure not to short change such an important part of our security.