If we go back a decade or more - ancient history in the information technology industry -accessing information on a research topic was a central issue.Remember the time when following up on a promising reference involved an interlibrary loan taking one or two months? Today, the issue has shifted from access to search.There is instant access to information, but finding the right information and judging if it is current, accurate and high quality is more difficult.
Much of today's electronic information is stored in folders.The folders are organized in a hierarchical fashion.Searching for a file or an e-mail message usually implies traversing the hierarchy, trying to remember exactly where we placed it.An item could have been placed in more that one folder with multiple versions of it.For example, a report on a trip to New York in 2003 could be in several different folders such as "Reports," "New York 2003" or "Miscellaneous Reports." Much of a search's success depends on personal organizational habits and memory.This is a familiar process to many of us.Computer file storage is basically the electronic equivalent of a big, bulky file cabinet.Spreadsheets and databases are not much better.In both situations computer technology offers the ability to query and search based on keywords.While a tremendous value over the old world of paper searches, this is still a rather primitive concept.
The geospatial industry is confronted with the same searching issues in relation to data, functionality and Web Services.There is significant activity exploring the creation of queryable spatial data infrastructures at various levels to help address search difficulties. To help with this development the industry is placing an increasing emphasis on metadata and service catalogues that rely on international standards such as ISO and OGC.These efforts rely on the current technologies - keyword search engines, spreadsheets and databases.The search quandary is not just a technology issue, but also a public policy issue.For instance, standardization of metadata for both data and services plays a fundamental role in improving the quality of search results.But implementing and deploying these standards on a global basis is very hard because of awareness, coordination, management and economic considerations.These difficulties are compounded by security and privacy considerations.Who can have access to which data? Is the data free?
The geospatial community will continue to work on these issues. Searching data and services is the focus of intense activity in the IT industry, which is fueled by the explosion of text, digital images, music and video data accessible via the Web.There is also a data explosion in the geospatial industry that encompasses all forms of data.
What are the trends that will impact future geospatial search technology? Extending the Web with information that gives it well defined meaning is the idea of the Semantic Web.This concept is described by Tim Berners-Lee, James Hendler and Ora Lassila in their article "The Semantic Web" in Scientific American magazine, May 2001.The World Wide Web Consortium is enabling a collaborative effort for development of the Semantic Web.The activity is based on XML and the Resource Description Framework.More information on this initiative is available here.Once these concepts make their way into usable implementations they should influence research on Semantic interoperability applied to geospatial technology.
There are many other innovative changes starting to happen.Leading search engine players such as MSN, Google and Yahoo! continue to introduce features like desktop search and geospatial processing and content.Of special interest to the geospatial community is the company MetaCarta.Their products enhance text searches with a geographic context (1, 2). At the core of the system is 'geoparsing,' which is a concept that appeared in the OGC's Geospatial Fusion Testbed in 2001.MetaCarta uses sophisticated algorithms to identify and tag geographic references.
At a fundamental level, Microsoft has unveiled ideas for a new revolutionary file system.Initially part of the Longhorn release of Windows (2006), it has now been pushed forward a couple of years.With WinFS Microsoft is moving the "search" battle to the operating system level.Microsoft explains that WinFS technology is about the ability to "find, relate and act" on information.The find ability will be supported by new capabilities that extend the file system with richer information.Relate capabilities will rely on encoded object relationships to enable richer data exploration.The act part of the technology will be implemented by a system to handle events and create agents to facilitate automatic data management.Macintosh enthusiasts are quick to point out that with MAC OS X Tiger, Apple is already delivering some of the advanced search concepts planned in WinFS.
Geospatial data and services searches are an extensive challenge with technical and political ramifications.The Semantic Web; the efforts by MSN, Google and Yahoo!; and WinFS and Mac OS X Tiger technologies collectively represent exciting new advancements.It is up to the geospatial industry to leverage these developments to improve access to data and services.