We thought readers would be interested in learning more about why RedSpider Web 3.1 is important to users of remotely sensed imagery, so Directions Magazine chatted with Chris Tucker, President and CEO of IONIC Enterprise, about it.RedSpider Web 3.1 is deep down enabling technology that isn't visible to a user.As Tucker succinctly puts it, "We do plumbing - we do shiny pictures too - but the good stuff is the plumbing.If you can't the water to the shiny picture, who cares?"
Directions Magazine (DM): We noticed the press release from April 19 about the SPOT Image and NASA deployment of OpenGIS Web Coverage Services using your RedSpider Web 3.1 product.Can you tell us more about it?
Chris Tucker (CT): Yes, we'd love to.In a sense, there are two different things going on that we packaged into one press release, but they are related through the OGC's OWS-2 test bed.The OGC's test bed process is key for deriving new interoperability specifications across the industry.Sponsors like NASA, NGA, and other US federal agencies, as well as European, Canadian, etc., public agencies, along with private sponsors, will derive a set of use cases, or business cases, that they need to have satisfied through standard interoperability specifications.They put money and resources into this process, and then we, the vendor community, will work together in a collaborative engineering environment to hammer out the next version of these specifications.So this is where the real hands-on work goes on.A big thread was called the Image Handling for Decision Support, to build the next level up of specifications for defining distributed image processing.
That sounds kind of "high-falutin", but it is the logical next step for the Web Coverage Service interface. It allows you to expose an image archive, and the implementation details behind that could be whatever they are - whatever vendor platform, wherever it is hosted, it doesn't really matter.It's just a standard interface to allow you to request spatially and spectrally super-setted, gridded data.(That's kind of a geeky way of saying that "I need to be able to request grids where each pixel has a meaning, not just the pretty picture.")
We're the first to commercially implement the Web Coverage Service interface - it's one of several OGC interfaces supported in RedSpider Web 3.1.NASA, for example, is using it to expose their "data pools" of earth science enterprise data so that any remote third party can access that service with any client application they want, from any vendor.The kick-off was at Raytheon, who is the contractor who does that work for NASA.Our work was being used as one of the base data sources for the image handling for data support thread.SPOT Image also deployed our Web Coverage Service atop their SPOT 5 imagery.Our server supports their DIMAP metadata standard format, and conversion to ISO 19115, which is the next generation beyond the FGDC metadata.So SPOT also deployed a hefty amount of SPOT imagery, so the other vendors, including Intergraph and PCI, could access our Web Coverage Services and use the imagery in remotely hosted image processing services.
You could start to imagine a world where you actually have imagery from NASA and SPOT, and the other commercial and national imagery providers, exposed at the gridded data level through Web Coverage Service, and then remote service providers enabling orthorectification processes, correction processes, and other processing to do value-added processing on top of that base data.And that's the vision that the OGC sponsors had brought in - to achieve a set of standards to accomplished that.
Click on image for larger version
NASA's GeoViewer application shows ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) data of Hurricane Isabel over the East Coast, and MODIS (Moderate Resolution Imaging Spectroradiometer) data of the 2003 California fires.Image courtesy of NASA.
DM: So is this an effort to establish a rather large archive or imagery, or at least a template, so that anybody's archive of imagery can be accessed in a more rapid fashion, or at least establish a set of metadata standards to allow somebody to quickly retrieve data that's remotely sensing?
CT: Yes, the way I would probably characterize it as follows.There's a related concept in OGC called the Image Archive Service, where Web Coverage Service is one technical interface, and it specifically allows you to request the gridded coverage.In an Image Archive Service, you would also use the catalog interface to query metadata about the imagery you want.So you could say, for example, "Show me all the imagery you have of the following resolution, over the following geography, with less than 10% cloud cover that was collected between October and December of 2003." So you can ask that kind of question of the catalog that gives you all the references to the data that are available through Web Coverage Service interfaces.
DM: So what's really driving this process?
CT: The demand is mostly coming from the hosting organizations, the companies and agencies that have this data.They are trying to make their data more available and accessible.Right now there are no service level interfaces that these companies and agencies have to allow somebody to dynamically discover data, and dynamically bind those services into their applications.So what you do now is search a web interface for a catalog, find out what might be available, send in an order, the order gets processed, something gets posted to an ftp site, you pull it down, and then if you want to put it in a web application, you have to stand it up behind your web application.
That's an onerous process that requires a fair amount of technical skill and time in the day.So if you've got a researcher with two grad students who can spend the time to do that, that might be doable.But NASA, for example, finds it's a real barrier to the use of their data, so they would prefer to have a standard interface that anyone can use to query the archive, find the data and request the data.So they're looking to greatly simplify access to their data and thereby increase demand by reducing these barriers.So it's clearly to increase application use of data.
DM: You followed many of OGC's recently adopted standards, the OWS-2 specifications among them. How much were you involved in designing those specs?
CT: We sit on all the revision working groups.It's really one of our strategic placements in the market.We implement the entire OGC architecture more extensively than anyone else.In the last test bed we worked through the image archive service use case.So I guess we're intimately involved in all of those discussions.The way it really works in OGC is that it's not just commenting on papers.You have to do real world implementations.Our strength is we show up with actual working implementations not just of the interface, but with high quality software implementing that interface.One of my comments about OGC is that the specs are only as strong as the software implementation of those interfaces, and we see it as our responsibility to have enterprise-class implementations of every spec.Otherwise, earlier in the spec process we say, "this cannot be done in an enterprise-class manner, so that's a bad idea."
DM: Do you anticipate the other imaging agencies such as the NGA, and commercial vendors will eventually look to you to deploy the same technology?
CT: NASA's been thrilled
with what was accomplished over the last year.They're committing significant
resources to try to accomplish more over production data pools this year
- 2004.We just closed a deal with large commercial imagery vendor that
is looking to manage their entire image archive using our software.So
that's at least one of the most significant commercial imagery providers
that has bought our entire suite of software to actually stand up an image
archive service using OGC specs.And key to that is serving out their "finished
products" through OGC interfaces, wrapped in a security and commerce model
so that paying customers can get access.