LI GeoCloud: The Cloud Matures

September 27, 2010
Share

Sharing is Caring

A passing comment at the reception following the one-day Location Intelligence GeoCloud event, held Sept 22 in Washington, D.C., summarizes both the event and the state of the art: "I didn't really learn anything new, but I did get lots of perspectives." The attendees, and hopefully many readers, have a good sense of what the cloud is, but the real question now is how (not whether) to implement it. The day was a parade of stories of different ways to implement the technology, create and implement different business models, and position offerings in the market. The presentations came from huge companies (Google, Amazon), specialty GIS cloud providers (Skygone, GIS Cloud) and small but feisty, young and not-so-young startups (FortiusOne, Rhiza Labs)..

Themes across the Presentations
While presenters shared a variety of understandings of the cloud and visions of its implementation, several themes popped up during the question and answer sessions and during hallway conversations. 

Security: Some cloud users flock to it for security; others stay away because they fear its lack of security, per John Sheppard of Netezza. It's easier for someone to lose a laptop than to lose the cloud implementation, he went on. One question, from a lawyer, on the privacy/security issue received this honest but realistic response from Nigel Taylor of Trillium Software. Be transparent about it throughout the process while working with your customers and be realistic about the differences in laws from country to country, as well as their changes over time. It was noted more than once that perhaps everyone in the room has their credit card on file with Amazon, suggesting that, at least individually, we are comfortable with storing such valuable information in the cloud.

Hybrid: Discussions about "hybrid clouds" arose more than once and seemed to refer both to implementations that include a publicly hosted plus an in-house portion, as well as ones that include both software as a service (SaaS) and desktop seats. The consensus seemed to be that maximum flexibility in implementation options is required in these early years of implementation.

Data: Whose data does a vendor offer via the cloud? What data does an end user use? That's a very complex question since data are now a service (data as a service, DaaS) and the choices run far beyond the two big data providers, NAVTEQ and TomTom, to Google, OpenStreetMap and other crowdsourced offerings which are popping up all the time.

ROI: Despite continued statements of the cloud providing efficiencies, speed and money savings, when the question of return on investment was put to panelists by Esri's Victoria Kouyoumjian, who was moderating a panel titled, "Business Models & ROI," the responses did not include numbers. Scott Robinson of PBBI made it clear that solving the customer's problem was a measure of return. Esri's Chris Cappelli stated it was too early to say.

Business Models: Philip O'Doherty of eSpatial was clear that his business was aimed at small- to medium-sized users. His pricing model: inviting an organization with 30 seats to pay $45/month/seat instead of licensing desktop solutions. Robinson of PBBI outlined how the convenience of the company's data as a service model meant users would pay a premium, just as you do for a bottle of water or a candy bar in the hotel room mini-bar. Dino Ravnic of GIS Cloud detailed its "freemium" model, offering a free version, but also more features for a fee.

Naming: Is the cloud new? Is cloud the right name for this technology platform? For many of the speakers, the shine had already worn of the term and the cloud is simply part of today's computing environment. As Dylan Lorimer of Google put it, you can just substitute "Internet-based services" for cloud.

Moving Data: Ryan Hughes of Skygone noted, along with others, the challenge of simply uploading big datasets to the cloud. "FedEx [i.e. literally shipping huge data sets via other means of transportation] is faster than FTP," he noted. Moreover, the "winners" will be those who figure out how to speed up the Internet, or those who already have the data in the cloud, those who need not move them around.

Timing: Many of the products discussed had only been launched this summer or fall and most speakers agreed on an expected three-year rollout of cloud geospatial implementations. While some suggested the desktop is dead, others were sure that desktops and cloud implementations will share space for quite some time. Several people also made statements indicating that providers do not yet offer the answers to geospatial business problems that buyers need solved.

Noteworthy Presentations
While all the presenter brought their insights to the topic of the cloud, three stood out.

Blue Raster Tackles Perishable Geodata Site for Healthcare Grant
Michael Lippmann, Blue Raster

Blue Raster
is not a company I'd heard of (though it was Esri's partner of the year in 2006), but it seems it was one of the early users of ArcGIS Server. The project the company took on was an interesting one: build a mapping site to help bidders for federal government grants determine where they would propose new health centers. The vision: a site that could turn around requested maps in one second. Why go to the cloud? The site was only needed for 90 days; it was "perishable." The cost comparison of an implementation using three-year-old tech (servers, data center, etc. at about $300,000) to a cloud implementation was about ten to one, per Michael Lippmann.

The lessons learned from a successful implementation:

  • use a mix of tech, find best of breed offerings (they used: AWS, Esri, Cloudfront)
  • have one or more "health checks" on your suite of services (they had failures on launch day due to a Cisco switch failure)
  • cache repeatable operations to speed things up
  • cache in the cloud (as opposed to moving cached data to the cloud; there's that moving data issue again!)

eGlobalTech Plays in the FGDC Geocloud Sandbox
Dave Merrill and Robert Patt-Corner, eGlobalTech

eGlobalTech
is the company investigating the Federal Cloud Computing Initiative, which includes the FDGC's "geocloud sandbox." Dave Merrill and Robert Patt-Corner explained exactly what they are doing for FGDC. In short, they are figuring out what geoapps are out there now and laying out a path to move them to cloud.

The company selected 10 federal agency apps that would be used to determine and build a first version of a cloud platform. They have not and will not share which ones were selected, but were clear the pilot apps are from many agencies.

The company explained it had found four use cases for the cloud for FGDC:

  • pure data case (get data via REST in a file)
  • Web services (compute intensive, figure something out and data comes out)
  • full up service and data (no data required, you put in parameters, results to a Web service)
  • apps (that exist)

The big idea new to me (though I confess, hinted at by one of my students this semester) was that building a platform that you host as a service (platform as a service, PaaS) is not trivial. In the same way that every desktop is different (has different drivers, video, etc.) so is a cloud platform, though there you'd find a variety of app servers, enablers like databases, service buses, frameworks, runtime engines, etc.). If something is missing in either case, the software on the desktop or the app in the cloud will simply not run!

The goal is to build two platforms that have "all the required bits" to power cloud-based services. The two platforms in development are:

  1. Esri windows, AWS and other bits
  2. open source platform on Linux, with open source bits, geo specific open source stack

How do you find out all the "bits" required? Put together a prototype platform, try to run the apps on it, see what breaks, then fix, then try again! Once all 10 apps run on one of the platforms, you've got two prototype PaaSs. Ideally, they will support up to "n" applications as they mature.

Salesforce.com Envisions Cloud 2.0
Dan Burton, Salesforce.com

Dan Burton of Salesforce.com gave the final presentation of the day. He reviewed the company's vision now and what's ahead.

Salesforce.com
has 82,000(!) clients. All of them (big, small and in between) get three upgrades per year and take advantage of "pay as you go" pricing, very fast deployment and lower prices than in-house customer relationship management tools. Burton used these numbers to describe his solution: "It's five times faster and half the cost." He offered a very effective "cloud computing or not" checklist. Sadly, I could not find that table on the company website and was not fast enough to document it.

The next step, Cloud 2.0, relates to communication tools. In July 2009, the number of social network users passed the number of e-mail users, indicating people are communicating differently. As Burton put it, "the feed is the new desktop." By that he meant not just RSS feeds, but Facebook feeds and Twitter feeds and their siblings, as well. There's also a move from using the Internet for search to using it for collaboration. The other move of note is from the PC to mobile devices, that is, the mobile phone.

Saleforce.com's response to those trends was to ask: "Why is enterprise software not like Facebook?" And, of course, Salesforce.com is busy putting social tools into its offering, adding Chatter in June of this year. The difference, he explained, is that with this sort of model where each employee selects feeds of interest, information of interest comes to the individual, not the other way around.

I have to say that looking at the cloud challenge from outside the geospatial world in which we live, Burton's presentation was very refreshing and informative. And, I was amused that when Burton was asked what geospatial apps were missing from his platform, he declined to answer.

Conclusion
The cloud and cloud computing are here for information technology, whether you call them that or not. And, they’re here for geospatial technology as well. Should you rush in? The sense seems to be no. Rather, the accumulated wisdom of this event is the same practical and valuable approach to any hyped technology: turn to it when there is a problem it can solve. The term "solution in search of a problem" came up more than once over the course of the day, suggesting some are diving in ahead of demand.

What's missing? My sense is that we are still in search of a best practices document, a document that guides implementers and end-users of the cloud in how to put the best foot forward while exploring cloud computing for geospatial. Perhaps it is too soon to produce such a definitive document while the technology, players, services, definitions and business models are still in quite a bit of flux.

Share

Sharing is Caring


Geospatial Newsletters

Keep up to date with the latest geospatial trends!

Sign up

Search DM

Get Directions Magazine delivered to you
Please enter a valid email address
Please let us know that you're not a robot by using reCAPTCHA.
Sorry, there was a problem submitting your sign up request. Please try again or email editors@directionsmag.com

Thank You! We'll email you to verify your address.

In order to complete the subscription process, simply check your inbox and click on the link in the email we have just sent you. If it is not there, please check your junk mail folder.

Thank you!

It looks like you're already subscribed.

If you still experience difficulties subscribing to our newsletters, please contact us at editors@directionsmag.com