Fifty Years of Commercial GIS – Part 2: 1994-2019

May 29, 2019
Share

Sharing is Caring

The World Wide Web of Information

I remember clearly the day I first encountered the World Wide Web. It was 1994. It was as if a new portal to the world of information had been opened. It was fast (for the time); intuitive; and the hyperlinks to more pages of information were truly mind-boggling. It was Mosaic, the forerunner to Netscape. And it wasn’t long thereafter that putting maps on the WWW was a possibility. Web maps, in the beginning, weren’t very functional; mostly visualization tools, but still, you could see the possibility for new business models.

Who would lead web mapping? Who should lead this new phenomenon? Several companies dove in. Lancaster, Pennsylvania-based RR Donnelly, which later became MapQuest, was already in the hard copy mapmaking business; they were a natural fit. GPS maker Delorme seemed a likely candidate as they were already offering CD’s to work with their handheld devices. But perhaps the most obvious company that should have had the lead in web mapping, Rand McNally, was slow move to a web ecommerce model and missed the Internet mapping business almost completely and has simply stuck to its paper road atlas publishing business. It was hard to see that a decade later Google would stomp into the geospatial community with a very disruptive business model. (More on this below.)

Rise of the GIS Professional

In the mid ’90s, professional GIS organizations flourished: ASPRS, GITA, URISA and MAPPS, as well as AASHTO and ASCE, supported subsets of the GIS profession. Likewise, the conferences these organizations established also rose in importance. In addition, more local and state-wide conferences supported user cohorts that were able to leverage the knowledge of their colleagues in smaller geographic regions.

URISA had a particularly strong vision to establish the GIS profession through certification. The creation of the GIS Certification Institute in 2003 gave rise to establishing the criteria by which an individual could put forth their bona fides and claim the Certified GIS Professional, or GISP, status. Employers could then seek individuals who met a certain standard of experience and expertise.

But by the early to mid-2000s conference attendance was dropping. Among local government users, budgets became tighter and choices had to be made. If you were going to attend one conference each year, users were more likely to attend their chosen GIS vendor. User conferences supported by Esri, Intergraph and Autodesk as well as Bentley Systems and MapInfo took sway over professional events. There were just too many conferences and, in some cases, like GIS/LIS and GITA, they ceased to exist.

As attendance in some GIS conferences declined, other events that had a unique differentiator were just beginning. Directions Magazine’s Location Intelligence Conference (2004), O’Reilly’s Where 2.0 (2006) and the U.S. Geospatial Intelligence Foundation’s (USGIF) GEOINT Symposium (2008) were catering to business applications, location-based services and the intelligence community, respectively. In the open source community, OSGeo, State of the Map and the Eclipse Foundation’s Location Tech event, were catering to a niche but growing community of professionals.

The Importance of the Open Geospatial Consortium

This significant rise of geospatial technology among the many information technology companies that were now entering this sector, the growth in the global user community, and the innovations that were occurring gave rise to the need for standards and to promote interoperability. The Open Geospatial Consortium (OGC) was founded in 1994. Started by David Schell, the organization was an outgrowth of Schell’s work at the Open GRASS Foundation; GRASS being among one of the first open source geospatial software solutions. In the early years, Sun Microsystems and PCI supported its existence. The OGC grew to become a collaborative organization to establish interoperability guidelines. This organization has fostered the development of many geospatial innovations that are shared by its members in a collegial atmosphere that benefits all. Recently retired OGC president, Mark Reichardt, steered the organization to join with other standards bodies to make certain that the geospatial technology sector was substantially reflected in the work of the W3C, the buildingSMART alliance and many others. In many ways, the OGC became the glue that allowed organizations to put competitive influences aside to work toward a common goal of geospatial innovation for mutual benefit.

Spatial Databases — Patterns and Processes

Also in the mid ’90s, spatial databases became differentiators for the major IT companies including Oracle, Microsoft and Informix. When clients found it necessary to offload significantly compute-intensive geospatial operations to mainframes or servers, software solutions such as Oracle Spatial, Microsoft SQL Spatial and Informix data cartridges became an option. Intergraph was keen to be seen as agnostic and was supporting several databases through data cartridges at that time. There was a significant push by Esri, Autodesk, MapInfo and Intergraph to make direct connections possible from desktop GIS software. Today, integration with spatial databases is required and is even more important as the volume of location-based data rises. In the case where big data environments are essential, incorporating natively-embedded geoprocessing is growing in popularity.

Earth Observation Platforms — Too Many Pixels

EarthWatch, Space Imaging, Spot Image and DigitalGlobe, followed by Surrey Satellite Technology, LLC and BlackBridge, all launched Earth observation satellites during this period to fill the gaps left by government-funded satellites such as Landsat. The problem was “too many pixels” and not enough customers outside of government and academia. When contracts such as NextView were let by the U.S. government, the private EO satellite customers had few incentives to change their business model to sell to commercial companies. And commercial companies, though in need of EO data, rarely knew how to consume it, analyze it and justify the cost.

For some companies, the need was too great. I remember visiting Mars, Inc., the candy company. Their need was to assess the availability of Brazilian cocoa so they could manufacture chocolate bars. They hired GIS and remote sensing analysts. It was one of the few examples that I observed that tried to integrate geospatial analysis in not only their upstream operations but also applications in marketing and sales. And, as mentioned in Part 1, commercial real estate and retail were heavy users of GIS. However, during this 25-year period, companies retail expanded and then, unfortunately, contracted during the “un-malling” of America in the mid-2000’s as the rise of ecommerce came of age and the impact of Amazon was felt.

The growing need for EO data has not deterred the venture capitalist community. During the period 2006-2012, venture capital poured into companies such as Skybox Imaging (later acquired by Google who then sold the renamed company, Terra Bella, to Planet Labs in 2017), Planet Labs, UrtheCast, PlanetIQ and others. Launching lighter-weight smallsats with significantly agile payloads offered coverage of the Earth’s surface in a shorter time period. However, this came with a tradeoff of sometimes lower spatial resolution than the 5000 kg satellites of DigitalGlobe and others. Application-specific smallsats for weather and agriculture, for example, were an attempt to differentiate the imagery products.

However, smallsats will now compete with the rise of unmanned aerial vehicles and companies that are creating consortiums to pilot them, much like Uber deploys ride-sharing services today. UAV’s can be positioned more quickly, albeit with smaller coverage areas, and capture images as well as full-motion video. Here again, the geospatial technology profession is faced with the challenge of educating and differentiating complex products that require an understanding of spectral and spatial resolutions and the knowledge to convert “dumb pixels” into location intelligence. The advent of machine learning, however, has given rise to better feature extraction technology that is fostering the ability to create unique data by-products.

Google Earth

In 2001, I was working for an Accenture-backed startup company in Berkeley, Calif., Vectiv, that was developing a solution for the retail real estate market in which the map visualization component was underpinned by MapInfo’s MapBasic programming language. It was there that we entertained a meeting with John Hanke of Keyhole. The visualization was truly unique, and as many know, Keyhole was purchased by Google and became the foundation for Google Earth.

The Google Earth phenomenon cannot be overstated as a defining paradigm shift in GIS, but it is only one of a few attempts at developing a comprehensive “globe.” NASA’s WorldWind, developed as an open source platform in 2003, and Microsoft’s Virtual Earth, which had its beginnings as TerraServer, are examples. Both Microsoft Virtual Earth and Google Earth were vying for early adopters in mid 2005 as commercial search engines, but Google Earth clearly won this competition.

Google’s entry into what heretofore was the domain of GIS stirred more than just deep concern among geospatial companies. Esri, for example, was compelled to develop ArcGlobe in response. Consumer imagery was not a target market for GIS vendors and yet Google stepped into the fray with a platform that allowed Earth observation imagery to be viewed and queried, and upended both the software and data market. Visualization of expensive satellite imagery was now free.

Business Intelligence vs. Location Intelligence

In 2003, there was clearly a change in the way geospatial technology was being viewed. It ceased to become an isolated technology. The tide was rising for business intelligence solutions and location-based data was a key ingredient. Would synergies exist between the two software solutions? The recognition of where the synergies existed was not quite understood. I have a clear recollection, just a few years before, of an Oracle marketing manager for the retail sector telling me that GIS was just “maps.” He had not quite grasped the fundamental value proposition of visualizing spatially-related data, the juxtaposition of which offered significant insights, and to some, a competitive advantage.

In 2004, Directions Magazine sponsored the first Location Intelligence Conference, held in May at the Wharton Business School at the University of Pennsylvania. I chaired this event, as I did for the next 10 years. It was an apt venue to bring together leaders in such industries as retail, banking and insurance with GIS technologies. In addition, business intelligence companies like Oracle, PeopleSoft and Siebel were invited. I remember a conversation with the representative from PeopleSoft and they simply did not understand what geospatial technology offered in terms of deep analytics. It’s ironic that, fast forward to today, every BI company that purports to offer advanced analytics must include some degree of geospatial analysis. There is a realization that there are just too many sources of location-based data (mobile phones, traffic sensors, “smart lighting” and others) that are needed for deeper analytics.

Two questions arise for the future BI and location analytics: 1) How much GIS functionality do BI software solutions incorporate into their products and, 2) How many other non-mapping visualization tools are incorporated into GIS? Desktop GIS is already overloaded with functionality. However, web mapping tools should be sufficiently extensible as to include supporting visualizations such as bar and pie charts. BI software providers don’t really want to be in the GIS business but GIS vendors are finding it hard not to overlap with BI. Solution provider, Alteryx, began as a workflow modeling solution utilizing geospatial data, then determined that BI was a bigger opportunity and now works with companies like Tableau to support a broad set of BI applications with some geospatial querying.

SaaS and Pricing Models

Perhaps different from many application software, GIS often lends itself to projects limited by time. Bentley Systems, and now others, will allow buyers to purchase limited use licenses for periods such as three or six months. This flexibility is a welcome change from perpetual license models that do not fit today’s buying habits. Software as a service (SaaS) and API’s provide the type of limited use, pay-as-you-go option that limits the financial commitment. These models are used today by companies such as CARTO, Galigeo and Mapbox that have carved market niches for geospatial data visualization and analytics.

Open Source Software

Open source software, QGIS and MapServer for example, has served a growing community of users that want solutions with lower costs. Most is freely downloaded but not truly free of cost. Customization and development take resources, and communities such as OSGeo, founded in 2006, and the Eclipse Foundation have organized to support these efforts. In addition, companies have sprouted to serve the need for continued software development, such as Boundless, which was acquired by Planet in 2018. This business model works to a point and the lines blur between “free” and the need to produce fit for purpose solutions using open source software that increases total cost of ownership. But make no mistake, open source software competes with commercial off-the-shelf software.

Acquisitions Shape Direction of GIS

During the early to late 2000s several acquisitions shaped the competitive landscape. Pitney Bowes purchased Group 1 Software in 2004 and MapInfo Corporation in 2007, strengthening their position in geocoding and GIS. Hexagon AB has acquired five geospatial companies: Leica Geosystems in 2005, ERDAS in 2009, Intergraph in 2010 and just recently, Luciad in 2018 and Thermopylae Sciences and Technology in 2019. Bentley Systems seems to have been on a tear recently acquiring several companies to strengthen its position in 3D data analysis and building information modeling. Esri’s acquisitions have included CACI in 2002 for demographic data, and more recently ClearTerra in 2018 and indoo.rs in 2019. All point to a restructuring of the geospatial industry to capitalize on the need to manage more location-based data and serve an intense appetite by government and private industry for geospatial information.

Data Marketplaces

Over the intervening years of this second quarter century, the one constant is that you cannot separate geospatial data from software. Georeferenced data is unique to Earth observation and analysis, and the software to process these data must include functionality to process coordinate geometry, topology, topography, projections and more. The availability of these data has been more problematic for users as they are presented with myriad vendors of raster or vector data and are left to fend for themselves as to how to acquire data for their use.

Numerous attempts have been made to develop data marketplaces, both commercial and government. Portals such as the USGS, as well as more generic data portals such as Data.gov in the U.S., and numerous European geospatial data portals, exist to support open data standards. Commercial data marts such as MapMart (now owned by Harris) attempted to provide a wide selection of data for download. Today, Maxar/DigitalGlobe, Pitney Bowes and Esri offer portals for data discovery, download and purchase.

Market Research Companies Recognize LI

In the last several years, there has been a growing recognition of the impact of location intelligence on the broader IT domain by market research firms. Several, such as Gartner, Forrester, Ventana, IDC and others that have traditionally focused on business intelligence have finally taken note of the abundance of companies that rely on location-based data for a competitive advantage and those that are now defacto data companies because of the enormous amounts of data they are capturing — from insurance companies that are developing risk models based on historical perils and understanding how their book of business would be impacted by only the slightest uplift in accuracy from geocoding solutions, to wireless telecommunications companies that ping mobile devices to ascertain signal quality. Therefore, the market research companies have acknowledged that location intelligence is a fundamental component of the IT infrastructure and that GIS has become a general purpose technology; that is, try doing without it.

Ventana conducted a market research study in 2013 validating that commercial organizations were growing their investment in location intelligence. Forrester, for example, has published two “Waves,” in 2016 and 2018, their method for identifying the key technology providers. In each, Esri and Pitney Bowes were recognized as the leading providers. Gartner, while not creating either a magic quadrant or hype cycle for location intelligence, has placed “Geospatial and Location Intelligence” on several hype cycle curves, though they seem to have been challenged to identify the technology as either at the “peak of inflated expectations” or along the “plateau of productivity.” Either way, they’ve deprived those of us who have been in this business for many years the opportunity to see the technology enter the “trough of disillusionment.” For a technology that’s been around as long as GIS, Gartner’s assessment seems confusing.

Embedded Geo

Recognizing all that has transpired during this second quarter century of GIS, I would be remiss if I did not mention how certain businesses could not function today without location intelligent solutions. Uber and Lyft, as examples, would not exist if not for how each has embedded geospatial technology as a foundation for their business model, and technology platforms that result in a customer experience where the user expects to see a map. They exemplify the importance and potential of location-based information to expand beyond the world of GIS. But the fact is, the underlying technology, the use of geocoding and machine learning, is transparent to the user. And, in that, lies the future of commercial GIS.

If you missed Part 1, you can catch up here.


background image via: Freepik.com

Share

Sharing is Caring


Geospatial Newsletters

Keep up to date with the latest geospatial trends!

Sign up

Search DM

Get Directions Magazine delivered to you
Please enter a valid email address
Please let us know that you're not a robot by using reCAPTCHA.
Sorry, there was a problem submitting your sign up request. Please try again or email editors@directionsmag.com

Thank You! We'll email you to verify your address.

In order to complete the subscription process, simply check your inbox and click on the link in the email we have just sent you. If it is not there, please check your junk mail folder.

Thank you!

It looks like you're already subscribed.

If you still experience difficulties subscribing to our newsletters, please contact us at editors@directionsmag.com