SJ Camarata, ESRI
A major theme that occurred in 2007 has been the exceptionally strong growth and emergence of server and Web based GIS in the market, especially in the enterprise arenas. This in turn has helped to drive a plethora of new and innovative uses of GIS both within existing organizations as well as into new organizations. For example, on the governmental side, NATO is now moving to implement a substantial advancement of a complete organizational wide GIS through a strategy that encompasses enterprise and Web based GIS for a wide variety of applications.
On the private sector side of the market, many large private corporations are embracing and deploying enterprise wide GIS systems in their organizations both extending the use of GIS throughout the company and through greater integration with IT operations and applications to leverage IT based investments.
People have been saying this for a while now, but it is has never been more true than now - GIS is going mainstream. And it is becoming part of the Web's 'ecosystem'. One example is that recent advancements in GIS servers allow for greater use of useful mash-ups that leverage technologies like REST APIs. But powerful geoprocessing functionality is also being made available to much larger user bases through GIS servers. Another example that illustrates this is that geospatial content is shifting from an operational expense to a financial asset in many organizations. Yet another significant impact is that the use of mobile based geospatial/location based applications is growing and will continue to grow at an even greater pace. With billions of mobile devices now being used throughout the world, and with GIS tools, applications and content readily available, both from a technology perspective and from a business perspective, GIS will expand even further. Enterprise and Web based GIS infrastructure, with GIS sServers as their backbones, is driving this.
GIS use, at all levels (mobile, desktop, Web based, server based, enterprise wide) will grow and continue. GIS will be integrated even further with mainstream IT applications and operations. And greater investments will be made by content focused players in the market (such as Google, Microsoft, Nokia - via their NAVTEQ acquisition, Tele Atlas -via their new ultimate acquirer, Digital Globe, GeoEye, etc.) in higher quality and greater volumes of content.
In 2007 we continued to see public and private organizations pile up remote sensing and photogrammetric data at an increasing rate. Despite making huge investments, organizations are swimming in unused and rapidly aging data that should be quickly applied to solving their end users' problems. Leica Geosystems Geospatial Imaging took great strides in solving this problem in 2007, and I believe started an important and ongoing process that will revolutionize the geospatial industry.
Leica Geosystems has displayed a keen understanding of customer needs by developing technology to support geospatial business systems across the enterprise. In 2007, the company announced three strategic acquisitions, Acquis, IONIC and ER Mapper, each supporting a different aspect of this ongoing strategy to meet geospatial business needs. Each of these acquisitions further enables Leica Geosystems to provide the most comprehensive and dynamic geospatial solutions to serve these growing areas. Acquis was very strong in multi-user topological editing everywhere, including support for mobile, Web and rich client desktop environments. IONIC provides an open, interoperable, secure and scalable geospatial platform in the RedSpider suite. IONIC's market position and strength in the Open Geospatial Consortium/International Organization for Standardization (OGC/ISO) promotes the extension of the company's geospatial domain expertise. With strong OGC Web Service and ISO metadata support, IONIC also provides a business platform for Leica Geosystems, partners and customers to utilize in building vertical market solutions. Finally, ER Mapper's Image Web Server (IWS) provided the benefits of rapidly delivering imagery to thousands of users in a variety of Web and desktop applications. By acquiring ER Mapper, Leica Geosystems also now has two world-leading remote sensing image-processing products, ER Mapper Professional and ERDAS IMAGINE. In addition to the acquisitions, Leica Geosystems also released Leica TITAN, an innovative online solution for sharing geospatial data, Web services and location-based content in a single, secure environment.
Starting from a position of strength, Leica Geosystems is preserving and improving customers' investment to produce accurate and reliable geospatial content, accessible everywhere. Leica Geosystems is extending the utilization of source content to solve location-specific business problems customers encounter on a daily basis. The experts in geospatial imaging, Leica Geosystems' recent strategic acquisitions and solution integration were important industry events in 2007, indicating leadership in the rapidly expanding geospatial information market well into the future.
In 2007, we have witnessed a surge in activity relating to SDIs or Spatial Data Infrastructures. These are driven by a key event of the year. This took place on 15th May 2007, when INSPIRE [Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE)] came into force.
Its significance will be measured ultimately in how well public sector data are combined to drive cross-administrative boundary infrastructure planning decisions. I choose the somewhat ugly boundary phrase, because the key measure of success will be based on the impact at a regional level. Europe refers to this process not as federalism, but as subsidiarity. Drafting Teams are working diligently on activities to implement INSPIRE principles. In keeping with the current mantra to deploy joined-up thinking, there seems to be a number of areas that INSPIRE must intercept for other EU initiatives. They are Public Sector Information and the Lisbon Strategy.
The Lisbon agenda was set for a ten-year period in 2000 in Lisbon, Portugal by the European Council. It broadly aims to "make Europe, by 2010, the most competitive and the most dynamic knowledge-based economy in the world". The broader objectives set out by the Lisbon Strategy were to be attained by 2010. INSPIRE implementing phases start in 2009, so they are parallel activities. If INSPIRE is viewed as a supporting policy instrument to deliver on the Lisbon agenda then it begs the question as to who is responsible for the knowledge management aspects of INSPIRE?
Public Sector Information is a directive in its own right [Directive 2003/98/EC of the European Parliament and of the Council of 17 November 2003 on the re-use of public sector information]. It seeks to extend the knowledge economy by re-using public sector data. There is a vast amount of information relating to the business rules that govern spatial datasets in Europe (and elsewhere). This spatial data has accumulated across the public sector (and other sectors) in the last 15-20 years and the information relating to these datasets is typically stored in peoples' heads, Excel spreadsheets, Microsoft Word documents, UML models and generally scattered across organizations. There is no evidence yet that INSPIRE is going to change this situation. In fact there is a body of evidence which says that the GIS industry in Europe doesn't understand the re-use concept. Funding within the eContentplus program for spatial data was under-utilized in 2005.
INSPIRE should create an opportunity to deliver benefit to the European knowledge economy by creating repositories of rules for each of the 34 themes (combining Annex I, II and III) that transcend map sheets and national boundaries. It will be an interesting challenge.
Dr. Carl Reed
2007 was the year that location, and geography, became cool. By cool, I mean not just millions of consumers using more and more Web based mapping and location based decision tools but also that key business markets have changed their perceptions regarding the value of digital geography and the importance of location. Further, there is increased understanding that the properties of geography and location are fundamental to almost every individual or business decision. As just one example, just about the hottest techie gift this year is a GPS based navigation system! There are even prime time advertisements for these technologies. The recent acquisitions of NAVTEQ and Tele Atlas, location based services (finally) beginning to realize its potential, the increased activity in solving the CAD-GIS integration problem, the continuing stream of product announcements in many non-traditional GIS sectors that incorporate location applications - these are all evidence.
In the background, in 2007 there has also been considerable activity in the standards and policy forums that will insure the continued integration of geography into an ever broadening set of consumer and business applications as well as decision support systems. For example, the INSPIRE directive has mandated the use of international standards to enable the development of a pan-European SDI. NATO C3, NGA, and GEO (GEOSS) have policies that provide guidance on what specific geospatial standards must be used for applications that require data sharing and geospatial service integration.
At an even more fundamental level, there are now a set of Internet standards that mandate standard payload encodings. These Internet standards include extensions to DHCP, PIDF, and SIP. While many have never heard of these standards, SIP, for example, is used to establish, modify, and terminate multimedia IP sessions including IP telephony, presence, and instant messaging. A DHCP packet is transmitted from your IP enabled device every time you connect to the Internet. The point is that location and location payloads are quickly becoming ubiquitous and an integral component of Internet and Web applications - what has become to be known as the Geospatial Web. And 2007 is the year that market forces, standards activities, and policy all converged to create the "tipping point" making geography cool - and important. Perhaps now as a global community we can take this enthusiasm and use that energy to help address the many social and environmental issues facing humanity. Then the importance of geography and location will truly be transcendent.
While most industry professionals agree that geospatial technology has indeed become mainstream, it is apparent that we still have a lot of work to do to explain why it is such a valuable tool. This is particularly true for non-geospatial managers and those executives who sign the checks for acquiring GIT products and services. I think that the next hurdle for broader use of the technology is more a financial barrier than a technology-related one: that is, the value of implementing geospatial technology must be provable.
Up until recently, this has not been as easy to deal with as one would think. Geospatial applications are often complex, lengthy and expensive. Because of the significant up-front costs involved in getting a large project off the ground, the rate of achieving anticipated benefits may not coincide with the quarterly bottom line approach characteristic of businesses these days. Further, geospatial proponents are typically focused upon the technology itself and not so much on the end results of the use of the technology from a strictly business perspective. On the other hand, business managers - those controlling increasingly tight budgets - are less concerned with the intricacies and innate technical capabilities of GIT than with having applications pay for themselves as soon as possible (and sooner, if possible) and deliver increasing business value going forward. Thus we have seen a natural tension between investment and payback, increasing by the pressure of time on tangible results.
This is a significant issue in that lack of articulated financial benefit, or in most cases, the inability to express it, could eventually delay or even stifle geospatial implementations where long term vision is not part of the organizational approach to technology.
Fortunately, there us a flip side to this problem. Where specific benefits from GIT implementations can be identified and quantified, geospatial practitioners are becoming increasingly successful in convincing executives that their geospatial investments not only pay for themselves, but can stimulate additional and often unplanned profit centers. Studies such as GITA's "Business Case Development and Return on Investment Methodology: A Practitioner's Guide to Strategic Financial Analysis," have yielded documented results in translating tangible and intangible benefits of applications into bottom line dollars and cents. This information, when accurately assembled, prepared and displayed, makes a very convincing argument that is beginning to show executive managers why geospatial is good for the company in a language they know: ROI. Several case studies conducted in conjunction with GITA's project have resulted in successful budget defenses - and even increases - at a time when financial resources are becoming increasingly scarce. Ultimately, geospatial practitioners are in competition with their organizational counterparts for these funds, and a clearly articulated, quantifiable statement of the value that a project will bring to the organization is a nice arrow to have in the quiver.
So, expect the impact of a solid ROI analysis on geospatial applications to become increasingly important in the coming year, as the technology continues to expand into more and more organizations and disciplines.
After all, in order to tell the story, you've got to sell the story.
From my perspective, 'convergence' seems to be the overriding influence in 2007. By this I mean the melding of content, technologies, platforms and even companies. I think it is safe to say it is not about the data, most certainly not about the map and even the platform might only be considered the stage where answers to everyday business problems are served up. From Tele Atlas and NAVTEQ, to MapInfo and the three major BI players all getting gobbled up and integrated into other offerings, it is clear to me that the total solution is what providers are striving for. IT is looking for more capabilities in fewer technologies and the winners going forward will likely be those that can provide the appropriate amount of convergence to deliver against these desires.
This continued convergence is important because it signifies increased pressure, primarily from the corporate world, that point solutions risk extinction unless they are integrated into environments that allow for defining, building and deploying 'best practices' for businesses. Let's face it, ETL, data quality and geospatial are just three of many spokes on a complex wheel needed to drive business answers and enhance corporate profits. And everyone in an organization is affected. From the power users needing flexibility to define best practices, to corporate developers needing rapid application development tools to deploy best practices, to business users having access to simple interfaces to run best practices.
Technology providers of all sizes are impacted by the convergence. Some will choose to embrace the idea and others will take the huge risk of ignoring it. At SRC, we have advanced our own Geo-BI offering of Alteryx with this convergence in mind and the response has been phenomenal as business leaders and IT professionals have rid themselves of island technologies in favor of environments that are orders of magnitude faster in solving simple to complex business problems.
Convergence shows no signs of slowing down and in 2008 we will likely see a heightened pace of consolidation and a corporate exodus from island technologies like GIS. Perhaps 2008 will be the year of "Better and Faster ....even if it's not Cheaper."
I would identify a key theme of the year as being the growth in non-traditional forms of geospatial data creation and update. This takes many forms. One is "crowdsourcing" or community generated content. I have talked recently on my blog about the OpenStreetMap project, which has established great momentum this year. For those not familiar with it, their site describes it as... a free editable map of the whole world. It is made by people like you. OpenStreetMap allows you to view, edit and use geographical data in a collaborative way from anywhere on Earth. In several cases, their data is now more detailed and complete than data available from commercial sources - Oxford University uses their data on its Web site. The Cambridge Cycling Campaign is another example which uses crowdsourced data for planning cycling routes. And leading navigation system provider TomTom is allowing its users to update its data too, so large commercial companies are getting in on the approach as well as open source projects. Obviously there are valid concerns about crowdsourced data, but there are interesting parallels like Wikipedia, which has been ranked in multiple studies as being as good as or better than traditional encyclopedias like Britannica, as well as some impressive success stories in the geospatial area already, even though we are in the very early days of this approach.
In addition to "traditional" geospatial data like these examples being created by users, we are seeing huge amounts of non-traditional data, one of the simplest examples being geo-referenced photos. The photo sharing web site flickr now has over 32 million georeferenced photos, after launching this capability about a year ago. As location aware cameraphones become much more common, we will see huge growth in the numbers of georeferenced photos being posted online in near real time. This data will have many potential applications in "traditional" areas such as emergency response and outage management, for gaining near real time situational awareness. Integration of real time video from the many security and traffic cameras extends this same idea.
This leads us into non-traditional types of data which don't fit into the crowd-sourcing category. There are many examples of georeferenced imagery being used in innovative ways - including Google's Streetview (which is a simplified version of the rich data provided by Immersive Media), Microsoft Photosynth, MapJack and EveryScape. Microsoft has driven significant capture of Pictometry oblique imagery for Virtual Earth, and is generating detailed 3D building models on a scale we haven't seen before, using technology it acquired from Vexcel. Google's approach to 3D building models takes us back to the crowd-sourced model using its SketchUp product - though Google has also contracted professionals such as architecture firms to help accelerate the process in some cases.
I see a combination of all these factors having a profound impact on the industry.
First, Handheld LBS. The events are Apple's launch of the iPhone, Google's release of Android, Google's intent to bid in the 700MHz auction, and Verizon Wireless announcing they are opening up their network. These things are going to fundamentally alter the LBS business for all time. They lay the groundwork for an explosion of pent-up creativity in LBS applications, which I think consumers will eat up. Consumers have shown an appetite for LBS systems in the way they've been buying in-vehicle navigation systems. I think the overwhelmingly positive experience those nav systems have provided paves the way towards rapid uptake of handheld LBS. The key is that the handheld LBS can't be walled-garden, "toy" applications.
Second, Crowdsourcing. Open Street Map has hit critical mass. Maybe in the US we're not as aware of OSM as people in other parts of the world are, but crowdsourcing is here to stay. Google now allows user "corrections" of locations, Mass GIS has experimented with corrections of street data. Crowdsourcing is not a slam-dunk, but I think over the next few years, the ability of individuals to contribute to their larger world-view is going to become significant. This is not just about creating street data and other traditional map data. Rather, think more along the lines of Flickr, YouTube, etc. being used to create a locative cultural layer. What's missing, and what will get built, are the mechanisms for finding just the information you want about a place.
The biggest event in LBS of 2007 that didn't happen is that the iPhone launched without GPS.
How is it possible that a phone which has been awarded best invention of the year by Time Magazine, has already sold 1.5 million handsets, and captured roughly .1% browser market share on the Web and mobile (Windows CE is half that % despite being on over 15 million handsets) not launch with GPS? With a delightful User interface, screen size, and comparatively easy development environment - this device could have spawned an explosion in consumer LBS products in areas such as Buddy Finding, LBS Games, and all of the other interesting applications we all want to see garner mass consumer adoption.
My prediction for the biggest LBS event in 2008 ... easy ... the iPhone launches with GPS.
There are exceptional writers and thinkers that act as mine canaries for important changes in our technology and culture. William Gibson's Neuromancer (1984) provided a window into the possibility and potential of the World Wide Web, then almost a decade away. Gibson coined the term "cyberspace" which we now use almost universally to express the electronic matrix so important to us today. His 2007 book, Spook Country, is another bellwether novel forecasting the rise of what Gibson calls the "locative" in which media content is assigned spatial coordinates so that they can only be accessed from a specific location with a GPS-enabled device.
This novel signals the most significant change in Geospatial/LBS technology that is occurring today. We are very rapidly shifting from software locational awareness to hardware locational awareness. GIS software knows where things are relative to other things because of the coordinate system used to locate those things. Everything is represented, measured, analyzed and processed by the coordinate geometry embedded in the software.
Hardware locational awareness relies on an embedded GPS receiver (and its associated software). The hardware always knows its position and accesses content based on that position. The content is software based, of course, but the user will be able to access (or not) "locative" information based on their personal preferences and searches just as they now access information from the Web. The significant difference will be that that content is now limited to a specified distance around the user's device.
Software locational awareness will not decline in importance or mass, because it will still be required to provide a spatial context to content. However, hardware locational awareness will increase exponentially in importance over the next few years. In my opinion, based on over thirty years of geospatial research, implementation, and observation, 2007 marks the beginning of rapid rise of the "locative". How we measure "importance" is a key question. Two ways come to mind, sales and "seats". I would argue that both sales and seats of locationally aware hardware and its associated content will dwarf traditional GIS and contemporary LBS in the next few years.
"Traffic is the New Black" - I only wish I would have coined this phrase - but that distinction goes to Directions Magazine's own Adena Schutzberg as she reflected on some of the major themes she observed at the deCarta DevCon event last month.
Indeed, traffic is the new black - an expression that quite clearly epitomizes the demand for and tremendous uptake of traffic services that we have seen take place this past year, as well as the rapidly changing nature of the LBS market overall. Clearly 2007 was the pivotal year for adoption of traffic in the PND, mobile navigation and telematics space.
Just one year ago, the actual penetration of traffic data information into PND's, automobiles, wireless and other devices was relatively low in North America. By the end of 2006, only about 8% of vehicles were equipped with in-vehicle navigation systems, and most had no way of integrating real-time traffic. We have come a long way in one year driven by three major shifts in the market including 1) huge growth in the availability of real-time traffic information from 8,000 miles of roads to now over 50,000 miles and 25 cities to over 100 cities; 2) significant increases in the quality of traffic; and 3) new business models and technology innovations that have made the bundling of traffic with devices more compelling for consumers.
According to a just released study from ABI Research, real-time traffic information will become a key feature for navigation, and will be further enhanced by the addition of historical and predictive traffic data to assist drivers in determining the best route. The traffic information market for in-dash vehicle and portable navigation is now projected to catapult to a 229% growth rate between now and 2011, with more than 83 million paid or registered users worldwide by 2012. These are just a few examples of key trends that are driving the aggressive adoption of traffic information services and applications.
As recently as September, I speculated that roughly 30 PND's would be offering real-time traffic information by the holiday season for 2007. The reality is that INRIX traffic alone will ship in nearly 40 models by the end of this year. "Black Friday" last month dismissed any speculation that the popularity and demand for GPS in the consumer sector was still an "emerging" category. According to the NPD Group, unit sales of GPS devices of that week increased six-fold over last year and broke the $100 million barrier. GPS-enabled devices and traffic information is simply a match made in heaven.
All of the leading PND makers now offer real-time traffic - via FM sideband (primarily RDS-TMC), satellite radio, or GPRS - on some of their models. The white hot PND market reflects the tremendous efforts that Garmin, TomTom, Mio and others have invested in developing compelling product lines with a range of affordable price points.
All in all, the undeniable surge in demand for telematics and location-based services, with traffic information leading the charge, indicates that the stakes have been raised in the competition for applications and services that will truly differentiate companies such as personal navigation device manufacturers, mobile services providers, automotive manufacturers, Web portals, and others. The integration of traffic with navigation applications across these devices offers compelling solutions to consumers and businesses.
INRIX is working with many customers on their next generation dedicated and connected solutions, building upon those available this year. However, like Apple's iPhone, customers will continue to confuse and confound those that want to categorize solutions in a single market, as it will be nearly impossible to define portable versus embedded, wireless versus PND, disconnected versus connected, etc.
The one theme that will be constant in these solutions is the dependence on dynamic content. Offering the broadest coverage, most accurate data and relentlessly innovating our technology - INRIX offers a triple threat that indicates traffic services could be the next billion dollar baby.
As these next-generation navigation solutions hit the market - the ETA on real-time traffic is now.
The growing importance of BIM (Building Information Modeling) to the GIS industry rates as a significant theme in 2007. This presents both interesting opportunities and new challenges for our industry. For opportunities, BIM opens up a new and better way for organizations to create better simulations and more realistic portrayals of our world. As the technology to support 3D in the GIS market become market mature, people will have vastly superior ways to visualize and understand the world they live in. This will also help address the imminent challenges our industry faces as the majority of the workforce moves toward retirement, taking the deep expertise and knowledge about GIS with them. The timing of BIM's emergence in the market will help our industry better serve new employees to make better decisions based on even more realistic, real world data. For example, our world's ability to manage disasters more effectively can and will be significantly improved with the addition of BIM information to supplement traditional GIS. Not only will fire fighters know where a fire is located within a building, they will be able to visualize the precise layout of the room, down to the actual paint color on the walls.
With the addition of BIM to our market, new challenges are arising and traditional issues are becoming even more complex. Gamers have created fantastic technology for creating realistic worlds - now all we (the GIS industry) need is to do is find ways to add real world data. To do this, organizations face the daunting challenge of finding a way to get its spatial data into the right format and data model. BIM adds a whole new set of data formats to an already fragmented industry where today spatial data is available in over 200 vector, raster, CAD and database formats. This leads to even more complex data interoperability challenges as organizations try to address data model mismatches and to integrate data from so many formats. Accurate and efficient data model conversion will be key to successfully managing BIM interoperability.
Adding BIM into the vocabulary of our industry is going to have a tremendous impact on our market over the next few years. Support for visualizing and manipulating 3D objects will become commonplace and expected in major desktop GIS packages. People who make BIM will need to become more spatially aware. We're excited to be a part of this amazing journey.
COLLADA (COLLAborative Design Activity) has the potential to significantly affect the spatial data industry and propel GIS into the 3D environment. I think we would all agree that we have the game industry to thank for the graphic performance we enjoy on our computers today. As the games have become more sophisticated, and lifelike, developers have increasingly sought to use many different software packages to build their games. These Digital Content Creation (DCC) tools have specialized capabilities (e.g. scene creation and architectural rendering). As we are all painfully aware, exchanging data between software programs is fraught with problems. In 2005, frustrated game industry folks got together to solve this problem. The result is Collada, an XML based schema for the exchange of 3D digital content. Collada documents are XML files with a .dae file extension. Within these documents, everything is explained so that the receiving program can easily translate the XML file into its native language. For example, simply defining the "Z" axis in Collada would be
Why is Collada important to the GIS industry? People around the world have created 3D content using many DCC tools. As Collada catches hold, I envision a large amount of digital data becoming available for inclusion in my GIS. Rumor has it that Collada will be supported in the ArcGIS 9.3 release, expected in the summer of 2008. This should mean that I could add data created in any DCC tool that supports Collada to my GIS as a feature. Right now, only SketchUp Pro 6 supports the creation of ArcGIS 3D features, and this requires users to rely on a free, Google created plug-in. I am sure many people see this as a potential impediment to 3D GIS development. Collada can eliminate this apprehension. Already the Google 3D Warehouse has started supporting the download of 3D building models in Collada. As more and more programs support the read and write of Collada files, more data will become available to a much wider audience. Since ArcGIS does not have a TIN editor, Collada might mean that I could use a third party scene creator tool and move my work into my GIS's 3D environment. I can only hope.
Could GIS provide data for the game industry? What if a municipal GIS could provide the data for the high school's driving simulator to help train new drivers? The future is 3D and our audience is going to grow rapidly.
The world's construction industry, whose annual spend is estimated to be $2.3 trillion ($1.2 trillion in 2006/2007 in the US), is facing serious challenges including global climate change, aging infrastructure, a shrinking workforce, and lagging productivity.
To enable architects, engineers, and owners and operators of buildings and infrastructure to address these challenges, the design industry is investing in technologies such as geospatial enabling, model-driven design, interoperability, and 3D simulation.
These technologies are not only going to change dramatically how we design, manage, and operate buildings and infrastructure, but are going to provide important benefits for urban planners, land developers, citizen participation, emergency planners, and first responders.
One of the most exciting things that is happening in the design software world is the convergence of architectural and engineering design, geospatial technology, and 3D simulation. The vision that the design industry is pursuing is to be able to design a structure and experience it before it is built. The business drivers for this transformative technology advance are productivity and efficiency in the construction and facilities management industry, and improving the performance of facilities over their full life-cycle.
Together with recent advances in photogrammetic, laser scanning, and other technologies, convergence means that it is now feasible to create simulate complete urban environments with engineering precision including the inside and outside of structures and aerial and underground utility infrastructure.
The good news is that most, if not all, of the basic geometric data that is required often already exists in precision digital form, as architectural plans in the form of building information models (BIM), CAD drawing files, network infrastructure databases, and geospatial vector and raster data.
To leverage this data, it needs to be integrated into, a single, interactive model supporting 3D visualization so that you can visualize and analyze all aspects of the facility, inside, outside, and underneath, quickly and easily. This can be achieved if the integrated model is intelligent, for example, recognizes different classes of objects such as skeletal structure, walls, floors and ceilings, plumbing, heating and ventilation, telecommunications and utility networks, and terrain.
The objective is to integrate the widest range of precision data - including computer-aided design (CAD), geospatial (GIS), 3D modeling, architectural, and subterranean utility infrastructure data - to deliver a precise synthetic environment that can be used to exploit the inside (utilities, HVAC systems, furniture, elevators, walls, doors, windows, and structural details), outside (aerial utilities, full city blocks of 3D detail, road access), and under (underground water, wastewater, gas, power, and telecommunications systems) of an urban location and make this accessible in web-based 3D visualization engines. We foresee that this will have significant benefits for urban planners, land developers, emergency planners, first responders, and most importantly citizens.