Directions Magazine
Hello. Login | Register


All Articles | Post Comment

Inundation Analysis of Houston

Thursday, August 29th 2013
Classified Ads:


The City of Houston's relatively flat topology, plentiful rainfall and large areas of impervious surfaces make it prone to flooding.  This article focuses on scenario-based HAZUS-MH riverine flood model information, including where flooding might occur and how much water might be present in different areas of the city. The project taps ArcGIS Server and Flex, along with crowdsourcing to understand inundations during and after major rain events. Author Larry T. Nierth is a GIS Supervisor for the City of Houston’s Enterprise GIS Division within the Planning Department.  

Introduction and Background:
Three known sources of flood data for the Houston area include the City of Houston, the Harris County Flood Control District (HCFCD) and the Federal Emergency Management Association (FEMA).  While these sources have varying levels of national, regional and local focus, there is a need for a single place where a Houstonian can go to get simple answers to a specific parcel's susceptibility to flooding.  Much of the time, flood data are presented to the citizens in terms of general polygon data, which help designate an area as "in" or "out" of a particular floodplain (e.g. 100 year or 500 year).  While this is very good information, the polygon data do not attempt to inform the user on actual inundation values that could possibly take place.
Figure 1:  Polygon vs. vector data for the same location
This project provides Houston residents a more granular look at where and how much water can be expected through the use of inundation analysis with raster layers.  Citizens can not only get an idea of where flooding may occur based on HAZUS-MH, but they are also provided with a mechanism for contributing their own observations directly on the map.  The software used for mapping and analysis was ArcGIS (10 and 10.1), the HAZUS-MH 2.1 plug-in module for flooding, as well as the Esri extensions of Spatial Analyst and 3D Analyst.  For the Web components, ArcGIS for Server, Esri CloudBuilder and Adobe Flex were used.   
Goals and Objectives:
There were four primary goals to accomplish with this project.  The first goal was to create a separate set of inundation raster layers for three scenarios, including 100-, 500- and 1,000-year flood events for the stream reaches in each study area.  The second goal was to calculate the percentage of flooded area for all affected parcels.  This involved capturing the maximum inundation cell value per parcel.  In the example below (Figure 2, left), the parcel outlined in red is 100% flooded at various depths for this particular scenario.  
Figure 2:  Flood depth parcel example (left) and 3D rendering of Study Area 2 (right)


The third goal was to create a map set, summarizing the analysis and findings, incorporating various three dimensional (3D) scenes, rendered to show the extent of each scenario's flood waters (Figure 2, right).  Actual building heights from full-feature 2008 Lidar points (aggregated to building footprints) were used for extrusion in ArcScene.  The color ramp being used starts with green (shallow water depths) and moves to dark blues (deeper water). These maps explore each study area and attempt to answer questions such as, "Where is inundation occurring?" and "Which land-use category floods the most?"
The project is broken down into four key areas within south-central Houston.  These areas include Downtown, Midtown and Third Ward, the Medical Center, and Linkwood Park areas.  Figure 4 shows results of running a preliminary 1,000-year maximum flood event across the region with HAZUS-MH.  Inundated areas are in black on a bare earth DEM shown in blue hillshade.  Many factors had to be considered including data size constraints, PC processing power, software limitations and time for project completion.  Additionally, the City of Houston is locally divided into 88 regions (for cooperative planning purposes) collectively known as “Super Neighborhoods.” Flooding totals needed to be summarized for these locations as well.
                   Figure 3:  Study area locations
The final goal was to create a Flex application website that displays the inundation data, and allows citizens to get involved and input their own flood observations.  Users of the application can find addresses and examine inundation values nearby.  The Flex application runs in a standard Web browser (Internet Explorer, Chrome or Firefox) that citizens can pull up on their computers and enter their own observations about the flood depths on their property by creating a point on the map within the application.  The points added by the user have geodatabase domain-controlled input for inundation value attributes and they can also add comments into a separate field. 
The data used for the creation of the inundation surfaces were the National Elevation Dataset (NED) 1/9 arc-second DEMs, resulting in raster layers that were approximately 3 meters in resolution.  The Manning's Roughness Coefficient (or n-value) used was 0.050, which corresponds to the categories of "Developed, Medium Intensity" and "Developed, High Intensity" land cover (FEMA, 2013).  
The first step was to generate the inundation raster grids for the chosen flood scenarios.  In order to generate flood plains and corresponding raster inundation grids, this study used DEM data that extended far beyond the stream reaches of the study area.  This was necessary to achieve successful hydraulics and hydrology analysis processes within the HAZUS-MH extension of ArcGIS.  Once the DEMs were added, HAZUS-MH generated the stream reach system.  This was done many times before a stream network was created with correctly varying inputs to mimic reality.  This project used the 2001 Lidar-derived stream network to validate the model's output for stream placement.  The final input variable used for stream generation was a 4.1-square-mile search radius.  Once the stream reaches were built, the model generated inundation raster surfaces by running flood scenarios against the stream and surrounding DEM area.  Figure 4 shows the result is a set of raster layers that vary in extent and contain cell-based inundation values across the study area for all three floods at a 3-meter resolution.
Figure 4: Example of a single 3-meter pixel value for the same location across all three scenarios
Once the inundation surfaces were generated, this study used parcel polygons to determine flood area and maximum depths per parcel.  The “Zonal Analysis” tool in Spatial Analyst was used to determine the maximum depth pixel of inundation for each parcel that was flooded.   The next step was to run the Tabulate Area tool in Spatial Analyst in order to determine total inundation area per parcel and calculate percent of total area flooded.  The flooded area and maximum water depth information was then appended to the parcel polygons for each flood scenario (Figure 5).  The fourth step was to aggregate the parcel data by land-use type and report the maximum flood depth for each scenario per land-use type.  Additionally, the data were aggregated to Super Neighborhood boundaries.  This study summarized pixels for all parcels that contained flooding within a Super Neighborhood to calculate overall flood volume.  Figure 5 shows a flooded area and highlights the pixel of maximum flood depth.
Figure 5:  Flooded area and maximum depth calculation
To make the information available to citizens, the data needed to be in a Web-enabled map application.  Utilizing Adobe Flex, the application was built with CloudBuilder and managed on Amazon.  Creating the application in the cloud gave considerable advantages with regard to flexibility of application development.  Once all prototyping and development was complete, the application was moved to city servers.  The application not only displays the inundation surface information, but also allows for citizen input. Citizens contribute their own flooding observations through an editable point feature service, stored in SQL Server with domain controls.  Users of the application can locate a parcel based on address or coordinates, and query the different scenario-based inundation surfaces to get estimated flood values that correlate to the 100-, 500- or 1,000-year HAZUS-MH generated flood scenarios. 
Project Results:
The inundation raster grids were compiled together for each model-run scenario and displayed with parcel information.  An example of a 100-year inundation surface can be seen in Figure 6 (left) and the zoomed-in polygon extents converted from these raster layers are shown in Figure 6 (right).  
Figure 6:  Several close-ups of the inundation surface raster (left) and extent polygons (right)
The completion of Tabulate Area analysis gave every parcel a flooded area, based on overlapping raster inundation grid coverage, and the maximum inundation 3-meter pixel value was appended to each parcel through the Zonal Analysis tool.  An example of the Zonal Analysis (Figure 7, left) and Tabulate Area analysis (Figure 7, right) is shown below.  Having this information in the attribute table allowed parcel polygons to be mapped quantitatively by maximum depth or flooded area, and any parcel affected could have its maximum flood depth and flooded area compared across the three different scenarios.  An example of this can be seen below (Figure 8), where six locations were sampled across an area encapsulating the University of Houston-Downtown campus.  Two of the sample locations (#3 and #4) fall inside the parcel boundaries of the university, and they only display inundation values from HAZUS-MH at the 1,000-year flood extent.
Figure 7:  Examples of parcel-level spatial analysis: Zonal Analysis (left) and Tabulate Area (right)
Having the raster data as well as the vector polygon data serves two purposes.  First, mapping parcel polygons in two dimensions (2D) is most helpful for citizens looking to see how much water may occur on their land.  Second, the raster layers allow for a more organic view of each flood scenario unrestrained by invisible parcel lines.  Inundation values can be attained across the entire study area outside of parcels, within right-of-way and within utility easements.  This information helps officials to better understand the dynamics of the flooding event.
Figure 8: Sampling results and locations
The aggregation of the parcel-polygon data to the Super Neighborhood level revealed that some of these neighborhoods are more apt to flood than others, based on the HAZUS-MH generated raster grids.  Six of the Super Neighborhoods affected by the three flood scenarios are compared below.  The Super Neighborhood with the greatest amount of flooded area as observed from the models was Greater Eastwood.  
Figure 9:  Super Neighborhood flood comparison chart
At the 1000-year flood scenario, this Super Neighborhood received approximately 3.9-square miles of inundated surface area (Figure 9).  The Super Neighborhood that had the least amount or area affected was Midtown.  At the 100-year flood scenario, this area received about 1/23rd of a square mile of inundation area, and most of this was confined to the sub-grade areas of Highway 59 on its southeastern border (Figure 10).
Figure 10:  Map of minimum and maximum flood scenario examples (right)
The parcel-polygon flood areas were also aggregated to each study area, and compared across the three scenarios based on land-use types.  While not all land-use types were present in all three scenarios for a given study area, there were enough categories present in each scenario to paint an overall picture of the severity of flooding for each study area.  The charts below (Figure 11) represent the summarized flooded area for each study area by land-use type.  Study Area 1 (chart A) shows the greatest amount of flooding occurs on single-family residential property.  Ideally, undeveloped areas would experience the most flooding, not residential locations.  The flooded areas for single-family residential were so high, the chart was interrupted between the values of 1,000,000 and 4,000,000, so that all other land-use values could be displayed.  Study Area 2 has less residential flooding, but has massive flood totals for public and institution property (Chart B).  Study Area 3 showed high flood totals for single-family residential and undeveloped parcels, and somewhat less for industrial property (Chart C).  Study Area 4 has the most flooded area in the category of undeveloped land (Chart D) and there were no single-family residential areas. This chart's values were interrupted between 300,000 and 1,100,000 for display.
Figure 11: The flooding statistics for the four study areas, broken down by land-use type


Four maps highlight each of the inundation surfaces for each HAZUS-MH scenario of each study area. In addition to the main map, several parcel examples are displayed, broken down by scenario-based flooded area.  A land-use inset map is also included for each study area in the table of contents.  

Figure 12:  Map set of the study areas


For the public-facing Web map, the Flex application inundation surfaces were converted to non-generalized polygon layers.  This was done for two reasons.  First, it speeds up the identify function.  Second, the identify results (when highlighted on the map) show all areas with the same water elevation.  This allows citizens to not only see the value of a selected location, but other areas in the view extent with the same water depth.  The list of layers can be brought up and the user can toggle between 100-, 500- and 1,000-year flood events on the display (Figure 13).  Additionally, an edit service allows users to add points and comments to the maps, as well as to select an inundation value from the domain-controlled, pull-down menu.     

Figure 13:  Screenshot of the Flex application

These crowdsourced points will be compiled after major rain events (not just those classified as 100-, 500- or 1,000-year events) to identify where citizens are experiencing localized flooding.   Comparing the actual flood observations of major rain events against the model inundation grids will help determine how close to a particular modeled flood scenario the actual event measured.  Additionally, the crowdsourced points will be converted to raster layers yearly using point density analysis so that each successive year's flooding raster can be compared with the prior year.  By comparing each yearly flood density raster to its prior year, it can show whether an area's likelihood to flood is increasing or decreasing over time.   
There are several advantages of this project for Houston residents.  This project informs City of Houston citizens about possible flood locations and depths using maps of scenario-based floods.  The public-facing Web map interface enables Houston residents to participate directly in refining the model by adding their own observations to the editable feature service within the inundation Web application.  Future flood modeling efforts will be aided by the gathering of actual flood observations from citizen inputs.
Future Project Use and Refinement:
This project encapsulates approximately 47 square miles of Houston (shown in the black box in Figure 14).  The study area would need to be greatly expanded, probably in phases, to incorporate the roughly 2,316 square miles of area needed to create inundation surfaces citywide (see below).  The intention is to expand the stream reach network and create inundation grids first within Loop 610, and then, out to Beltway 8.  The crowdsourced data will be gathered for areas beyond the study area in this project, but scenario-based inundation data will not be visible within the application.
Figure 14:  Project area and expansion
There are also limitations to the accuracy of these inundation grids, since they are based on hydraulics and hydrology processes from HAZUS-MH, and assume a standard roughness coefficient across the entire reach. Additionally, these scenario-based inundation raster layers are based on U.S. Geological Survey historical stream gauge data (not continuously updated water elevation levels).  Enhancements to the inundation grids through either level 2 (or higher) HAZUS-MH analysis or HEC-RAS study will increase the accuracy of the model data.  
Additional enhancements to the project include automating the method of capturing the maximum flood value per parcel and total flood area per parcel with model builder and Python.  Aggregation of parcel level data to study areas, land-use types and Super Neighborhood boundaries can also be scripted in the same manner.  The generation of the inundation grids themselves, however, cannot be automated at this point.
FEMA (2013), Multi-hazard Loss Estimation Methodology (Flood Model), Ch. 4, pg. 4-74, Figure 4.49 as retrieved June 2013 from the FEMA Hazus Documentation Center

Bookmark and Share

Stay Connected

Twitter RSS Facebook LinkedIn Delicious Apple Devices Android Blackberry

Recent Comments

Operating UAVs in US Airspace-The Legal Implications

U.S. Army Corps of Engineers’ Mapping Team: Hoping for a Picture-perfect Hurricane Season
Beyond GPS: Five Next-Generation Technologies for Positioning, Navigation and Timing (PNT)
Meet the Unknowns in the 20 Most Promising GIS Solution Providers of 2014
Mapping Disaster: A Global Community Helps from Space
Alteryx 9.0: “Data Scientists Not Required!”
Millennium Space Seeks Your Space-enabled Big Ideas
Using Other People’s Maps and Data: Let’s Keep it Legal
The Present and Future of CartoDB

About Us | Advertise | Contact Us | Web Terms & Conditions | Privacy Policy
© 2014 Directions Media. All Rights Reserved