[ Ø ] Harsh Prakash – GIS Blog

Quiet Musings On Applied Spatial (Health, Disaster, Technology, Planning et al.)

Archive for the ‘Education’ Category

Meanwhile, Thirteen Years Later…

without comments

So, does it hold up?

The Map (GIS Growth Study) v. The Thing Mapped (Demographics, Plan)

PS: I smell a decentralist –

“A Caveat (from 2001)

Such a planning methodology of data collection and projection does have some intrinsic faults: it relies heavily on knowledge-based skills. It assumes that ‘correct solutions’ to social problems can be obtained from a scientific analysis of various data. It must be noted that a solution-driven approach and heavy reliance on physical sciences as opposed to social sciences, is inherently inaccurate since the ‘best planning answer’ is a non-existent variable, changing with time, society, culture, resource availability, etc. And there is always a danger of being consumed by this technique, and confusing the result for a solution.

The nature of this study involved making some basic assumptions about the way our study-area could evolve in the not-so-distant future. There have been doubts raised about the correctness of such a clinical technique wherein an urban settlement is ‘stripped’ of its various attributes, and these attributes then individually graded. Appreciation of the intricate complexity of human society, where each individual is a separate factor, is absent. Lack of importance to these inter-relationships is a flaw of such an analysis.

For E.g. In the current study, if we were to discover one other attribute, say a desert, how would it affect the final map? We would, using this approach, simply grade each cell one more time. Then we would add this new map to our list of maps, and calculate the new final map. However, we would fail to evaluate how the addition of a desert affects each of the other attributes individually.

But this flaw may not be as aggravated as it seems. Each cell gains its final value from all attributes. If in a hypothetical case, one could gather a ‘complete list of attributes’ that would impact future growth, and assign them ‘correct values’ (without even breaking them into distance-bands which are only for convenience), finally adding them in the ‘right equation’, one would come up with a case-specific fairly accurate growth forecast (however, even then, any sudden future changes would still get missed).

There have also been some other approximations:

* The integer weights assigned to attributes.

* Or, areas outside the study-area that exert significant impact on urban growth, but were ignored because of study limitations.

* Also, on examining the Cultural Points table, it is found that Cemetery was included as a row category. Cultural Points have been considered as having positive influence on future growth. But a cemetery would not have an entirely positive influence on urban growth. Furthermore, parts of UVA were used as cultural points. The university was also used as a major employer. Thus, there has been some overlapping. This results in disproportionate values for some cells.

But this study is an illustration more of a proactive planning approach, than an accurate projection of urban growth for an area. And even though limited in its effectiveness, any attempt to administer planning remedies would have to include some such non-arbitrary problem-solving technique.”

Conference Presentation: GIS TECH 201 – Mapping Mashups

without comments

Technology Division of the American Planning Association (APA) Report: Summer 2011

without comments

This report provides a description of existing services, both external and in-house, available to APA divisions for hosting and broadcasting webcasts to their members and other interested professionals, and specifically looks at the external Planning Webcast series. In addition, it includes an analysis of options for expanding these services. The report was produced in response to a request from the APA Divisions Council (DC).

Options for Division Webinars: Summer 2011 (PDF, DOC)

* Planning Webcast series
* APA Audio/Web Conference series

Written by Harsh

September 30th, 2011 at 12:16 pm

Posted in Education,Planning,Technology

Tagged with

New Media

without comments

Written by Harsh

June 25th, 2011 at 12:28 pm

Technology Division of the American Planning Association (APA) Awards for 2011

without comments

Category 4: The award for the ‘Best Paper on Technology in Planning’ goes to Omar J. Peters’ (University at Albany, SUNY) ‘Why-Fi: A Look at Information Technology as a Strategy for Urban Development’ for the outstanding paper on the use of technology in planning.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Sunday, April the 10th (11:45 AM – 1:00 PM) in Beacon G, Sheraton Boston Hotel. Congratulations again to our award winner!

* Technology Division of the American Planning Association (APA) Awards for 2010

Written by Harsh

March 30th, 2011 at 6:30 pm

Posted in Education,Planning,Technology

Tagged with

Webinar Series: GIS TECH 101 – Mapping Mashups

with 6 comments

Technology Division of the American Planning Association (APA) Awards for 2010

with 2 comments

Category 1: The award for the ‘Best Use of Technology to Improve a Plan or Planning Process’ goes to Marc Schlossberg‘s (University of Oregon) ‘Engaging Citizens in Active Transportation Planning with Mobile GIS‘ for its creative use of technology in improving planning processes.

Category 2: The award for the ‘Best Use of Technology for Public Participation’ goes to Michael Baker Jr.‘s ‘More For 1604 Social Media Program‘ for its good use of technology to enhance public involvement and participation in planning and decision making processes.

Category 3: The award for the ‘Best Use of Technology for a University Urban and Regional Planning Program’ goes to the School of Policy Planning and Development‘s (University of Southern California) ‘Multimedia Boot Camps‘ for its effective use of teaching with technology in preparing future planners for professional work.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Monday, April the 12th (7 AM) in the Hilton New Orleans Trafalgar Room. Congratulations again to all our award winners!

* Technology Division of APA
* Planning & Technology Today

Written by Harsh

March 30th, 2010 at 3:48 pm

Interview: “Geographic Information Systems (GIS) – It’s Much More Than Google Maps – A Chat With GIS Experts”

without comments

Written by Harsh

March 18th, 2010 at 9:27 am

Cost of Living and Higher Education

with one comment

As I returned from the American Planning Association‘s 2007 National Planning Conference in Philadelphia, I rummaged through some past papers and chanced upon a letter.

Thomas Jefferson or William Penn?

When I look back to why I chose UVA over UPenn, the cost of living at Charlottesville v. Philadelphia, not Public Ivy v. Ivy League, proved to be the determining factor given finances. Although Charlottesville’s small-town vibe didn’t reconcile well with the “urban” in Urban Planning, and UVA did not play to my love of physical design (focusing more on the sociological aspects of planning that I, well, now believe to be closer to the core principles of planning), it was an enriching ride.

So, as some of you may be deciding on which offer letter to accept this fall, here is a little advice – focus on the one you really want and everything else might just fall in place.

Good luck!

PS: Compared to UPenn, UVA has smaller graduate programs and endowments. And it feeds the Washington DC metro’s job market. UPenn, on the other hand, feeds the New York metropolitan region. So spare a thought to where you would like to spend, or at least start, your professional career. A note for foreign students – UVA has a good number of, for lack of a better word, “southern aristocracy” flocking to its classes, while UPenn has a larger international student population. So stay north of the Mason-Dixon line, if you have a choice.

*• USATODAY – ‘Mr. Jefferson would be proud’: Charlottesville is No. 1
Rural Clusters and Relative Rurality:
• Albemarle VA | $37,638 | 0.358 | $13,474.40
• Philadelphia PA | $29,755 | 0.037 | $1,100.935
* Roughly, the higher the Relative Rurality, the further the dollar would go
• Cities Ranked & Rated: ‘The Ten Best Places to Live [2005]’ and ‘2005 Best Places to Live’
• 1 | Charlottesville VA
• 76 | Philadelphia PA-NJ
*• Frost, Robert. The Road Not Taken. http://www.poets.org/viewmedia.php/prmMID/15717
•* More
* Ways to give

Written by Harsh

April 22nd, 2007 at 12:55 pm

Posted in Education,Social

Tagged with , ,

International Outreach

without comments

One of the pleasures of my current job is the annual opportunity to interact with professionals from around the world, thanks to the International Visitor Leadership Program. During these interactions, I share with the visiting delegations how regional government works in the Virginias.

Pi: Quiet Musing
Mayoral Delegation from the Republic of Tajikistan, 2006

Pi: Quiet Musing
Public and Private Sector Delegation from the Russian Federation, 2005

I always end my presentation on regional governance and SIS with a quick display of Google Earth when we try to locate the remote places the delegation members come from. As can be deduced from these pictures, the members stand in rapt attention of how one private enterprise gives back to the greater common good.

* Theories and Approaches in Local Government Studies

Written by Harsh

January 24th, 2006 at 8:09 pm

Brain Hypnosis

without comments

An intriguing article that may help those interested in best meeting project expectations in a team-setting. Here is my take on that- for rewards, it is often best if expectations are lower than the actual; for punishments, it is often best if expectations are higher than the actual; so that in both cases, the resulting momentum is kept pointing upward. The old adage of “under-promise over-deliver” follows along the same line.


“… The probe, called the Stroop Test, presents words in block letters in the colors red, blue, green and yellow. The subject has to press a button identifying the color of the letters. The difficulty is that sometimes the word ‘Red’ is colored green. Or the word ‘Yellow’ is colored blue.

For people who are literate, reading is so deeply ingrained that it invariably takes them a little bit longer to override the automatic reading of a word like ‘Red’ and press a button that says green. This is called the Stroop effect.

Sixteen people, half highly hypnotizable and half resistant, went into Dr. Raz‘s lab after having been covertly tested for hypnotizability. The purpose of the study, they were told, was to investigate the effects of suggestion on cognitive performance. After each person underwent a hypnotic induction, Dr. Raz said:

‘Very soon you will be playing a computer game inside a brain scanner. Every time you hear my voice over the intercom, you will immediately realize that meaningless symbols are going to appear in the middle of the screen. They will feel like characters in a foreign language that you do not know, and you will not attempt to attribute any meaning to them.

This gibberish will be printed in one of four ink colors: red, blue, green or yellow. Although you will only attend to color, you will see all the scrambled signs crisply. Your job is to quickly and accurately depress the key that corresponds to the color shown. You can play this game effortlessly. As soon as the scanning noise stops, you will relax back to your regular reading self’…

In highly hypnotizables, when Dr. Raz’s instructions came over the intercom, the Stroop effect was obliterated, he said. The subjects saw English words as gibberish and named colors instantly. But for those who were resistant to hypnosis, the Stroop effect prevailed, rendering them significantly slower in naming the colors.

When the brain scans of the two groups were compared, a distinct pattern appeared. Among the hypnotizables, Dr. Raz said, the visual area of the brain that usually decodes written words did not become active. And a region in the front of the brain that usually detects conflict was similarly dampened.

Top-down processes overrode brain circuits devoted to reading and detecting conflict, Dr. Raz said, although he did not know exactly how that happened. Those results appeared in July in The Proceedings of the National Academy of Sciences…”

Sandra Blakeslee

• NYT Article

Written by Harsh

November 22nd, 2005 at 7:10 pm

Posted in Education,Social

Tagged with , ,

Digital conversion of Flood Insurance Rate Maps (DFIRMs): White Paper

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

Parent document copied with permission from the original white paper at the GIS Technical Center. The objective was to add notes reflecting procedural changes brought about by the integration of CITRIX WISE Tools. The initial notes were created during a 2005 DFIRM Production.


In August 2003, the GIS Technical Center (WVGISTC) became a Cooperating Technical Partner with the Federal Emergency Management Agency. Our mission, to create digital flood themes from paper Flood Insurance Rate Map (FIRM) and Floodway Boundary and Floodway Map (FBFM) panels and to deliver the data in specified formats and with appropriate documentation. FEMA prepares Mapping Activity Statements (MAS) that outline the scope of work and deliverables for each county-based project. Final products are primarily seamless, countywide geospatial data files in the ESRI shapefile format, along with associated metadata.

According to FEMA (Michael Craghan, pers. comm.), the final vector products will have the following qualities:

1. A seamless county-wide dataset, with no gaps or overlaps
2. The lines and polygons end up in their real-world locations
3. There is no scale distortion (i.e. spatial relationships are maintained; if paper map is 1”=500’, digital version should be too).

The FIRM/FBFM features collected by WVGISTC are:

1. Base Flood Elevations (BFE-lines)
2. Cross Sections (Xsection-lines)
3. Flood Hazard Areas (polygons in final format)

The current Mapping Activity Statement for conversion of Jefferson and Berkeley counties specifies these deliverables:

1. Written certification that the digital base data meet the minimum standards and specifications.
2. DFIRM database and mapping files, prepared in accordance with the requirements in Guidelines and Specifications for Flood Hazard Mapping Partners (see references for citation); (S_ Base_Index, S_Fld_Haz_Ar, S_BFE, S_XS, S_FIRM_Pan).
3. Metadata files describing the DFIRM data, including all required information shown in Guidelines and Specifications for Flood Hazard Mapping Partners.
4. Printed work maps showing the 1- and 0.2-percent-annual-chance floodplain boundary delineations, regulatory floodway boundary delineations, cross sections, and BFEs at a scale of 1:100,000 or larger.
5. A Summary Report that describes and provides the results of all automated or manual QA/QC review steps taken during the preparation of the DFIRM.
6. An ESRI shape file showing points where mapping problems are discovered during the digitizing process.

The following sections describe the procedures we follow to (1) prepare the base material for digitizing, (2) digitize features, (3) perform quality control, and (4) prepare final files using ESRI Arcmap 8.x software. This document assumes the user is skilled with ESRI Arcmap 8.x GIS software and has the ability to use reference materials. For help using ESRI Arcmap consult the help files or ESRI on-line support.


Source Material (Source Material Inspection)
In the MAS cost estimation phase it is advantageous to become familiar with the FIRM and FBFM panels that cover the geographic extent of the county. In the back of our FEMA binder, there are 3 CDs with scanned panels for 10 high priority counties. The scanned or paper FIRM and FBFM panels should be visually inspected to check for insets and other format issues that may impact the amount of time it takes to digitize and attribute. At the on-line FEMA Flood Map Store search for FEMA issued flood maps. Follow the prompts for state, county, and community. This is one way to become familiar with the number of panels in a county and also to gather information on the effective date. The effective date on-line may be compared to the effective date on the paper panels to determine if we have the newest source. This is important because FEMA may have done some digital conversion in the counties we are digitizing; in Berkeley County, for instance, 2 of the panels were available in a digital CAD format. We received the CAD files (DLG) and copied the line vectors into our Arcmap project.

Base Layer Compilation
As part of the MAS, a ‘base map’ is obtained for georeferencing the FIRM and FBFM panels in a county. The MAS states: “the base map is to be the USGS digital orthophoto 3.75-minute quarter-quadrangles (DOQQs), or other digital orthophotography that meets FEMA standards.” Currently, we use the DOQQs to georeference the panels; when it becomes available, we will use the Statewide Addressing and Mapping photography. Countywide mosaics of the DOQQs are available either from CDs in our office or from the NRCS geospatial data gateway. Before beginning panel georeferencing, gather all the base map photography to cover the geographic extent of the county. Check DOQQ tiles and the ortho mosaic, if used, for agreement with each other. Also check the individual DOQQ tiles against the quarter quadrangle index to make sure that they are NAD83 and not NAD27. Finally, check to make sure that the spatial properties (coordinate system and projection) are defined for each quarter quad.

FEMA provides scanned (TIFF) images of the paper FIRMs and FBFMs. Not all counties have separate floodway panels (FBFMs).

You can download county FIRMs and FBFMs from the FEMA Map Store. For Summers and Fayette Counties WV, aerial photographs from the SAMB were reprojected on-the-fly and used as base.

“ArcMap will not project data on-the-fly if the coordinate system for the dataset has not been defined. The coordinate system for any dataset can be defined using ArcCatalog” [ESRI Help].

It is advisable to load the aerials, FIRMs and FBFMs in different Raster Catalogs for quicker refreshes. It is best to start-off with geoferencing the index and then nailing each semi-transparent panel in its approximate location through corner points [“spreading in all the right directions”]. Again, it is best to concentrate around your area of interest, in this case, the floodplain. It is also advisable to adjust the visible scale for the aerials for easier navigation.

Also, try to keep the clipboard empty since on aging systems that may cause incomplete raster refreshes. To avoid related spikes in CPU usage, you may adjust the display settings, page file size and Task Manager priorities accordingly. Also, if you have upgraded to ArcGIS 9.1 minus the patch and are having raster display problems, consult the following ESRI thread.

“In general, if your raster dataset needs to be stretched, scaled, and rotated, use a first-order transformation. If, however, the raster dataset must be bent or curved, use a second- or third-order transformation. Add enough links for the transformation order. You need a minimum of 3 links for a first-order transformation, 6 links for a second-order, and 10 links for a third-order” [ESRI Help].

Priority should be given to georeferencing individual panels over interlocking adjacent panels. Once satisfied with the adjustments and associated RMS Error, you may either update if using first-order transformation, or rectify if using higher-order transformation.

Note that first-order transformations update the *.TWF files, however higher-order transformations also update the *.AUX files.

Once the groundwork is done, it takes less than 1/2 an hour per panel on a machine with the following specifications:

MS Windows 2000 SP4 Dell PWS 340 Pentium [4] CPU 1700 MHz 1.05 GB RAM

The steps taken to georeference the scanned FIRMs/FBFMs using Arcmap are:

1. Start an Arcmap project in the desired coordinate system. When using West Virginia DOQQs that will primarily be UTM 83 zone 17 (although Jefferson County was zone 18).
2. Add the DOQQs for the area of interest to the project.
3. Add the scanned TIFF to the project. The first panel to be georeferenced is the most difficult, because locating the correct spot on the base map photographs using the landmarks on the panel can be frustrating without a good reference system. One way to do this is to warp the panel index first—hence giving a rough estimate of panel location on the photographs. Alternatively, after warping one panel, work with adjacent panels to make landmark location easier.
4. Use “fit to display” on the georeferencing toolbar pull-down menu to move the TIFF to the current extent.
5. Use the georeferencing toolbar to create control points on the DOQQs and the scanned TIFF, using roads and other major features appearing on the FIRM.
6. It is recommended that “Auto Adjust” be checked on the georeferencing dropdown and that the layer being georeferenced is partially transparent. As control point links are added the scanned TIFF will be shifted over the DOQQs, making finding and adding additional links easier.
7. As you are adding control points, check the residual values and total RMS value in the link table. The goal is for a total RMS value of 10 or less (units are mapping units, meters). After adding as many control points as possible it is sometimes useful to remove links that have very high residual values to improve the overall RMS value of the warp. Sometimes it is not possible to get an RMS below 10.
8. Concentrate control points around areas with flood features to improve the fit of areas that will be digitized. We recommend adding at least 10 sets of control points, although in some cases we used over 20 sets to improve fit.
9. Record the total RMS value of the transformation for each panel in a spreadsheet for the county.

Vertical Datum Conversion [optional]
The estimate of the basic shape of the earth was inconsistent under the National Geodetic Vertical Datum [NGVD] 1929. This resulted in less accurate vertical data computations. Hence, it was decided to shift to the North American Vertical Datum [NAVD] 1988 that uses more reliable means for this estimation. Vertical Datum is required for DFIRM panels and the D_V_Datum table. Note that Vertical Datum conversion will not result in any change in flood depths.

Begin with 7.5-minute USGS Quadrangles. For Summers and Fayette Counties WV, this data was downloaded from the WV GIS Technical Center. Next buffer your County by 2.5 miles to select all the Quad corners that fall inside the buffer. Then reproject the corner points thus selected to GCS_North_American_1983 and add XY coordinates. Now you have all the latitude/longitude coordinates required for orthometric height-difference computations using the National Geodetic Survey’s VERTCON software. Alternatively, you may use the Corps of Engineers’s CORPSCON software.

In VERTCON, if you have generated an input data file for your latitude/longitude coordinates, you would typically select the ‘Free Format Type 2’ option. Else, you would simply enter individual Station Names and associated latitude/longitude coordinates. VERTCON generates an output data file for use in the following calculations [Sample Worksheet].

Once Conversion Factors for all points have been determined, calculate the Average, Range and Maximum Offset for the Conversion Factors. If the Average is less than 0.1 foot, only a “passive” Vertical Datum conversion may be applied. Typically, when the Maximum Offset is <= 0.25 feet, a single Conversion Factor can be applied. Else, stream-by-stream Conversion Factors need to be applied.


dd = 37.87511679110 degrees ~ 37 degrees
mm = .87511679110*60 = 52.50700746600 ~ 52 minutes
ss = .50700746600*60 = 30.42044796 ~ 30 seconds
==> 37 degrees 52 minutes 30 seconds
2. Appendix B: Guidance for Converting to the North American Vertical Datum of 1988
3. FIA-20 June 1992, Ada County OH

Digitizing and Attributing Flood Features (Arcmap Project and File Specifications)
The UTM NAD83 projection, zone 17 is used for all West Virginia countywide flood mapping projects, with the exception of Jefferson County, which is zone 18. All features are initially collected as lines, although special flood hazard areas (e.g., Zone A, AE) are later converted to polygons. All features are drawn in one line shapefile and are later separated into the separate files required to meet MAS deliverables. For the purposes of drawing the flood feature lines we are using a line shapefile with the following attribute fields: Type (text, 10), Letter (text 2), Elev (long integer, precision 5). A description of the values we use in those fields is given below with each different feature type. In the first round of digitizing the shapefile was named All_Lines.shp, although in the future we may switch to using a county name in combination with employee name. Save edits frequently while digitizing, both by using the save edits button in Arcmap and by making backup copies of the file with Arccatalog.

Begin an edit session and set up the snapping environment. Having snapping turned on is important to allow snapping of BFEs to the edges of flood hazard areas and for snapping the flood zone line segments together. We generally usually use a snapping tolerance between 7 and 10 pixels; this is a personal drawing preference and may vary from person to person. Use the appropriate snapping mode for each type of feature, i.e. ‘vertex’ for closing zone boundaries, ‘end’ for snapping arc ends together and ‘edge’ for snapping BFE lines to zone boundaries. Note that having ‘vertex’ snapping on can make it more difficult to accurately place BFE endpoints. The goal is clean intersections and BFEs that are snapped to flood hazard area boundaries.

Feature Collection
We generally draw flood map features in this order: floodway, flood zone, BFE, and cross-sections. Some counties have floodway features on a separate map (FBFM) from the FIRM. When working with two maps, collect floodways and cross sections from the FBFM and collect flood hazard zones, BFEs, and streams and channels from the FIRM maps. When working with a FIRM and a FBFM for a panel, it is recommended that lines are drawn from the FBFM first and the FIRM second. Features are to be seamless across panel boundaries, meaning when the same feature type occurs on both sides of a panel boundary, it should be drawn with no interruption. Adjacent panels digitized by different people should have the endpoints of flood feature lines snapped together in the final line shapefile. Be sure to check panel edges carefully for small flood zone polygons.

Panel Index and Base Index
Collection and attribution of flood features will be discussed in detail below. In addition to the flood features, we also submit 2 polygon index shapefiles to FEMA for each county. One of the shapefiles is called S_FIRM_Pan and is an index of the FIRM panels for a county. It is created by digitizing the lines on the scanned and warped county FIRM index. Only unincorporated areas are included the in the panel index, not the incorporated areas. Secondly, an index of the “base” data for a county is to be provided in a polygon shapefile called S_Base_Index. In our case, the base data is the DOQQs. The S_Base_Index shapefile can be generated by clipping out the appropriate quarter quads from the DOQQ index. As with all other shapefiles we submit, both the S_FIRM_Pan and S_Base_Index shapefiles have a required attribute table format, discussed later in this document.

Flood Feature Symbology and Attributes

The floodway is the channel of a river plus any adjacent floodplain areas. Floodways won’t be found on all panels. There are 2 different presentations of floodways on FEMA panels, which vary by county. In some counties, Berkeley for example, floodway symbology is included on the FIRM (Figure1a). Other counties have separate floodway panels (FBFM, Figure 1b) and they must be added as a separate layer for floodway line collection.

In the initial drawing, lines defining the floodway are given the following attributes:

Type: floodway

Flood Hazard Areas
Flood hazard areas will also be referred to as ‘flood zones’ or ‘zones’ and they identify areas of different levels of flood risk. Flood zones are labeled on the FIRMs with letters; commonly used zone names are A, AE, B, C, X and they are shown on the paper maps with different densities of shading and text labels (Figure 2a). Zones are collected as lines, although later they will be converted to polygons. Digitizing proceeds from the inside out, i.e., collect the innermost zones first (In Figure 2a, the floodway would be collected first, and then AE, then X). Where an outer zone line flows into an interior zone line, they should be snapped (Figure 2c). Each line defining flood zones should be collected only ONCE. In areas where zone boundaries are coincident, only one line is collected (Figure 2c). There are zone division lines (Figure 2c and d, also referred to as gutter lines), which separate “special” flood hazard areas (generally zones A and AE). The zone division lines are thin white strips that are hard to see in the shaded zones. Gutter lines should be considered the border of those particular zones and treated as any zone boundary would be (i.e., collected once, continuous with other zone lines).

In the initial drawing, lines defining the flood hazard areas are given the following attributes:

Type: zone

Base Flood Elevations
Base Flood Elevation (BFE) is the height of the base (100-year) flood in relation to a specified datum. BFEs are symbolized on the FIRM panels with a wavy line (Figure 3a) but the feature is usually collected as a straight line (Figure 3b) that is snapped to the edge of the flood hazard area. IF there is a significant bend in the BFE as drawn on the panel, then additional points may be added to follow the curve. Ends should always be snapped to the flood hazard area.

In the initial drawing, lines defining the BFEs are given the following attributes:

Type: bfe
Elev: numeric elevation value on FIRM (e.g., 405)

Cross Sections
Cross sections (Figure 4a) show the location of floodplain cross sections used for computing base flood elevations. Cross sections are normally collected as a straight line, crossing and exiting the flood hazard area (Figure 4b). It is not necessary to follow bends in the cross section line that occur outside of the flood hazard area, nor is it necessary to extend the line through the hexagons at the end of the line symbol. If there are bends in the cross section within the flood hazard area, place only as many vertices needed to maintain shape. Cross section lines should not be snapped to the flood hazard area lines, and instead should extend beyond them.

In the initial drawing, lines defining the cross sections are given the following attributes:

Type: xsection
Letter: letter of cross section, found in hexagon symbol (e.g., z)

Channels and Streams
Channels and streams (Figure 5a and 5b) are collected in the flood hazard areas for QC purposes. No snapping is required and the stream or channel line should extend just beyond the flood hazard area when applicable. Streams are collected as single lines and both lines of a channel are collected.

In the initial drawing, lines defining the channels and streams are given the following attributes:

Type: channel or stream, as appropriate


Visual QC Of Linework
After all lines are digitized and in a countywide, seamless file, a visual check is done to ensure that all features have been collected. The “Type” field in the line shapefile can be used to categorically symbolize the different feature types for the visual QC. Different colors and line styles can be used to represent separate feature types and the legend symbols can be saved as a layer file to preserve the symbol assignments. Turn on the labels for BFEs (elevation) and xsections (letter) and select a font style and color that allows them to be easily seen and checked in the visual QC process. Each person will probably have a different method of doing a systematic visual inspection. Some suggestions: a grid could be used to scan the linework, drainages can be followed, or the check can be done panel by panel. The important thing is to scan at a level such that all of the panel raster features can be identified and vectors examined. The person doing the QC should have a full understanding of what features are supposed to be collected and the symbology variations (e.g., floodways on FIRMs vs FBFMs). Any missed features should be digitized. This is also a good time to make note of any unusual problems or non-conformities in the scanned panels (e.g., zone type changes at panel or corporate boundary). This is the time to check that features are seamless across panel boundaries; BFEs and cross sections in particular should be checked at panel boundaries because there is no further geometric processing with these lines that will reveal continuity errors.

Spatial Adjustments (otherwise known as “Adjusting To The Real World”)
Post-drawing manipulation of lines to improve “fit” is hard-to-quantify and subjective. As stated in the introduction, FEMA requires the digital data to have a reasonably good fit to the “real world”. The “real world” in our case is the DOQQs. The scanned panels do not warp perfectly and in some areas the digitized lines will not overlay real world features very well. Current adjustment procedures involve these steps:

1. Compile the following layers in Arcmap:
a. DOQQs
b. Line shapefile with county-wide seamless flood features
c. 1:24,000-scale NHD centerline data layer (route.rch, in catalog unit coverages)
d. Problem point file (discussed in the next section)
2. Determine a systematic method for visually scanning the data (similar to that used in the visual QC) and adjust “Type” symbology for easy differentiation.
3. Begin a visual check of the linework, this time concentrating on how well the streams and channels drawn from the flood panels line up with the DOQQ and the NHD data. It is strongly recommended that you do not use the FIRM panels at this point, as they will increase confusion.
4. NHD data are a fairly good guide to where the flood panel waterways “should” be; however they are not perfect. While visually scanning the linework, check that the streams and channels collected from FEMA panels line up fairly well with the NHD data, while also checking to see that NHD data appears to overlay the hydrologic feature on the DOQQ. There is never going to be a perfect fit; the panels streams will wander back and forth over the NHD vectors. What you are looking for is areas of consistent difference that extend for a noticeable distance (again, hard to quantify). In Figure 6a, the blue dashed panel stream channel lines are not aligned with the DOQQ stream channel edges.
5. When areas of consistent difference are found, ALL the linework surrounding the area is shifted at the same time, until the panel stream has a better fit to the real world stream. This is accomplished by first breaking all the continuous flood zone, floodway, and stream lines at about the same point on 2 imaginary lines that run perpendicular to the “flow,” one at each end of the area to be shifted. Then, the cut lines are selected, along with any BFEs or cross sections that are in the area (Figure 6b), and all the selected features are moved until the streams are better aligned (Figure 6c). The adjustment is accomplished mostly with the move tool in Arcmap, although in occasion the rotate tool may be used to improve the fit of the selected lines with the DOQQ.
6. Lastly, snap the dangling ends together and smooth out the curves of the reattached lines by moving or adding vertices (Figure 6d). This is the only time lines should be moved or stretched individually, as it distorts proportions.

Mapping Problem File
One of the required deliverables is a point file indicating areas where certain “problem” situations arise. At the same time as adjustments are being performed, the problem point file can be edited. FEMA defined mapping problems are outlined in the draft Technical Memo, dated October 3, 2003, a copy of which is found in the FEMA project notebook; they have also been listed below for convenience. A point shapefile is created for each county with the following fields: Error_type (text, 10) and Descrip (text, 75).

Error_type Descrip
BFE Base Flood Elevation problem
XSECT Cross-section problem
SFHA-PAN Special Flood Hazard Area changes at map panel edge
SFHA-BDY Special Flood Hazard Area changes at a political boundary
SFHA-STR Special Flood Hazard Area different on each side of a stream
SFHA-OTH Other Special Flood Hazard Area problems
STR-FW Stream outside of floodway
STR-SFHA Stream outside of Special Flood Hazard Area

As of this writing, we have primarily found the STR-SFHA, STR-FW, and SFHA-BDY types of errors. Note: errors should be determined AFTER lines are adjusted in a given area, as the adjustment may correct the problem. Place a point in the shapefile at the location where the problem occurs. In Figure 7 the pink point indicates a location where the stream (orange) is outside of the flood hazard area (blue line).


The flood hazard zones and floodways must be converted to polygons for final processing. Select all lines with a “Type” of zone or floodway and export to a separate line shapefile. Topological checks will be performed on the line file before polygons are built. Topology work can only be done in Arcmap via the geodatabase model. Import the line shapefile into a geodatabase feature class that is under a feature dataset (must have a feature dataset to create a topology). If you are starting with a geodatabase / feature class, then use Export | Geodatabase to Geodatabase in Arccatalog to transfer the feature class into the dataset.

Add a new topology under the feature dataset. Set the cluster tolerance relatively high (0.1 was used in the first 2 MAS, which corresponds to 10 centimeters on the ground) to reduce the number of small pieces formed. Only the flood hazard zone lines feature class will participate in the topology. The topology rules used are: must not have pseudos, must not have dangles, and must not self-overlap. After creating the topology for the lines, validate it. Bring the validated topology layer into an Arcmap project to view the errors found. Use the topology tools to analyze and correct all errors before proceeding. See the Topology section in the ArcGIS book “Building a Geodatabase” for help.

After validating the topology and fixing all topological errors, convert the lines feature class to a polygon feature class. To do this, right click on the feature dataset in Arccatalog and select ‘new’ and then ‘polygon feature class from lines’. A wizard helps with the conversion; accept the default tolerance.

Once the polygon layer is created, create a new topology for it. Use the default cluster tolerance, which is very small. Only the polygon feature class participates in the topology, and the rules are: must not overlap and must not have gaps. Bring the validated polygon topology into Arcmap as with the line topology. Ideally, there will be no errors in this topology. After checking for and fixing topological errors, another check should be done for sliver polygons. This can be done by viewing the polygon attribute table in Arcmap and sorting the table based on the shape_area attribute field in ascending order. Examine the smallest polygons to be sure they are not slivers.

Next, the polygon flood hazard features need to be attributed. This can be done in the geodatabase, setting up a domain so that attributes can be chosen from a drop down list. Overlay the flood hazard polygon layer with the FIRM/FBFM panels and attribute the polygons. It saves time if the shapefile you are using to add attributes has the same column structure as the required final product (see Table 2). In the future we hope to have template files available for use, so that the required structure will already be in place. We have tried merging with a template file in the geodatabase, but that resulted in features shifting. This process is still being developed.


For the final deliverables, the flood features collected in the line shapefile must be processed into separate shapefiles with specified fields. Table 1 gives an overview of the shapefile names and contents. Attribute fields have required field types (e.g., text, number) and sizes; details can be found on the pages of Guidelines & Specifications for Flood Hazard Mapping Partners Appendix L referred to in Table 1. These pages from Appendix L have been printed out and are in the guidelines/technical section of the FEMA project binder. Table 2 provides details on the required fields.

Table 1. Deliverable shapefile description
Shapefile Name Contents Pages in Appendix L
S_Base_Index Grid of base data, in our case, DOQQs. Polygons. L-270 to L-271
S_FIRM_Pan Grid of FEMA panels; digitized from county panel index. Polygons. L-286 to L-290
S_Fld_Haz_Ar Flood hazard zone polygons L-291 to L-293
S_BFE Base flood elevation lines collected from FEMA panel L-272 to L-273
S_XS Cross section lines collected from FEMA panel L-350 to L-354

Table 2. Shapefile attribute field requirements
Shapefile Field Name What Goes In It
S_Fld_Haz_Ar (polygon) FLD_AR_ID A unique feature number. Can be copied from FID field. [Text, 11]
FLD_ZONE Flood zone from FIRM. Use values in FLD_ZONE field of Table D_Zone on pg L-452 of Appendix L. [Text 55]
FLOODWAY “FLOODWAY” if polygon is a floodway. Null if not. [Text, 30]
SFHA_TF “T” if any zone beginning with A. “F” for any other zone. True or false. [Text, 1]
SOURCE_CIT 11-digit FIRM panel number that majority of feature is on. If polygon crosses many panels, use downstream panel. [Text, 11]

S_XS (line) XS_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
XS_LTR Upper case letter(s) of cross-section from FIRM. [Text, 12]
XS_LN_TYP “LETTERED” in all cases. [Text, 20]
WTR_NM Name of water feature (stream) cross section is on. From FIRM or FIS. [Text, 100]
SOURCE_CIT 11-digit FIRM panel number cross section is on. If on two, list panel with majority. [Text, 11]

S_BFE (line) BFE_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
ELEV Numeric elevation of BFE, from FIRM [Double, Prec. 13, Scale 2]
LEN_UNIT “FEET” in all cases. [Text, 20]
V_DATUM Vertical datum of panel. Listed on panel, and values must come from the V_DATUM field of the D_V_Datum table on page L-444 of Appendix L. [Text, 6]
SOURCE_CIT 11-digit FIRM panel number BFE is on. If on two, list panel with majority. [Text, 11]

S_ Base_Index (polygon) BASE_ID A unique feature number. Can be copied from FID field. [Text, 11]
FILENAME Name of DOQQ or other image file used as base map. [Text, 50]
BASE_DATE Date image was captured. For DOQQs can be found in header file. [Date]
SOURCE_CIT BASE1 or other abbreviation that corresponds to metadata [Text, 11]

S_FIRM_Pan (polygon) FIRM_ID A unique feature number. Can be copied from FID field. [Text, 11]
FIRM_PAN FIRM panel number. [Text, 11]
EFF_DATE Effective date on FIRM panel. [Date]
SCALE Scale of FIRM panel. If map scale on FIRM is 1” = 500’, then scale is 6000. Multiply feet by 12 to get true scale. [Text, 5]
SOURCE_CIT 11-digit FIRM panel number. [Text, 11]
BFE Shapefile Creation

From the line shapefile that was used for digitizing, use the Type field to select and export BFEs to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output merge file can be a geodatabase feature class, which allows for the use of an attribute domain drop-down for the SOURCE_CIT field. Use Arcmap editing tools to assign attributes to the fields shown in the preceding table. BFE lines are submitted in the S_BFE shapefile.

Cross-section Shapefile Creation
From the line shapefile that was used for digitizing, use the Type field to select and export cross-sections to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output, as with BFE, can be a geodatabase feature class. Attribute domains can be created for the XS_LTR, XS_ LN_TYP, WTR_NM (a list of stream names is available in the county FIS book) and SOURCE_CIT fields. Cross-section lines are submitted in the X_Xs shapefile.

One of the required deliverables relating to the base map (DOQQs in our case) is a “written certification that the digital data meet the minimum standards and specifications.” A text file with the following statement was created:

“This text file serves as written certification that the base map digital data meet the minimum standards and specifications in Guidelines and Specifications for Flood Hazard Mapping Partners Appendix K. On page K-42 (Section K.4.1.1) of that document it is written “The most common form of raster image map is the digital orthophoto, especially the standard Digital Orthophoto Quadrangle (DOQ) produced by the U.S. Geological Survey.” DOQQ’s were used as the base map for georeferencing scanned paper FIRMs and for visually locating features of interest.


Refer to the DOQQ Metadata and the Digital Orthophoto Standards. Appendix L is the primary document of interest.

Refer to the NHD website.

Retrieve the 1:24:000 NHD coverages to use as reference

FEMA flood documents in the black FEMA 3 ring binder

Arcmap editing and geodatabase manuals.

Mapping Activity Statement documents – be sure to understand all deliverables.




Summary Report
QA/QC Review Steps During Digital Conversion of Flood Insurance Rate Maps
Mapping Activity Statement 2003-02, West Virginia GIS Technical Center
Prepared 4/15/04

The following QA/QC checks were performed during the digital conversion of Flood Insurance Rate Maps by the West Virginia GIS Technical Center (WVGISTC):

1) Source Material Inspection
a) Visually reviewed scanned panels received in .tif format; compared with printed paper maps to check for completeness

2) Base Layer Compilation/Verification
a) Used a vector quarter quad index certified by WVGISTC to confirm that the USGS Digital Ortho Quarter Quads (DOQQs) were in the UTM NAD83 projection; DOQQS were used for the georegistration base map
b) Checked the spatial integrity of a county-wide ortho mosaic (used as a reference; obtained from the NRCS Geospatial Data Gateway

3) Georegistration of Scanned Panel Source Material
a) Ensured data were correctly referenced to the UTM coordinate system
i) Set Arcmap software data frame projection to UTM NAD83, Zone 17 or 18, as appropriate
ii) Georeferenced scanned panels to real-world coordinates using DOQQs to establish reference links
(1) The mean RMS value for warped panels was 5.63 meters (mapping units). This was the best attainable georeferencing that could be accomplished without stretching features and impacting length relationships
iii) Re-warped portions of scanned panels in areas of poor fit to attain a better visual real-world correlation
b) Checked that the scale of warped raster (.tif) and original paper maps were compatible
i) Plotted georeferenced FIRMS at the same scale as paper maps; conducted manual ruler measurements on paper map in comparison to plotted data to confirm accuracy of feature location and length relationships

4) Digitizing of Flood Features
a) Digitized SFHA, BFE, and cross section features from the georeferenced panels as line feature types
i) SFHAs and Floodways were digitized first; BFEs and Xsections were digitized next and BFEs were snapped to AE zone boundaries (Arcmap snapping tolerance set to 10 pixels)
ii) Streams and channel banks were partially digitized as additional reference features
b) Systematically visually scanned collected vectors and compared them with underlying georeferenced paper flood maps
i) Checked that character of features was maintained
ii) Checked that required features were collected
c) Edgematched features on adjacent panels
i) Checked that features were snapped seamlessly at panel boundaries

5) Spatial Adjustments
a) National Hydrography Dataset (NHD) vector stream centerlines were used to assist in identifying real-world (DOQQ) stream position
b) Proportional piecewise adjustments
i) Adjusted all features (SFHAs, BFEs, cross sections) in small sections of the floodplain when:
(1) the DOQQ stream was not located within the SFHA or
(2) there was a visibly constant difference between location of the DOQQ stream and location of the digitized stream
ii) Attempted to bring the digitized FIRM stream in line with the NHD stream or the stream on the ground, if it was visible on the DOQQ
iii) Used Arcmap editing functions such as line moving and rotating
c) Created a point shapefile to mark location of “mapping problems” as defined in the FEMA technical memo dated October 3, 2003. Examples of problems found:
i) Stream outside of SFHA
ii) Stream outside of floodway
iii) SFHA changes at political boundary

6) Topology
a) Used the ArcGIS geodatabase model and topology rules on SFHA and floodway line features
i) Corrected pseudo-nodes, dangles, and self-overlapping lines
b) Generated polygons from SFHA and floodway line features and used the ArcGIS geodatabase model and topology rules for polygons
i) Confirmed there were no polygon overlaps or gaps
ii) Removed sliver polygons

7) Feature Attribution
a) Reviewed technical memo and MAS to format the 5 required shapefiles (S_Base_Index, S_FIRM_Pan, S_Fld_Haz_ar, S_BFE, S_Xs)
i) Checked that file names, attribute names, types and sizes meet specs
b) Checked that correct attributes were assigned to digitized flood features
i) Completed a systematic visual scan of vector flood features overlaid with georeferenced panels; used symbology variation and labeling to confirm proper attributes had been applied
ii) Checked that valid domain values were used in attribute table columns

8) Map plot for final visual inspection and scale check


File Backup
Everything pertaining to the current flood mapping project should be backed up to Vesta. This includes warped panels, line shapefiles, and other reference documents.

A FEMA backup folder is set up at this location:


It is visible from the TechCenter network under Vesta and is shared openly. This is where all the files for a MAS in progress should be stored. Use sensible file and folder names to help everyone identify the pieces of the project.

A final backup of everything was kept in this location:


It is recommended that drawing shapefiles be backed up every time they are changed; a file versioning system may be preferable to overwriting the same file each time.

Naming Conventions/Path Structure
FEMA has requested that we name the metadata files in this format:


So, for example, the metadata files submitted for Jefferson County were named:


On the CD containing the final deliverable files, this is the requested structure:


The county name behind the first backslash will change for each countywide project completed and submitted. The Arcshape folder contains the S_Base_Index, S_FIRM_Pan, S_Xs, S_BFE, and S_FLD_Haz_Ar shapefiles, plus the problem shapefile. The Ortho_photos subdirectory contains the DOQQs or other imagery used for the base map. The document subfolder contains the metadata, QA/QC report, and base map certification. I made subfolders for each of those items under the document folder. The RFIRM folder contains all the georeferenced panels.

* Digital conversion of Flood Insurance Rate Maps (FIRMs)
* WIKI: Edit Lock Schema

Written by Harsh

July 7th, 2005 at 10:03 am


without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

  • I am getting a ‘jsForm.htm not found’ error? If you are using Internet Explorer, first make sure you have the latest version of that browser. Then remove the Arcims site from your browser favorites, reopen the browser and try again.

  • How do I import Arcims maps inside ESRI Arcmap? If you have Arcmap 9.x, you can import Arcims maps by connecting to the services of an Arcims server. In Arccatalog 9.x, simply click on ‘GIS Servers’ to add the Arcims server and type-in its URL. Note that this does lead to a noticeable performance drop.

  • How do I accurately rescale the map when that functionality is provided? True scale depends on monitor resolution, the default being 96 DPI (Dots Per Inch). To make sure that your monitor is configured correctly, for MS Windows, check Display Properties–>Settings–>Advanced–>General. Note that when the map is rescaled to, say 1:12000, 1 inch on the map should represent 12,000 inches. Also note that you can use the Esc button on your keyboard to stop the map from rescaling at any time. Refer to Map Scales for related information.

  • I click on the print button but nothing happens? Make sure pop-ups are allowed for your Arcims site, then try the Print Tool again.

* ESRI Support
* WIKI: Edit Lock Schema

Written by Harsh

January 6th, 2005 at 1:43 pm

Posted in Education,GIS,IMS,Technology,Web

Tagged with , ,

Digital conversion of Flood Insurance Rate Maps (DFIRMs): Summary

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

My notes reflect procedural changes brought about by the integration of DFIRM Production Tools.


  • Request jurisdiction(s) for existing geodata like new political boundaries and road names for use as base map. Base map geodata must NOT be older than 7 years.
  • Request GEOPOP from the MOD team and use it to create an empty DFIRM geodatabase. Use existing political boundaries for its geographic extent.
  • GEOPOP creates 3 table types- S (Spatial), L (Lookup) and D (Domain). Edit the main lookup tables:

L_COMM_INFO (community information)
L_SOURCE_CIT (source citation)
L_WTR_NM (hydrographic feature information- stream names etc)
L_STN_START (properties of starting points for stream distance measurements)

  • Create panel index and data catalogs
  • Georeference, scan and rectify geodata at its recommended scale to capture required floodplain features. Refer to FEMA MSC for full-sized PDFs of FIRM panels.


* HEC-RAS Online Help
* WIKI: Edit Lock Schema

Written by Harsh

December 27th, 2004 at 8:12 pm