[ Ø ] Harsh Prakash – GIS Blog

Quiet Musings On Applied Spatial (Health, Disaster, Technology, Planning et al.)

Archive for the ‘GIS’ Category

Esri WebGIS Platform

without comments

Customer Need, Deployment Option, Authority to Operate (ATO), Challenges, Solutions, Lessons

Esri WebGIS Platform

Written by Harsh

September 13th, 2017 at 4:11 pm

Posted in GIS,Health,Management,Technology

Tagged with , , ,

Esri in AWS Cloud

without comments

Background, Options, Opportunities

Esri in AWS Cloud

Written by Harsh

December 17th, 2016 at 11:48 pm

Posted in GIS,Management,Programming,Technology

Tagged with , ,

Geodata Based Decisions

without comments

How to Use Location Analytics to Improve the Effectiveness of Public-Facing Sites

Geodata Based Decisions

Written by Harsh

March 17th, 2016 at 11:47 pm

HowTo: Run ‘ArcGIS for Server Advanced Enterprise’ (10.3.1) on Amazon EC2 Red Hat Enterprise Linux (7)

without comments

The talks on ArcGIS Server at ESRI Health GIS were fun, but I wanted more – specifically, to install and administer its latest release on Amazon Web Services, all via the trusted command line. Here’s how I did that:

To follow along, get an EDN license and an AWS account. Especially, if you have been in the industry for long, there’s no good excuse to not have those with the biggest companies in GIS and da Cloud (and while you are at it, get MapBox and CartoDB accounts too).


### Setup the stage ###
# Downloaded its AWS key from //aws.amazon.com/console/ and connected to my instance (ensured it matched the min. system requirements) using its public DNS (if you restart your instance, this will change). Note I SSHed using Cygwin instead of PuTTy.
$ ssh -i "key.pem" ec2-user@#.#.#.#.compute.amazonaws.com
$ cat /etc/redhat-release
> Red Hat Enterprise Linux Server release 7.1 (Maipo) # Even though I used RHEL-7.0_HVM_GA-20141017-x86_64-1-Hourly2-GP2 by Red Hat (I later found out that ESRI provides its own AMI)
$ sudo yum upgrade
$ sudo yum update
$ sudo yum install emacs # For that college-dorm smell, no offense Nano/Vi
$ sudo emacs ~/.bashrc
    force_color_prompt=yes # If you haven't already... (Ignored the embedded rant and uncommented this line to make the prompt colored so it was easier to read in-between)

### Setup the instance ###
# I used a M4.LARGE instance with a 20GB EBS volume (in the same Availability Zone, of course) - ensured it didn't go away if I were to terminate the instance. Then, I extended the partition to exceed the min. space requirements (took a snapshot first) - unfortunately, AWS docs didn't help much with that.
$ df -h
> ...
$ lsblk # Listed block partitions attached to the device. Since there was a gap in sizes between the partition and the device (and there were no other partitions), I resized the child partition "XVDA2" (the root file system where I would finally install ArcGIS Server) to use up the surplus space on its parent disk "XVDA".
> NAME SIZE TYPE MOUNTPOINT
> xvda 20G disk
> |_xvda2 6G part /
# First, updated its metadata in the partition table
$ sudo yum install gdisk # Since disk label was GPT
$ sudo gdisk /dev/xvda/
$     print # Noted the start sector
$     delete
$     new
$     #### # Used the same start sector so that data is preserved
$     \r # For the max. last sector
$     # # Used the same partition code
$     print
$     write
$     y
# Next, updated the actual XFS file system
$ sudo xfs_growfs / # This is the actual change for XFS. If 'df -T' reveals the older EXT4, use 'resize2fs'.
# Then, confirmed to see if the boot sector was present so that stop-start will work
$ sudo file -s /dev/xvda # Bootloader
# Finally, rebooted the instance to reflect the new size
$ sudo reboot

### Onto GIStuff ###
# WinSCPed and untarred the fresh-off-the-press 1GB release
$ tar -xvf ArcGIS_for_server_linux_1031_145870.gz
# Got the right ECP#########?
$ ./Setup # Started headless installation - try "--verbose" if you run into other issues
# Hit a diagnostics roadblock: File handle limits for the install user were required to be set to 65535 and the number of processes limits to 25059. So...
$ sudo emacs /etc/security/limits.conf
$     ec2-user soft nofile 65535
$     ec2-user hard nofile 65535
$     ec2-user soft nproc 25059
$     ec2-user hard nproc 25059
# Logged out, logged back in, verified
$ ulimit -Hn -Hu
$ ulimit -Sn -Su
$ ./Setup

### Authorize, authorize, authorize! ###
# Created and uploaded authorization.txt, and downloaded authorization.ecp from //my.esri.com/ -> "My Organization" -> "Licensing" -> "Secure Site Operations"
$ locate -i authorization.ecp
$ readlink -f authorization.ecp
$ ./authorizeSoftware -f /path/authorization.ecp
$ ./authorizeSoftware -s # s=status, not silent
$ ./startserver.sh
$ netstat -lnp | grep "6080" # Confirmed owned processes - that it was listening on the default TCP@6080 (port is only required if you don't have the Web Adapter)
# Ensured IP and domain were listed correctly in the hosts file (e.g. Single IP may be mapped to multiple hosts, both IPv4 and IPv6 may be mapped to a single host, etc.)
$ hostname
$ emacs /etc/hosts
$     127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
$     ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
$     #.#.#.# localhost localhost.localdomain localhost4 localhost4.localdomain4
# But wait, before I could browse to my site from a public browser, I needed to add this Inbound Rule to the Security Group attached to the instance
Custom TCP rule TCP 6080 0.0.0.0/0

### Browser ahoy! ###
//#.#.#.# or machinename:6080/arcgis/manager
ArcGIS Server Setup Wizard -> Create New Site
Primary Site Administrator -> Create Account # Stored with the site, not the OS
# Must be local and accessible from every machine in your site
    Root Server Directory: /home/username/arcgis/server/usr/directories # To store output images, etc.
    Configuration Store: /home/username/arcgis/server/usr/config-store # To hold info about the server's machines, services, directories, etc.
# This is when I ran into "0x80040154 - Could not create object 'ConfigurationFactory'". So, went digging through the logs...
$ cat /home/ec2-user/arcgis/server/usr/logs/EC2/server/server-...log
> ...
> Cluster 'default' successfully created.
> Failed to create the site. com.esri.arcgis.discovery.servicelib.AGSException: java.lang.Exception: AutomationException: 0x80040154 - Could not create object 'ConfigurationFactory'.
> Disconnecting the site from the configuration store.
...
# Back to the server: File/directory permission issue? Nope. The issue turned out to be missing packages, even though the pre-installation dependencies check had passed. All 15 listed below:
$ sudo yum list installed
$ sudo yum install wget
$ wget http://vault.centos.org/6.2/os/x86_64/Packages/xorg-x11-server-Xvfb-1.10.4-6.el6.x86_64.rpm
$ sudo yum localinstall xorg-x11-server-Xvfb-1.10.4-6.el6.x86_64.rpm
sudo yum install Xvfb # Else "Unable to start Xvfb on any port in the range 6600-6619"
sudo yum install freetype
sudo yum install fontconfig
sudo yum install mesa-libGL
sudo yum install mesa-libGLU
sudo yum install redhat-lsb
sudo yum install glibc
sudo yum install libXtst
sudo yum install libXext
sudo yum install libX11
sudo yum install libXi
sudo yum install libXdmcp
sudo yum install libXrender
sudo yum install libXau
# Cleanliness is next to godliness, or so my Catholic school nuns would say
$ sudo yum clean all
$ cd /tmp/
$ sudo rm -r *
$ logout

### Back to the browser ###
//#.#.#.#:6080/arcgis/manager/
# At the end, added SSL using a self-signed certificate
//#.#.#.#:6080/arcgis/admin/
Custom TCP rule TCP 6443 0.0.0.0/0 # Added this rule to the group on AWS first

### Uninstall? ###
$ ./stopserver.sh
$ ./uninstall_ArcGISServer
# rm folders after done

Conclusion: 6443 or 8443?

After years of doing this with first ESRI (PROD), then MapServer (PROD) and GeoServer (DEV), I went back to the dark ahem ESRI side. And what do I keep finding? That the big two are blending together in terms of looks. E.g. The console of the other Java-powered mapping server, GeoServer, is looking similar to that of its big brother on-steroids. The third, MapServer, somewhat paradoxically on the other hand, has both come a long way (MapCache and ScribeUI, yay!) and still lost ground.

Next up, testing Tippecanoe.

PS:
* I tried both 10.3.1 and 10.0 on Ubuntu (15.04), unsupported. While both installed, site creation didn’t work because of missing packages – searching through apt-cache didn’t help either. On Windows, there is always their CloudBuilder.

Related:
* GeoNet
* Landsat on AWS in ArcGIS

Written by Harsh

September 28th, 2015 at 7:43 pm

Posted in GIS,IMS,Programming

Tagged with ,

#HealthGIS: Notable links and final thoughts on the conference

without comments

Health websites using ESRI ++

    * With ArcGIS JavaScript
        • CDC’s Division for Heart Disease and Stroke Prevention (DHDSP) Atlas

    * With ArcGIS Server / ArcGIS Online (via Apache Flex)
        • HealthLandscape’s Accountable Care Organization (ACO) Explorer
        • Dartmouth’s Atlas (try generate KML)
        • NMQF’s Methicillin-resistant Staphylococcus aureus (MRSA) mapping
        • HRSA’s Datawarehouse

Health websites whose global participants have trouble with software licenses ++

    * With OpenLayers and DHIS2 (~ an opensource InstantAtlas)
        • PEPFAR’s (a president’s best legacy) Data for Accountability, Transparency and Impact (DATIM) – coming soon to GeoServer + MapLoom and OpenLayers

    * Even Highmaps(!)
        • NCHS’s Health Indicators Warehouse (HIW)

    * Many More…

Dots

Clearly, there’s no shortage of health data or technologies, esp. following ACA’s requirements of uniform data collection standards, just a continuing kerfuffle with overlaying disparate JSON/OGC tiles from their many data owners and manifold service endpoints. Unfortunately, only part of this problem is technical. Take Flu mapping, for instance. CDC, WHO, WebMD (with MapBox) and Google, even Walgreens does it. Or take HIV mapping where you can choose from CDC and NMQF, among others. Even anonymized private claims data is available for a couple of Ks a month. I think a bigger part of the problem is the misalignment between vendors’ business interests and mandates of various agencies and goals of the health research community at large.

Connect

At some point, researchers and epidemiologists would want to see how these data tiles correlate to each other. And GIS professionals would want a quicker way to ‘overlay this layer’ with out having to dig through Firebug. And compress it over the wire, while you are at it (when our users in remote Africa were asked to switch off their smartphones to view desktop maps, we understood data compression a little differently).

Crunch

And then they would want to analyze them, be it on the server with Big Data or in the client with smaller ones. On analyses, your favorite GIS continues to take heat from tools like Tableau among conference attendees.




Mapping Visible Human with Deep Zoom

Overall, a growing use of ArcGIS Server’s publisher functionalities and a compelling body of story map templates leveraging its narrative text capabilities. E.g. Atlas for Geographic Variation within Medicare. On publishing, I suspect some researchers would like to see a Mapbox plugin for QGIS. Yes, you can render and uploads maps from TileMill to your Mapbox account, but CartoDB has QgisCartoDB where you can view, create, edit or delete data from QGIS to your CartoDB account (I needn’t add that Python-powered QGIS remains a favorite among matplotlib-loving researchers).

health.w800PS: My ranking of how easy it is to connect to federal health datasets –
1. CDC (E.g. NCHS, Wonder, Health Indicators)
2. CMS (E.g. DNAV, Medicare – try Hospital Compare – Info, Spreadsheet, JSON)
3. HRSA (E.g. Datawarehouse).

Related:
* CDC’s GIS Resources
* CDC’s Submit Maps
* Hospital Referral Region (HRR) – A regional market area for tertiary medical care
* Health Savings Account (HSA) – A tax-advantaged medical savings account available to some taxpayers

++ While log analyses attest that mono-themed web maps provide a better user experience, given the nature of health data and the costs behind spinning off another mapp (yup, blended words to make a portmanteau), sometimes you just have to combine themes.

Written by Harsh

September 21st, 2015 at 8:10 pm

How We Balanced Proprietary With Opensource Software And Saved Tax Dollars, And You Can Too

without comments

It all began with a question – “Can we do with out?”.

GIS@NIH

Enterprise Architecture > Technology Architecture > Geographic Information System (GIS):
* Geographic Information System (GIS) Pattern
* GIS Desktop Brick
* GIS Virtual Globe Brick
* GIS IMS Brick
* GIS Web Service Brick

Related:
* GIS Market Study of Internet Mapping Server (IMS) – Summary – Requirements and Comparison Matrix (2006)

Meanwhile, Thirteen Years Later…

without comments

So, does it hold up?

The Map (GIS Growth Study) v. The Thing Mapped (Demographics, Plan)

PS: I smell a decentralist –

“A Caveat (from 2001)

Such a planning methodology of data collection and projection does have some intrinsic faults: it relies heavily on knowledge-based skills. It assumes that ‘correct solutions’ to social problems can be obtained from a scientific analysis of various data. It must be noted that a solution-driven approach and heavy reliance on physical sciences as opposed to social sciences, is inherently inaccurate since the ‘best planning answer’ is a non-existent variable, changing with time, society, culture, resource availability, etc. And there is always a danger of being consumed by this technique, and confusing the result for a solution.

The nature of this study involved making some basic assumptions about the way our study-area could evolve in the not-so-distant future. There have been doubts raised about the correctness of such a clinical technique wherein an urban settlement is ‘stripped’ of its various attributes, and these attributes then individually graded. Appreciation of the intricate complexity of human society, where each individual is a separate factor, is absent. Lack of importance to these inter-relationships is a flaw of such an analysis.

For E.g. In the current study, if we were to discover one other attribute, say a desert, how would it affect the final map? We would, using this approach, simply grade each cell one more time. Then we would add this new map to our list of maps, and calculate the new final map. However, we would fail to evaluate how the addition of a desert affects each of the other attributes individually.

But this flaw may not be as aggravated as it seems. Each cell gains its final value from all attributes. If in a hypothetical case, one could gather a ‘complete list of attributes’ that would impact future growth, and assign them ‘correct values’ (without even breaking them into distance-bands which are only for convenience), finally adding them in the ‘right equation’, one would come up with a case-specific fairly accurate growth forecast (however, even then, any sudden future changes would still get missed).

There have also been some other approximations:

* The integer weights assigned to attributes.

* Or, areas outside the study-area that exert significant impact on urban growth, but were ignored because of study limitations.

* Also, on examining the Cultural Points table, it is found that Cemetery was included as a row category. Cultural Points have been considered as having positive influence on future growth. But a cemetery would not have an entirely positive influence on urban growth. Furthermore, parts of UVA were used as cultural points. The university was also used as a major employer. Thus, there has been some overlapping. This results in disproportionate values for some cells.

But this study is an illustration more of a proactive planning approach, than an accurate projection of urban growth for an area. And even though limited in its effectiveness, any attempt to administer planning remedies would have to include some such non-arbitrary problem-solving technique.”

Conference Presentation: GIS TECH 201 – Mapping Mashups

without comments

New Media

without comments

Written by Harsh

June 25th, 2011 at 12:28 pm

Verizon iPhone or iNot?

without comments

Back in the summer of 2010, as one of the million proud owners of iPhone 4, I noticed a certain setting to switch phone carrier. That setting then portended the change we will see tomorrow. But should you bite the bait? Assuming CDMA and GSM don’t matter, here’s part 1 of my guide:

There is a lot of spin around Apple’s flagship cash cow, or as we have come to know it- the iPhone, which only recently represented about 43% of its overall sales. Not all of the coverage is positive (remember Foxconn?). Apple’s growing pains also include a big lawsuit fight. But for those with out a blind searing faith in Steve Jobs, the genius patriarch, the iPhone may very well be suffocating. If true, could Jobs be repeating his original sin? And if so, should your phone follow his sin to the grave?

iOS works better than Android out-of-the-box. To better understand the genesis of its famed usability and cool minimalism, watch Jobs’ 2005 Stanford Commencement Address. If you decide to switch, be prepared to shell out monies in cool apps and media. From a quick glance, I paid around $750 over 2 years. To Apple. Not AT&T (that averaged around $2,400 for the same time). And remember that MP3s from Amazon, somethings you can’t buy on your iPhone, tend to be less expensive and redownloadable – a big plus for some. And all that precious data would cost even more to put into MobileMe, Apple’s own cloud solution, never mind the naysayers. So more additions to your ever burgeoning monthly bill (Tethering, Personal Hotspot, …).

iPhone’s Mythical Advantage: Apps

Apple still disallows Adobe Flash (or Oracle Java) from iOS. It appears to be more a business decision than a technology constraint, designed to control the sprawl of Flash-based gaming mobile websites where you could buy outside of Apple’s walled-garden. How this affects HTML5 gaming websites is still unfolding, but it certainly helps the lagging QuickTime in the meantime. In any case, it goes against the customer’s best interests by taking away her choice to enjoy multimedia content in one of the industry’s most prolific formats. But Apple has you covered with the most commonly used app: the browser. Mobile Safari, hands down is the best mobile browser out there between the platforms that I tested, namely iOS, Android and Windows Mobile. For the GIS pros among you, Joben blogs about GIS apps for the iPhone. You can always find an increasing number at the App Store, like the iGIS.

Jailbreaking Folsom

So you switch and finally get that toy you were waiting for? Why jailbreak it? Jailbreaking the iPhone isn’t worth the effort, even if it is legal. And even if not upgrading to the latest and greatest release (something that iTunes would handle seamlessly for you, but something that you can’t always do with Cydia because Cydia often trots a step behind) is an acceptable risk, ask yourself if your precious data is too important to jailbreak. After all, you could brick your iPhone and quite possibly provide no way for iTunes to restore it. But if your phone data is not critical ahem, then you can add some developer functionalities by jailbreaking and escape the infamous iTunes bloat. Now jailbreaking could also introduce your spanking iOS to new viruses, but if you must, hope over to Cydia. If you need a copy of the old firmware during jailbreak, grab it from here. Once you jailbreak, remember to download a file browser or explorer, like iFunBox or iPhoneBrowser. You may also want to jailbreak if you want to install a phone firewall out of privacy concerns. After all, Apple did confess to collecting GPS data from iOS 3 and iOS 4 daily. Then again, if that is what propels you, why share your payment info with Cydia’s marketplace (just asking)?

Some quick notes on iFunBox or iPhoneBrowser – You can’t watch your uploaded pics or videos, or play your uploaded songs in their native app, even if you upload them to the folders that the iPhone looks under, say //var/mobile/Media/DCIM/100APPLE/. This is because the iPhone, much like the Android, extensively uses SQLite as its Swiss Army database, and all your uploads need to be first registered in the database, say //private/var/mobile/Media/PhotoData/Photos.sqlite which links your IMG_0001.JPG or IMG_0002.MOV. Now there are Cydia apps like iFile that help add your photos, but videos are still no go. But if you are brave enough to try, download the SQLite Manager add-on for Firefox and test your luck.

PS: More

Written by Harsh

February 9th, 2011 at 7:44 pm

Mashup on iPad

with 7 comments

OK, so tested Google, Bing, Yahoo, ESRI, Openlayers and MapServer mashups on the iPad, and much like on the iPhone, the slippy drag-and-droll interface doesn’t work. Except for one mashup. Take a guess?

Related:
* Safari
* WebKit

Written by Harsh

April 15th, 2010 at 10:50 pm

Webinar Series: GIS TECH 101 – Mapping Mashups

with 6 comments

Technology Division of the American Planning Association (APA) Awards for 2010

with 2 comments

Category 1: The award for the ‘Best Use of Technology to Improve a Plan or Planning Process’ goes to Marc Schlossberg‘s (University of Oregon) ‘Engaging Citizens in Active Transportation Planning with Mobile GIS‘ for its creative use of technology in improving planning processes.

Category 2: The award for the ‘Best Use of Technology for Public Participation’ goes to Michael Baker Jr.‘s ‘More For 1604 Social Media Program‘ for its good use of technology to enhance public involvement and participation in planning and decision making processes.

Category 3: The award for the ‘Best Use of Technology for a University Urban and Regional Planning Program’ goes to the School of Policy Planning and Development‘s (University of Southern California) ‘Multimedia Boot Camps‘ for its effective use of teaching with technology in preparing future planners for professional work.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Monday, April the 12th (7 AM) in the Hilton New Orleans Trafalgar Room. Congratulations again to all our award winners!

Related:
* Technology Division of APA
* Planning & Technology Today

Written by Harsh

March 30th, 2010 at 3:48 pm

Interview: “Geographic Information Systems (GIS) – It’s Much More Than Google Maps – A Chat With GIS Experts”

without comments

Written by Harsh

March 18th, 2010 at 9:27 am

Neogeography 101: Word Association

without comments

Question:
‘Genre Books’ is to ‘Writer’
as
‘Web Maps’ is to …?

Choices:
• [a] iPhone […since the buzz is about it- the Paris Hilton of the technorati]
• [b] Paris Hilton […since the buzz is about her- the iPhone of the glitterati]
• [c] Geographer […since ESRI Press said so]
• [d] Programmer/Developer

Answer:
• If you answered [c], you have spent a lot of time around ESRI-championed web maps with 8 direction tags, a dogged insistence on not exploiting browser cache and a ridiculous north arrow on every map- never mind that so far no one has turned a browser upside down.

–π

Related:
• A Rose by Any Other Name
• Web Mapping
• The New Yorker

Written by Harsh

July 7th, 2007 at 11:30 am

Posted in Geography,GIS,Service

Tagged with , , ,

My Pick of FOSS4G 2007 Presentation Submissions

with one comment

An impressive summary of presentations, but my professional favorite would be ‘IBM DB2 Express-C: A Free Database for Open Source Spatial and XML Development’. Although something tells me that something else might be the crowd favorite.

Pi: Quiet Musing

On DB2 Express-C: It went free soon after its counter-weights Oracle XE and SQL Server XE last year, but its press “news” release has not found its way into major SIS publications. DB2’s continued advancements in the free spatial database market could only make things tighter for PostgreSQL+PostGIS.

–π

Related:
• Free and Open Source Software for Geospatial [FOSS4G] 2007
• ‘DB2 Express-C, the developer-friendly alternative’
• ‘Oracle XE and Geospatial Information Systems: An Interview with
Dennis Wuthrich of Farallon Geographics’

Written by Harsh

May 5th, 2007 at 11:12 am

Posted in GIS,OSGeo

Tagged with ,

Elite Systems Research Institute, Inc. [ESRI] et al

with 2 comments

This GCN article titled ‘Geospatial and the elite: Old-school geographic information systems still dig deep on mapping and analyses’ points to a tortuous debate within the traditional GIS industry, and the new industry push to remodel itself as solely an “enterprise class” industry while it continues to loose ground to an increasing domestication or democratization of GIS services.

Pi: Quiet Musing
ESRI: Elitist or Commonplace?

But this new industry push is not without some strategy confusion as old-school GIS faces its mid-life identity crisis without the “cool factor” spouse.

–π

Related:
• More

Written by Harsh

April 22nd, 2007 at 8:49 pm

Posted in GIS,Mashup

Tagged with ,

Google Earth [GE] @ Work

with one comment

This week I had the opportunity to listen to the Google Guys. Having earlier missed a similar opportunity for Jack Dangermond due to schedule conflicts, I made sure I was present at this seminar.

Pi: Quiet MusingOn display were the GE Enterprise solutions- Fusion, Server and Enterprise Client. With GE Enterprise, you can sign into multiple servers, grab the most accurate data from each and roll everything into one seamless experience. You may even squeeze your private globe onto a pocket-sized device and strut it out on a field. For a private domain, GE Enterprise can scale upto a healthy 250 concurrent users, or a little less than those supported by a default PostgreSQL 8.X on Windows.

One astounding statistic quoted was the vast number of users GE has been able to accumulate over its short life- approximately 200 million; reportedly many more than those by Google Maps, with nearly 80% for casual uses. And a surprising number, or so we are told, falls in the 45+ age group.

Approximations aside, here’s my take:

When you try to fathom the 200 million number, you are reminded yet again how ESRI, Intergraph, MapInfo, Autodesk et al, poorly missed the globe software bandwagon. And the traditional SIS companies still do not have a clear winner when it comes to 3D buildings and surface textures, despite counting 3DS Max and Maya. All that information is what users now expect from any cutting-edge globe software.

From the looks of it and the high-end price tag of over $100,000, Google has smelled blood- the fat inside some governments; ESRI and Intergraph can attest to that. If Google succeeds in this aggressive push, the traditional SIS companies will cede further into the background on data visualization; they are anyway planted firmly in the backseat with regards to a lot of casual uses.

So when you combine this push with GE user groups, the KML offer to OGC, KML-based searchesPi: Quiet Musing and other enterprise solutions, then you can see why some traditions may be feeling nervous. Add to that the general perception about Google’s speed-of-innovation- ‘when you use a Google product, Google would innovate faster than the traditional SIS companies to support it’.

As I see it, that growing perception should be the biggest reason for the traditional industry’s nervousness.

–π

Related:
• Application: PortlandMaps
• Ogle Earth
• More

Written by Harsh

February 28th, 2007 at 10:17 pm

Posted in GIS,Virtual Globe

Tagged with ,

Follow Up [1]: ESRI Ketchup!

without comments

Following on the heels of E2, Google recently consolidated GE’s usergroups through some interesting collaborations with Wikipedia and Panoramio. These follow earlier deals with UNEP, NASA, USGS, ESA, Discovery, National Geographic et al.

These steps slowly push one other software- ESRI’s ArcGlobe, part of the ArcGIS 3D Analyst extension, further away from all that is important. ArcGlobe was useful in that it eventually led to E2, but ESRI had much bigger plans- it was promoted to become widely adopted for 3D data mapping and visualization.

Then Google came along, and ArcGlobe and all the shabby flyby animations and painstaking multipatches in ArcScene, also part of 3D Analyst, suddenly became embarrassing.

That leads me to my prediction of the week: all this will force ESRI to either lower the inflation-adjusted cost of its pricey 3D Analyst- currently marked at $2500, or absorb some of it into E2 or the desktop. Note that Google Earth Pro today costs a fraction at $400.

Pi: Quiet Musing
Fortius One‘s GeoIQ: A free simple Spatial Analyst?

–π

Related:
• ArcGIS Extensions
• More via Google Earth Links
• More

Written by Harsh

December 16th, 2006 at 10:01 pm

Posted in GIS,Mashup

Tagged with ,

ESRI Ketchup!

with one comment

After months of wild speculations and foot-dragging, ESRI finally released ArcGIS Explorer– twice as big as Google Earth and a shade shy. Here is why:

Google Earth [googleearth.exe]
+ Searches better
– Does not offer native support for popular spatial data types

ESRI ArcGIS Explorer [E2.exe]
+ Offers native support for popular spatial data types
– Clunkier navigation and interface

• Both show comparable spatial data displays and memory usages. I am pleasantly surprised by how consenting NASA of World Wind fame, has been to all such uses, given the murky legal waters of the future when others start using this precedent to demand equal treatment.

Pi: Quiet Musing
ESRI ArcGIS Explorer: Adding content

Being true to the misplaced compulsions of most commercial companies, ESRI only lets you export your layers in E2’s markup language [*.nmf]. However, to piggy-back on the growing user community around GE and because ESRI has no current alternative to Google SketchUp, E2 allows you to import *.kml and *.kmz files. GE, on the other hand, also imports *.gpz and *.loc GPS files in its commerical flavor.

E2 can also create geoprocessing tasks, and styles and symbologies; export identification results; display attribute tables.

So what is the bottom-line: GE is better suited for consumers of spatial data, while E2 is targeted more at the creators and editors. And how close does E2 come to following the “if you are late, you better be better” mantra? Not quite, but then again, it is just a beta.

Now the waiting game begins for arguably the most innovative internet company in recent times, notwithstanding the acquired nature of GE and SketchUp- Google, to hit back after losing ground to Yahoo Maps– better driving directions planning, and Microsoft Virtual Earth– ability to add and save shapes, and browser-based GE-esque 3D and street level views.

–π

PS:
•
I wonder how the good folks at Arc2Earth and Shape2Earth would maintain their rates of innovation in response?

Related:
• ArcGIS Explorer Overview Podcast
• ArcGIS Online Services
• Server Object Manager [SOM] Setup
• Sample *.nmf containing 1 point feature derived from feature class [e2.shp] in GCS_North_American_1983 coordinate system
• TerrainView
• Follow Up [4]: Graphic Software
• Follow Up [2]: Map Viewer and Google

Written by Harsh

November 29th, 2006 at 10:04 pm

Posted in GIS,Mashup

Tagged with ,

Follow Up [4]: Graphic Software

with 2 comments

Yet more evidence of acceptance of Google Maps and through it, of spatial relevance, by established publications:

• A Guide to Commuting and Readers’ Stories
• How Much Is Gas In Jersey?

In a related development, Microsoft continues to play catch-up with Google by acquiring GeoTango. However, with its “3D Internet Visualization- a truly open and web services-oriented solution”, GeoTango may just be the partner Microsoft needs for a tango.

–π

Related:
• ESRI ArcWeb Services
• NASA World Wind

Written by Harsh

December 28th, 2005 at 6:00 pm

Posted in GIS,Mashup

Tagged with , ,

Links

without comments

It’s time to move these to del.icio.us:

• http://labs.google.com/ Google’s showcase
• http://next.yahoo.com/ Yahoo’s showcase
• http://research.microsoft.com/ Microsoft Research

• http://geoportal.kgs.ku.edu/googlemaps/ks_gm.cfm SDE+GMap
• http://traffic.poly9.com/ Traffic, weather and news glues for Google Maps

• http://opensource.nokia.com/ Nokia in opensource WAP

• http://www.webstyleguide.com/ Web style guide
• http://jibbering.com/faq/ comp.lang.javascript FAQ

• http://robin.sourceforge.net/ Browser-based desktop
• http://www.writely.com/ Browser-based word processor
• http://www.ktdms.com/ Document management system
• http://www.openfiler.org/ Browser-based network storage software distribution
• http://www.debugmode.com/wink/ Tutorial and presentation creation software

• http://www.lexisnexis.com/sourcelists/ Legal and public records
• http://www.issues2000.org/ Candidates on issues

• http://senseable.mit.edu/grazrealtime/ Mobile Landscape
• http://www.cbsnews.com/stories/2005/10/08/tech/main927858.shtml “Could cell phones stop traffic?”

–π

Written by Harsh

November 7th, 2005 at 6:02 pm

Posted in GIS,Web

Tagged with , ,

Follow Up [3]: Graphic Software

with 3 comments

This week Yahoo released its own take on online mapping. Its new service includes both Flash and AJAX APIs coupled with the ability to geocode.

If you think about it, sooner or later this had to happen- developers finally mustering the courage to embrace arty Macromedia Flash for distributing spatial information in a big way, like Geocentric. Actually, Google has been using Flash for a different distribution for quite some time now. But this release by Yahoo and its under-1000 dollar price-tag should help Flash emerge as a more visible player in the online mapping game.

Did the earlier musings portend this?

–π

Related:
• Yahoo Developer Network
• GeoCool! Tutorial
• Google Local, MSN Virtual Earth, Amazon A9, AOL MapQuest
• Application: Google Earth
• Discussion Forum

Written by Harsh

November 3rd, 2005 at 6:32 pm

Posted in GIS,Mashup

Tagged with , ,

Digital conversion of Flood Insurance Rate Maps (DFIRMs): White Paper

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

Parent document copied with permission from the original white paper at the GIS Technical Center. The objective was to add notes reflecting procedural changes brought about by the integration of CITRIX WISE Tools. The initial notes were created during a 2005 DFIRM Production.

INTRODUCTION

In August 2003, the GIS Technical Center (WVGISTC) became a Cooperating Technical Partner with the Federal Emergency Management Agency. Our mission, to create digital flood themes from paper Flood Insurance Rate Map (FIRM) and Floodway Boundary and Floodway Map (FBFM) panels and to deliver the data in specified formats and with appropriate documentation. FEMA prepares Mapping Activity Statements (MAS) that outline the scope of work and deliverables for each county-based project. Final products are primarily seamless, countywide geospatial data files in the ESRI shapefile format, along with associated metadata.

According to FEMA (Michael Craghan, pers. comm.), the final vector products will have the following qualities:

1. A seamless county-wide dataset, with no gaps or overlaps
2. The lines and polygons end up in their real-world locations
3. There is no scale distortion (i.e. spatial relationships are maintained; if paper map is 1”=500’, digital version should be too).

The FIRM/FBFM features collected by WVGISTC are:

1. Base Flood Elevations (BFE-lines)
2. Cross Sections (Xsection-lines)
3. Flood Hazard Areas (polygons in final format)

The current Mapping Activity Statement for conversion of Jefferson and Berkeley counties specifies these deliverables:

1. Written certification that the digital base data meet the minimum standards and specifications.
2. DFIRM database and mapping files, prepared in accordance with the requirements in Guidelines and Specifications for Flood Hazard Mapping Partners (see references for citation); (S_ Base_Index, S_Fld_Haz_Ar, S_BFE, S_XS, S_FIRM_Pan).
3. Metadata files describing the DFIRM data, including all required information shown in Guidelines and Specifications for Flood Hazard Mapping Partners.
4. Printed work maps showing the 1- and 0.2-percent-annual-chance floodplain boundary delineations, regulatory floodway boundary delineations, cross sections, and BFEs at a scale of 1:100,000 or larger.
5. A Summary Report that describes and provides the results of all automated or manual QA/QC review steps taken during the preparation of the DFIRM.
6. An ESRI shape file showing points where mapping problems are discovered during the digitizing process.

The following sections describe the procedures we follow to (1) prepare the base material for digitizing, (2) digitize features, (3) perform quality control, and (4) prepare final files using ESRI Arcmap 8.x software. This document assumes the user is skilled with ESRI Arcmap 8.x GIS software and has the ability to use reference materials. For help using ESRI Arcmap consult the help files or ESRI on-line support.

DATA COLLECTION PROCEDURES

Source Material (Source Material Inspection)
In the MAS cost estimation phase it is advantageous to become familiar with the FIRM and FBFM panels that cover the geographic extent of the county. In the back of our FEMA binder, there are 3 CDs with scanned panels for 10 high priority counties. The scanned or paper FIRM and FBFM panels should be visually inspected to check for insets and other format issues that may impact the amount of time it takes to digitize and attribute. At the on-line FEMA Flood Map Store search for FEMA issued flood maps. Follow the prompts for state, county, and community. This is one way to become familiar with the number of panels in a county and also to gather information on the effective date. The effective date on-line may be compared to the effective date on the paper panels to determine if we have the newest source. This is important because FEMA may have done some digital conversion in the counties we are digitizing; in Berkeley County, for instance, 2 of the panels were available in a digital CAD format. We received the CAD files (DLG) and copied the line vectors into our Arcmap project.

Base Layer Compilation
As part of the MAS, a ‘base map’ is obtained for georeferencing the FIRM and FBFM panels in a county. The MAS states: “the base map is to be the USGS digital orthophoto 3.75-minute quarter-quadrangles (DOQQs), or other digital orthophotography that meets FEMA standards.” Currently, we use the DOQQs to georeference the panels; when it becomes available, we will use the Statewide Addressing and Mapping photography. Countywide mosaics of the DOQQs are available either from CDs in our office or from the NRCS geospatial data gateway. Before beginning panel georeferencing, gather all the base map photography to cover the geographic extent of the county. Check DOQQ tiles and the ortho mosaic, if used, for agreement with each other. Also check the individual DOQQ tiles against the quarter quadrangle index to make sure that they are NAD83 and not NAD27. Finally, check to make sure that the spatial properties (coordinate system and projection) are defined for each quarter quad.

Georeferencing
FEMA provides scanned (TIFF) images of the paper FIRMs and FBFMs. Not all counties have separate floodway panels (FBFMs).

You can download county FIRMs and FBFMs from the FEMA Map Store. For Summers and Fayette Counties WV, aerial photographs from the SAMB were reprojected on-the-fly and used as base.

“ArcMap will not project data on-the-fly if the coordinate system for the dataset has not been defined. The coordinate system for any dataset can be defined using ArcCatalog” [ESRI Help].

It is advisable to load the aerials, FIRMs and FBFMs in different Raster Catalogs for quicker refreshes. It is best to start-off with geoferencing the index and then nailing each semi-transparent panel in its approximate location through corner points [“spreading in all the right directions”]. Again, it is best to concentrate around your area of interest, in this case, the floodplain. It is also advisable to adjust the visible scale for the aerials for easier navigation.

Also, try to keep the clipboard empty since on aging systems that may cause incomplete raster refreshes. To avoid related spikes in CPU usage, you may adjust the display settings, page file size and Task Manager priorities accordingly. Also, if you have upgraded to ArcGIS 9.1 minus the patch and are having raster display problems, consult the following ESRI thread.

“In general, if your raster dataset needs to be stretched, scaled, and rotated, use a first-order transformation. If, however, the raster dataset must be bent or curved, use a second- or third-order transformation. Add enough links for the transformation order. You need a minimum of 3 links for a first-order transformation, 6 links for a second-order, and 10 links for a third-order” [ESRI Help].

Priority should be given to georeferencing individual panels over interlocking adjacent panels. Once satisfied with the adjustments and associated RMS Error, you may either update if using first-order transformation, or rectify if using higher-order transformation.

Note that first-order transformations update the *.TWF files, however higher-order transformations also update the *.AUX files.

Once the groundwork is done, it takes less than 1/2 an hour per panel on a machine with the following specifications:

MS Windows 2000 SP4 Dell PWS 340 Pentium [4] CPU 1700 MHz 1.05 GB RAM

The steps taken to georeference the scanned FIRMs/FBFMs using Arcmap are:

1. Start an Arcmap project in the desired coordinate system. When using West Virginia DOQQs that will primarily be UTM 83 zone 17 (although Jefferson County was zone 18).
2. Add the DOQQs for the area of interest to the project.
3. Add the scanned TIFF to the project. The first panel to be georeferenced is the most difficult, because locating the correct spot on the base map photographs using the landmarks on the panel can be frustrating without a good reference system. One way to do this is to warp the panel index first—hence giving a rough estimate of panel location on the photographs. Alternatively, after warping one panel, work with adjacent panels to make landmark location easier.
4. Use “fit to display” on the georeferencing toolbar pull-down menu to move the TIFF to the current extent.
5. Use the georeferencing toolbar to create control points on the DOQQs and the scanned TIFF, using roads and other major features appearing on the FIRM.
6. It is recommended that “Auto Adjust” be checked on the georeferencing dropdown and that the layer being georeferenced is partially transparent. As control point links are added the scanned TIFF will be shifted over the DOQQs, making finding and adding additional links easier.
7. As you are adding control points, check the residual values and total RMS value in the link table. The goal is for a total RMS value of 10 or less (units are mapping units, meters). After adding as many control points as possible it is sometimes useful to remove links that have very high residual values to improve the overall RMS value of the warp. Sometimes it is not possible to get an RMS below 10.
8. Concentrate control points around areas with flood features to improve the fit of areas that will be digitized. We recommend adding at least 10 sets of control points, although in some cases we used over 20 sets to improve fit.
9. Record the total RMS value of the transformation for each panel in a spreadsheet for the county.

Vertical Datum Conversion [optional]
The estimate of the basic shape of the earth was inconsistent under the National Geodetic Vertical Datum [NGVD] 1929. This resulted in less accurate vertical data computations. Hence, it was decided to shift to the North American Vertical Datum [NAVD] 1988 that uses more reliable means for this estimation. Vertical Datum is required for DFIRM panels and the D_V_Datum table. Note that Vertical Datum conversion will not result in any change in flood depths.

Begin with 7.5-minute USGS Quadrangles. For Summers and Fayette Counties WV, this data was downloaded from the WV GIS Technical Center. Next buffer your County by 2.5 miles to select all the Quad corners that fall inside the buffer. Then reproject the corner points thus selected to GCS_North_American_1983 and add XY coordinates. Now you have all the latitude/longitude coordinates required for orthometric height-difference computations using the National Geodetic Survey’s VERTCON software. Alternatively, you may use the Corps of Engineers’s CORPSCON software.

In VERTCON, if you have generated an input data file for your latitude/longitude coordinates, you would typically select the ‘Free Format Type 2’ option. Else, you would simply enter individual Station Names and associated latitude/longitude coordinates. VERTCON generates an output data file for use in the following calculations [Sample Worksheet].

Once Conversion Factors for all points have been determined, calculate the Average, Range and Maximum Offset for the Conversion Factors. If the Average is less than 0.1 foot, only a “passive” Vertical Datum conversion may be applied. Typically, when the Maximum Offset is <= 0.25 feet, a single Conversion Factor can be applied. Else, stream-by-stream Conversion Factors need to be applied.

Reference:

1.
dd = 37.87511679110 degrees ~ 37 degrees
mm = .87511679110*60 = 52.50700746600 ~ 52 minutes
ss = .50700746600*60 = 30.42044796 ~ 30 seconds
==> 37 degrees 52 minutes 30 seconds
2. Appendix B: Guidance for Converting to the North American Vertical Datum of 1988
3. FIA-20 June 1992, Ada County OH

Digitizing and Attributing Flood Features (Arcmap Project and File Specifications)
The UTM NAD83 projection, zone 17 is used for all West Virginia countywide flood mapping projects, with the exception of Jefferson County, which is zone 18. All features are initially collected as lines, although special flood hazard areas (e.g., Zone A, AE) are later converted to polygons. All features are drawn in one line shapefile and are later separated into the separate files required to meet MAS deliverables. For the purposes of drawing the flood feature lines we are using a line shapefile with the following attribute fields: Type (text, 10), Letter (text 2), Elev (long integer, precision 5). A description of the values we use in those fields is given below with each different feature type. In the first round of digitizing the shapefile was named All_Lines.shp, although in the future we may switch to using a county name in combination with employee name. Save edits frequently while digitizing, both by using the save edits button in Arcmap and by making backup copies of the file with Arccatalog.

Snapping
Begin an edit session and set up the snapping environment. Having snapping turned on is important to allow snapping of BFEs to the edges of flood hazard areas and for snapping the flood zone line segments together. We generally usually use a snapping tolerance between 7 and 10 pixels; this is a personal drawing preference and may vary from person to person. Use the appropriate snapping mode for each type of feature, i.e. ‘vertex’ for closing zone boundaries, ‘end’ for snapping arc ends together and ‘edge’ for snapping BFE lines to zone boundaries. Note that having ‘vertex’ snapping on can make it more difficult to accurately place BFE endpoints. The goal is clean intersections and BFEs that are snapped to flood hazard area boundaries.

Feature Collection
We generally draw flood map features in this order: floodway, flood zone, BFE, and cross-sections. Some counties have floodway features on a separate map (FBFM) from the FIRM. When working with two maps, collect floodways and cross sections from the FBFM and collect flood hazard zones, BFEs, and streams and channels from the FIRM maps. When working with a FIRM and a FBFM for a panel, it is recommended that lines are drawn from the FBFM first and the FIRM second. Features are to be seamless across panel boundaries, meaning when the same feature type occurs on both sides of a panel boundary, it should be drawn with no interruption. Adjacent panels digitized by different people should have the endpoints of flood feature lines snapped together in the final line shapefile. Be sure to check panel edges carefully for small flood zone polygons.

Panel Index and Base Index
Collection and attribution of flood features will be discussed in detail below. In addition to the flood features, we also submit 2 polygon index shapefiles to FEMA for each county. One of the shapefiles is called S_FIRM_Pan and is an index of the FIRM panels for a county. It is created by digitizing the lines on the scanned and warped county FIRM index. Only unincorporated areas are included the in the panel index, not the incorporated areas. Secondly, an index of the “base” data for a county is to be provided in a polygon shapefile called S_Base_Index. In our case, the base data is the DOQQs. The S_Base_Index shapefile can be generated by clipping out the appropriate quarter quads from the DOQQ index. As with all other shapefiles we submit, both the S_FIRM_Pan and S_Base_Index shapefiles have a required attribute table format, discussed later in this document.

Flood Feature Symbology and Attributes

Floodways
The floodway is the channel of a river plus any adjacent floodplain areas. Floodways won’t be found on all panels. There are 2 different presentations of floodways on FEMA panels, which vary by county. In some counties, Berkeley for example, floodway symbology is included on the FIRM (Figure1a). Other counties have separate floodway panels (FBFM, Figure 1b) and they must be added as a separate layer for floodway line collection.

In the initial drawing, lines defining the floodway are given the following attributes:

Type: floodway
Letter:
Elev:

Flood Hazard Areas
Flood hazard areas will also be referred to as ‘flood zones’ or ‘zones’ and they identify areas of different levels of flood risk. Flood zones are labeled on the FIRMs with letters; commonly used zone names are A, AE, B, C, X and they are shown on the paper maps with different densities of shading and text labels (Figure 2a). Zones are collected as lines, although later they will be converted to polygons. Digitizing proceeds from the inside out, i.e., collect the innermost zones first (In Figure 2a, the floodway would be collected first, and then AE, then X). Where an outer zone line flows into an interior zone line, they should be snapped (Figure 2c). Each line defining flood zones should be collected only ONCE. In areas where zone boundaries are coincident, only one line is collected (Figure 2c). There are zone division lines (Figure 2c and d, also referred to as gutter lines), which separate “special” flood hazard areas (generally zones A and AE). The zone division lines are thin white strips that are hard to see in the shaded zones. Gutter lines should be considered the border of those particular zones and treated as any zone boundary would be (i.e., collected once, continuous with other zone lines).

In the initial drawing, lines defining the flood hazard areas are given the following attributes:

Type: zone
Letter:
Elev:

Base Flood Elevations
Base Flood Elevation (BFE) is the height of the base (100-year) flood in relation to a specified datum. BFEs are symbolized on the FIRM panels with a wavy line (Figure 3a) but the feature is usually collected as a straight line (Figure 3b) that is snapped to the edge of the flood hazard area. IF there is a significant bend in the BFE as drawn on the panel, then additional points may be added to follow the curve. Ends should always be snapped to the flood hazard area.

In the initial drawing, lines defining the BFEs are given the following attributes:

Type: bfe
Letter:
Elev: numeric elevation value on FIRM (e.g., 405)

Cross Sections
Cross sections (Figure 4a) show the location of floodplain cross sections used for computing base flood elevations. Cross sections are normally collected as a straight line, crossing and exiting the flood hazard area (Figure 4b). It is not necessary to follow bends in the cross section line that occur outside of the flood hazard area, nor is it necessary to extend the line through the hexagons at the end of the line symbol. If there are bends in the cross section within the flood hazard area, place only as many vertices needed to maintain shape. Cross section lines should not be snapped to the flood hazard area lines, and instead should extend beyond them.

In the initial drawing, lines defining the cross sections are given the following attributes:

Type: xsection
Letter: letter of cross section, found in hexagon symbol (e.g., z)
Elev:

Channels and Streams
Channels and streams (Figure 5a and 5b) are collected in the flood hazard areas for QC purposes. No snapping is required and the stream or channel line should extend just beyond the flood hazard area when applicable. Streams are collected as single lines and both lines of a channel are collected.

In the initial drawing, lines defining the channels and streams are given the following attributes:

Type: channel or stream, as appropriate
Letter:
Elev:

POST-DRAWING QUALITY CONTROL AND ADJUSTMENTS

Visual QC Of Linework
After all lines are digitized and in a countywide, seamless file, a visual check is done to ensure that all features have been collected. The “Type” field in the line shapefile can be used to categorically symbolize the different feature types for the visual QC. Different colors and line styles can be used to represent separate feature types and the legend symbols can be saved as a layer file to preserve the symbol assignments. Turn on the labels for BFEs (elevation) and xsections (letter) and select a font style and color that allows them to be easily seen and checked in the visual QC process. Each person will probably have a different method of doing a systematic visual inspection. Some suggestions: a grid could be used to scan the linework, drainages can be followed, or the check can be done panel by panel. The important thing is to scan at a level such that all of the panel raster features can be identified and vectors examined. The person doing the QC should have a full understanding of what features are supposed to be collected and the symbology variations (e.g., floodways on FIRMs vs FBFMs). Any missed features should be digitized. This is also a good time to make note of any unusual problems or non-conformities in the scanned panels (e.g., zone type changes at panel or corporate boundary). This is the time to check that features are seamless across panel boundaries; BFEs and cross sections in particular should be checked at panel boundaries because there is no further geometric processing with these lines that will reveal continuity errors.

Spatial Adjustments (otherwise known as “Adjusting To The Real World”)
Post-drawing manipulation of lines to improve “fit” is hard-to-quantify and subjective. As stated in the introduction, FEMA requires the digital data to have a reasonably good fit to the “real world”. The “real world” in our case is the DOQQs. The scanned panels do not warp perfectly and in some areas the digitized lines will not overlay real world features very well. Current adjustment procedures involve these steps:

1. Compile the following layers in Arcmap:
a. DOQQs
b. Line shapefile with county-wide seamless flood features
c. 1:24,000-scale NHD centerline data layer (route.rch, in catalog unit coverages)
d. Problem point file (discussed in the next section)
2. Determine a systematic method for visually scanning the data (similar to that used in the visual QC) and adjust “Type” symbology for easy differentiation.
3. Begin a visual check of the linework, this time concentrating on how well the streams and channels drawn from the flood panels line up with the DOQQ and the NHD data. It is strongly recommended that you do not use the FIRM panels at this point, as they will increase confusion.
4. NHD data are a fairly good guide to where the flood panel waterways “should” be; however they are not perfect. While visually scanning the linework, check that the streams and channels collected from FEMA panels line up fairly well with the NHD data, while also checking to see that NHD data appears to overlay the hydrologic feature on the DOQQ. There is never going to be a perfect fit; the panels streams will wander back and forth over the NHD vectors. What you are looking for is areas of consistent difference that extend for a noticeable distance (again, hard to quantify). In Figure 6a, the blue dashed panel stream channel lines are not aligned with the DOQQ stream channel edges.
5. When areas of consistent difference are found, ALL the linework surrounding the area is shifted at the same time, until the panel stream has a better fit to the real world stream. This is accomplished by first breaking all the continuous flood zone, floodway, and stream lines at about the same point on 2 imaginary lines that run perpendicular to the “flow,” one at each end of the area to be shifted. Then, the cut lines are selected, along with any BFEs or cross sections that are in the area (Figure 6b), and all the selected features are moved until the streams are better aligned (Figure 6c). The adjustment is accomplished mostly with the move tool in Arcmap, although in occasion the rotate tool may be used to improve the fit of the selected lines with the DOQQ.
6. Lastly, snap the dangling ends together and smooth out the curves of the reattached lines by moving or adding vertices (Figure 6d). This is the only time lines should be moved or stretched individually, as it distorts proportions.

Mapping Problem File
One of the required deliverables is a point file indicating areas where certain “problem” situations arise. At the same time as adjustments are being performed, the problem point file can be edited. FEMA defined mapping problems are outlined in the draft Technical Memo, dated October 3, 2003, a copy of which is found in the FEMA project notebook; they have also been listed below for convenience. A point shapefile is created for each county with the following fields: Error_type (text, 10) and Descrip (text, 75).

Error_type Descrip
BFE Base Flood Elevation problem
XSECT Cross-section problem
SFHA-PAN Special Flood Hazard Area changes at map panel edge
SFHA-BDY Special Flood Hazard Area changes at a political boundary
SFHA-STR Special Flood Hazard Area different on each side of a stream
SFHA-OTH Other Special Flood Hazard Area problems
STR-FW Stream outside of floodway
STR-SFHA Stream outside of Special Flood Hazard Area

As of this writing, we have primarily found the STR-SFHA, STR-FW, and SFHA-BDY types of errors. Note: errors should be determined AFTER lines are adjusted in a given area, as the adjustment may correct the problem. Place a point in the shapefile at the location where the problem occurs. In Figure 7 the pink point indicates a location where the stream (orange) is outside of the flood hazard area (blue line).

POLYGON CREATION

The flood hazard zones and floodways must be converted to polygons for final processing. Select all lines with a “Type” of zone or floodway and export to a separate line shapefile. Topological checks will be performed on the line file before polygons are built. Topology work can only be done in Arcmap via the geodatabase model. Import the line shapefile into a geodatabase feature class that is under a feature dataset (must have a feature dataset to create a topology). If you are starting with a geodatabase / feature class, then use Export | Geodatabase to Geodatabase in Arccatalog to transfer the feature class into the dataset.

Add a new topology under the feature dataset. Set the cluster tolerance relatively high (0.1 was used in the first 2 MAS, which corresponds to 10 centimeters on the ground) to reduce the number of small pieces formed. Only the flood hazard zone lines feature class will participate in the topology. The topology rules used are: must not have pseudos, must not have dangles, and must not self-overlap. After creating the topology for the lines, validate it. Bring the validated topology layer into an Arcmap project to view the errors found. Use the topology tools to analyze and correct all errors before proceeding. See the Topology section in the ArcGIS book “Building a Geodatabase” for help.

After validating the topology and fixing all topological errors, convert the lines feature class to a polygon feature class. To do this, right click on the feature dataset in Arccatalog and select ‘new’ and then ‘polygon feature class from lines’. A wizard helps with the conversion; accept the default tolerance.

Once the polygon layer is created, create a new topology for it. Use the default cluster tolerance, which is very small. Only the polygon feature class participates in the topology, and the rules are: must not overlap and must not have gaps. Bring the validated polygon topology into Arcmap as with the line topology. Ideally, there will be no errors in this topology. After checking for and fixing topological errors, another check should be done for sliver polygons. This can be done by viewing the polygon attribute table in Arcmap and sorting the table based on the shape_area attribute field in ascending order. Examine the smallest polygons to be sure they are not slivers.

Next, the polygon flood hazard features need to be attributed. This can be done in the geodatabase, setting up a domain so that attributes can be chosen from a drop down list. Overlay the flood hazard polygon layer with the FIRM/FBFM panels and attribute the polygons. It saves time if the shapefile you are using to add attributes has the same column structure as the required final product (see Table 2). In the future we hope to have template files available for use, so that the required structure will already be in place. We have tried merging with a template file in the geodatabase, but that resulted in features shifting. This process is still being developed.

PREPARATION OF DELIVERABLES

For the final deliverables, the flood features collected in the line shapefile must be processed into separate shapefiles with specified fields. Table 1 gives an overview of the shapefile names and contents. Attribute fields have required field types (e.g., text, number) and sizes; details can be found on the pages of Guidelines & Specifications for Flood Hazard Mapping Partners Appendix L referred to in Table 1. These pages from Appendix L have been printed out and are in the guidelines/technical section of the FEMA project binder. Table 2 provides details on the required fields.

Table 1. Deliverable shapefile description
Shapefile Name Contents Pages in Appendix L
S_Base_Index Grid of base data, in our case, DOQQs. Polygons. L-270 to L-271
S_FIRM_Pan Grid of FEMA panels; digitized from county panel index. Polygons. L-286 to L-290
S_Fld_Haz_Ar Flood hazard zone polygons L-291 to L-293
S_BFE Base flood elevation lines collected from FEMA panel L-272 to L-273
S_XS Cross section lines collected from FEMA panel L-350 to L-354

Table 2. Shapefile attribute field requirements
Shapefile Field Name What Goes In It
S_Fld_Haz_Ar (polygon) FLD_AR_ID A unique feature number. Can be copied from FID field. [Text, 11]
FLD_ZONE Flood zone from FIRM. Use values in FLD_ZONE field of Table D_Zone on pg L-452 of Appendix L. [Text 55]
FLOODWAY “FLOODWAY” if polygon is a floodway. Null if not. [Text, 30]
SFHA_TF “T” if any zone beginning with A. “F” for any other zone. True or false. [Text, 1]
SOURCE_CIT 11-digit FIRM panel number that majority of feature is on. If polygon crosses many panels, use downstream panel. [Text, 11]

S_XS (line) XS_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
XS_LTR Upper case letter(s) of cross-section from FIRM. [Text, 12]
XS_LN_TYP “LETTERED” in all cases. [Text, 20]
WTR_NM Name of water feature (stream) cross section is on. From FIRM or FIS. [Text, 100]
SOURCE_CIT 11-digit FIRM panel number cross section is on. If on two, list panel with majority. [Text, 11]

S_BFE (line) BFE_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
ELEV Numeric elevation of BFE, from FIRM [Double, Prec. 13, Scale 2]
LEN_UNIT “FEET” in all cases. [Text, 20]
V_DATUM Vertical datum of panel. Listed on panel, and values must come from the V_DATUM field of the D_V_Datum table on page L-444 of Appendix L. [Text, 6]
SOURCE_CIT 11-digit FIRM panel number BFE is on. If on two, list panel with majority. [Text, 11]

S_ Base_Index (polygon) BASE_ID A unique feature number. Can be copied from FID field. [Text, 11]
FILENAME Name of DOQQ or other image file used as base map. [Text, 50]
BASE_DATE Date image was captured. For DOQQs can be found in header file. [Date]
SOURCE_CIT BASE1 or other abbreviation that corresponds to metadata [Text, 11]

S_FIRM_Pan (polygon) FIRM_ID A unique feature number. Can be copied from FID field. [Text, 11]
FIRM_PAN FIRM panel number. [Text, 11]
EFF_DATE Effective date on FIRM panel. [Date]
SCALE Scale of FIRM panel. If map scale on FIRM is 1” = 500’, then scale is 6000. Multiply feet by 12 to get true scale. [Text, 5]
SOURCE_CIT 11-digit FIRM panel number. [Text, 11]
BFE Shapefile Creation

From the line shapefile that was used for digitizing, use the Type field to select and export BFEs to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output merge file can be a geodatabase feature class, which allows for the use of an attribute domain drop-down for the SOURCE_CIT field. Use Arcmap editing tools to assign attributes to the fields shown in the preceding table. BFE lines are submitted in the S_BFE shapefile.

Cross-section Shapefile Creation
From the line shapefile that was used for digitizing, use the Type field to select and export cross-sections to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output, as with BFE, can be a geodatabase feature class. Attribute domains can be created for the XS_LTR, XS_ LN_TYP, WTR_NM (a list of stream names is available in the county FIS book) and SOURCE_CIT fields. Cross-section lines are submitted in the X_Xs shapefile.

Certification
One of the required deliverables relating to the base map (DOQQs in our case) is a “written certification that the digital data meet the minimum standards and specifications.” A text file with the following statement was created:

“This text file serves as written certification that the base map digital data meet the minimum standards and specifications in Guidelines and Specifications for Flood Hazard Mapping Partners Appendix K. On page K-42 (Section K.4.1.1) of that document it is written “The most common form of raster image map is the digital orthophoto, especially the standard Digital Orthophoto Quadrangle (DOQ) produced by the U.S. Geological Survey.” DOQQ’s were used as the base map for georeferencing scanned paper FIRMs and for visually locating features of interest.

DOQQ

Refer to the DOQQ Metadata and the Digital Orthophoto Standards. Appendix L is the primary document of interest.

NHD
Refer to the NHD website.

Retrieve the 1:24:000 NHD coverages to use as reference

FEMA flood documents in the black FEMA 3 ring binder

Arcmap editing and geodatabase manuals.

Mapping Activity Statement documents – be sure to understand all deliverables.

APPENDIX B. SAMPLE METADATA FILE

Refer to the original DIGITAL CONVERSION OF FLOOD INSURANCE RATE MAPS.

APPENDIX C. SAMPLE QA/QC REPORT

Summary Report
QA/QC Review Steps During Digital Conversion of Flood Insurance Rate Maps
Mapping Activity Statement 2003-02, West Virginia GIS Technical Center
Prepared 4/15/04

The following QA/QC checks were performed during the digital conversion of Flood Insurance Rate Maps by the West Virginia GIS Technical Center (WVGISTC):

1) Source Material Inspection
a) Visually reviewed scanned panels received in .tif format; compared with printed paper maps to check for completeness

2) Base Layer Compilation/Verification
a) Used a vector quarter quad index certified by WVGISTC to confirm that the USGS Digital Ortho Quarter Quads (DOQQs) were in the UTM NAD83 projection; DOQQS were used for the georegistration base map
b) Checked the spatial integrity of a county-wide ortho mosaic (used as a reference; obtained from the NRCS Geospatial Data Gateway

3) Georegistration of Scanned Panel Source Material
a) Ensured data were correctly referenced to the UTM coordinate system
i) Set Arcmap software data frame projection to UTM NAD83, Zone 17 or 18, as appropriate
ii) Georeferenced scanned panels to real-world coordinates using DOQQs to establish reference links
(1) The mean RMS value for warped panels was 5.63 meters (mapping units). This was the best attainable georeferencing that could be accomplished without stretching features and impacting length relationships
iii) Re-warped portions of scanned panels in areas of poor fit to attain a better visual real-world correlation
b) Checked that the scale of warped raster (.tif) and original paper maps were compatible
i) Plotted georeferenced FIRMS at the same scale as paper maps; conducted manual ruler measurements on paper map in comparison to plotted data to confirm accuracy of feature location and length relationships

4) Digitizing of Flood Features
a) Digitized SFHA, BFE, and cross section features from the georeferenced panels as line feature types
i) SFHAs and Floodways were digitized first; BFEs and Xsections were digitized next and BFEs were snapped to AE zone boundaries (Arcmap snapping tolerance set to 10 pixels)
ii) Streams and channel banks were partially digitized as additional reference features
b) Systematically visually scanned collected vectors and compared them with underlying georeferenced paper flood maps
i) Checked that character of features was maintained
ii) Checked that required features were collected
c) Edgematched features on adjacent panels
i) Checked that features were snapped seamlessly at panel boundaries

5) Spatial Adjustments
a) National Hydrography Dataset (NHD) vector stream centerlines were used to assist in identifying real-world (DOQQ) stream position
b) Proportional piecewise adjustments
i) Adjusted all features (SFHAs, BFEs, cross sections) in small sections of the floodplain when:
(1) the DOQQ stream was not located within the SFHA or
(2) there was a visibly constant difference between location of the DOQQ stream and location of the digitized stream
ii) Attempted to bring the digitized FIRM stream in line with the NHD stream or the stream on the ground, if it was visible on the DOQQ
iii) Used Arcmap editing functions such as line moving and rotating
c) Created a point shapefile to mark location of “mapping problems” as defined in the FEMA technical memo dated October 3, 2003. Examples of problems found:
i) Stream outside of SFHA
ii) Stream outside of floodway
iii) SFHA changes at political boundary

6) Topology
a) Used the ArcGIS geodatabase model and topology rules on SFHA and floodway line features
i) Corrected pseudo-nodes, dangles, and self-overlapping lines
b) Generated polygons from SFHA and floodway line features and used the ArcGIS geodatabase model and topology rules for polygons
i) Confirmed there were no polygon overlaps or gaps
ii) Removed sliver polygons

7) Feature Attribution
a) Reviewed technical memo and MAS to format the 5 required shapefiles (S_Base_Index, S_FIRM_Pan, S_Fld_Haz_ar, S_BFE, S_Xs)
i) Checked that file names, attribute names, types and sizes meet specs
b) Checked that correct attributes were assigned to digitized flood features
i) Completed a systematic visual scan of vector flood features overlaid with georeferenced panels; used symbology variation and labeling to confirm proper attributes had been applied
ii) Checked that valid domain values were used in attribute table columns

8) Map plot for final visual inspection and scale check

APPENDIX D. FILE BACKUP AND NAMING CONVENTIONS

File Backup
Everything pertaining to the current flood mapping project should be backed up to Vesta. This includes warped panels, line shapefiles, and other reference documents.

A FEMA backup folder is set up at this location:

\\Vesta\FEMA_BkUp

It is visible from the TechCenter network under Vesta and is shared openly. This is where all the files for a MAS in progress should be stored. Use sensible file and folder names to help everyone identify the pieces of the project.

A final backup of everything was kept in this location:

\\Ra\TechCenter\Projects\FEMA

It is recommended that drawing shapefiles be backed up every time they are changed; a file versioning system may be preferable to overwriting the same file each time.

Naming Conventions/Path Structure
FEMA has requested that we name the metadata files in this format:

metadata_countyname.txt

So, for example, the metadata files submitted for Jefferson County were named:

metadata_jefferson.txt
metadata_jefferson.html
metadata_jefferson.sgml

On the CD containing the final deliverable files, this is the requested structure:

\jefferson\Arcshape\
\jefferson\Ortho_photos\
\jefferson\Document\
\jefferson\RFIRM\

The county name behind the first backslash will change for each countywide project completed and submitted. The Arcshape folder contains the S_Base_Index, S_FIRM_Pan, S_Xs, S_BFE, and S_FLD_Haz_Ar shapefiles, plus the problem shapefile. The Ortho_photos subdirectory contains the DOQQs or other imagery used for the base map. The document subfolder contains the metadata, QA/QC report, and base map certification. I made subfolders for each of those items under the document folder. The RFIRM folder contains all the georeferenced panels.

Related:
* Digital conversion of Flood Insurance Rate Maps (FIRMs)
* WIKI: Edit Lock Schema

Written by Harsh

July 7th, 2005 at 10:03 am

A Rose by Any Other Name

with 4 comments

The definition of GIS has evolved from ‘Geographic Information System’ to ‘Geospatial Information System’. It is time now that it takes the next logical step to ‘Spatial Information System’. My earlier post wrestled, well not quiet, for a truer understanding of “GIS” given the advent of non-traditional spatial software. Since then I have been convinced that spatial information is better understood by snapping links that tie, and thus confine, it to geography.


Inside Space- An Unventured “GIS” Frontier? Magnetic Resonance Image [MRI] of my right-wrist

It is therefore disappointing that some professionals continue to look at spatial information from behind the narrow screens of geography. Hopefully, with the entry of non-traditional market forces, this viewpoint will be shaken to the point of abandonment. A truer appreciation of spatial information will require a visual mindset where all spatial components to information are addressed.

Related:

• Front, Side and Top View: Construct two valid isometric projections


• Find the missing piece

Written by Harsh

June 10th, 2005 at 7:04 pm

Follow Up [2]: Map Viewer and Google

with one comment

Written by Harsh

May 27th, 2005 at 6:40 pm

Posted in GIS,Mashup

Tagged with ,

Follow Up [1]: Graphic Software

with one comment

It is good to know that some professionals concur with the views expressed in my earlier post on the potential for graphic software, like Macromedia Flash. One comment links to an impressive demonstration of this largely untapped potential.

Anyway, two companies whose product GUI I enjoy interfacing with- Adobe and Macromedia, announced their merger earlier this month.

Both their flagship products have become industry-standards in exchanging documents and creating experience-rich applications across platforms. The largely unused spatial potential within Macromedia Flash combined with the increasingly widespread use of Adobe PDF/SVG maps and the sprouting of some exciting derivatives like geoPDF, pstoedit and GSview, make this merger important to how spatial information is exchanged in the near future.

Written by Harsh

April 28th, 2005 at 6:01 pm

Posted in GIS,Mashup

Tagged with , ,

Follow Up [1]: Map Viewer and Google

with one comment

A quick note on the happenings at Google: Yesterday, Google added satellite imagery to its mapping. For speedy displays, 256px*256px JPEG image-tiles scanned at different zoom-levels and each weighing around 30 KB, coupled with some nifty AJAX come handy.

Such a drag-and-drool tiling paradigm, although practised for some time now by website developers to load large images, when applied to internet mapping represents a refreshing out-of-the-box approach. The GET HTTP request method uses a cryptic naming convention to fetch these image-tiles from a preexisting pallette, like so:

http://kh.google.com/kh?v=1&t=TILE…

WHERE in one instance, TILE zooms closer from [tqtsqr] to [tqtsqrtssssrq] and still closer to [tqtsqrtssssrtrttr].

Unlike for its regular mapping where Google predictably uses GIF image-tiles each sized at 128px*128px, for its satellite imagery, Google’s preference for JPEG over another competitive format PNG, is worthy of a second glance: As is common knowledge, JPEG supports millions of colors, but is infamous for its lossy compression. PNG on the other hand, is lossless while supporting millions of colors. However, PNG is currently not supported by all browsers and depending on compression settings, may end-up weighing more.

–π

Written by Harsh

April 5th, 2005 at 7:29 pm

Posted in GIS,Mashup

Tagged with ,

ArcIMS FAQs

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

  • I am getting a ‘jsForm.htm not found’ error? If you are using Internet Explorer, first make sure you have the latest version of that browser. Then remove the Arcims site from your browser favorites, reopen the browser and try again.

  • How do I import Arcims maps inside ESRI Arcmap? If you have Arcmap 9.x, you can import Arcims maps by connecting to the services of an Arcims server. In Arccatalog 9.x, simply click on ‘GIS Servers’ to add the Arcims server and type-in its URL. Note that this does lead to a noticeable performance drop.

  • How do I accurately rescale the map when that functionality is provided? True scale depends on monitor resolution, the default being 96 DPI (Dots Per Inch). To make sure that your monitor is configured correctly, for MS Windows, check Display Properties–>Settings–>Advanced–>General. Note that when the map is rescaled to, say 1:12000, 1 inch on the map should represent 12,000 inches. Also note that you can use the Esc button on your keyboard to stop the map from rescaling at any time. Refer to Map Scales for related information.

  • I click on the print button but nothing happens? Make sure pop-ups are allowed for your Arcims site, then try the Print Tool again.

Related:
* ESRI Support
* WIKI: Edit Lock Schema

Written by Harsh

January 6th, 2005 at 1:43 pm

Posted in Education,GIS,IMS,Technology,Web

Tagged with , ,

Digital conversion of Flood Insurance Rate Maps (DFIRMs): Summary

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

My notes reflect procedural changes brought about by the integration of DFIRM Production Tools.

SCHEMA

  • Request jurisdiction(s) for existing geodata like new political boundaries and road names for use as base map. Base map geodata must NOT be older than 7 years.
  • Request GEOPOP from the MOD team and use it to create an empty DFIRM geodatabase. Use existing political boundaries for its geographic extent.
  • GEOPOP creates 3 table types- S (Spatial), L (Lookup) and D (Domain). Edit the main lookup tables:

L_COMM_INFO (community information)
L_SOURCE_CIT (source citation)
L_WTR_NM (hydrographic feature information- stream names etc)
L_STN_START (properties of starting points for stream distance measurements)

  • Create panel index and data catalogs
  • Georeference, scan and rectify geodata at its recommended scale to capture required floodplain features. Refer to FEMA MSC for full-sized PDFs of FIRM panels.

LINKS

Related:
* HEC-RAS Online Help
* WIKI: Edit Lock Schema

Written by Harsh

December 27th, 2004 at 8:12 pm

Graphic Software

with one comment

The discussion “So …How About That Election Coverage?” at Directions Magazine makes you think about graphic software, like Macromedia Flash, that cater to small-time spatial needs.

Such graphic software, minus the topology and advanced query benefits, function well as basic spatial tools and comfortably serve data over the web with a “fair” amount of interactivity.

Does this make your overpriced IMS overhyped and overblown too?

[my comment]
“Macromedia Flash fills this niche quite well as demonstrated [here]. And as the market seems to indicate, it does that [while] satisfying more customers than what an overly fancy GIS would. [This] reminds me of the MapQuest survey when polled customers had expressed great contentment with their level of map detail, whereas cartographers were red with indignation. Akin to using an atomic clock to serve your wake-up call- not needed!”
[/my comment]

So is the complexity in Geospatial, better still Spatial, Information System or SIS overblown too? Much of SIS requires common-sense logic arranged linearly. If a person can drive her car in rush-hour traffic as she deciphers vague directions off a schematic map while trying to make sense of rain-washed road signs and maintain a semblance of conversation with her passenger, and still manage to engage the kid in the back-seat [read “multi-linear tasking”]; she can achieve a sound understanding of spatial databases with little persistence, except for the eye-for-details that comes with practice.

My point: SIS is non-complex and not at the cutting-edge of technological change, and there is ample room for non-traditional spatial software!

PS:
• This rise of non-traditional spatial software challenges the accepted definition of SIS. If you were to follow the modernist’s approach to design where in the end you remove everything you can without taking away from the essence of your creation and apply it to defining a SIS, you wonder what such a conceptual SIS would be in its simplest stark-naked Spartan form?

Written by Harsh

November 11th, 2004 at 7:35 pm

Posted in GIS,Mashup

Tagged with , ,

Map Viewer and Google

with one comment

Interesting web-based map viewer– very snazzy. Now only if the download was quicker.

In related news, Google acquires Keyhole: a company promising a similar 3D interface. Right now, if you google an address, Google provides links to its 2D maps from Yahoo!Maps and MapQuest. Google also provides possible address matches and map links if you type in a name, akin to what Switchboard does.

It would be better if you could click and drag on a map to limit the spatial extent for your search. Although that would clutter the clean interface of Google Local, which by the way, does show maps.

Note to self- invest in Google.

PS:
Pi: Quiet Musing
• Google acquires gbrowser.com, and moves into video search. And here‘s the Google Blog.

Written by Harsh

October 27th, 2004 at 6:15 pm

Posted in GIS,Mashup,Service

Tagged with ,