[ Ø ] Harsh Prakash

Quiet Musings on Cloud, Machine Learning, Big Data, Health, Disaster, et al.

S3 Multipart and Glacier Vault

without comments

{ “filepath”: “filecontent” }

So you want to migrate large data to S3, and/or directly upload to Glacier (for faster archiving and lower storage fees, rather than go through lifecycle management)?

And you want all that in one place with examples?

Well, follow along here.

Remember, while the max object size in S3 is currently 5TB (40TB for Glacier Archive), max single PUT is only 5GB, so use multipart uploads for objects greater than 100MB. For the impatient, note S3 follows the BASE (Basically Available, Soft state, Eventually consistent) model. For Glacier, once uploaded, you can delete or overwrite your archive, but you can’t change it.

S3 Bucket:S3 Object::Glacier Vault:Glacier Archive

* Select part of an S3 object using SQL – S3 SELECT (S3 SELECT ~ Athena-lite (doesn’t directly support JOINs) ~ Redshift Spectrum (supports JOINs))
* Query a bucket to return a subset of its objects – S3 QUERY.
* While S3 is cheaper than EBS is cheaper than EFS (doesn’t directly support snapshots), for lazy migrations, look at Storage Gateway, File or Cache (good for slower connections), both mounted as NFS.

Written by Harsh

May 28th, 2020 at 1:30 am

Posted in Cloud

Tagged with

Once in a lifetime to do what is needed

without comments

Selected places in the extracted NLP answers for Task 3 – “HELP US UNDERSTAND HOW GEOGRAPHY AFFECTS VIRALITY?”

Trying to make good use of some quarantine time (when not sketching participatory comics for my kiddo i.e.), I submitted a Py notebook on Kaggle over the weekend.

It was in response to CORD-19 (COVID-19 Open Research Dataset Challenge), an AI challenge with AI2, CZI, MSR, Georgetown, NIH and The White House – Facebook was in, Microsoft was in, Google was in, but the elephant was not in the room.

The goal was to help better understand COVID-19, and specifically comb through 13,000+ scientific papers using NLP to best answer the 10 questions posed.

Potentially, if this works out, it can be expanded to help with other natural language projects, or even MODL below – our shallow irrigation borehole project, a GMA volunteer endeavor to model optimal drilling locations in Ethiopia with EWTI and GMU.

Full code here.


Walk given path for papers. If conditions are met, then write paper paths to a main papers file.
This creates – papers.biorxiv_medrxiv.json, papers.comm_use_subset.json, papers.noncomm_use_subset.json, papers.pmc_custom_license.json.

Walk given path for answer files. If conditions are met, then merge all answers on a given path by task # and source type into a main answer file. The merged JSON is structured like the original papers, and contains pointers to the original papers for reference.
This creates – answers.task.0.biorxiv_medrxiv.json, answers.task.0.comm_use_subset.json, answers.task.0.noncomm_use_subset.json, answers.task.0.pmc_custom_license.json.

For example, potential answers for Task 1 – “WHAT IS KNOWN ABOUT TRANSMISSION, INCUBATION, AND ENVIRONMENTAL STABILITY?” can be found here: (broken down by source type)
* bioRxiv-medRxiv (640 kb)
* Commmercial Use (9.2 mb)
* Non-commercial Use (1.7 mb)
* PubMed Central (PMC) (1.2 mb)

Potential answers for Task 3 – “HELP US UNDERSTAND HOW GEOGRAPHY AFFECTS VIRALITY?” can be found here:
* bioRxiv-medRxiv (660 kb)
* Commmercial Use (9.1 mb)
* Non-commercial Use (1.7 mb)
* PubMed Central (PMC) (1.2 mb)

We can also geoparse and geocode the extracted NLP answers for visualization.


Cities –

* bioRxiv-medRxiv
* Commmercial Use
* Non-commercial Use
* PubMed Central (PMC)

Country Codes and Counts –

* bioRxiv-medRxiv
* Commmercial Use
* Non-commercial Use
* PubMed Central (PMC)


Pros and cons of using CountVectorizer over HashingVectorizer (or TfidfVectorizer) –
* CountVectorizer uses in-memory vocabulary.
* HashingVectorizer doesn’t have a way to compute the inverse transform (from feature indices to string feature names). This can be a problem when trying to introspect which features are most important to a model. Also, no IDF (Inverse Document Frequency) weighting – IDF measures how important a word is to a doc in a collection of docs.
* TF-IDF increases with the # of times a word appears in a doc.
* TF-IDF decreases with the # of docs in the collection that contain the word.

Given the large unlabeled corpora, WORD2VEC – a group of models that represents each word in a large text as a [vector] in a space of N-dimensions (or features) making similar words closer to each other – was found to be more granular for the COVID19 use-case. It was used over DOC2VEC because the concepts or individual IDs of the docs/papers themselves weren’t the most important factors towards the closest answers i.e. What mattered more than their authors, sponsors or tags was if the papers had the rightly-worded answers to the questions posed. DOC2VEC adds a doc/para [vector], and is generally more helpful when the papers have tags. E.g. to find duplicate papers, or papers by similar authors.

The Euclidean distance (or cosine similarity) between 2 word [vector]s provided an effective method for measuring the linguistic/semantic similarity of the 2 words. Nearest neighbor reveals relevant similarities outside an average vocabulary. Similarity metrics used in nearest neighbor evaluations produce 1 (scalar) that quantifies the relatedness of its 2 words. This simplicity can be an issue since 2 words may exhibit other relationships. In order to capture that in a quantitative way, it was necessary to associate more than 1 number to a word pair. NGRAM_RANGE was used to determine context by weighing nearby words more heavily than distant words. BOW was found to be less accurate as it ignored word ordering.

Populating GloVe required 1 pass through the entire COVID19 dataset. For the large COVID19 dataset, this pass was computationally expensive. Subsequent training iterations would have been faster. Also, pre-trained word [vector] datasets downloaded (e.g. Wikipedia 2014 + Gigaword 5) didn’t match the semantics for COVID19.

And a search engine-type approach returning a few top results was found to be more deterministic than what was understood to be expected or even sufficiently accurate. Therefore, the approach taken was to massively trim the papers the scientists would need to digest. Future enhancements could include clustering for visualization.

* https://www.instagram.com/gisblog/ (the participatory comics, if you’re wondering)
* https://arxiv.org/abs/1301.3781
* http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/
* https://radimrehurek.com/gensim/
* https://nlp.stanford.edu/projects/glove/
* https://github.com/stanleyfok/sentence2vec

Written by Harsh

April 13th, 2020 at 7:30 pm

Model Optimal Drilling Location (MODL)

without comments

▲ Model Optimal Drilling Location (MODL)

Following-up on our AI Scoping Mission for Association for the Advancement of Artificial Intelligence (AAAI), this presentation for Emerald Planet is about our shallow irrigation borehole project – A Global MapAid (GMA) volunteer endeavor to model optimal drilling locations in Ethiopia with Ethiopian Water Technology Institute (EWTI) and George Mason University (GMU).

See Slide 13 – From a quick ML stab, I found:
* SWL (depth to Static Water Level), or the level of water in a well under normal, undisturbed and no-pumping conditions. Basically, the top of the water table, typically measured by lowering a tape into the well until a part is under water, then pulling it out to read the mark where the line is wet.
– to be have the strongest correlation to –
* Easting (Eastward-measured distance)?
* Total Drill Depth (TDD) – The depth of the bottom of the well. Usually, this is the depth where drilling has stopped.

I computed the degree of correlation between the variables by the coefficient of correlation. My quick results indicate overall poor correlations for the sample boreholes in the target Bilate river basin.

Clearly, our boreholes data needs massive clean-up. But if a strong correlation is eventually found, then not much investigation/logging exercise would be needed to hit water – That should reduce the total cost of drilling. In areas where poor or no correlation is found, obviously more funding would be recommended for investigation/logging exercises.

Let the data lead the way!

In the meantime, check out Global Groundwater Maps and U.S. Drought Forecasts, Borehole Construction, Development and Maintenance Techniques for Nigeria and Pump HowTos.

Written by Harsh

April 4th, 2020 at 2:11 pm

Emerging Emerging Tech

without comments

Okay, so here’re some docs, part of our requirements, as food for thought for those of you strategizing around similar innovation topics. Ping me with your ideas!

* Emerging Technology Strategy and Implementation Plan
* Systems Engineering Integration and Implementation Requirements
* Emerging Technology Practice Definition
* Emerging Technology Practice Research Support
* Cloud Services and Support

Written by Harsh

March 31st, 2020 at 8:46 pm

Sunsetting Python 2

without comments

Written by Harsh

December 10th, 2019 at 6:01 pm

Posted in Cloud,Programming

Tagged with ,

SM Automation and Service Catalog: When to use what

without comments

Systems Manager Automation can be utilized in cases where an Automation workflow needs to be built for maintenance and deployment tasks. Pre-defined Automation Documents can be used, or an Automation Document can be created for the desired task to define the actions that Systems Manager should perform on the specified resources. The Automation Document includes one or more steps which run in a sequential order, and each step is
associated with a particular action that you specify.

E.g. You can have one step to launch an instance, and the next step in the Automation Document to do some actions on the newly created instance.

See user guides for Systems Manager Automation and Automation Document.

On the other hand, Service Catalog lets sysadmins manage a catalog of infrastructure Products, and organize them into Portfolios to which the end users can be granted access to. This way, your fav sysadmin can control the list of Products that the end users are allowed to deploy on AWS.

See sorting and FAQs.

When should I use Systems Manager Automation to launch an instance v. use Service Catalog to launch an instance?

Well, it depends on your individual use case.

Systems Manager Automation enables you to automate the administrative or management tasks performed on instances and other AWS resources. This helps you to simplify complex task by performing the desired tasks on a large groups of instances.

Service Catalog is used to group your resources in CloudFormation templates, called Products, by IAM groups, users or roles. You can group and administrate those products in Portfolios which can be shared with your accounts. Service Catalog also enables you to have an abstraction layer, where even if your end users do not have IAM permissions to create
EC2 instances, for example, but if they have access to a Service Catalog Product that creates EC2 instances, then the users can create instances via provisioning the Product. This allows Administrators to provision applications for end users by setting configurations within


For Systems Manager, you pay only for what you use and are charged based on the number and types of steps.

In Service Catalog, you pay a fixed fee of $5 per month for each portfolio of Products with assigned users.

In Systems Manager Automation, there are a few predefined Automation Documents which can be utilized for the tasks you would like to perform on your resources. If there is no predefined Automation Document available for the specific task that you would like to perform, you can build your custom Automation Document utilizing the Automation Actions.

In Service Catalog, you would have to create a custom CloudFormation template for each task you would like to perform and create the products in your Portfolio.

Differences between Systems Manager Automation Document and Service Catalog Template

For a few sample codes for Actions that can be specified in a Systems Manager Automation
Document, see this user guide.

You can view the predefined Automation Documents from your Console by following the
steps below:
1. Open Systems Manager Console.
2. Choose Documents from the navigation pane on the left.
3. In the search filter, choose Document Type and Automation as the value.
4. From the list of Automation Documents displayed, click on a Document to open
the details page.
5. Choose the Content tab to view the Document content.

For a number of sample CloudFormation templates for Service Catalog, see these user guides – 01 and 02.

How will I use Systems Manager Automation and Service Catalog for each of the tasks below i.e. what is best suited for each of the following tasks. E.g. Launching a new instance – Patch an existing instance – Terminating an instance – Scheduling start and stop of an instance.

If you have a use case where an Administrator would like to grant the other IAM users who would be performing this tasks limited access to these services, then you can utilize Service Catalog to create a Portfolio and only allow certain IAM users/roles to use it to deploy only the Products configured in that Portfolio.

Otherwise, if you would only like to automate these tasks, Systems Manager Automation would be more suitable for performing the above tasks.

E.g. 1. Launching a new instance: To launch a new instance using Automation, you can use the action ‘aws:runInstances’ in your custom Automation Document to launch a new instance. See the user guide for run instance.

E.g. 2. Patching an existing instance: There is a predefined Automation Document which can be utilized. See the user guide for patch instance.

E.g. 3. Terminating an instance: For a predefined Automation Document that can be used for terminate EC2 instances, see the user guide for terminate instance.

E.g. 4. Schedule a start and stop of an instance: There are predefined Automation Documents for Starting and Stopping EC2 instances which can be scheduled to run at the desired time using a Maintenance window. See these user guides – 01 and 02.

If you would like to perform all these tasks in an automated way on a schedule, you can utilize Systems Manager Maintenance window to achieve this. See the user guide for maintenance.

For Service Catalog, you would have to build your own CloudFormation template to perform each task. This template can be provided while creating a Product in your Portfolio. See these user guides – 01 and 02.

Written by Harsh

December 4th, 2019 at 6:11 pm

Posted in Cloud

Tagged with , ,


without comments

This was announced today, so some guidance on what should be the preferred way(s) to create and share AMIs, going forward:

Via Cloud management like CloudTamer
You can share AMIs (and Service Catalog Portfolios) in CloudTamer. Specifically, you can create AMIs under “Cloud Management” > “AWS AMIs”. Note, a share is fully managed by CloudTamer so it’ll only be shared with AWS accounts that have a CloudRule containing it, and all other AWS accounts will lose access to it if it is shared with them separately. Once we have the permission to share these at the Project or OU level, AMIs won’t need to be re-shared with new accounts as they are on-boarded.

Via Service Catalog
I created a product for this in July – It creates an AMI from an EC2 instance (not just create an EC2 instance from an accessible AMI) by implementing a Custom Resource [Custom::AMI] that calls the CreateImage API on Create (and calls the DeregisterImage and DeleteSnapshot APIs on Delete). Custom Resources enable writing custom Lambda in templates.
Basically, it starts off with a base image to create an EC2, then patches it, creates an image out of it, and deletes the EC2 when all is done – Essentially, automating what most Cloud shops do anyway. With this approach, you can enable your customers to mint a new AMI on-demand. Wee!

Via AMI Factory/SM Automation/Resource Access Manager (RAM)/EC2 Image Builder (from re:Invent 2019)

Via the traditional workflow
You can share AMIs by adding new account numbers to the share. This approach should be scripted.

PS: Should add that AMI IDs can be “hidden” in Parameter Store $vars for all of the above.

Written by Harsh

December 2nd, 2019 at 6:06 pm

Posted in Cloud

Tagged with ,

From SSH to SSM (a.k.a. the demise of bastion host)

without comments

To avoid having to manage very many SSH keys, sysadmins used ssh-user. AWS solves the headache with ssm-user. It is only for administering existing infrastructure. To provision new infrastructure, as should be end-goal in a Cloud shop anyway (think cattle, not pet – Shitty way to look at animals, but anyway…), you don’t need either SSM or SSH.

Traditionally, you find “what” someone does, like so –

$ sudo cat secure | tail -n 3
Aug 6 10:40:16 hostname sshd[12345]: pam_unix(sshd:session): session opened for user username by (uid=0)
Aug 6 10:40:40 hostname sudo: username : TTY=pts/3 ; PWD=/var/log ; USER=ssh-user 02 ; COMMAND=/usr/bin/bash
Aug 6 10:40:56 hsotname sudo: ssh-user : TTY=pts/3 ; PWD=/var/log ; USER=root ; COMMAND=/bin/cat secure

For tracking, the one change that can be implemented a-stat is to require all teams to start using TOKENs and not keys for GitHub check-ins so traceability of their provisioning code can be maintained.

With these, if teams need to login to existing sys, you can trace who all logged in, who all invoked ssh-user, and what ssh-userdid. So, you can narrow down a misstep to the group that last invoked ssh-user, but not to a specific individual. (Note, while SSM stills logs everyone in as ssm-user, it hyphenates sessions separately by user and can narrow it down to an individual).

Another workaround you can put in place is to manage keys via .ssh/config, at least in the interim. But eventually, you should move to SSM in an AWS shop.

And I’d rather have teams working in an IDE from their local laptops than in a text editor on a server instance (admit it, VI or EMACS isn’t everyone’s cup of tea). For that, since we’d need Linux running on Windows for Ansible etc., take your pick between WSL or VirtualBox. Either way should be a-okay – In the past, I used WSL for access (with temp keys via STS), and found a sub-system less bloated than a full-fledged sys box. And there was this sweet thing.

This is imp – Esp. as we treat infrastructure = code and engineers/sysadmins = developers/builders, we should allow engineers/sysadmins the same DevOps tools – an IDE to work in locally, etc. So that’d mean that each engineer/sysadmin forks her own repo and pushes changes out in true Git fashion (with retention/archiving handled by versioning).

With that, you pretty much kill the need for most intermediate server hosts.

Written by Harsh

August 6th, 2019 at 3:30 pm

Posted in Cloud

Tagged with , ,


without comments

Ansible-vault can be in any format, just as long as you can retrieve from it –
If YML (recommended), it’s easier to parse – msg: "{{ key }}"
If JSON, then you’ll have to parse it like so – msg: "{{ vault_var['key'] }}"

For Ansible older than version 2.4 | Version 2.4 or newer | Description

* --ask-vault-pass | --vault-id @prompt | It’ll prompt for your vault password when you run the playbook
* --vault-password-file | --vault-id file.yml | It’ll look for your vault password in file.yml

See SOP.

Written by Harsh

August 5th, 2019 at 2:33 am

Posted in Cloud

Tagged with ,

Giving ML the RT

without comments

So, how does Machine Learning fare against the Rorschach Test?

  Rorschach Blot     Popular Responses     Labels Found by AWS Rekognition  
bat, butterfly, moth, Stain, Weaponry, Weapon,
humans, animal, Art, Modern Art, Heart, Paint Container,
human, Art, Animal, Bird, Modern Art, Drawing, Painting, Leisure Activities, Dance, Dance Pose, Stain,
animal hide, skin, rug, Art,
bat, butterfly, moth, Plant, Leaf, Silhouette, Symbol, Dog, Mammal, Canine, Pet, Animal, Bird, Arrow,
animal hide, skin, rug, Arrowhead, Bird, Animal,
human, head, Nature, Outdoors, Animal, Bird, Chicken, Poultry, Fowl, Weather, Ground, Stain,
animal, Stain, Art, Painting,
human, Stain, Painting, Art, Modern Art, Canvas, Graphics, Paint Container, Advertisement, Poster, Plot,
crab, spider, Art, Painting, Graphics, Modern Art, Pattern,

▲ Applied ML (Machine Learning)

* “MIT researchers: Amazon’s Rekognition shows gender and ethnic bias (updated)”
* Solar Dynamics Observatory (SDO)
* Global MapAid

Written by Harsh

July 10th, 2019 at 7:10 pm

How much does the humanitarian sector have to do to invest into innovation to produce effective results?

without comments

In response to this question at Invest in Humanity – Aid & Trade London:

I think a lot i.e. the humanitarian sector has to invest a lot in order to produce effective innovation because that is the price of good innovation. If that was not the case, good innovation would be easier to come by, and that is why you see in the private sector – where you find arguably the most effective innovation – a lot of money put towards research and development.

Basically, the higher the risk, the more the reward. Conversely, the more the results from innovation, the more the risk or stakes, and therefore, more the investment betting on it.

So now the real question is: How can the humanitarian sector – where resources are limited – then pursue effective innovation?

And I think the answer to that lies in focusing on the other side of capital i.e. the human capital. The humanitarian sector is certainly short on funds, but it is full of passionate people. So it should invest a lot in supporting teams of self-driven individuals who can produce result-driven innovations.

Written by Harsh

April 22nd, 2018 at 10:33 pm

Posted in Education,Social

Tagged with , , , ,

In Retrospect: Lessons & Tips from a Large Federal Implementation

without comments

Last year, I presented on “Esri WebGIS Platform – How we implemented ArcGIS, and you can too” at FedGIS. This year, I shared another summary – lessons and tips from that implementation. That is especially helpful if you are dealing with the unique security responsibilities of the federal government around high-value PII/PHI-based data assets and Expedited Life Cycle (XLC) processes.

From a technical perspective, I shared how we implemented a hybrid and disconnected ArcGIS design inside a 3-zone architecture with multi-VPN and multi-NIC networks on Red Hat Enterprise Linux.

From a high-level management perspective, I shared how that played out inside the federal environment.

Esri ArcGIS Federal

* Mapping and Location Analytics for Fraud Prevention at CMS.gov
* Geocoding ETL/ELT Workflow for 1/4 Billion Addresses

Written by Harsh

March 26th, 2018 at 9:10 pm

Esri WebGIS Platform

without comments

Customer Need, Deployment Option, Authority to Operate (ATO), Challenges, Solutions, Lessons

Esri WebGIS Platform

Written by Harsh

September 13th, 2017 at 4:11 pm

Esri in AWS Cloud

without comments

Background, Options, Opportunities

Esri in AWS Cloud

Written by Harsh

December 17th, 2016 at 11:48 pm

Geodata Based Decisions

without comments

How to Use Location Analytics to Improve the Effectiveness of Public-Facing Sites

Geodata Based Decisions

Written by Harsh

March 17th, 2016 at 11:47 pm

Posted in GIS,Health,Management,Technology

Tagged with , ,

HowTo: Run ‘ArcGIS for Server Advanced Enterprise’ (10.3.1) on Amazon EC2 Red Hat Enterprise Linux (7)

without comments

The talks on ArcGIS Server at ESRI Health GIS were fun, but I wanted more – specifically, to install and administer its latest release on Amazon Web Services, all via the trusted command line. Here’s how I did that:

To follow along, get an EDN license and an AWS account. Especially, if you have been in the industry for long, there’s no good excuse to not have those with the biggest companies in GIS and da Cloud (and while you are at it, get MapBox and CartoDB accounts too).

### Setup the stage ###
# Downloaded its AWS key from //aws.amazon.com/console/ and connected to my instance (ensured it matched the min. system requirements) using its public DNS (if you restart your instance, this will change). Note I SSHed using Cygwin instead of PuTTy.
$ ssh -i "key.pem" ec2-user@#.#.#.#.compute.amazonaws.com
$ cat /etc/redhat-release
> Red Hat Enterprise Linux Server release 7.1 (Maipo) # Even though I used RHEL-7.0_HVM_GA-20141017-x86_64-1-Hourly2-GP2 by Red Hat (I later found out that ESRI provides its own AMI)
$ sudo yum upgrade
$ sudo yum update
$ sudo yum install emacs # For that college-dorm smell, no offense Nano/Vi
$ sudo emacs ~/.bashrc
    force_color_prompt=yes # If you haven't already... (Ignored the embedded rant and uncommented this line to make the prompt colored so it was easier to read in-between)

### Setup the instance ###
# I used a M4.LARGE instance with a 20GB EBS volume (in the same Availability Zone, of course) - ensured it didn't go away if I were to terminate the instance. Then, I extended the partition to exceed the min. space requirements (took a snapshot first) - unfortunately, AWS docs didn't help much with that.
$ df -h
> ...
$ lsblk # Listed block partitions attached to the device. Since there was a gap in sizes between the partition and the device (and there were no other partitions), I resized the child partition "XVDA2" (the root file system where I would finally install ArcGIS Server) to use up the surplus space on its parent disk "XVDA".
> xvda 20G disk
> |_xvda2 6G part /
# First, updated its metadata in the partition table
$ sudo yum install gdisk # Since disk label was GPT
$ sudo gdisk /dev/xvda/
$     print # Noted the start sector
$     delete
$     new
$     #### # Used the same start sector so that data is preserved
$     \r # For the max. last sector
$     # # Used the same partition code
$     print
$     write
$     y
# Next, updated the actual XFS file system
$ sudo xfs_growfs / # This is the actual change for XFS. If 'df -T' reveals the older EXT4, use 'resize2fs'.
# Then, confirmed to see if the boot sector was present so that stop-start will work
$ sudo file -s /dev/xvda # Bootloader
# Finally, rebooted the instance to reflect the new size
$ sudo reboot

### Onto GIStuff ###
# WinSCPed and untarred the fresh-off-the-press 1GB release
$ tar -xvf ArcGIS_for_server_linux_1031_145870.gz
# Got the right ECP#########?
$ ./Setup # Started headless installation - try "--verbose" if you run into other issues
# Hit a diagnostics roadblock: File handle limits for the install user were required to be set to 65535 and the number of processes limits to 25059. So...
$ sudo emacs /etc/security/limits.conf
$     ec2-user soft nofile 65535
$     ec2-user hard nofile 65535
$     ec2-user soft nproc 25059
$     ec2-user hard nproc 25059
# Logged out, logged back in, verified
$ ulimit -Hn -Hu
$ ulimit -Sn -Su
$ ./Setup

### Authorize, authorize, authorize! ###
# Created and uploaded authorization.txt, and downloaded authorization.ecp from //my.esri.com/ -> "My Organization" -> "Licensing" -> "Secure Site Operations"
$ locate -i authorization.ecp
$ readlink -f authorization.ecp
$ ./authorizeSoftware -f /path/authorization.ecp
$ ./authorizeSoftware -s # s=status, not silent
$ ./startserver.sh
$ netstat -lnp | grep "6080" # Confirmed owned processes - that it was listening on the default TCP@6080 (port is only required if you don't have the Web Adapter)
# Ensured IP and domain were listed correctly in the hosts file (e.g. Single IP may be mapped to multiple hosts, both IPv4 and IPv6 may be mapped to a single host, etc.)
$ hostname
$ emacs /etc/hosts
$ localhost localhost.localdomain localhost4 localhost4.localdomain4
$     ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
$     #.#.#.# localhost localhost.localdomain localhost4 localhost4.localdomain4
# But wait, before I could browse to my site from a public browser, I needed to add this Inbound Rule to the Security Group attached to the instance
Custom TCP rule TCP 6080

### Browser ahoy! ###
//#.#.#.# or machinename:6080/arcgis/manager
ArcGIS Server Setup Wizard -> Create New Site
Primary Site Administrator -> Create Account # Stored with the site, not the OS
# Must be local and accessible from every machine in your site
    Root Server Directory: /home/username/arcgis/server/usr/directories # To store output images, etc.
    Configuration Store: /home/username/arcgis/server/usr/config-store # To hold info about the server's machines, services, directories, etc.
# This is when I ran into "0x80040154 - Could not create object 'ConfigurationFactory'". So, went digging through the logs...
$ cat /home/ec2-user/arcgis/server/usr/logs/EC2/server/server-...log
> ...
> Cluster 'default' successfully created.
> Failed to create the site. com.esri.arcgis.discovery.servicelib.AGSException: java.lang.Exception: AutomationException: 0x80040154 - Could not create object 'ConfigurationFactory'.
> Disconnecting the site from the configuration store.
# Back to the server: File/directory permission issue? Nope. The issue turned out to be missing packages, even though the pre-installation dependencies check had passed. All 15 listed below:
$ sudo yum list installed
$ sudo yum install wget
$ wget http://vault.centos.org/6.2/os/x86_64/Packages/xorg-x11-server-Xvfb-1.10.4-6.el6.x86_64.rpm
$ sudo yum localinstall xorg-x11-server-Xvfb-1.10.4-6.el6.x86_64.rpm
sudo yum install Xvfb # Else "Unable to start Xvfb on any port in the range 6600-6619"
sudo yum install freetype
sudo yum install fontconfig
sudo yum install mesa-libGL
sudo yum install mesa-libGLU
sudo yum install redhat-lsb
sudo yum install glibc
sudo yum install libXtst
sudo yum install libXext
sudo yum install libX11
sudo yum install libXi
sudo yum install libXdmcp
sudo yum install libXrender
sudo yum install libXau
# Cleanliness is next to godliness, or so my Catholic school nuns would say
$ sudo yum clean all
$ cd /tmp/
$ sudo rm -r *
$ logout

### Back to the browser ###
# At the end, added SSL using a self-signed certificate
Custom TCP rule TCP 6443 # Added this rule to the group on AWS first

### Uninstall? ###
$ ./stopserver.sh
$ ./uninstall_ArcGISServer
# rm folders after done

Conclusion: 6443 or 8443?

After years of doing this with first ESRI (PROD), then MapServer (PROD) and GeoServer (DEV), I went back to the dark ahem ESRI side. And what do I keep finding? That the big two are blending together in terms of looks. E.g. The console of the other Java-powered mapping server, GeoServer, is looking similar to that of its big brother on-steroids. The third, MapServer, somewhat paradoxically on the other hand, has both come a long way (MapCache and ScribeUI, yay!) and still lost ground.

Next up, testing Tippecanoe.

* I tried both 10.3.1 and 10.0 on Ubuntu (15.04), unsupported. While both installed, site creation didn’t work because of missing packages – searching through apt-cache didn’t help either. On Windows, there is always their CloudBuilder.

* GeoNet
* Landsat on AWS in ArcGIS

Written by Harsh

September 28th, 2015 at 7:43 pm

#HealthGIS: Notable links and final thoughts on the conference

without comments

Health websites using ESRI ++

    * With ArcGIS JavaScript
        • CDC’s Division for Heart Disease and Stroke Prevention (DHDSP) Atlas

    * With ArcGIS Server / ArcGIS Online (via Apache Flex)
        • HealthLandscape’s Accountable Care Organization (ACO) Explorer
        • Dartmouth’s Atlas (try generate KML)
        • NMQF’s Methicillin-resistant Staphylococcus aureus (MRSA) mapping
        • HRSA’s Datawarehouse

Health websites whose global participants have trouble with software licenses ++

    * With OpenLayers and DHIS2 (~ an opensource InstantAtlas)
        • PEPFAR’s (a president’s best legacy) Data for Accountability, Transparency and Impact (DATIM) – coming soon to GeoServer + MapLoom and OpenLayers

    * Even Highmaps(!)
        • NCHS’s Health Indicators Warehouse (HIW)

    * Many More…


Clearly, there’s no shortage of health data or technologies, esp. following ACA’s requirements of uniform data collection standards, just a continuing kerfuffle with overlaying disparate JSON/OGC tiles from their many data owners and manifold service endpoints. Unfortunately, only part of this problem is technical. Take Flu mapping, for instance. CDC, WHO, WebMD (with MapBox) and Google, even Walgreens does it. Or take HIV mapping where you can choose from CDC and NMQF, among others. Even anonymized private claims data is available for a couple of Ks a month. I think a bigger part of the problem is the misalignment between vendors’ business interests and mandates of various agencies and goals of the health research community at large.


At some point, researchers and epidemiologists would want to see how these data tiles correlate to each other. And GIS professionals would want a quicker way to ‘overlay this layer’ with out having to dig through Firebug. And compress it over the wire, while you are at it (when our users in remote Africa were asked to switch off their smartphones to view desktop maps, we understood data compression a little differently).


And then they would want to analyze them, be it on the server with Big Data or in the client with smaller ones. On analyses, your favorite GIS continues to take heat from tools like Tableau among conference attendees.

Mapping Visible Human

Overall, a growing use of ArcGIS Server’s publisher functionalities and a compelling body of story map templates leveraging its narrative text capabilities. E.g. Atlas for Geographic Variation within Medicare. On publishing, I suspect some researchers would like to see a Mapbox plugin for QGIS. Yes, you can render and uploads maps from TileMill to your Mapbox account, but CartoDB has QgisCartoDB where you can view, create, edit or delete data from QGIS to your CartoDB account (I needn’t add that Python-powered QGIS remains a favorite among matplotlib-loving researchers).

health.w800PS: My ranking of how easy it is to connect to federal health datasets –
1. CDC (E.g. NCHS, Wonder, Health Indicators)
2. CMS (E.g. DNAV, Medicare – try Hospital Compare – Info, Spreadsheet, JSON)
3. HRSA (E.g. Datawarehouse).

* CDC’s GIS Resources
* CDC’s Submit Maps
* Hospital Referral Region (HRR) – A regional market area for tertiary medical care
* Health Savings Account (HSA) – A tax-advantaged medical savings account available to some taxpayers

++ While log analyses attest that mono-themed web maps provide a better user experience, given the nature of health data and the costs behind spinning off another mapp (yup, blended words to make a portmanteau), sometimes you just have to combine themes.

Written by Harsh

September 21st, 2015 at 8:10 pm

Federal Contracting 101

without comments

How to write a S.O.W. for a G.I.S. in a G.O.V.?

Mapping APPlication (MAPP) v#.#
Statement of Work (SOW)

Project Goals/Scope:
Project Responsibilities:
The project has to take into consideration the following -
Project Steering Committee:
Meeting and Reporting Requirements:
Contractor Requirements:
* Security -
Data Requirements:
Tasks/Activities and Deliverables:
Services Requested -
1. Kick-off -
2. Stakeholder Coordination -
3. Usability Analyses and Staff Review - Collect High-Level Business Requirements (Use-Cases) -
Responsive Design
4. Prototype Development -
Install and Config GIS Server
Install and Config GIS Widgets
Finalize GIS Service
5. Collect Detailed Functional Requirements, Define Scope, Create WBS -
Define Activities, Develop Schedule, Identify Risks
Verify Quality
Validate Scope
Procure Software
6. Collect Cloudware/Hardware Requirements -
Procure Cloudware/Hardware
7. Data Model -
Test Extraction and Off-loading
Setup Backend (Data Mart)
Test Backend (Data Mart)
Procure Data License
8. Design/Development -
Develop MAPP
9. Implementation and Testing -
Install and Config GIS Server
Install and Config GIS Widgets
Final Testing - MAPP, Load
Finalize GIS Service
Update ETL
Update Backend (Data Mart)
Update MAPP
Verify Quality
Validate Scope
Schedule ETL
Inspection and Acceptance
Tool Administration - Operations and Maintenance
10. Training/Travel -
Selection Scoring and Proposal Guidelines:
Pricing and Payment Schedule:
Project Timeline:
Project Resources:
Criteria for Performance:
Place of Performance:
Key Personnel
Period of Performance
Hours of Operation
Terms of Resolution

* Sample Technology Statements of Work (SOWs) – GWAC, Connections II, BPA, Schedule 70
* Enterprise Systems Development (ESD) SOW Template
* Federal Business Opportunities (FBO), GovWin IQ (Deltek) – E.g. Search for your NAICS codes, say, 518000,540000,541500,541510,541511,541512,541519,541611,541720,611400; forecast/pre-RFP; civilian; projected award date; etc.
* System for Award Management (SAM)
* USA spending
* 10 Things to Know about Managing GIS Projects (ESRI)

Written by Harsh

August 21st, 2015 at 4:12 pm

How We Balanced Proprietary With Opensource Software And Saved Tax Dollars, And You Can Too

without comments

It all began with a question – “Can we do with out?”.


Enterprise Architecture > Technology Architecture > Geographic Information System (GIS):
* Geographic Information System (GIS) Pattern
* GIS Desktop Brick
* GIS Virtual Globe Brick
* GIS IMS Brick
* GIS Web Service Brick

* GIS Market Study of Internet Mapping Server (IMS) – Summary – Requirements and Comparison Matrix (2006)

Meanwhile, Thirteen Years Later…

without comments

So, does it hold up?

The Map (GIS Growth Study) v. The Thing Mapped (Demographics, Plan)

PS: I smell a decentralist –

“A Caveat (from 2001)

Such a planning methodology of data collection and projection does have some intrinsic faults: it relies heavily on knowledge-based skills. It assumes that ‘correct solutions’ to social problems can be obtained from a scientific analysis of various data. It must be noted that a solution-driven approach and heavy reliance on physical sciences as opposed to social sciences, is inherently inaccurate since the ‘best planning answer’ is a non-existent variable, changing with time, society, culture, resource availability, etc. And there is always a danger of being consumed by this technique, and confusing the result for a solution.

The nature of this study involved making some basic assumptions about the way our study-area could evolve in the not-so-distant future. There have been doubts raised about the correctness of such a clinical technique wherein an urban settlement is ‘stripped’ of its various attributes, and these attributes then individually graded. Appreciation of the intricate complexity of human society, where each individual is a separate factor, is absent. Lack of importance to these inter-relationships is a flaw of such an analysis.

For E.g. In the current study, if we were to discover one other attribute, say a desert, how would it affect the final map? We would, using this approach, simply grade each cell one more time. Then we would add this new map to our list of maps, and calculate the new final map. However, we would fail to evaluate how the addition of a desert affects each of the other attributes individually.

But this flaw may not be as aggravated as it seems. Each cell gains its final value from all attributes. If in a hypothetical case, one could gather a ‘complete list of attributes’ that would impact future growth, and assign them ‘correct values’ (without even breaking them into distance-bands which are only for convenience), finally adding them in the ‘right equation’, one would come up with a case-specific fairly accurate growth forecast (however, even then, any sudden future changes would still get missed).

There have also been some other approximations:

* The integer weights assigned to attributes.

* Or, areas outside the study-area that exert significant impact on urban growth, but were ignored because of study limitations.

* Also, on examining the Cultural Points table, it is found that Cemetery was included as a row category. Cultural Points have been considered as having positive influence on future growth. But a cemetery would not have an entirely positive influence on urban growth. Furthermore, parts of UVA were used as cultural points. The university was also used as a major employer. Thus, there has been some overlapping. This results in disproportionate values for some cells.

But this study is an illustration more of a proactive planning approach, than an accurate projection of urban growth for an area. And even though limited in its effectiveness, any attempt to administer planning remedies would have to include some such non-arbitrary problem-solving technique.”

To Tile Or Not To Tile (Windows 8 App Store: Making the business case)

without comments

Top 5 Apps by Popularity (free)
1 Xbox SmartGlass
2 Skype
3 Photo Editor
4 Google Search
5 Twitter
* MetroStore Scanner. 04/28th/2013.
Top 5 Apps by Popularity (paid)
1 Crash Course Go
2 Fruit Ninja
3 Bejeweled LIVE
4 Plex
5 Angry Birds Star Wars
* MetroStore Scanner. 04/28th/2013.
Top 5 Categories by Popularity
1 Education
2 Books & Reference
3 Entertainment
4 Games
5 Tools
* MetroStore Scanner. 04/30th/2013.




* ArcGIS app

Written by Harsh

May 5th, 2013 at 7:07 pm

Conference Presentation: GIS TECH 201 – Mapping Mashups

without comments

Technology Division of the American Planning Association (APA) Newsletter: Spring 2013 (Conference Edition)

without comments

Planning & Technology Today: Spring 2013 (Conference Edition) (alternate link)

Letter From Your Chair

The National Planning Conference starts this week in Chicago at the Hyatt Regency. The theme of APA 2013 is “Plan Big”, and focuses on creating a big shift in planning via big opportunities and big projects to better plan for our future (See related questions about Big Poverty on page 7). While that theme may seem at odds with the current sequestration and other economic cuts, it is a call to get bold with the planning challenges of our times. You can follow the conference on Twitter via the hashtag #apa2013. You can also follow our division’s activities at that conference via the hashtag #apa2013tech.

The division’s Business Meeting and Facilitated Discussion will be held on Sunday at 10.30 am – 11.45 am in the Riverside Exhibition Center. The agenda for our Business Meeting includes sharing of our last Performance Report, adoption of our Bylaws, and transition to the new leadership. Our Facilitated Discussion will focus on broadband infrastructure, policy and sustainability. We have also tentatively planned a social event on Monday at 5 pm. Don’t forget to check the roster on page 4 for the other technology-related events being held at the conference – I will be discussing mapping mashups on Monday at 2:30 pm – 3:45 pm.

If you want to further coordinate activities with other members, I encourage you to enter the days you are planning to attend at http://doodle.com/9fa62e5zfywcy727.

Finally, I want to congratulate those who got elected and thank others for participating in our elections held last month. Please join me in welcoming the new leadership: Katherine McMahon, incoming Chair; Nader Afzalan (@NaderAfzalan), incoming Vice-Chair; and Karen Quinn Fung (@counti8), incoming Secretary/Treasurer. Kate has some interesting ideas on planning strategies for broadband, and I wish her success in defining the division’s focus on broadband planning activities (See the notice on p.age 3 for a related survey at http://www.surveymonkey.com/s/72GN8KZ).

I also want to thank Corey Proctor and Joni Graves for their contributions of time and talent as the interim Vice-Chair and Secretary/Treasurer respectively, and especially Steve Chiaramonte for a splendid job as the Newsletter Editor.

I hope to meet you in Chicago. As always, you can contact me directly at harshATgisblog.org (@GISblog).

Written by Harsh

April 8th, 2013 at 11:38 pm

Posted in Planning,Technology

Tagged with

Technology Division of the American Planning Association (APA) Newsletter: Winter 2013

without comments

Written by Harsh

February 19th, 2013 at 7:51 pm

Posted in Planning,Technology

Tagged with

Monty Hall

without comments

Written by Harsh

January 31st, 2013 at 10:48 pm

Technology Division of the American Planning Association (APA) Newsletter: Summer/Fall 2012

with one comment

Written by Harsh

October 5th, 2012 at 7:01 pm

Posted in Planning,Technology

Tagged with

Technology Division of the American Planning Association (APA) Newsletter: Spring 2012

with one comment

Written by Harsh

April 12th, 2012 at 8:30 pm

Posted in Planning,Technology

Tagged with

Technology Division of the American Planning Association (APA) Newsletter: Winter 2011

without comments

Written by Harsh

April 11th, 2012 at 8:27 pm

Posted in Planning,Technology

Tagged with

Technology Division of the American Planning Association (APA) Report: Summer 2011

without comments

This report provides a description of existing services, both external and in-house, available to APA divisions for hosting and broadcasting webcasts to their members and other interested professionals, and specifically looks at the external Planning Webcast series. In addition, it includes an analysis of options for expanding these services. The report was produced in response to a request from the APA Divisions Council (DC).

Options for Division Webinars: Summer 2011 (PDF, DOC)

* Planning Webcast series
* APA Audio/Web Conference series

Written by Harsh

September 30th, 2011 at 12:16 pm

Posted in Education,Planning,Technology

Tagged with

My Pick of Steve Jobs’ Technology Quotes

without comments


“The most compelling reason for most people to buy a computer for the home will be to link it to a nationwide communications network.” (1985)

“The desktop metaphor was invented because one, you were a stand-alone device, and two, you had to manage your own storage. And that may go away.” (1996)

* Letter
* Patents
* Bio

Written by Harsh

August 26th, 2011 at 12:13 pm

Posted in Technology

Tagged with ,

Technology Division of the American Planning Association (APA) Newsletter: Summer 2011

without comments

Written by Harsh

August 22nd, 2011 at 6:18 pm

Posted in Planning,Technology

Tagged with

New Media

without comments

Written by Harsh

June 25th, 2011 at 12:28 pm


with 2 comments

So it seems that some companies pay more importance to making the “right” decision than executing the better follow-through. I suspect more often than not, you can make a “wrong” decision “right” in its follow-through. Doing so, along with setting up few arbitrary constraints, lessens the often debilitating and sometimes paralyzing effects of decision-making in an over-abundance of choices, no matter how much you break the process down. And it increases the chances of a “command presence”. Also, it opens up the possibility of more than one decision being right (or wrong, but don’t think about that too much because you’d never know until later anyway), especially since information relevant to decision-making is almost always trickling in. Quite unlike in the STEM academia where often all relevant information can be known via some analytical processes and the “correctness” of decision-making therefore mostly rests on how you apply those processes and put their results together, in actuality you can often never fully know all that is relevant. Therefore, you have to act on incomplete imprecise implicit data points and shifting goal posts or evolving requirements. But by making the follow-through important, you end up taking more shots. And while that decreases your batting average, it increases the probability of more home runs.

* Facebook
* How…

Updated: Aug 2011
* The Problem with Perfection

Written by Harsh

June 10th, 2011 at 4:55 pm

Technology Division of the American Planning Association (APA) Awards for 2011

without comments

Category 4: The award for the ‘Best Paper on Technology in Planning’ goes to Omar J. Peters’ (University at Albany, SUNY) ‘Why-Fi: A Look at Information Technology as a Strategy for Urban Development’ for the outstanding paper on the use of technology in planning.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Sunday, April the 10th (11:45 AM – 1:00 PM) in Beacon G, Sheraton Boston Hotel. Congratulations again to our award winner!

* Technology Division of the American Planning Association (APA) Awards for 2010

Written by Harsh

March 30th, 2011 at 6:30 pm

Posted in Education,Planning,Technology

Tagged with

Interview: Senator Cardin, Maryland, speaks on transportation (2009)

without comments

This is a little fresh air for an old post that was collecting cobwebs as a draftee:

During a visit to Capitol Hill, I got an opportunity to interview Senator Cardin on changing federal policies that affect planning. This is an excerpt from our interview. The full interview can be found at the Division’s website.

Harsh – What are some of your main expectations from the next federal surface transportation bill?

Senator – We face three fundamental challenges with the new transportation bill –

With bridges failing, congested roadways, and transit systems strained to the limit, we need to make a major new investment in the nation’s transportation infrastructure. According to the US DOT, the average annual cost to maintain both highways and bridges at their current level for the next 20 years could reach $78.8 billion, while it would take approximately $131.7 billion per year to improve the condition of both highways and bridges Those figures don’t include the billions more needed for our transit systems and their needed expansions. We must act to make a major new investment in a system that is under extreme stress.

Our transportation policy needs to be reoriented to the nation’s needs in the new century. We need to better integrate our various modes of transportation for handling the nation’s commercial goods. That includes freight rail, harbors, and highway trucking routes, including their interconnection to air freight facilities. Our current system for moving people to and from their work, schools, and recreation also will need to be fundamentally rethought. That will mean a much greater focus on mass transit, alternative modes of transportation, smart growth, reduction in the number of vehicle miles traveled as a policy goal, and so much more. We need a transportation policy that supports our goal of reducing our dependence on foreign oil and reduces the generation of greenhouse gases. The new surface transportation law will not accomplish all of these changes overnight, but the new bill should put us on a fundamentally different path than we have taken in the past.

We will need to explore new ways to fund our national transportation programs. Our current reliance on a static “gas tax” is already coming up short: $8 billion in the current fiscal year. If we are successful in moving more commuters out of their cars and into buses and subways, we will see those gas tax revenues decline, not increase. If we are successful in encouraging people to live where they work and to telecommute, gas tax receipts will fall even further.

Harsh – Given the bridge tragedy in Minneapolis last year and the subsequent findings of the National Transportation Safety Board, do you support in principle the National Plan for Infrastructure Investment, and also as a way to stimulate our economy in a time of financial uncertainty?

Senator – The collapse of the I-35 Bridge was a tragedy for Minnesota and for the nation. The bridge failure resulted in 13 deaths. The accident has already spurred the nation into action.

There are approximately 600,000 bridges on highways throughout the United States. About 51 percent of bridges are state owned, 47 percent are locally owned, and less than two percent are owned by the Federal government or private entities. National surveys indicate that nearly one-quarter of all these bridges are structurally deficient.

In addition to the funds provided directly for the repair of the I-35 Bridge, the Congress provided $1 billion in special funding to address our structurally deficient bridges. Of the 2,584 bridges along the Maryland State highway system, 411 (16 percent) are classified as functionally obsolete.

The American Society of Civil Engineers, the Nation, and others are calling for major infrastructure investments. I support a sustained effort to rebuild our national infrastructure. Doing so will provide an immediate stimulus to our economy and give us the network we need to restore the health of our commercial sector.

PS: Thanks to Mike Burke for arranging this!

* Senator Benjamin Cardin (Wikipedia)
* US Department of Transportation
* US National Transportation Safety Board
* Planning & Technology Today (2009)
* US Green Building Council
* Data 360

Written by Harsh

February 18th, 2011 at 6:05 pm

Verizon iPhone or iNot?

without comments

Back in the summer of 2010, as one of the million proud owners of iPhone 4, I noticed a certain setting to switch phone carrier. That setting then portended the change we will see tomorrow. But should you bite the bait? Assuming CDMA and GSM don’t matter, here’s part 1 of my guide:

There is a lot of spin around Apple’s flagship cash cow, or as we have come to know it- the iPhone, which only recently represented about 43% of its overall sales. Not all of the coverage is positive (remember Foxconn?). Apple’s growing pains also include a big lawsuit fight. But for those with out a blind searing faith in Steve Jobs, the genius patriarch, the iPhone may very well be suffocating. If true, could Jobs be repeating his original sin? And if so, should your phone follow his sin to the grave?

iOS works better than Android out-of-the-box. To better understand the genesis of its famed usability and cool minimalism, watch Jobs’ 2005 Stanford Commencement Address. If you decide to switch, be prepared to shell out monies in cool apps and media. From a quick glance, I paid around $750 over 2 years. To Apple. Not AT&T (that averaged around $2,400 for the same time). And remember that MP3s from Amazon, somethings you can’t buy on your iPhone, tend to be less expensive and redownloadable – a big plus for some. And all that precious data would cost even more to put into MobileMe, Apple’s own cloud solution, never mind the naysayers. So more additions to your ever burgeoning monthly bill (Tethering, Personal Hotspot, …).

iPhone’s Mythical Advantage: Apps

Apple still disallows Adobe Flash (or Oracle Java) from iOS. It appears to be more a business decision than a technology constraint, designed to control the sprawl of Flash-based gaming mobile websites where you could buy outside of Apple’s walled-garden. How this affects HTML5 gaming websites is still unfolding, but it certainly helps the lagging QuickTime in the meantime. In any case, it goes against the customer’s best interests by taking away her choice to enjoy multimedia content in one of the industry’s most prolific formats. But Apple has you covered with the most commonly used app: the browser. Mobile Safari, hands down is the best mobile browser out there between the platforms that I tested, namely iOS, Android and Windows Mobile. For the GIS pros among you, Joben blogs about GIS apps for the iPhone. You can always find an increasing number at the App Store, like the iGIS.

Jailbreaking Folsom

So you switch and finally get that toy you were waiting for? Why jailbreak it? Jailbreaking the iPhone isn’t worth the effort, even if it is legal. And even if not upgrading to the latest and greatest release (something that iTunes would handle seamlessly for you, but something that you can’t always do with Cydia because Cydia often trots a step behind) is an acceptable risk, ask yourself if your precious data is too important to jailbreak. After all, you could brick your iPhone and quite possibly provide no way for iTunes to restore it. But if your phone data is not critical ahem, then you can add some developer functionalities by jailbreaking and escape the infamous iTunes bloat. Now jailbreaking could also introduce your spanking iOS to new viruses, but if you must, hope over to Cydia. If you need a copy of the old firmware during jailbreak, grab it from here. Once you jailbreak, remember to download a file browser or explorer, like iFunBox or iPhoneBrowser. You may also want to jailbreak if you want to install a phone firewall out of privacy concerns. After all, Apple did confess to collecting GPS data from iOS 3 and iOS 4 daily. Then again, if that is what propels you, why share your payment info with Cydia’s marketplace (just asking)?

Some quick notes on iFunBox or iPhoneBrowser – You can’t watch your uploaded pics or videos, or play your uploaded songs in their native app, even if you upload them to the folders that the iPhone looks under, say //var/mobile/Media/DCIM/100APPLE/. This is because the iPhone, much like the Android, extensively uses SQLite as its Swiss Army database, and all your uploads need to be first registered in the database, say //private/var/mobile/Media/PhotoData/Photos.sqlite which links your IMG_0001.JPG or IMG_0002.MOV. Now there are Cydia apps like iFile that help add your photos, but videos are still no go. But if you are brave enough to try, download the SQLite Manager add-on for Firefox and test your luck.

PS: More

Written by Harsh

February 9th, 2011 at 7:44 pm

Mashup on iPad

with 7 comments

OK, so tested Google, Bing, Yahoo, ESRI, Openlayers and MapServer mashups on the iPad, and much like on the iPhone, the slippy drag-and-droll interface doesn’t work. Except for one mashup. Take a guess?

* Safari
* WebKit

Written by Harsh

April 15th, 2010 at 10:50 pm

Webinar Series: GIS TECH 101 – Mapping Mashups

with 6 comments

Technology Division of the American Planning Association (APA) Awards for 2010

with 2 comments

Category 1: The award for the ‘Best Use of Technology to Improve a Plan or Planning Process’ goes to Marc Schlossberg‘s (University of Oregon) ‘Engaging Citizens in Active Transportation Planning with Mobile GIS‘ for its creative use of technology in improving planning processes.

Category 2: The award for the ‘Best Use of Technology for Public Participation’ goes to Michael Baker Jr.‘s ‘More For 1604 Social Media Program‘ for its good use of technology to enhance public involvement and participation in planning and decision making processes.

Category 3: The award for the ‘Best Use of Technology for a University Urban and Regional Planning Program’ goes to the School of Policy Planning and Development‘s (University of Southern California) ‘Multimedia Boot Camps‘ for its effective use of teaching with technology in preparing future professionals.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Monday, April the 12th (7 AM) in the Hilton New Orleans Trafalgar Room. Congratulations again to all our award winners!

* Technology Division of APA
* Planning & Technology Today

Written by Harsh

March 30th, 2010 at 3:48 pm

Interview: “Geographic Information Systems (GIS) – It’s Much More Than Google Maps – A Chat With GIS Experts”

without comments

Written by Harsh

March 18th, 2010 at 9:27 am

Follow Up [1]: Rural Clusters and Relative Rurality

with 3 comments

Written by Harsh

February 25th, 2010 at 2:23 pm

Les Misérables

with 4 comments

America’s 10 Most Miserable Cities
1 Cleveland, Ohio
2 Stockton, Calif.
3 Memphis, Tennessee
4 Detroit, Mich.
5 Flint, Mich.
6 Miami, Fla.
7 St. Louis, Mo.
8 Buffalo, N.Y.
9 Canton, Ohio
10 Chicago, Ill.

* Cost of Living and Higher Education
* Rural Clusters and Relative Rurality
* Les Misérables by Victor Hugo – Project Gutenberg

Written by Harsh

February 19th, 2010 at 12:07 pm

Posted in Planning,Social

Tagged with , , , , ,

Follow Up [2]: Unshared Sacrifice

without comments

CO2 emissions per capita: Carbon dioxide emissions in metric tons per capita

Population: Midyear estimates of the resident population

* Total Area:
1 Russia 1
2 Canada 2
3 United States 3
4 China 4
5 Brazil 5
6 India 7
7 France 43
7 Japan 61
8 Germany 62
9 United Kingdom 79
* Follow Up [1]: Unshared Sacrifice

Written by Harsh

February 3rd, 2010 at 2:48 pm

Posted in Planning,Social

Tagged with , , ,

Follow Up [1]: A Touch of Play

with 3 comments

Nearly 6 months after starting work on a Touch mapping project for kiosk deployment running Windows 7 RC on a HP TouchSmart, it sure is good to see touch taking-off. Slowly but surely:

* Gateway One ZX6810-01

* Sony next with Windows 7 multitouch all-in-one

And yes, I used 3rd-party x64 drivers to turn it from single-touch to double-touch.

So any of you guys working with touch?

Written by Harsh

November 17th, 2009 at 9:44 pm


without comments

Those investors who are rushing to their brokers for a piece of TeleNav’s IPO (TeleNav GPS Navigator needs extra cash to fight Google Map Navigation, or prep itself for a buyout), note that TeleNav (read LBS) has nothing to do with TeleAtlas of TomTom (read data). Yet.

* LBS’ Halloween – Interesting post @ Google Redefines Disruption: The “Less Than Free” Business Model.

Written by Harsh

November 2nd, 2009 at 4:10 pm

Taking Wolfram|Alpha on an Alpha Run

without comments

Wolfram|Alpha is being billed as an Answer Engine for the scientifically-minded, as opposed to a Search Engine: It takes your query, implied or otherwise, that critical step further by selecting from its list of matches, the one objective description, image etc, and lays them out in context. Not that Google never attempts definitive answers [chord], but when it does, Wolfram|Alpha [note] handily beats it to it with background information. START, on the other hand, is sometimes embarrassing. Note that it may not know what to do, but it does not give the wrong answer. Yet.

So Wolfram|Alpha dares to do more than say, Google or Yahoo or Microsoft, and impresses despite its alpha status.

There are inherent risks in such an approach in that it hopes our queries are frequently specific enough, which in some cases, will not be because that is how we generally are. There is also that small issue of assigning culpability to its user for a dumb query. But through consistent performance and by avoiding curation, link-fraud etc pitfalls, Wolfram|Alpha has the potential to wean away some of the Google fan-base, notwithstanding Google Squared. And by targeting the scientific community, it has the potential to emerge as a niche Answer Engine despite semantic ambiguity or crowd-sourcing.

Bookmark it now. And keep checking.

Here are some stumpers:
* What is the elevation above sea level at 38.889483,-77.035254? Wolfram|Alpha v Google v START
* What was the annual revenue of the state of Maryland for Fiscal Year 2007?
Wolfram|Alpha v Google v START
* What is the maximum height of the Guggenheim Museum NY? Wolfram|Alpha v Google v START
* How many symphonies did Sergei Rachmaninoff compose? Wolfram|Alpha v Google v START

* Developers

Written by Harsh

May 17th, 2009 at 7:52 pm

Swine Flu

without comments

Written by Harsh

May 3rd, 2009 at 2:57 pm

Posted in Mashup,Social

Tagged with , , , , , , ,

Follow Up [1]: Job

without comments

We are looking for a Senior Mobile Developer, GIS or otherwise, in the Washington DC Metro. Given the niche, pass it along to qualified professionals or contact me with your resume.

* Job

Written by Harsh

April 24th, 2009 at 12:45 pm

Posted in Job,LBS

Tagged with , , , , ,

Backdoor Buyer

with 3 comments

Oracle -> Sun -> MySQL

Positioning Timeline

* Oracle buys PeopleSoft (2004)
Ending a long-running and bitter battle: “We won’t do any other major mergers ($200 million plus) until it’s clear to us we have integrated this one to our satisfaction.” (Larry Ellison, CEO, Oracle)

* Oracle buys Siebel (2005)
Customer Relationship Management: “Oracle becomes CRM applications company.” (Larry Ellison)

* Oracle buys InnoDB used by MySQL (2005)
Oracle buying Innobase: “If Oracle thought it was threatened by MySQL, this was a very easy move.” (Paola Lubet, Vice President, Marketing and Business Development, Solid Information Technology)

* Oracle tries to buy MySQL (2006)
Why he turned down Oracle’s offer: the desire to keep his company’s independence: “They’re obviously entrenched in different areas of the market. But is there overlap in the middle? Sure.” (Stephen O’Grady, Analyst, Redmonk)

* Oracle buys opensource embedded Sleepycat (2006)
Linux and BSD UNIX operating systems, and Apache web server, embed Berkeley DB: Embedded databases also include Oracle Lite (mobile) and Oracle TimesTen (in-memory). (Oracle)

* Sun buys MySQL (2008)
We’re acquiring MySQL: “The world’s most popular opensource database.” (Jonathan Schwartz, CEO and President, Sun)

* Oracle buys Sun (2009)
Solaris is the leading platform for the Oracle database, Oracle’s largest business: While Oracle Fusion Middleware, Oracle’s fastest growing business, is built on top of Java. (Sun)

* Gartner (2007)
Oracle’s database etc sales: $8.3 billion (up 14.9%) | Market share: 47.9%->48.6%
IBM’s DB2 etc sales: $3.5 billion (up 10%) | Market share: 21.3%->20.7%
Microsoft’s SQL Server etc sales: $3.1 billion (up 16.5%) | Market share: 17.6%->18.1%
Total database sales: $17.1 billion
* Softwar: An Intimate Portrait of Larry Ellison and Oracle
* Jonathan Schwartz (conspicuously MIA from his blog in recent days)
* MySQL Resurrection?
* PostgreSQL > MySQL > Drizzle > SQLite
* My Pick of FOSS4G 2007 Presentation Submissions

Written by Harsh

April 20th, 2009 at 5:56 pm

Posted in Database,Technology

Tagged with , , ,

Technology Division of the American Planning Association (APA) Webinar Series – TECH 101: Mashups for Planning

with 2 comments

Written by Harsh

February 18th, 2009 at 7:30 am

A Touch of Play

with 3 comments

First impressions after testing Microsoft’s Surface Table:

Pi: Quite MusingIt is a coffee-table sized hardware running Windows Vista and allowing collaborative interaction from up to 4-6 participants. The number of hand-gestures it can recognize is obviously higher than that of a standard touch-screen which can typically handle only a single tap and drag, and maybe multi-touch. On the other hand, the Surface Table can recognize multiple taps, imprecise flicks and resizes, and touch-intensity. Actually, much like a TouchSmart, it can even detect movement just above its surface. Simply put, it is like a giant iPhone.


So how does it lend itself to GIS/Planning application development? Well, it is more eye-candy than useful for its cost at this point and appropriate application ideas may not come readily. If you try to recreate a similar collaborative environment with a series of Tablet PCs, TouchSmarts and Windows 7, you might just be successful. Note that it can’t be detached from its base and wall-mounted since it has a projector underneath.

The Surface Table’s biggest strength lies in its enabling a collaborative environment, and therefore, it is more suited towards “playful infotainment”-type applications. If you develop GIS/Planning applications for the Surface Table, note this: It would be a lot of fun, but maybe not a lot useful. And also, it doesn’t carry any browser application (!) so you can’t simply start using your planning mash-up and development would present its own WPF learning curve for the web savvy. For an elegant GUI design, remember that fat shaky fingers need big buttons. In terms of pricing, Microsoft is currently also charging for its SDK (approx. $3K): Not sure of their pricing model, but it doesn’t seem like a smart idea if their goal is to also encourage the Viral Phenomenon. And although, they don’t yet come pre-installed (!), a wireless card and wheels can easily be mounted to turn your Surface Table into a self-contained unit to enhance its portability.


There are already some creative applications in-use: Soldiers returning from a patrol dump their head gears onto the Surface Table, and its docking corner instantly syncs their captured data with their sync folder- no fumbling there! Special ID tags can “identify” themselves to the Surface Table, but cell phones running Windows Mobile require a download before they can sync. Selected Omni Sheraton hotels and others are currently showcasing Surface Tables.


So how does it work? Well, conventional technologies detect touch-location by interrupting:
* Infrared
* Optical Field
* Surface Acoustic Wave
This interception happens just above the screen substrata and its grid coordinates are then translated to screen position. Alternatively, you can do a makeover of your current display using Dispersive Signal Technology (DST). DST integrates chemically-strengthened glass onto existing display. It detects bending wave within the glass radiating to the 4 corners where it gets converted to electric signals. This approach also makes it ideal for heavy-duty use to filter out “noise”, say when outdoors or think glass spills and crumbs in a snack-rich community planning meeting. Then there is Proximity Capacitive Resistance (PCR) for touch-across-surface.

Written by Harsh

January 20th, 2009 at 4:54 am


with 2 comments

Given the niche, pass it along to qualified professionals or contact me with your resume:
“The project objectives are to develop Virtual World applications to study how people acquire, organize and apply information. The ideal candidate will have a strong background in Virtual World development, and a demonstrable interest in Social/Bio Sciences and/or Communication/Media.”

Written by Harsh

August 24th, 2008 at 9:34 pm

b2evolution 2 wordpress

with 3 comments

Well, I have switched from b2evolution to WordPress CMS. And thanks to Apache’s mod_rewrite, I was able to keep all my old links intact. Here’s how:

### wordpress:
<IfModule mod_rewrite.c>
# basic:
RewriteEngine On
RewriteBase /gistools/discuss/weblogs/blogs/
# file:
RewriteCond %{REQUEST_FILENAME} !-f
# dir:
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /gistools/discuss/weblogs/blogs/index.php [L]
# [R] Redirect [L] Last rule
# post:
Options +FollowSymLinks
RewriteCond %{QUERY_STRING} title=([^&]+)
RewriteRule ^index\.php /gistools/discuss/weblogs/blogs/%1\.html? [r=301,nc]
# archive – monthly:
RewriteCond %{QUERY_STRING} m=([0-9]{4})([0-9]{2})
RewriteRule ^index\.php /gistools/discuss/weblogs/blogs/%1/%2? [r=301,nc]
# archive – category:
RewriteCond %{QUERY_STRING} cat=15
RewriteRule ^index\.php /blog/category/aspatial [r=301,nc]
RewriteCond %{QUERY_STRING} cat=14
RewriteRule ^index\.php /blog/category/spatial [r=301,nc]
### end wordpress

This was how the old URL looked like, http://www.spatiallink.org/gistools/discuss/weblogs/blogs/?title=gisp_and_aicp. Note that there were limitations to permalink, since %year%, %day% or %category% were unknown from the old URL. Fortunately, I had only 2 categories, so this was a cinch.

Written by Harsh

July 28th, 2008 at 11:22 pm


with 2 comments

Although I am still on the fence on GISP given the relative lackluster, what APA has done with AICP‘s CM could give it some shine when it comes to creating a provider ecosystem.

To quickly fill you in: Last year at its Leadership Meetings, APA launched the CM program for AICP. In short, it required professionals to continuously seek training in order to maintain their certifications, and allowed 3rd-party providers to offer that training.

For SIS, adopting a similar approach would require forsaking a fee-centric approach, letting someone like OGC bite a bigger share and sinking deeper into some sort of GIS accreditation, far beyond ESRI Authorized Training Program, before the “Surveyor Usurp” (see below).


The Status of Professional Certification in GIS – Conclusion:
“GIS application areas range from engineering to computer and information sciences, geography, business, logistics, forestry, and many other academic and professional preparation fields. Because GIS professionals come from a wide variety of backgrounds and academic preparation, no one group can claim to represent all approaches and applications within the GIS community. Also, given the volatile nature of the field, and the rapid change currently underway in software development and application deployment, adequate preparation today does not guarantee competency in the future. For these reasons, an overarching program to ensure appropriate professional preparation and competency must be developed by those parties interested in safeguarding the viability of the field and the competency of those claiming professional status.

It is unlikely that voluntary certification can assure competency across the profession if most practitioners choose not to be certified or if employers don’t insist that their employees be certified. Therefore, it is essential that benefits of certification be clearly articulated. By including a wide range of professional organizations within the certification development process, and working to include the interests of all GIS professionals by developing both a reasonable core set of competencies and appropriate specialized evaluations within the certification process, all groups will benefit from certification.”

� Groups – “The Usurp“: Spatial Sciences Institute (SSI), Australia and Board of Surveying and Spatial Information, New South Wales, Australia

� Degrees: University of Southern Queensland, Australia – Bachelor of Spatial Science (BSPS) and Bachelor of Spatial Science Technology (BSST)

� Ideas: For ideas on what required trainings could entail, here’re some courses (article) and my GIS suggestions for AICP’s CM:

— PLAN TECH 101 – Desktop GIS (vendor-neutral) —
– QGIS (opensource)
– ArcGIS (proprietary, $)
– MapInfo (proprietary, $)
* Intro to GIS data
– Vector
– Raster
– KML, GML, WMS etc
* How to acquire GIS data – Resources
* How to work with maps – Common tasks
– Geocoding/Geoprocessing
– Spatial analyses
– Editing
– Printing, publishing
* Intro to spatial databases
* Best practices
* What lies ahead – Industry trends
* Other notable resources – handy tools and hacks

— PLAN TECH 102 – webGIS (vendor-neutral) —
– Mash-Up APIs
– Google Maps (proprietary, free)
– ArcWebServices (proprietary, $)
– Virtual Globes
– NASA World Wind (opensource)
– Google Earth and SketchUp (proprietary, free versions)
– MapServer (opensource)
* Intro to webGIS
* How to mash-up
– Text to maps etc
– How to use MapMaker, MyMaps and Charts
– License considerations
* How to use Virtual Globes
– How to add placemarks, polygons, photographs etc
– How to georeference photographs
– How to create network links
– How to create tours
– License considerations
– Other presentation considerations
– 3D models
* Intro to in-house interactive mapping
– How to set-up and serve
* Best practices
* What lies ahead – Industry trends
* Other notable resources – handy tools and hacks

— PLAN TECH 103 – Web 2.0 (vendor-neutral) —
* Intro to Web 2.0
* How to set-up
– CMSs
– Blogs and forums
– Mailing lists
– webGIS
– Mash-Ups
* How to use Social Networking
– YouTube
– MySpace
– Facebook
* License considerations
* Intro to Section 508
– Guidelines
– Resources
– Tips
* Best practices
* What lies ahead – Industry trends
* Other notable resources – handy tools and hacks

Written by Harsh

July 11th, 2008 at 11:36 pm

Posted in Geography,Planning

Tagged with , , ,

Google Earth [GE] Hacks

without comments

GEMMO is a massively multiplayer online game [MMOG or MMO] for Google Earth that allows you to “explore the world as you collect gold, fight evil monsters and try to collect the crystals that are guarded in major cities [19 so far] across the planet” without any additional software to download.
Pi: Quite Musing
Given the gathering whispers of our rumor-mill, some morph of this could make it big- a la Scrabulous. Good job Mickey of MickMel Inc!

— π

• Google Earth [GE] @ Work
• Google Earth in Second Life!
• Metaverse

Written by Harsh

March 2nd, 2008 at 11:05 pm

Posted in Virtual Globe

Tagged with , ,

Follow Up [2]: Debating Net Neutrality: A Nutshell

without comments

Quotes from the recent Net Neutrality Hearings:

David L. Cohen, Vice-President, Comcast– ‘…on a “very limited basis” Comcast was delaying traffic in limited areas when there is heavy traffic.'”Don’t let the rhetoric of some of the critics scare you- there is nothing wrong with network management. Every network is managed.”

Tim Wu, Professor, Columbia Law School– “I have this terrible fear we are going to have an exam after this on what is reasonable network management. And we are all going to fail.”

FCC to Act on Delaying of Broadband Traffic [NYT]
Network Management

Written by Harsh

February 25th, 2008 at 8:34 pm

Posted in Social,Technology,Web

Tagged with , , ,


without comments

I started the year with thisTime Management‘ video by Randy Pausch. You may know him from ‘The Last Lecture‘. His introduction is by Gabe– my website mentor at UVA Computer Science Web Team. A must-watch if you haven’t already.

— π

The Legacy of Randy Pausch

Written by Harsh

January 31st, 2008 at 9:35 pm

Posted in Technology,Web

Tagged with ,

The Power of Ten

without comments


Medium Maximization: “A medium, for example, points or money, is a token people receive as the immediate reward of their effort. It has no value in and of itself, but it can be traded for a desired outcome. Experiments demonstrate that, when people are faced with options entailing different outcomes, the presence of a medium can alter what option they choose. This effect occurs because the medium presents an illusion of advantage to an otherwise not so advantageous option, an illusion of certainty to an otherwise uncertain option, or an illusion of linearity to an otherwise concave effort-outcome return relationship. This work has implications for how points influence consumer choice and how money influences human behavior.”

• “With the lure of points added to the mix, more than half of students chose the longer task and the less desirable pistachio prize that went with it. Independent of their actual value, ‘points’ apparently give people some satisfaction. That’s just one reason that frequent-flier programs have been so successful for so long.” [NYT]

* “It is claimed that a satisfied customer tells an average of three people about a product or service he/she likes, and eleven people about a product or service which he/she did not like [Silverman, George. Secrets of Word Of Mouth Marketing. 2001]. Viral marketing is based on this natural human behavior.
* Bizsum Book Summary [Amazon]

Written by Harsh

December 28th, 2007 at 7:20 pm

Posted in Technology,Web

Tagged with , ,

Mash-ups as Planning Tools

with 5 comments

Planning departments, especially those of smaller cities, have long hesitated because of technology, budgetary and other constraints to engage their constituents through web-based mapping tools. Part of the reason is simply an uneasiness with Web 2.0-esque mapping technologies.

Well, these days they have less to worry about. That is, if they don’t mind piggy-backing on corporate giants.

Pi: Quite MusingRecently, the BurbankLeader reported on how the City of Burbank, Los Angeles County, California, the not-so-undisputed “Media Capital of the World” with a comfortable population of 104,317 (2006), has trusted some online service providers and their armies of 24/7 network-support staff to host part of its mapping data. Not a mash-up feat by today’s standards, but the City has invited public input by publishing its planning project status using Google Maps‘s free Application Programming Interface (API).

According to the City’s Principal Planner Michael Forbes, AICP, “the planning projects map, run by Google, is an interactive list of all residential, commercial and industrial projects throughout Burbank that are being processed or have been recently approved or denied. Each project icon on the map includes information about the project and a link to its current status.”

Pi: Quite Musing

Pre-computed KMLs load faster than dynamic KMLs for obvious reasons, but even with clusters, loading a lot of data can sometimes stretch mash-ups beyond their user's patienceA note of caution for the impatient GIS professional: While nowadays, a mash-up is more than a hack, most public map APIs are still constrained by their ask-coordinates-get-flat-tile design, albeit smart, when it comes to geometry-aware mapping that requires ‘queriable geometry’.
Pi: Quite Musing
Consequently, despite the established familiarity of mash-ups, the appropriateness of such mash-ups to enterprise GIS for large-scale custom mapping is still debated.

Pi: Quite MusingThen there is that question of commercial advertisements on publicly-funded maps. Note that there are ways around it: Google Maps for Enterprise, for one, allows the option to disable location-based advertising for an annual fee. The free Google Maps also requires map and custom data to be publicly-accessible. But as far as the cause of community’s access to information is concerned, it is well-served by such mash-ups.

So nearly two years after chicagocrime.org– the seminal Google mash-up that won the 2005 Batten Award for Innovations in Journalism and was named by the New York Times as one of 2005’s best ideas (“It turns out that the best way to organize much of the information online is geographically.” – Do-It-Yourself Cartography, NYT), arrived at the mapping scene followed by hordes of Google Earth KMLs; At a time when some Elite Systems Research Institutes have already tried similar approaches and not quite succeeded; At a time when companies have been successfully built from mash-ups; At a time when real-estate mash-ups have become stale and foreclosure mash-ups have become hot; ‘smallish’ planning departments are warming up to the idea of neogeographic mash-ups as planning tools. Finally.

— π

Online Tool Spotlight: Mash-Ups as Planning Tools (Summary)
Planning and Technology Today: Technology in Public Participation (Issue 90, Fall 2007) – A Publication of the Technology Division of the American Planning Association
Neogeography 101: Word Association
Google Earth [GE] @ Work
Follow Up [1]: ESRI Ketchup!
Follow Up [4]: Graphic Software
Follow Up [2]: Map Viewer and Google
Virtual Earth For Government
ESRI ArcWeb Services: Pricing Guide
* Find sample region by geometry – $0.02
* Get map of region – $0.02
* Zoom in/out of above map – $0.02
* Find places – $0.02
* Measure distance on map – $0.00
� Other Examples: OpenLayers – Web Processing and Routing

Written by Harsh

December 4th, 2007 at 11:39 pm

Posted in Mashup,Planning,Service

Tagged with ,

Follow Up [1]: Unshared Sacrifice

with one comment

Written by Harsh

November 25th, 2007 at 12:36 am

Posted in Planning,Social

Tagged with , , ,

Why Contribute

without comments

Paul Ramsey points to Danny de Vries‘s take on Free and Open Source Software for Geospatial [FOSS4G] 2007:

“What we saw was a young and passionate movement not-so-subtly showcasing their dedication for open-source as a tool by which to challenge corporate, or closed-source, IT monopolies in the geospatial domain.”

I want to underline the ‘showcasing’ part. It is important to not ignore why that is significant for contribution to opensource, which as some would like you to believe is often lacking direction and profit and not the best use of your time. And it can be summarized like so:

                        +—[IN]—> LEARN
                        +—[OUT]—> SHOWCASE —> GET WORK


• My Pick of FOSS4G 2007 Presentation Submissions
• Contribute

Written by Harsh

November 22nd, 2007 at 12:07 am

Posted in OSGeo,Programming

Tagged with ,

Follow Up [1]: Never the Twain Shall Meet

without comments

Written by Harsh

November 21st, 2007 at 6:30 pm

Posted in Technology,Web

Tagged with ,

The OpenHandset Alliance and the Mozilla Foundation

without comments

As far as the OpenHandset Alliance SDK is concerned, in spite of how Jonathan Schwartz feels about it and the 10 million that Google is giving away in developer prizes, the SDK could become an albatross around Google neck, courtesy Java.

Google appears to also have successfully convinced the opensource Mozilla Foundation to promote its own services above and before other compelling interests. This may be akin to special interest groups’ manoeuvrings on Capitol Hill, and certainly begs the question – did Google push the Foundation to go slow on mobile? Certainly, Minimo with its XUL environment and many extensions could have made for a speedier development cycle.


* Back in 2005, realizing the potential of WAP, I tested XHTML/WML/WMLscript v HTML/Javascript on Nokia emulators, and wondered how best to balance the 2 different development requirements. After all, you want to get the many more people who own a mobile but not a computer, access your services.

* Symbian Python

Written by Harsh

November 14th, 2007 at 10:50 pm

Posted in LBS,Mobile,Technology

Tagged with , ,

Mobile Browsers

without comments

As the Google-backed Open Handset Alliance takes shape, I have been testing dominant WAP browsers on my 2-year old touchscreen PocketPC. This resulting post should narrow down the choices for those who follow:

• Deep Fish by Microsoft appears to be the most promising of the lot. Unfortunately, it is in a strict testing phase and no longer accepting registrations. Until then, you can always make do with Internet Explorer for Mobile.
• Opera, arguably the slimmest desktop browser out there, has a paid version- Opera Mobile for $24. But if you do not have a smartphone and/or do not wish to spend any money, try Opera Mini.
• The Mozilla Foundation has the amusingly named Minimo.

Opera Mobile offers tab-browsing like Minimo, and does a better job at handling pop-ups and javascripts than Internet Explorer. And like Minimo, it offers ‘grab and drag’ navigation thus eliminating scrollbars. Opera Mobile also offers subtle other improvements, like allowing you to change your User Agent- a must-have for those websites that recognize mobile browsers, but remain inexplicably unprepared for them. On the other hand, Minimo features XUL [try this in Firefox – chrome://browser/content/browser.xul] that has impressively found its way into Mozilla Amazon Browser etc, and is the most customizable.

Absent from all these is the Nokia Web Browser– the sometime favorite of opensource mobile development. After all, its early emulators are what helped a lot of programmers/developers gain a handle on mobile development long before Google.


• Follow Up [1]: Wireless Application Protocol
• Wanted: Proactive Policies
• >> WAP
• News:: Spatial
• News:: Science & Technology
• Sample *.xul
• xda-developers Forum
• Picsel Browser
• Zumobi
• Proxy Server
• Mini-Me

Written by Harsh

November 13th, 2007 at 7:22 pm

Posted in LBS,Web

Tagged with , , ,

A Tale of Two Languages

with 2 comments

Try this page to compare Ruby‘s and Python‘s language elegance side-by-side. Spoiler Warning: There is a winner!

To get you started:
Ruby – string.method [“String”.reverse or “String”.length]
Python – string[slice] or function(string) [“String”[::-1] or len(“String”)]


• Python Interpreter
• Cold Fusion
• Perl [ActivePerl]
• [ActivePython]
• Tcl [ActiveTcl]
• A Tale of Two Cities

Written by Harsh

November 11th, 2007 at 10:36 pm

Posted in Programming,Web

Tagged with ,

MapServer’s Claim to Fame?

with one comment

I was a little surprised to find MapServer listed on Nessus– the network vulnerability scanner website chugging along on Apache/PHP: Its mention points to greater usage than earlier anticipated. So if even AGG– its Google-esque 5.0 rendering backend is not enough, here‘s another reason for –4.10.3 users to upgrade:

The remote web server contains CGI scripts that are prone to arbitrary remote command execution and cross-site scripting attacks.

The remote host is running MapServer, an opensource internet map server.

The installed version of MapServer is vulnerable to multiple cross-site scripting vulnerabilities and to a buffer overflow vulnerability. To exploit those flaws an attacker needs to send specially crafted requests to the mapserv CGI.

By exploiting the buffer overflow vulnerability an attacker would be able to execute code on the remote host with the privileges of the web server.

Upgrade to MapServer 4.10.3.

Notice how their solutions are always short and sweet. Savvy programmers/developers would know of a couple of other ways to fail such automatic scanning.

On Nessus, MapServer shares the company of the spatial heavy-weight: Google Earth– ‘heap overflow in the KML engine [FreeBSD]‘. Given Nessus’s reputation in the enterprise class, ESRI’s ArcGIS Server and ArcIMS are both conspicuous by their absence- impossibly secure? less likely; less widespread and not sufficient to warrant a mention, atleast in the enterprise community? quite possible.


US-CERT Vulnerability Notes Database

Written by Harsh

November 10th, 2007 at 10:46 pm

Posted in IMS,OSGeo

Tagged with , ,

Neogeography 101: Word Association

without comments

‘Genre Books’ is to ‘Writer’
‘Web Maps’ is to …?

• [a] iPhone […since the buzz is about it- the Paris Hilton of the technorati]
• [b] Paris Hilton […since the buzz is about her- the iPhone of the glitterati]
• [c] Geographer […since ESRI Press said so]
• [d] Programmer/Developer

• If you answered [c], you have spent a lot of time around ESRI-championed web maps with 8 direction tags, a dogged insistence on not exploiting browser cache and a ridiculous north arrow on every map- never mind that so far no one has turned a browser upside down.


• A Rose by Any Other Name
• Web Mapping
• The New Yorker

Written by Harsh

July 7th, 2007 at 11:30 am

Posted in Geography,GIS,Service

Tagged with , , ,

Follow Up [1]: Debating Net Neutrality: A Nutshell

without comments

Written by Harsh

June 24th, 2007 at 4:30 am

Posted in Technology,Web

Tagged with , , ,

Technology Leaders and Political Bent, 2007

with one comment

Top 3 Contributions Over $2,000 from the Big 3:
• Chen, Ling | Bellevue WA 98006 | – | $4,600 | Hillary Clinton
• Giblett, Leslie | Seattle WA 98119 | Microsoft Visual C++ Box Program Manager | $4,600 | John Edwards
• Gonzalez, Christopher | Glen Ellyn IL 60137 | – | $2,300 | Barack Obama

• Lee, Alissa | San Francisco CA 94114 | Senior Corporate Counsel, International Affairs | $4,600 | Barack Obama
• Merrill, Douglas | Danville CA 94526 | Vice President, Engineering | $4,600 | Barack Obama
• Cerf, Vinton | Mc Lean VA 22102 | Vice President and Chief Internet Evangelist | $4,200 | Hillary Clinton

• Goldberg, David | Atherton CA 94027 | Vice President and General Manager of Music | $4,600 | Hillary Clinton
• Semel, Terry | Beverly Hills CA 90212 | Chairman and Chief Executive Officer | $4,600 | Hillary Clinton
• Garlinghouse, Brad | Menlo Park CA 94025 | Senior Vice President, Communications, Communities, and Front Doors | $2,300 | Barack Obama
• Map of Contributions to Presidential Campaigns, New York Times [NYT]
• Presidential Campaign Finance Map, Federal Election Commission [FEC]


• Political Equilibrium
• Godin, Seth | Irvington NY 10533 | Author, Speaker and Blogger | $999 | John Edwards
• US Technology Administration

Written by Harsh

June 16th, 2007 at 10:30 pm

Unshared Sacrifice

without comments

Written by Harsh

May 27th, 2007 at 12:45 pm

Posted in Planning,Social

Tagged with , , ,

Debating Net Neutrality: A Nutshell

without comments


• [my comment]
The Coming Internet Traffic Jam: “…argument on government legislation. It is a false argument that some proponents of non-neutrality wish to spread. Surely, in this age of war-profiteers turning in record-breaking quarters, loose monopolies of mergers and bundles, debatable price gouging etc, it is a little naive to want to believe that all the companies involved will tow some good line on the other side of short-term profits for the greater common good.

If anything, some private companies interfere with day-to-day governance through unabashed lobbying and kickback offerings, creating grossly unfair access to government.

If a government legislation has caused long-term damage in the past, the legislation must be refined or redone and the legislators should be unelected, not have the people’s say through ‘smart legislation’ be silenced.”
[/my comment]
• Making Public Policy: A Nutshell
• Wanted: Proactive Policies

Written by Harsh

May 13th, 2007 at 11:05 pm

Posted in Technology,Web

Tagged with , , ,

My Pick of FOSS4G 2007 Presentation Submissions

with one comment

An impressive summary of presentations, but my professional favorite would be ‘IBM DB2 Express-C: A Free Database for Open Source Spatial and XML Development’. Although something tells me that something else might be the crowd favorite.

Pi: Quiet Musing

On DB2 Express-C: It went free soon after its counter-weights Oracle XE and SQL Server XE last year, but its press “news” release has not found its way into major SIS publications. DB2’s continued advancements in the free spatial database market could only make things tighter for PostgreSQL+PostGIS.


• Free and Open Source Software for Geospatial [FOSS4G] 2007
• ‘DB2 Express-C, the developer-friendly alternative’
• ‘Oracle XE and Geospatial Information Systems: An Interview with
Dennis Wuthrich of Farallon Geographics’

Written by Harsh

May 5th, 2007 at 11:12 am

Posted in GIS,OSGeo

Tagged with ,

Elite Systems Research Institute, Inc. [ESRI] et al

with 2 comments

This GCN article titled ‘Geospatial and the elite: Old-school geographic information systems still dig deep on mapping and analyses’ points to a tortuous debate within the traditional GIS industry, and the new industry push to remodel itself as solely an “enterprise class” industry while it continues to loose ground to an increasing domestication or democratization of GIS services.

Pi: Quiet Musing
ESRI: Elitist or Commonplace?

But this new industry push is not without some strategy confusion as old-school GIS faces its mid-life identity crisis without the “cool factor” spouse.


• More

Written by Harsh

April 22nd, 2007 at 8:49 pm

Posted in GIS,Mashup

Tagged with ,

Cost of Living and Higher Education

with one comment

As I returned from the American Planning Association‘s 2007 National Conference in Philadelphia, I rummaged through some past papers and chanced upon a letter.

Thomas Jefferson or William Penn?

When I look back to why I chose UVA over UPenn, the cost of living at Charlottesville v. Philadelphia, not Public Ivy v. Ivy League, proved to be the determining factor, given finances. Although Charlottesville’s small-town vibe didn’t reconcile well with the “urban” in GIS, and UVA did not play to my love of data analytics, it was an enriching ride.

So, as some of you may be deciding on which offer letter to accept this fall, here is a little advice – focus on the one you really want and everything else might just fall in place.

Good luck!


GIS at UPenn

PS: Compared to UPenn, UVA has smaller graduate programs and endowments. And it feeds the Washington DC metro’s job market. UPenn, on the other hand, has a stronger focus on spatial analytics and feeds the New York metropolitan region. So spare a thought to where you would like to spend, or at least start, your professional career. A note for foreign students – UVA has a good number of, for lack of a better word, “southern aristocracy” flocking to its classes, while UPenn has a larger international student population. So stay north of the Mason-Dixon line, if you have a choice. Well, you always have a choice – choose wisely.

* USATODAY – ‘Mr. Jefferson would be proud’: Charlottesville is No. 1
* Rural Clusters and Relative Rurality:
Albemarle VA | $37,638 | 0.358 | $13,474.40
Philadelphia PA | $29,755 | 0.037 | $1,100.935
* Roughly, the higher the Relative Rurality, the further the dollar would go
Cities Ranked & Rated: ‘The Ten Best Places to Live [2005]’ and ‘2005 Best Places to Live’
1 | Charlottesville VA
76 | Philadelphia PA-NJ
* Frost, Robert. The Road Not Taken. http://www.poets.org/viewmedia.php/prmMID/15717
* More
* Ways to give

Written by Harsh

April 22nd, 2007 at 12:55 pm

Posted in Education,Social

Tagged with , ,

Rural Clusters and Relative Rurality

with one comment

The US Economic Development Administration [EDA], in conjunction with the State of Indiana, has recently released an interesting research titled “The Role of Regional Clusters: Unlocking Rural Competitiveness” [2007] on the benefits of regionalism in rural America.

One of the primary objectives of this research is to help rural America find its competitive edge in our rapidly globalizing world. It accompanies another research in a similar vein named “Rural Clusters of Innovation: Berkshires Strategy Project- Driving a Long-Term Economic Strategy” [2006]– a public-private study funded in part by the US Department of Commerce. These 2 studies follow an earlier precursor report titled “Competitiveness in Rural US Regions: Learning and Research Agenda” [2004] led by the Harvard Business School. That report arrived at 2 main conclusions:

• ‘Capacity for regional innovation is often driven by industry ‘clusters”.
• ‘Clusters also significantly enhance the ability of regional economies to build prosperity’.

First, some quick background:

Clusters– industry and region, have been defined as ‘broad networks of companies, suppliers, service providers, institutions and organizations in related industries that, together, bring new products or services to a market’. A cluster-based approach provides an effective planning tool for economic development in the rural countryside. Graphically, I can summarize a rural cluster like so:

Pi: Quiet Musing [© Imagezoo/Images.com/Corbis]

The research’s findings, lessons, conclusions, recommendations and directions that I found relevant are:

• ‘Labeling a region around a single cluster or economic activity is too simplistic due to considerable co-location of clusters’
• ‘Clusters most strongly associated with higher levels of economic performance are business and financial services; IT and Telecom; and printing and publishing’.
• ‘Human capital, as measured by educational attainment, is the primary factor related to differences in income growth among counties’.

This research also underlines the importance of spatial technologies as follows:

• ‘Much of the analysis of rural America has been overly simplistic. GIS tools and advanced spatial analyses are not commonly used. It is important that greater use of these powerful approaches be applied to a wide range of issues facing rural America’.
• ‘Mapping is particularly helpful to illustrate and communicate data on clusters’.

Some of the maps coming out of this research can be found here.

Anyway, as I see it, an uneasy socio-cultural issue remains unenlightened, and that is…

When you take rural America or for that matter rural Anywhere, and strip it of all its social stereotypes and negatives, you are left with something or end up attracting something that is far from rural- something that will jump, skip and run to the New Yorks of our world in time.

Pi: Quiet Musing [© DLILLC/Corbis] Rural Anything does not clamor for riches; it does not yearn for the hustle-and-bustle of urban life, or for its smog-filled jam-packed commute traffic, or for that neck-breaking workday; it is not awed by the many skyscrapers of the City on whom it conveniently blames all social ills; none of the multi-cultural nightlife or rebellious ways.

Pi: Quiet Musing [© Imageplus/Corbis] Rural Anything simply desires simplicity- a dog yawning in the backyard farm; a winding trail to work; free parking; quiet and quaint neighborhoods topped by the clichéd church tucked away inside the folds of its countryside; fishing expeditions on weekends; just yearning to stretch on a summery afternoon without having to worry about city-like pollutions and crimes; content only to drift and conform to its tightly-knit value-system.

It is a different “make” of people.

How then do you convince it to join the rat-race?


• As I see it, Relative Rurality- a measure used in this research, helps answer the age-old question: How far would the dollar go? Roughly, the higher the Relative Rurality, the further the dollar would go
Pi: Quiet Musing

• More
• Even More
• A Lot More

Written by Harsh

March 21st, 2007 at 10:03 pm

Posted in Planning,Social

Tagged with , , ,

Google Earth [GE] @ Work

with one comment

This week I had the opportunity to listen to the Google Guys. Having earlier missed a similar opportunity for Jack Dangermond due to schedule conflicts, I made sure I was present at this seminar.

Pi: Quiet MusingOn display were the GE Enterprise solutions- Fusion, Server and Enterprise Client. With GE Enterprise, you can sign into multiple servers, grab the most accurate data from each and roll everything into one seamless experience. You may even squeeze your private globe onto a pocket-sized device and strut it out on a field. For a private domain, GE Enterprise can scale upto a healthy 250 concurrent users, or a little less than those supported by a default PostgreSQL 8.X on Windows.

One astounding statistic quoted was the vast number of users GE has been able to accumulate over its short life- approximately 200 million; reportedly many more than those by Google Maps, with nearly 80% for casual uses. And a surprising number, or so we are told, falls in the 45+ age group.

Approximations aside, here’s my take:

When you try to fathom the 200 million number, you are reminded yet again how ESRI, Intergraph, MapInfo, Autodesk et al, poorly missed the globe software bandwagon. And the traditional SIS companies still do not have a clear winner when it comes to 3D buildings and surface textures, despite counting 3DS Max and Maya. All that information is what users now expect from any cutting-edge globe software.

From the looks of it and the high-end price tag of over $100,000, Google has smelled blood- the fat inside some governments; ESRI and Intergraph can attest to that. If Google succeeds in this aggressive push, the traditional SIS companies will cede further into the background on data visualization; they are anyway planted firmly in the backseat with regards to a lot of casual uses.

So when you combine this push with GE user groups, the KML offer to OGC, KML-based searchesPi: Quiet Musing and other enterprise solutions, then you can see why some traditions may be feeling nervous. Add to that the general perception about Google’s speed-of-innovation- ‘when you use a Google product, Google would innovate faster than the traditional SIS companies to support it’.

As I see it, that growing perception should be the biggest reason for the traditional industry’s nervousness.


• Application: PortlandMaps
• Ogle Earth
• More

Written by Harsh

February 28th, 2007 at 10:17 pm

Posted in GIS,Virtual Globe

Tagged with ,


without comments

Here are four “events” from 2006 that I consider as evolutionary milestones of our burgeoning SIS industry:

• E2– ESRI finally catches up to GE. Almost
• Virtual Earth– Microsoft adds the ability to add and save shapes, and browser-based GE-esque 3D views
• GE– Google gulps SketchUp and consolidates GE’s usergroups by jumping head-first in collaborations
• Spatial Web Services- Be it ESRI’s ArcWeb Services with GlobeXplorer, or DM Solutions Group‘s MapSherpa Spatial Web Services and Mapgears, spatial web services gain a firmer footing at the enterprise level.


Written by Harsh

December 24th, 2006 at 10:05 pm

Posted in Technology,Web

Tagged with , ,

Follow Up [2]: Katrina Links

with one comment

Former senator Stafford of Robert T. Stafford Disaster Relief and Emergency Assistance Act, familiar to anyone requesting, managing and mapping disaster grants under the Hazard Mitigation Grant Program [HMGP] AKA Buyout Program, dies at 93.

“FEMA Told to Resume Storm Aid”
• Blogs about this article
• “Katrina Victims in Limbo as FEMA Appeals Aid Order”
• Government Accountability Office [GAO] Report: Abstract– Hurricanes Katrina and Rita Disaster Relief. Continued Findings of Fraud, Waste and Abuse. GAO-07-252T. December 6, 2006
• Video: Reactions from the Grassroots– Effects of Flood Map Modernization [Map Mod] Program’s Digital Flood Insurance Rate Maps [DFIRMs] on National Flood Insurance Program’s [NFIP’s] Ordinance Updates
• Pre-Disaster Mitigation [PDM] Grant Program

Written by Harsh

December 23rd, 2006 at 1:30 pm

Follow Up [1]: ESRI Ketchup!

without comments

Following on the heels of E2, Google recently consolidated GE’s usergroups through some interesting collaborations with Wikipedia and Panoramio. These follow earlier deals with UNEP, NASA, USGS, ESA, Discovery, National Geographic et al.

These steps slowly push one other software- ESRI’s ArcGlobe, part of the ArcGIS 3D Analyst extension, further away from all that is important. ArcGlobe was useful in that it eventually led to E2, but ESRI had much bigger plans- it was promoted to become widely adopted for 3D data mapping and visualization.

Then Google came along, and ArcGlobe and all the shabby flyby animations and painstaking multipatches in ArcScene, also part of 3D Analyst, suddenly became embarrassing.

That leads me to my prediction of the week: all this will force ESRI to either lower the inflation-adjusted cost of its pricey 3D Analyst- currently marked at $2500, or absorb some of it into E2 or the desktop. Note that Google Earth Pro today costs a fraction at $400.

Pi: Quiet Musing
Fortius One‘s GeoIQ: A free simple Spatial Analyst?


• ArcGIS Extensions
• More via Google Earth Links
• More

Written by Harsh

December 16th, 2006 at 10:01 pm

Posted in GIS,Mashup

Tagged with ,

ESRI Ketchup!

with one comment

After months of wild speculations and foot-dragging, ESRI finally released ArcGIS Explorer– twice as big as Google Earth and a shade shy. Here is why:

Google Earth [googleearth.exe]
+ Searches better
– Does not offer native support for popular spatial data types

ESRI ArcGIS Explorer [E2.exe]
+ Offers native support for popular spatial data types
– Clunkier navigation and interface

• Both show comparable spatial data displays and memory usages. I am pleasantly surprised by how consenting NASA of World Wind fame, has been to all such uses, given the murky legal waters of the future when others start using this precedent to demand equal treatment.

Pi: Quiet Musing
ESRI ArcGIS Explorer: Adding content

Being true to the misplaced compulsions of most commercial companies, ESRI only lets you export your layers in E2’s markup language [*.nmf]. However, to piggy-back on the growing user community around GE and because ESRI has no current alternative to Google SketchUp, E2 allows you to import *.kml and *.kmz files. GE, on the other hand, also imports *.gpz and *.loc GPS files in its commerical flavor.

E2 can also create geoprocessing tasks, and styles and symbologies; export identification results; display attribute tables.

So what is the bottom-line: GE is better suited for consumers of spatial data, while E2 is targeted more at the creators and editors. And how close does E2 come to following the “if you are late, you better be better” mantra? Not quite, but then again, it is just a beta.

Now the waiting game begins for arguably the most innovative internet company in recent times, notwithstanding the acquired nature of GE and SketchUp- Google, to hit back after losing ground to Yahoo Maps– better driving directions planning, and Microsoft Virtual Earth– ability to add and save shapes, and browser-based GE-esque 3D and street level views.


I wonder how the good folks at Arc2Earth and Shape2Earth would maintain their rates of innovation in response?

• ArcGIS Explorer Overview Podcast
• ArcGIS Online Services
• Server Object Manager [SOM] Setup
• Sample *.nmf containing 1 point feature derived from feature class [e2.shp] in GCS_North_American_1983 coordinate system
• TerrainView
• Follow Up [4]: Graphic Software
• Follow Up [2]: Map Viewer and Google

Written by Harsh

November 29th, 2006 at 10:04 pm

Posted in GIS,Mashup

Tagged with ,

Interview: Ric Stephens, Immediate Past Editor, Technology Division of the American Planning Association [APA]

without comments

As the Secretary/Treasurer of the Technology Division of APA, I recently had the opportunity to interview Ric Stephens, our Immediate Past Editor:

Harsh: So what got you into planning and publishing/editing?
Ric: I worked as a cartographer/German language translator for USAID during college and was hired by a civil engineering firm to prepare maps during summer break.

After school, the firm offered me a job in their planning department and …voila! There are still some plat maps on file from the late 70s with elaborate compass roses for north arrows. I began helping with a local APA section newsletter out of curiosity. A quarter of a century and thousands of newsletters later, I am still interested in desktop publishing.

InfoTEXT began as a paste-up effort ten years ago and is now ‘completely digital’. I’m still helping with two APA newsletters, ‘Private Practice Perspectives’ and ‘Mountains and Shores’. I’ve also published two books: ‘Plannerese Dictionary’ and ‘International Planning Organizations’ and am working on a third, ‘Dark and Stormy Planning Prose’.

Harsh: Any favorite planning story that you edited?
Ric: There are three unique stories-

Pi: Quiet Musing
Ric Stephens at the Street of Dreams

For several years, I organized the ‘Dark and Stormy Planning Prose Contest’ to collect and share humorous planning stories. One of my favorites is the 2002 Winner, ‘Zone Noir’ by Michael Young who merged the feel of a 50s detective novel with current planning issues. It’s hard to imagine, but Dr. Seuss wrote a humorous poem on regulating signage for the city of La Jolla, California!

Lastly, while living in California, I received ‘The Story of Sexton Mountain Meadows‘. It revolves around the continuous removal of the ‘t’ from ‘Sexton’. I now live a few miles from this very street in Beaverton, Oregon and am a Planning Commissioner for the City. I found the listed author, but he denies writing the story and referred me to a blog author who remembers the incident, but also denies writing the story. The mystery continues to this day.

I am still collecting stories and if you have a ‘hearing from hell’, ‘purple planning prose’ or other contributions, please email a copy to ric@alphacommunity.com.

Harsh: Any thoughts on the New Media?
Ric: We are far from reaching a paperless office environment, but we are clearly moving towards digital information and communication technologies.

For planning in particular, it is an exciting time to expand GIS with numerous databases including satellite imagery. The REAL CORP 007 event will showcase some of these outstanding IT innovations. Our firm, Alpha Community Development, is developing software to link our projects with these databases. We are also developing project-specific websites and looking for new ways to provide online project management.

Harsh: Any thoughts on increasing readership for the Technology Division?
Ric: InfoTEXT contributors have provided outstanding content that is very relevant to practicing planners, agency officials, educators and students. I believe the missing element is visibility.

It would also be helpful for APA to actively promote the Divisions, and for the Divisions to have programs to promote the newsletters to planning departments, governmental agencies, universities and other institutions.

Harsh: And finally, any advice to the new editor[s] of the Technology Division?
Ric: It’s very difficult to find contributors for articles- I’m several weeks late in responding to this interview.

Having a large group of people to help gather material would be ideal. As the newsletter migrates to the web, the publication should probably adapt a monitor-friendly format and be rich in hyperlinks. I enjoyed editing InfoTEXT and am indebted to all who helped make this a memorable experience.

Harsh: Thank you and good luck!
Ric: Thanks!

• Planning Publications Directory
• What’s New: Books and Documents

Written by Harsh

October 29th, 2006 at 10:03 pm

Why do you like Geography?

with one comment

Here’s one of many reasons:

“… And then the strange people of Asia- the Tartars, who are such splendid horsemen; the Arabs, who travel over the deserts upon camels, and at night stop and tell stories to each other; and the Hindoos, who burn their widows and drown their children, thinking these things are pleasing to God; and the Chinese, who eat puppies and rats, and furnish all the world with tea; and the Turks, with their big turbans- what a wonderful thing it is that in one little book we may learn all about these queer [sic] people.

Perhaps I like geography the more for this reason: Uncle Ben has a great many pictures of different countries, with the people who live there; and when I am studying about a country I look over these pictures…”

[Goodrich, Samuel G [Peter Parley]. pp 45. Chapter V- Geography. The Adventures of Billy Bump on the Pacific Coast- A tale of ’49. 1793-1860. http://www.openlibrary.org/details/billybump00goodarch ]


Written by Harsh

August 26th, 2006 at 10:09 pm

Posted in Geography,Social

Tagged with ,

Follow Up [1]: Katrina Links

without comments

Written by Harsh

August 16th, 2006 at 10:11 pm

International Outreach

without comments

One of the pleasures of my current job is the annual opportunity to interact with professionals from around the world, thanks to the International Visitor Leadership Program. During these interactions, I share with the visiting delegations how regional government works in the Virginias.

Pi: Quiet Musing
Mayoral Delegation from the Republic of Tajikistan, 2006

Pi: Quiet Musing
Public and Private Sector Delegation from the Russian Federation, 2005

I always end my presentation on regional governance and SIS with a quick display of Google Earth when we try to locate the remote places the delegation members come from. As can be deduced from these pictures, the members stand in rapt attention of how one private enterprise gives back to the greater common good.

* Theories and Approaches in Local Government Studies

Written by Harsh

January 24th, 2006 at 8:09 pm

Top 10 Technology Trends for 2006 [“comment”]

without comments

1. First there were WiFi hotspots, then hot zones [“even more so”]
2. Cell phones do everything [“right-on”]
3. Internet phone calls become more popular now that major Web companies are making it easier [“about time”]
4. The [MS] Office moves to the Web. Documents, e-mail and spreadsheets move off your desktop computer to the Web [“about time”]
5. Stem-cell research advances despite legal challenges [“right-on”]
6. Biotechs target flu vaccines [“right-on, same for other vaccines”]
7. Even small start-ups go global [“even more so”]
8. Video comes to the blog [“refer to 9”]
9. On-demand video everywhere [“refer to 2”]
10. Clean technologies [“even more so”]

More crystal ball gazing:

• A tough year ahead for Sony [“fate deserved, although XBox would probably hurt more”]
• AJAX cleans up the Web [“impressive”]
• Cracks appear in Apple’s iTunes shiny armor [“would take more, but also refer to hymn“]
• Telco companies get ensnared in a domestic eavesdropping scandal [“a very tight-rope”]
• A video search company is acquired by a major player [“iFilm?”]
• Municipal Wi-Fi [“refer to South Korea and Japan“]
• Silicon Photonics [ ~ ‘integrating light with silicon’]
• Social Machines [ ~ ‘social web’]
• Search [“Google“!]
• Feeds [“RSS and podcasting and videos, need I say more?”]

Technology Review

• Gates on Vista
• Directions Magazine takes a swing

Written by Harsh

January 6th, 2006 at 6:04 pm

Posted in Technology,Web

Tagged with , ,

Follow Up [4]: Graphic Software

with 2 comments

Yet more evidence of acceptance of Google Maps and through it, of spatial relevance, by established publications:

• A Guide to Commuting and Readers’ Stories
• How Much Is Gas In Jersey?

In a related development, Microsoft continues to play catch-up with Google by acquiring GeoTango. However, with its “3D Internet Visualization- a truly open and web services-oriented solution”, GeoTango may just be the partner Microsoft needs for a tango.


• ESRI ArcWeb Services
• NASA World Wind

Written by Harsh

December 28th, 2005 at 6:00 pm

Posted in GIS,Mashup

Tagged with , , ,

Brain Hypnosis

without comments

An intriguing article that may help those interested in best meeting project expectations in a team-setting. Here is my take on that- for rewards, it is often best if expectations are lower than the actual; for punishments, it is often best if expectations are higher than the actual; so that in both cases, the resulting momentum is kept pointing upward. The old adage of “under-promise over-deliver” follows along the same line.


“… The probe, called the Stroop Test, presents words in block letters in the colors red, blue, green and yellow. The subject has to press a button identifying the color of the letters. The difficulty is that sometimes the word ‘Red’ is colored green. Or the word ‘Yellow’ is colored blue.

For people who are literate, reading is so deeply ingrained that it invariably takes them a little bit longer to override the automatic reading of a word like ‘Red’ and press a button that says green. This is called the Stroop effect.

Sixteen people, half highly hypnotizable and half resistant, went into Dr. Raz‘s lab after having been covertly tested for hypnotizability. The purpose of the study, they were told, was to investigate the effects of suggestion on cognitive performance. After each person underwent a hypnotic induction, Dr. Raz said:

‘Very soon you will be playing a computer game inside a brain scanner. Every time you hear my voice over the intercom, you will immediately realize that meaningless symbols are going to appear in the middle of the screen. They will feel like characters in a foreign language that you do not know, and you will not attempt to attribute any meaning to them.

This gibberish will be printed in one of four ink colors: red, blue, green or yellow. Although you will only attend to color, you will see all the scrambled signs crisply. Your job is to quickly and accurately depress the key that corresponds to the color shown. You can play this game effortlessly. As soon as the scanning noise stops, you will relax back to your regular reading self’…

In highly hypnotizables, when Dr. Raz’s instructions came over the intercom, the Stroop effect was obliterated, he said. The subjects saw English words as gibberish and named colors instantly. But for those who were resistant to hypnosis, the Stroop effect prevailed, rendering them significantly slower in naming the colors.

When the brain scans of the two groups were compared, a distinct pattern appeared. Among the hypnotizables, Dr. Raz said, the visual area of the brain that usually decodes written words did not become active. And a region in the front of the brain that usually detects conflict was similarly dampened.

Top-down processes overrode brain circuits devoted to reading and detecting conflict, Dr. Raz said, although he did not know exactly how that happened. Those results appeared in July in The Proceedings of the National Academy of Sciences…”

Sandra Blakeslee

• NYT Article

Written by Harsh

November 22nd, 2005 at 7:10 pm

Posted in Education,Social

Tagged with , ,

Memorandum Excerpt, Alleged

without comments

From: Bill Gates
Sent: Sunday, October 30, 2005 9:56 PM
To: Executive Staff and Direct Reports; Distinguished Engineers
Subject: Internet Software Services

“… Ten years ago this December, I wrote a memo entitled The Internet Tidal Wave which described how the internet was going to forever change the landscape of computing… Five years ago we focused our strategy on .NET making a huge bet on XML and web services… We will build our strategies around internet services and we will provide a broad set of service APIs and use them in all of our key applications… This coming ‘services wave’ will be very disruptive… This next generation of the internet is being shaped by its ‘grassroots’ adoption and popularization model, and the cost-effective ‘seamless experiences’ delivered through the intentional fusion of services, software and sometimes hardware… I’ve attached a memo from Ray which I feel sure we will look back on as being as critical as The Internet Tidal Wave memo was when it came out…”


From: Ray Ozzie
Date: October 28, 2005
To: Executive Staff and direct reports
Subject: The Internet Services Disruption

“… This isn?t the first time of such great change: we?ve needed to reflect upon our core strategy and direction just about every five years… In 1990, there was actually a question about whether the graphical-user-interface had merit… When we reflected upon our dreams just five years later in 1995, the impetus for our new center of gravity came from the then-nascent web… In 2000, in the waning days of the dot com bubble, we yet again reflected on our strategy and refined our direction… It is now 2005, and the environment has changed yet again- this time around services…

The Landscape:

… In the US, there are more than 100MM broadband users, 190MM mobile phone subscribers, and WiFi networks blanket the urban landscape… We should?ve been leaders with all our web properties in harnessing the potential of AJAX, following our pioneering work in OWA [Outlook Web Access]. We knew search would be important, but through Google?s focus they?ve gained a tremendously strong position. RSS is the internet?s answer to the notification scenarios we?ve discussed and worked on for some time, and is filling a role as ?the UNIX pipe of the internet? as people use it to connect data and systems in unanticipated ways. For all its tremendous innovation and its embracing of HTML and XML, Office is not yet the source of key web data formats- surely not to the level of PDF. While we?ve led with great capabilities in Messenger and Communicator, it was Skype, not us, who made VoIP broadly popular and created a new category. We have long understood the importance of mobile messaging scenarios and have made significant investment in device software, yet only now are we surpassing the Blackberry… The same is true of Apple, which has done an enviable job integrating hardware, software and services into a seamless experience with .Mac, iPod and iTunes, but seems less focused on enabling developers to build substantial products and businesses.

… Only a few years ago I?d have pointed to the Weblog and the Wiki as significant emerging trends; by now they?re mainstream and have moved into the enterprise. Flickr and others have done innovative work around community sharing and tagging based on simple data formats and metadata. GoToMyPC and GoToMeeting are very popular low-end solutions to remote PC access and online meetings… VoIP seems on the verge of exploding- not just in Skype, but also as indicated by things such as the Asterisk soft-PBX. Innovations abound from small developers- from RAD frameworks to lightweight project management services and solutions…

Key Tenets:

… 1. The power of the advertising-supported economic model… 2. The effectiveness of a new delivery and adoption model… 3. The demand for compelling, integrated user experiences that ‘just work’…

The Opportunities:

Seamless OS… Seamless Communications… Seamless Productivity… Seamless Entertainment… Seamless Marketplace… Seamless Solutions… Seamless IT…

Moving Forward:

… Platform Products and Services Division- a. Base v. Additive Experiences… b. Services Platform… c. Service/Server Synergy… d. Lightweight Development- The rapid growth of application assembly using things such as REST, JavaScript and PHP suggests that many developers gravitate toward very rapid, lightweight ways to create and compose solutions. We have always appreciated the need for lightweight development by power users in the form of products such as Access and SharePoint… e. Responsible Competition…

Business Division- a. Connected Office… Should PowerPoint directly ?broadcast to the web?, or let the audience take notes and respond?… b. Telecom Transformation… c. Rapid Solutions- How can we utilize our extant products and our knowledge of the broad historical adoption of forms-based applications to jump-start an effort that could dramatically surpass offerings from Quickbase to Salesforce.com?…

Entertainment and Devices Division- a. Connected Entertainment… b. Grassroots Mobile Services… c. Device/Service Fusion…

What’s Different?:

… Complexity kills… Another simple tool I?ve used involves attracting developers to use common physical workspaces to naturally catalyze ad hoc face-time between those who need to coordinate, rather than relying solely upon meetings and streams of email and document reviews for such interaction…”


* “Building a Better Boom: …The Internet is exciting again, and once again folks are rushing in. In some categories – like search or social networking, for example – there are scores of start-ups vying for pretty much the same market, and it’s certain that, just like last time, most of them will fail.

But regardless of all this déjà vu, we are not in a bubble. Instead we are witnessing the Web’s second coming, and it’s even got a name- ‘Web 2.0’, although exactly what that moniker stands for is the topic of debate in the technology industry. For most it signifies a new way of starting and running companies – with less capital, more focus on the customer and a far more open business model when it comes to working with others. Archetypal Web 2.0 companies include Flickr– a photo sharing site; Bloglines– a blog reading service; and MySpace– a music and social networking site…

Start-ups are leveraging nearly a decade’s worth of work on technologies that are now not only proven, but also free, or very nearly so. Open-source software can now do nearly everything that Oracle, I.B.M. and Microsoft specialized in back in the 90’s. And the cost of computing and bandwidth? You can now lease a platform that can handle millions of customers for less than $500 a month. In the 90’s, such a platform would have run tens of thousands of dollars or more a month…

Or just ask Joe Kraus– a founder of the once high-flying Excite portal. Excite ran through millions in venture capital, then tens of millions of I.P.O. money, before its spectacular demise [Mr. Kraus had left before then]. His latest start-up- JotSpot, is built on open-source software, and cost less than $200,000 to begin.

Mr. Kraus exemplifies the second reason I believe we are not in a bubble: this time, the financiers aren’t driving. Instead, the entrepreneurs and geeks – often one and the same – are. The lessons of Web 1.0 are never far from their minds, and the desire to create something cool that might foster some good in the world is often equally paramount with the desire to make money. The culture of Web 2.0 is, in fact, decidedly missionary – from the communitarian ethos of Craigslist to Google‘s informal motto- ‘don’t be evil’.

Ah, yes, Google. That brings us to the third reason we are not in a bubble: vastly improved search technologies. Recall that the demise of Web 1.0 was predicated in large part on the collapse of the Internet advertising business – people were spending millions buying billboard-like ads that, it turns out, nobody was paying attention to…”

John Battelle; Co-producer, Web 2.0 conference; Author, “The Search: How Google and Its Rivals Reinvented Business and Transformed Our Culture”
* “What is Web 2.0”: Design Patterns and Business Models for the Next Generation of Software
* NYT Article
* Memorandum Excerpt, Alleged

Written by Harsh

November 18th, 2005 at 7:01 pm

Posted in Technology,Web

Tagged with ,


without comments

It’s time to move these to del.icio.us:

• http://labs.google.com/ Google’s showcase
• http://next.yahoo.com/ Yahoo’s showcase
• http://research.microsoft.com/ Microsoft Research

• http://geoportal.kgs.ku.edu/googlemaps/ks_gm.cfm SDE+GMap
• http://traffic.poly9.com/ Traffic, weather and news glues for Google Maps

• http://opensource.nokia.com/ Nokia in opensource WAP

• http://www.webstyleguide.com/ Web style guide
• http://jibbering.com/faq/ comp.lang.javascript FAQ

• http://robin.sourceforge.net/ Browser-based desktop
• http://www.writely.com/ Browser-based word processor
• http://www.ktdms.com/ Document management system
• http://www.openfiler.org/ Browser-based network storage software distribution
• http://www.debugmode.com/wink/ Tutorial and presentation creation software

• http://www.lexisnexis.com/sourcelists/ Legal and public records
• http://www.issues2000.org/ Candidates on issues

• http://senseable.mit.edu/grazrealtime/ Mobile Landscape
• http://www.cbsnews.com/stories/2005/10/08/tech/main927858.shtml “Could cell phones stop traffic?”


Written by Harsh

November 7th, 2005 at 6:02 pm

Posted in GIS,Web

Tagged with , ,

Follow Up [3]: Graphic Software

with 3 comments

This week Yahoo released its own take on online mapping. Its new service includes both Flash and AJAX APIs coupled with the ability to geocode.

If you think about it, sooner or later this had to happen- developers finally mustering the courage to embrace arty Macromedia Flash for distributing spatial information in a big way, like Geocentric. Actually, Google has been using Flash for a different distribution for quite some time now. But this release by Yahoo and its under-1000 dollar price-tag should help Flash emerge as a more visible player in the online mapping game.

Did the earlier musings portend this?


• Yahoo Developer Network
• GeoCool! Tutorial
• Google Local, MSN Virtual Earth, Amazon A9, AOL MapQuest
• Application: Google Earth
• Discussion Forum

Written by Harsh

November 3rd, 2005 at 6:32 pm

Posted in GIS,Mashup

Tagged with , , ,

Katrina Links

without comments

Rethinking Flood Insurance” [09/21/2005]: A timely but poorly-researched editorial in The Washington Post on the levee problems plaguing the National Flood Insurance Program.

As much as some may cringe to what they see as their tax-dollars being spent on bail out, the often-omitted fact remains that many New Orleanians were not required by the National Flood Insurance Program to purchase flood insurance because they enjoyed the protection of levees. So the federal government through the Corps of Engineers is at least partly responsible for creating a false sense of security by failing to repair levees in a timely manner. Bear in mind that the State of California has been asked by its court to shoulder responsibility for damages from failure of levees for which it is a sponsor. And if we did not cry “welfare state” when the federal government stepped in to ease out the airline industry after 2001, surely we can hush our moans now.

For more discussion points, refer to this white paper by the Association of State Floodplain Managers.

While on this disaster as one watches events unfold, it becomes clear that an infuriating management style marked by a “hands-off” approach that is prone to making excuses for ignored red flags can only get rewarded for ideological and rhetorical reasons rather than merit. And such a management style finds a willing bed-partner in a “let’s-eat-at-steakhouse-since-the-proceeds-go-towards-relief-efforts” empathy-response. In itself, such a response cannot be right all the time for it is primarily detached and “feel good”.

The Disaster Mitigation Act of 2000 had laid down clear requirements to plan for such events. And as I understand, the National Incident Management System laid down a similar framework with regards to response-coordination. But no amount of planning [State of Louisiana Hazard Mitigation Plan, State of Alabama Hazard Mitigation Plan] could prevent the failure from happening.

Having observed this breakdown in leadership and with some benefit of experience, I cannot stress enough how planners should restrict their impulse to pen a plan for every problem and how they should also focus on becoming “political actors” for one cannot write a plan that accounts for the failure in carrying out the plan itself.

On another note, many of the residents of New Orleans were not required by the National Flood Insurance Program to purchase flood insurance since they were protected by levees. Although non-discriminatory exceptions can always be made, this further complicates relief efforts as it currently limits the amount of disaster assistance available through certain agencies.

• Blogs about this editorial
• Katrina
• Craigslist: Lost and Found- New Orleans LA, Baton Rouge LA
• Red Cross: Family Links Registry
• Lycos: Missing Persons Search
• Housing Information Gateway
• Shelter Map
• Information Map
• ESRI: Katrina Disaster Viewer
• Google Earth: Imagery
• NYT: Draining New Orleans Map
• Contact: Mitigation Planners and Substantial Damage Assessors

Written by Harsh

September 21st, 2005 at 7:09 pm

Never the Twain Shall Meet

without comments

On the eve of the launch of Virtual Earth, as Microsoft plays catch-up with Google‘s high-rate of innovation, here’s a transcript of some tete-a-tete:

[Sometime before 2000]
Bill Gates: Now that we are in the email business with Hotmail, we need to think of ways to fatten the bottom-line.
Steve Ballmer: Online marketing is the way to go Bill! Let’s just create ahem ahem unnecessary page-views when the user logs-in and put as many graphic-intensive ads on each one of them as possible.
Bill Gates: …something like that SNL skit about advertisements on MSNBC flooding the screen and blocking the anchor’s face?!
Steve Ballmer: …hehehe, something like that! Hey, it’s a free service- the user might as well pay for it through ad views. You’ve got to market these goodies aggressively!
Bill Gates: Yeah, the bottom-line is the key!

[Sometime before 2004]
Larry Page: We need to get into the email business with a Google mail. The current services aren’t up to par.
Sergey Brin: Yeah, but given our relative size we must offer something that is significantly superior to what the market currently offers to make any reasonable in-roads.
Larry Page: OK, let’s start with a clean slate- how do we offer a better email service?
Sergey Brin: It’s all about the user-experience. At the end, if the user likes it, she will come back for more.
Larry Page: So we don’t flood the page with pop-ups and such junk??
Sergey Brin: That’s right! Advertisements should be useful but as unobtrusive as possible.
Larry Page: Agreed, the user-experience is the key!

• Follow Up [2]: Map Viewer and Google


• Rudyard Kipling

Written by Harsh

July 24th, 2005 at 7:37 pm

Posted in Technology,Web

Tagged with ,

Digital conversion of Flood Insurance Rate Maps (DFIRMs): White Paper

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

Parent document copied with permission from the original white paper at the GIS Technical Center. The objective was to add notes reflecting procedural changes brought about by the integration of CITRIX WISE Tools. The initial notes were created during a 2005 DFIRM Production.


In August 2003, the GIS Technical Center (WVGISTC) became a Cooperating Technical Partner with the Federal Emergency Management Agency. Our mission, to create digital flood themes from paper Flood Insurance Rate Map (FIRM) and Floodway Boundary and Floodway Map (FBFM) panels and to deliver the data in specified formats and with appropriate documentation. FEMA prepares Mapping Activity Statements (MAS) that outline the scope of work and deliverables for each county-based project. Final products are primarily seamless, countywide geospatial data files in the ESRI shapefile format, along with associated metadata.

According to FEMA (Michael Craghan, pers. comm.), the final vector products will have the following qualities:

1. A seamless county-wide dataset, with no gaps or overlaps
2. The lines and polygons end up in their real-world locations
3. There is no scale distortion (i.e. spatial relationships are maintained; if paper map is 1”=500’, digital version should be too).

The FIRM/FBFM features collected by WVGISTC are:

1. Base Flood Elevations (BFE-lines)
2. Cross Sections (Xsection-lines)
3. Flood Hazard Areas (polygons in final format)

The current Mapping Activity Statement for conversion of Jefferson and Berkeley counties specifies these deliverables:

1. Written certification that the digital base data meet the minimum standards and specifications.
2. DFIRM database and mapping files, prepared in accordance with the requirements in Guidelines and Specifications for Flood Hazard Mapping Partners (see references for citation); (S_ Base_Index, S_Fld_Haz_Ar, S_BFE, S_XS, S_FIRM_Pan).
3. Metadata files describing the DFIRM data, including all required information shown in Guidelines and Specifications for Flood Hazard Mapping Partners.
4. Printed work maps showing the 1- and 0.2-percent-annual-chance floodplain boundary delineations, regulatory floodway boundary delineations, cross sections, and BFEs at a scale of 1:100,000 or larger.
5. A Summary Report that describes and provides the results of all automated or manual QA/QC review steps taken during the preparation of the DFIRM.
6. An ESRI shape file showing points where mapping problems are discovered during the digitizing process.

The following sections describe the procedures we follow to (1) prepare the base material for digitizing, (2) digitize features, (3) perform quality control, and (4) prepare final files using ESRI Arcmap 8.x software. This document assumes the user is skilled with ESRI Arcmap 8.x GIS software and has the ability to use reference materials. For help using ESRI Arcmap consult the help files or ESRI on-line support.


Source Material (Source Material Inspection)
In the MAS cost estimation phase it is advantageous to become familiar with the FIRM and FBFM panels that cover the geographic extent of the county. In the back of our FEMA binder, there are 3 CDs with scanned panels for 10 high priority counties. The scanned or paper FIRM and FBFM panels should be visually inspected to check for insets and other format issues that may impact the amount of time it takes to digitize and attribute. At the on-line FEMA Flood Map Store search for FEMA issued flood maps. Follow the prompts for state, county, and community. This is one way to become familiar with the number of panels in a county and also to gather information on the effective date. The effective date on-line may be compared to the effective date on the paper panels to determine if we have the newest source. This is important because FEMA may have done some digital conversion in the counties we are digitizing; in Berkeley County, for instance, 2 of the panels were available in a digital CAD format. We received the CAD files (DLG) and copied the line vectors into our Arcmap project.

Base Layer Compilation
As part of the MAS, a ‘base map’ is obtained for georeferencing the FIRM and FBFM panels in a county. The MAS states: “the base map is to be the USGS digital orthophoto 3.75-minute quarter-quadrangles (DOQQs), or other digital orthophotography that meets FEMA standards.” Currently, we use the DOQQs to georeference the panels; when it becomes available, we will use the Statewide Addressing and Mapping photography. Countywide mosaics of the DOQQs are available either from CDs in our office or from the NRCS geospatial data gateway. Before beginning panel georeferencing, gather all the base map photography to cover the geographic extent of the county. Check DOQQ tiles and the ortho mosaic, if used, for agreement with each other. Also check the individual DOQQ tiles against the quarter quadrangle index to make sure that they are NAD83 and not NAD27. Finally, check to make sure that the spatial properties (coordinate system and projection) are defined for each quarter quad.

FEMA provides scanned (TIFF) images of the paper FIRMs and FBFMs. Not all counties have separate floodway panels (FBFMs).

You can download county FIRMs and FBFMs from the FEMA Map Store. For Summers and Fayette Counties WV, aerial photographs from the SAMB were reprojected on-the-fly and used as base.

“ArcMap will not project data on-the-fly if the coordinate system for the dataset has not been defined. The coordinate system for any dataset can be defined using ArcCatalog” [ESRI Help].

It is advisable to load the aerials, FIRMs and FBFMs in different Raster Catalogs for quicker refreshes. It is best to start-off with geoferencing the index and then nailing each semi-transparent panel in its approximate location through corner points [“spreading in all the right directions”]. Again, it is best to concentrate around your area of interest, in this case, the floodplain. It is also advisable to adjust the visible scale for the aerials for easier navigation.

Also, try to keep the clipboard empty since on aging systems that may cause incomplete raster refreshes. To avoid related spikes in CPU usage, you may adjust the display settings, page file size and Task Manager priorities accordingly. Also, if you have upgraded to ArcGIS 9.1 minus the patch and are having raster display problems, consult the following ESRI thread.

“In general, if your raster dataset needs to be stretched, scaled, and rotated, use a first-order transformation. If, however, the raster dataset must be bent or curved, use a second- or third-order transformation. Add enough links for the transformation order. You need a minimum of 3 links for a first-order transformation, 6 links for a second-order, and 10 links for a third-order” [ESRI Help].

Priority should be given to georeferencing individual panels over interlocking adjacent panels. Once satisfied with the adjustments and associated RMS Error, you may either update if using first-order transformation, or rectify if using higher-order transformation.

Note that first-order transformations update the *.TWF files, however higher-order transformations also update the *.AUX files.

Once the groundwork is done, it takes less than 1/2 an hour per panel on a machine with the following specifications:

MS Windows 2000 SP4 Dell PWS 340 Pentium [4] CPU 1700 MHz 1.05 GB RAM

The steps taken to georeference the scanned FIRMs/FBFMs using Arcmap are:

1. Start an Arcmap project in the desired coordinate system. When using West Virginia DOQQs that will primarily be UTM 83 zone 17 (although Jefferson County was zone 18).
2. Add the DOQQs for the area of interest to the project.
3. Add the scanned TIFF to the project. The first panel to be georeferenced is the most difficult, because locating the correct spot on the base map photographs using the landmarks on the panel can be frustrating without a good reference system. One way to do this is to warp the panel index first—hence giving a rough estimate of panel location on the photographs. Alternatively, after warping one panel, work with adjacent panels to make landmark location easier.
4. Use “fit to display” on the georeferencing toolbar pull-down menu to move the TIFF to the current extent.
5. Use the georeferencing toolbar to create control points on the DOQQs and the scanned TIFF, using roads and other major features appearing on the FIRM.
6. It is recommended that “Auto Adjust” be checked on the georeferencing dropdown and that the layer being georeferenced is partially transparent. As control point links are added the scanned TIFF will be shifted over the DOQQs, making finding and adding additional links easier.
7. As you are adding control points, check the residual values and total RMS value in the link table. The goal is for a total RMS value of 10 or less (units are mapping units, meters). After adding as many control points as possible it is sometimes useful to remove links that have very high residual values to improve the overall RMS value of the warp. Sometimes it is not possible to get an RMS below 10.
8. Concentrate control points around areas with flood features to improve the fit of areas that will be digitized. We recommend adding at least 10 sets of control points, although in some cases we used over 20 sets to improve fit.
9. Record the total RMS value of the transformation for each panel in a spreadsheet for the county.

Vertical Datum Conversion [optional]
The estimate of the basic shape of the earth was inconsistent under the National Geodetic Vertical Datum [NGVD] 1929. This resulted in less accurate vertical data computations. Hence, it was decided to shift to the North American Vertical Datum [NAVD] 1988 that uses more reliable means for this estimation. Vertical Datum is required for DFIRM panels and the D_V_Datum table. Note that Vertical Datum conversion will not result in any change in flood depths.

Begin with 7.5-minute USGS Quadrangles. For Summers and Fayette Counties WV, this data was downloaded from the WV GIS Technical Center. Next buffer your County by 2.5 miles to select all the Quad corners that fall inside the buffer. Then reproject the corner points thus selected to GCS_North_American_1983 and add XY coordinates. Now you have all the latitude/longitude coordinates required for orthometric height-difference computations using the National Geodetic Survey’s VERTCON software. Alternatively, you may use the Corps of Engineers’s CORPSCON software.

In VERTCON, if you have generated an input data file for your latitude/longitude coordinates, you would typically select the ‘Free Format Type 2’ option. Else, you would simply enter individual Station Names and associated latitude/longitude coordinates. VERTCON generates an output data file for use in the following calculations [Sample Worksheet].

Once Conversion Factors for all points have been determined, calculate the Average, Range and Maximum Offset for the Conversion Factors. If the Average is less than 0.1 foot, only a “passive” Vertical Datum conversion may be applied. Typically, when the Maximum Offset is <= 0.25 feet, a single Conversion Factor can be applied. Else, stream-by-stream Conversion Factors need to be applied.


dd = 37.87511679110 degrees ~ 37 degrees
mm = .87511679110*60 = 52.50700746600 ~ 52 minutes
ss = .50700746600*60 = 30.42044796 ~ 30 seconds
==> 37 degrees 52 minutes 30 seconds
2. Appendix B: Guidance for Converting to the North American Vertical Datum of 1988
3. FIA-20 June 1992, Ada County OH

Digitizing and Attributing Flood Features (Arcmap Project and File Specifications)
The UTM NAD83 projection, zone 17 is used for all West Virginia countywide flood mapping projects, with the exception of Jefferson County, which is zone 18. All features are initially collected as lines, although special flood hazard areas (e.g., Zone A, AE) are later converted to polygons. All features are drawn in one line shapefile and are later separated into the separate files required to meet MAS deliverables. For the purposes of drawing the flood feature lines we are using a line shapefile with the following attribute fields: Type (text, 10), Letter (text 2), Elev (long integer, precision 5). A description of the values we use in those fields is given below with each different feature type. In the first round of digitizing the shapefile was named All_Lines.shp, although in the future we may switch to using a county name in combination with employee name. Save edits frequently while digitizing, both by using the save edits button in Arcmap and by making backup copies of the file with Arccatalog.

Begin an edit session and set up the snapping environment. Having snapping turned on is important to allow snapping of BFEs to the edges of flood hazard areas and for snapping the flood zone line segments together. We generally usually use a snapping tolerance between 7 and 10 pixels; this is a personal drawing preference and may vary from person to person. Use the appropriate snapping mode for each type of feature, i.e. ‘vertex’ for closing zone boundaries, ‘end’ for snapping arc ends together and ‘edge’ for snapping BFE lines to zone boundaries. Note that having ‘vertex’ snapping on can make it more difficult to accurately place BFE endpoints. The goal is clean intersections and BFEs that are snapped to flood hazard area boundaries.

Feature Collection
We generally draw flood map features in this order: floodway, flood zone, BFE, and cross-sections. Some counties have floodway features on a separate map (FBFM) from the FIRM. When working with two maps, collect floodways and cross sections from the FBFM and collect flood hazard zones, BFEs, and streams and channels from the FIRM maps. When working with a FIRM and a FBFM for a panel, it is recommended that lines are drawn from the FBFM first and the FIRM second. Features are to be seamless across panel boundaries, meaning when the same feature type occurs on both sides of a panel boundary, it should be drawn with no interruption. Adjacent panels digitized by different people should have the endpoints of flood feature lines snapped together in the final line shapefile. Be sure to check panel edges carefully for small flood zone polygons.

Panel Index and Base Index
Collection and attribution of flood features will be discussed in detail below. In addition to the flood features, we also submit 2 polygon index shapefiles to FEMA for each county. One of the shapefiles is called S_FIRM_Pan and is an index of the FIRM panels for a county. It is created by digitizing the lines on the scanned and warped county FIRM index. Only unincorporated areas are included the in the panel index, not the incorporated areas. Secondly, an index of the “base” data for a county is to be provided in a polygon shapefile called S_Base_Index. In our case, the base data is the DOQQs. The S_Base_Index shapefile can be generated by clipping out the appropriate quarter quads from the DOQQ index. As with all other shapefiles we submit, both the S_FIRM_Pan and S_Base_Index shapefiles have a required attribute table format, discussed later in this document.

Flood Feature Symbology and Attributes

The floodway is the channel of a river plus any adjacent floodplain areas. Floodways won’t be found on all panels. There are 2 different presentations of floodways on FEMA panels, which vary by county. In some counties, Berkeley for example, floodway symbology is included on the FIRM (Figure1a). Other counties have separate floodway panels (FBFM, Figure 1b) and they must be added as a separate layer for floodway line collection.

In the initial drawing, lines defining the floodway are given the following attributes:

Type: floodway

Flood Hazard Areas
Flood hazard areas will also be referred to as ‘flood zones’ or ‘zones’ and they identify areas of different levels of flood risk. Flood zones are labeled on the FIRMs with letters; commonly used zone names are A, AE, B, C, X and they are shown on the paper maps with different densities of shading and text labels (Figure 2a). Zones are collected as lines, although later they will be converted to polygons. Digitizing proceeds from the inside out, i.e., collect the innermost zones first (In Figure 2a, the floodway would be collected first, and then AE, then X). Where an outer zone line flows into an interior zone line, they should be snapped (Figure 2c). Each line defining flood zones should be collected only ONCE. In areas where zone boundaries are coincident, only one line is collected (Figure 2c). There are zone division lines (Figure 2c and d, also referred to as gutter lines), which separate “special” flood hazard areas (generally zones A and AE). The zone division lines are thin white strips that are hard to see in the shaded zones. Gutter lines should be considered the border of those particular zones and treated as any zone boundary would be (i.e., collected once, continuous with other zone lines).

In the initial drawing, lines defining the flood hazard areas are given the following attributes:

Type: zone

Base Flood Elevations
Base Flood Elevation (BFE) is the height of the base (100-year) flood in relation to a specified datum. BFEs are symbolized on the FIRM panels with a wavy line (Figure 3a) but the feature is usually collected as a straight line (Figure 3b) that is snapped to the edge of the flood hazard area. IF there is a significant bend in the BFE as drawn on the panel, then additional points may be added to follow the curve. Ends should always be snapped to the flood hazard area.

In the initial drawing, lines defining the BFEs are given the following attributes:

Type: bfe
Elev: numeric elevation value on FIRM (e.g., 405)

Cross Sections
Cross sections (Figure 4a) show the location of floodplain cross sections used for computing base flood elevations. Cross sections are normally collected as a straight line, crossing and exiting the flood hazard area (Figure 4b). It is not necessary to follow bends in the cross section line that occur outside of the flood hazard area, nor is it necessary to extend the line through the hexagons at the end of the line symbol. If there are bends in the cross section within the flood hazard area, place only as many vertices needed to maintain shape. Cross section lines should not be snapped to the flood hazard area lines, and instead should extend beyond them.

In the initial drawing, lines defining the cross sections are given the following attributes:

Type: xsection
Letter: letter of cross section, found in hexagon symbol (e.g., z)

Channels and Streams
Channels and streams (Figure 5a and 5b) are collected in the flood hazard areas for QC purposes. No snapping is required and the stream or channel line should extend just beyond the flood hazard area when applicable. Streams are collected as single lines and both lines of a channel are collected.

In the initial drawing, lines defining the channels and streams are given the following attributes:

Type: channel or stream, as appropriate


Visual QC Of Linework
After all lines are digitized and in a countywide, seamless file, a visual check is done to ensure that all features have been collected. The “Type” field in the line shapefile can be used to categorically symbolize the different feature types for the visual QC. Different colors and line styles can be used to represent separate feature types and the legend symbols can be saved as a layer file to preserve the symbol assignments. Turn on the labels for BFEs (elevation) and xsections (letter) and select a font style and color that allows them to be easily seen and checked in the visual QC process. Each person will probably have a different method of doing a systematic visual inspection. Some suggestions: a grid could be used to scan the linework, drainages can be followed, or the check can be done panel by panel. The important thing is to scan at a level such that all of the panel raster features can be identified and vectors examined. The person doing the QC should have a full understanding of what features are supposed to be collected and the symbology variations (e.g., floodways on FIRMs vs FBFMs). Any missed features should be digitized. This is also a good time to make note of any unusual problems or non-conformities in the scanned panels (e.g., zone type changes at panel or corporate boundary). This is the time to check that features are seamless across panel boundaries; BFEs and cross sections in particular should be checked at panel boundaries because there is no further geometric processing with these lines that will reveal continuity errors.

Spatial Adjustments (otherwise known as “Adjusting To The Real World”)
Post-drawing manipulation of lines to improve “fit” is hard-to-quantify and subjective. As stated in the introduction, FEMA requires the digital data to have a reasonably good fit to the “real world”. The “real world” in our case is the DOQQs. The scanned panels do not warp perfectly and in some areas the digitized lines will not overlay real world features very well. Current adjustment procedures involve these steps:

1. Compile the following layers in Arcmap:
a. DOQQs
b. Line shapefile with county-wide seamless flood features
c. 1:24,000-scale NHD centerline data layer (route.rch, in catalog unit coverages)
d. Problem point file (discussed in the next section)
2. Determine a systematic method for visually scanning the data (similar to that used in the visual QC) and adjust “Type” symbology for easy differentiation.
3. Begin a visual check of the linework, this time concentrating on how well the streams and channels drawn from the flood panels line up with the DOQQ and the NHD data. It is strongly recommended that you do not use the FIRM panels at this point, as they will increase confusion.
4. NHD data are a fairly good guide to where the flood panel waterways “should” be; however they are not perfect. While visually scanning the linework, check that the streams and channels collected from FEMA panels line up fairly well with the NHD data, while also checking to see that NHD data appears to overlay the hydrologic feature on the DOQQ. There is never going to be a perfect fit; the panels streams will wander back and forth over the NHD vectors. What you are looking for is areas of consistent difference that extend for a noticeable distance (again, hard to quantify). In Figure 6a, the blue dashed panel stream channel lines are not aligned with the DOQQ stream channel edges.
5. When areas of consistent difference are found, ALL the linework surrounding the area is shifted at the same time, until the panel stream has a better fit to the real world stream. This is accomplished by first breaking all the continuous flood zone, floodway, and stream lines at about the same point on 2 imaginary lines that run perpendicular to the “flow,” one at each end of the area to be shifted. Then, the cut lines are selected, along with any BFEs or cross sections that are in the area (Figure 6b), and all the selected features are moved until the streams are better aligned (Figure 6c). The adjustment is accomplished mostly with the move tool in Arcmap, although in occasion the rotate tool may be used to improve the fit of the selected lines with the DOQQ.
6. Lastly, snap the dangling ends together and smooth out the curves of the reattached lines by moving or adding vertices (Figure 6d). This is the only time lines should be moved or stretched individually, as it distorts proportions.

Mapping Problem File
One of the required deliverables is a point file indicating areas where certain “problem” situations arise. At the same time as adjustments are being performed, the problem point file can be edited. FEMA defined mapping problems are outlined in the draft Technical Memo, dated October 3, 2003, a copy of which is found in the FEMA project notebook; they have also been listed below for convenience. A point shapefile is created for each county with the following fields: Error_type (text, 10) and Descrip (text, 75).

Error_type Descrip
BFE Base Flood Elevation problem
XSECT Cross-section problem
SFHA-PAN Special Flood Hazard Area changes at map panel edge
SFHA-BDY Special Flood Hazard Area changes at a political boundary
SFHA-STR Special Flood Hazard Area different on each side of a stream
SFHA-OTH Other Special Flood Hazard Area problems
STR-FW Stream outside of floodway
STR-SFHA Stream outside of Special Flood Hazard Area

As of this writing, we have primarily found the STR-SFHA, STR-FW, and SFHA-BDY types of errors. Note: errors should be determined AFTER lines are adjusted in a given area, as the adjustment may correct the problem. Place a point in the shapefile at the location where the problem occurs. In Figure 7 the pink point indicates a location where the stream (orange) is outside of the flood hazard area (blue line).


The flood hazard zones and floodways must be converted to polygons for final processing. Select all lines with a “Type” of zone or floodway and export to a separate line shapefile. Topological checks will be performed on the line file before polygons are built. Topology work can only be done in Arcmap via the geodatabase model. Import the line shapefile into a geodatabase feature class that is under a feature dataset (must have a feature dataset to create a topology). If you are starting with a geodatabase / feature class, then use Export | Geodatabase to Geodatabase in Arccatalog to transfer the feature class into the dataset.

Add a new topology under the feature dataset. Set the cluster tolerance relatively high (0.1 was used in the first 2 MAS, which corresponds to 10 centimeters on the ground) to reduce the number of small pieces formed. Only the flood hazard zone lines feature class will participate in the topology. The topology rules used are: must not have pseudos, must not have dangles, and must not self-overlap. After creating the topology for the lines, validate it. Bring the validated topology layer into an Arcmap project to view the errors found. Use the topology tools to analyze and correct all errors before proceeding. See the Topology section in the ArcGIS book “Building a Geodatabase” for help.

After validating the topology and fixing all topological errors, convert the lines feature class to a polygon feature class. To do this, right click on the feature dataset in Arccatalog and select ‘new’ and then ‘polygon feature class from lines’. A wizard helps with the conversion; accept the default tolerance.

Once the polygon layer is created, create a new topology for it. Use the default cluster tolerance, which is very small. Only the polygon feature class participates in the topology, and the rules are: must not overlap and must not have gaps. Bring the validated polygon topology into Arcmap as with the line topology. Ideally, there will be no errors in this topology. After checking for and fixing topological errors, another check should be done for sliver polygons. This can be done by viewing the polygon attribute table in Arcmap and sorting the table based on the shape_area attribute field in ascending order. Examine the smallest polygons to be sure they are not slivers.

Next, the polygon flood hazard features need to be attributed. This can be done in the geodatabase, setting up a domain so that attributes can be chosen from a drop down list. Overlay the flood hazard polygon layer with the FIRM/FBFM panels and attribute the polygons. It saves time if the shapefile you are using to add attributes has the same column structure as the required final product (see Table 2). In the future we hope to have template files available for use, so that the required structure will already be in place. We have tried merging with a template file in the geodatabase, but that resulted in features shifting. This process is still being developed.


For the final deliverables, the flood features collected in the line shapefile must be processed into separate shapefiles with specified fields. Table 1 gives an overview of the shapefile names and contents. Attribute fields have required field types (e.g., text, number) and sizes; details can be found on the pages of Guidelines & Specifications for Flood Hazard Mapping Partners Appendix L referred to in Table 1. These pages from Appendix L have been printed out and are in the guidelines/technical section of the FEMA project binder. Table 2 provides details on the required fields.

Table 1. Deliverable shapefile description
Shapefile Name Contents Pages in Appendix L
S_Base_Index Grid of base data, in our case, DOQQs. Polygons. L-270 to L-271
S_FIRM_Pan Grid of FEMA panels; digitized from county panel index. Polygons. L-286 to L-290
S_Fld_Haz_Ar Flood hazard zone polygons L-291 to L-293
S_BFE Base flood elevation lines collected from FEMA panel L-272 to L-273
S_XS Cross section lines collected from FEMA panel L-350 to L-354

Table 2. Shapefile attribute field requirements
Shapefile Field Name What Goes In It
S_Fld_Haz_Ar (polygon) FLD_AR_ID A unique feature number. Can be copied from FID field. [Text, 11]
FLD_ZONE Flood zone from FIRM. Use values in FLD_ZONE field of Table D_Zone on pg L-452 of Appendix L. [Text 55]
FLOODWAY “FLOODWAY” if polygon is a floodway. Null if not. [Text, 30]
SFHA_TF “T” if any zone beginning with A. “F” for any other zone. True or false. [Text, 1]
SOURCE_CIT 11-digit FIRM panel number that majority of feature is on. If polygon crosses many panels, use downstream panel. [Text, 11]

S_XS (line) XS_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
XS_LTR Upper case letter(s) of cross-section from FIRM. [Text, 12]
XS_LN_TYP “LETTERED” in all cases. [Text, 20]
WTR_NM Name of water feature (stream) cross section is on. From FIRM or FIS. [Text, 100]
SOURCE_CIT 11-digit FIRM panel number cross section is on. If on two, list panel with majority. [Text, 11]

S_BFE (line) BFE_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
ELEV Numeric elevation of BFE, from FIRM [Double, Prec. 13, Scale 2]
LEN_UNIT “FEET” in all cases. [Text, 20]
V_DATUM Vertical datum of panel. Listed on panel, and values must come from the V_DATUM field of the D_V_Datum table on page L-444 of Appendix L. [Text, 6]
SOURCE_CIT 11-digit FIRM panel number BFE is on. If on two, list panel with majority. [Text, 11]

S_ Base_Index (polygon) BASE_ID A unique feature number. Can be copied from FID field. [Text, 11]
FILENAME Name of DOQQ or other image file used as base map. [Text, 50]
BASE_DATE Date image was captured. For DOQQs can be found in header file. [Date]
SOURCE_CIT BASE1 or other abbreviation that corresponds to metadata [Text, 11]

S_FIRM_Pan (polygon) FIRM_ID A unique feature number. Can be copied from FID field. [Text, 11]
FIRM_PAN FIRM panel number. [Text, 11]
EFF_DATE Effective date on FIRM panel. [Date]
SCALE Scale of FIRM panel. If map scale on FIRM is 1” = 500’, then scale is 6000. Multiply feet by 12 to get true scale. [Text, 5]
SOURCE_CIT 11-digit FIRM panel number. [Text, 11]
BFE Shapefile Creation

From the line shapefile that was used for digitizing, use the Type field to select and export BFEs to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output merge file can be a geodatabase feature class, which allows for the use of an attribute domain drop-down for the SOURCE_CIT field. Use Arcmap editing tools to assign attributes to the fields shown in the preceding table. BFE lines are submitted in the S_BFE shapefile.

Cross-section Shapefile Creation
From the line shapefile that was used for digitizing, use the Type field to select and export cross-sections to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output, as with BFE, can be a geodatabase feature class. Attribute domains can be created for the XS_LTR, XS_ LN_TYP, WTR_NM (a list of stream names is available in the county FIS book) and SOURCE_CIT fields. Cross-section lines are submitted in the X_Xs shapefile.

One of the required deliverables relating to the base map (DOQQs in our case) is a “written certification that the digital data meet the minimum standards and specifications.” A text file with the following statement was created:

“This text file serves as written certification that the base map digital data meet the minimum standards and specifications in Guidelines and Specifications for Flood Hazard Mapping Partners Appendix K. On page K-42 (Section K.4.1.1) of that document it is written “The most common form of raster image map is the digital orthophoto, especially the standard Digital Orthophoto Quadrangle (DOQ) produced by the U.S. Geological Survey.” DOQQ’s were used as the base map for georeferencing scanned paper FIRMs and for visually locating features of interest.


Refer to the DOQQ Metadata and the Digital Orthophoto Standards. Appendix L is the primary document of interest.

Refer to the NHD website.

Retrieve the 1:24:000 NHD coverages to use as reference

FEMA flood documents in the black FEMA 3 ring binder

Arcmap editing and geodatabase manuals.

Mapping Activity Statement documents – be sure to understand all deliverables.




Summary Report
QA/QC Review Steps During Digital Conversion of Flood Insurance Rate Maps
Mapping Activity Statement 2003-02, West Virginia GIS Technical Center
Prepared 4/15/04

The following QA/QC checks were performed during the digital conversion of Flood Insurance Rate Maps by the West Virginia GIS Technical Center (WVGISTC):

1) Source Material Inspection
a) Visually reviewed scanned panels received in .tif format; compared with printed paper maps to check for completeness

2) Base Layer Compilation/Verification
a) Used a vector quarter quad index certified by WVGISTC to confirm that the USGS Digital Ortho Quarter Quads (DOQQs) were in the UTM NAD83 projection; DOQQS were used for the georegistration base map
b) Checked the spatial integrity of a county-wide ortho mosaic (used as a reference; obtained from the NRCS Geospatial Data Gateway

3) Georegistration of Scanned Panel Source Material
a) Ensured data were correctly referenced to the UTM coordinate system
i) Set Arcmap software data frame projection to UTM NAD83, Zone 17 or 18, as appropriate
ii) Georeferenced scanned panels to real-world coordinates using DOQQs to establish reference links
(1) The mean RMS value for warped panels was 5.63 meters (mapping units). This was the best attainable georeferencing that could be accomplished without stretching features and impacting length relationships
iii) Re-warped portions of scanned panels in areas of poor fit to attain a better visual real-world correlation
b) Checked that the scale of warped raster (.tif) and original paper maps were compatible
i) Plotted georeferenced FIRMS at the same scale as paper maps; conducted manual ruler measurements on paper map in comparison to plotted data to confirm accuracy of feature location and length relationships

4) Digitizing of Flood Features
a) Digitized SFHA, BFE, and cross section features from the georeferenced panels as line feature types
i) SFHAs and Floodways were digitized first; BFEs and Xsections were digitized next and BFEs were snapped to AE zone boundaries (Arcmap snapping tolerance set to 10 pixels)
ii) Streams and channel banks were partially digitized as additional reference features
b) Systematically visually scanned collected vectors and compared them with underlying georeferenced paper flood maps
i) Checked that character of features was maintained
ii) Checked that required features were collected
c) Edgematched features on adjacent panels
i) Checked that features were snapped seamlessly at panel boundaries

5) Spatial Adjustments
a) National Hydrography Dataset (NHD) vector stream centerlines were used to assist in identifying real-world (DOQQ) stream position
b) Proportional piecewise adjustments
i) Adjusted all features (SFHAs, BFEs, cross sections) in small sections of the floodplain when:
(1) the DOQQ stream was not located within the SFHA or
(2) there was a visibly constant difference between location of the DOQQ stream and location of the digitized stream
ii) Attempted to bring the digitized FIRM stream in line with the NHD stream or the stream on the ground, if it was visible on the DOQQ
iii) Used Arcmap editing functions such as line moving and rotating
c) Created a point shapefile to mark location of “mapping problems” as defined in the FEMA technical memo dated October 3, 2003. Examples of problems found:
i) Stream outside of SFHA
ii) Stream outside of floodway
iii) SFHA changes at political boundary

6) Topology
a) Used the ArcGIS geodatabase model and topology rules on SFHA and floodway line features
i) Corrected pseudo-nodes, dangles, and self-overlapping lines
b) Generated polygons from SFHA and floodway line features and used the ArcGIS geodatabase model and topology rules for polygons
i) Confirmed there were no polygon overlaps or gaps
ii) Removed sliver polygons

7) Feature Attribution
a) Reviewed technical memo and MAS to format the 5 required shapefiles (S_Base_Index, S_FIRM_Pan, S_Fld_Haz_ar, S_BFE, S_Xs)
i) Checked that file names, attribute names, types and sizes meet specs
b) Checked that correct attributes were assigned to digitized flood features
i) Completed a systematic visual scan of vector flood features overlaid with georeferenced panels; used symbology variation and labeling to confirm proper attributes had been applied
ii) Checked that valid domain values were used in attribute table columns

8) Map plot for final visual inspection and scale check


File Backup
Everything pertaining to the current flood mapping project should be backed up to Vesta. This includes warped panels, line shapefiles, and other reference documents.

A FEMA backup folder is set up at this location:


It is visible from the TechCenter network under Vesta and is shared openly. This is where all the files for a MAS in progress should be stored. Use sensible file and folder names to help everyone identify the pieces of the project.

A final backup of everything was kept in this location:


It is recommended that drawing shapefiles be backed up every time they are changed; a file versioning system may be preferable to overwriting the same file each time.

Naming Conventions/Path Structure
FEMA has requested that we name the metadata files in this format:


So, for example, the metadata files submitted for Jefferson County were named:


On the CD containing the final deliverable files, this is the requested structure:


The county name behind the first backslash will change for each countywide project completed and submitted. The Arcshape folder contains the S_Base_Index, S_FIRM_Pan, S_Xs, S_BFE, and S_FLD_Haz_Ar shapefiles, plus the problem shapefile. The Ortho_photos subdirectory contains the DOQQs or other imagery used for the base map. The document subfolder contains the metadata, QA/QC report, and base map certification. I made subfolders for each of those items under the document folder. The RFIRM folder contains all the georeferenced panels.

* Digital conversion of Flood Insurance Rate Maps (FIRMs)
* WIKI: Edit Lock Schema

Written by Harsh

July 7th, 2005 at 10:03 am

WIKI: Edit Lock Schema

with 3 comments

Now that I would update the DFIRM WIKI more frequently, I added a lock this past weekend to prevent simultaneous editing. And after being hit by abuse through automated comments, basic verification was also added while still allowing relatively hassle-free editing.
Pi: Quiet Musing
At some point, I may submit these improvements back to TipiWiki.


Written by Harsh

June 17th, 2005 at 7:34 pm

Posted in Programming,Web

Tagged with , , ,

A Rose by Any Other Name

with 4 comments

The definition of GIS has evolved from ‘Geographic Information System’ to ‘Geospatial Information System’. It is time now that it takes the next logical step to ‘Spatial Information System’. My earlier post wrestled, well not quiet, for a truer understanding of “GIS” given the advent of non-traditional spatial software. Since then I have been convinced that spatial information is better understood by snapping links that tie, and thus confine, it to geography.

Inside Space- An Unventured “GIS” Frontier? Magnetic Resonance Image [MRI] of my right-wrist

It is therefore disappointing that some professionals continue to look at spatial information from behind the narrow screens of geography. Hopefully, with the entry of non-traditional market forces, this viewpoint will be shaken to the point of abandonment. A truer appreciation of spatial information will require a visual mindset where all spatial components to information are addressed.


• Front, Side and Top View: Construct two valid isometric projections

• Find the missing piece

Written by Harsh

June 10th, 2005 at 7:04 pm

Declaration of Interdependence

without comments

“As we become aware of the ethical implications of design, not only with respect to buildings, but in every aspect of human endeavour, they reflect changes in the historical concept of who or what has rights. When you study the history of rights, you begin with the Magna Carta which was about the rights of white, English, noble males. With the Declaration of Independence, rights were expanded to all landowning white males. Nearly a century later, we moved to the emancipation of slaves and during the beginnings of this century, to suffrage, giving the right to women to vote. Then the pace picks up with the Civil Rights Act in 1964, and then in 1973, the Endangered Species Act. For the first time, the right of other species and organisms to exist was recognised. We have essentially “declared” that Homo Sapiens are part of the web of life. Thus, if Thomas Jefferson were with us today, he would be calling for a Declaration of Interdependence which recognises this. This Declaration of Interdependence comes hard on the heels of realising that the world has become vastly complex, both in its workings and in our ability to perceive and comprehend those complexities. In this complicated world, prior modes of domination have essentially lost their ability to maintain control. The sovereign, whether in the form of a king or nation, no longer seems to reign”.

William McDonough [WIKI]; Architect, William McDonough Architects; Centennial Sermon On the 100th Anniversary of the Cathedral of St. John the Divine, New York City

• 2004 National Design Awards: Environment Design Finalists
• Virginia Association for Mapping and Land Information Systems 2001 Annual Scholarship: Growth Study for Charlottesville VA for 2000-2030- Analysis and Possible Energy-Conscious Applications

Written by Harsh

June 5th, 2005 at 6:33 pm

Posted in Planning,Social

Tagged with , ,

Follow Up [2]: Map Viewer and Google

with one comment

Written by Harsh

May 27th, 2005 at 6:40 pm

Posted in GIS,Mashup

Tagged with ,

Follow Up [1]: Wireless Application Protocol

without comments

Written by Harsh

May 19th, 2005 at 6:54 pm

Posted in LBS,Technology

Tagged with , , ,

Half-life of a Website

without comments

The primary objective of this blog is to mull over industry trends and abstract ideas relevant to the profession, not to regurgitate “operational details”. However, this post may bend that rule.

For those not in the know, a webpage does a lot of behind-the-scene work before it spits-out text on the screen. Here’s a summary of what this does:

• The very first thing it does is send out a header depending on the client-browser. This is recommended when, say, different protocols are used to access the webpage. Note that this step gets initiated only after the Apache Webserver has finished running through its configuration directives. The webpage then marks the start-time for script download and execution. Measuring script download and execution time helps in diagnostics. The webpage also goes down a list of red-flags checking for browser compatibility and permission-settings. Later, it establishes connections with MySQL databases and fetches or defines client and script variables.

• Only then does the layout begin to emerge with some CSS, XHTML and plenty of include files. Care is taken to separate presentation which has been kept to a minimum given the volunteer nature of the website, from content and function, and make it easier to reuse data. To display news feeds, as is the case here, the webpage fetches the feed URL and slices its content into nodes. Sometimes feed URLs do not provide information as desired. For example, this feed URL does not provide a direct hyperlink to its article. Sometimes a feed URL includes an image-path in its description that needs to be dropped. For such cases, scripting languages like PHP offer a wide-array of string-manipulation functions. It is advisable to ensure that the webpage continues to get parsed in a timely manner even if the fetching fails.

• The webpage then wraps-up logging of relevant variables and closes open database connections. If script execution has generated any errors, a summary gets emailed to the administrator. The webpage then spits-out the footer. Its decay into dead text is finally complete […well, unless you use AJAX to monitor client-behavior, as is the case here].

• A quick note on the website maintenance: Given its volunteer nature, it is maintained in small nudges i.e. “minor increments made frequently”, with the emphasis being on function over form.

• World Wide Web Consortium
• Web Style Guide
• Interesting Website
  º http://www.nyas.org/
  º http://news.google.com/
  º http://www.cancer.gov/
  º http://www.nobodyhere.com/
• Website Theme: A lot of experiences came together to start and shape the evolving theme of this website- During the 2002 Colorado/Arizona wildfire disaster, I received an email from the FGDC list serve requesting volunteers for assistance; Then at the 2003 ESRI Annual Conference, I learnt how volunteering is not easy- how the volunteer is not always in control; The omnipresence of mature opensource software not getting enough attention from the general public was a cause for concern; Also, a need was felt to enhance the functionality of my cellphone by connecting it with custom online applications; Additionally, there was a personal need to digest vast amounts of professional information from anywhere.


Written by Harsh

May 8th, 2005 at 10:55 pm

Posted in Programming,Web

Tagged with ,

Follow Up [1]: Graphic Software

with one comment

It is good to know that some professionals concur with the views expressed in my earlier post on the potential for graphic software, like Macromedia Flash. One comment links to an impressive demonstration of this largely untapped potential.

Anyway, two companies whose product GUI I enjoy interfacing with- Adobe and Macromedia, announced their merger earlier this month.

Both their flagship products have become industry-standards in exchanging documents and creating experience-rich applications across platforms. The largely unused spatial potential within Macromedia Flash combined with the increasingly widespread use of Adobe PDF/SVG maps and the sprouting of some exciting derivatives like geoPDF, pstoedit and GSview, make this merger important to how spatial information is exchanged in the near future.

Written by Harsh

April 28th, 2005 at 6:01 pm

Posted in GIS,Mashup

Tagged with , , ,

Follow Up [1]: Map Viewer and Google

with one comment

A quick note on the happenings at Google: Yesterday, Google added satellite imagery to its mapping. For speedy displays, 256px*256px JPEG image-tiles scanned at different zoom-levels and each weighing around 30 KB, coupled with some nifty AJAX come handy.

Such a drag-and-drool tiling paradigm, although practised for some time now by website developers to load large images, when applied to internet mapping represents a refreshing out-of-the-box approach. The GET HTTP request method uses a cryptic naming convention to fetch these image-tiles from a preexisting pallette, like so:


WHERE in one instance, TILE zooms closer from [tqtsqr] to [tqtsqrtssssrq] and still closer to [tqtsqrtssssrtrttr].

Unlike for its regular mapping where Google predictably uses GIF image-tiles each sized at 128px*128px, for its satellite imagery, Google’s preference for JPEG over another competitive format PNG, is worthy of a second glance: As is common knowledge, JPEG supports millions of colors, but is infamous for its lossy compression. PNG on the other hand, is lossless while supporting millions of colors. However, PNG is currently not supported by all browsers and depending on compression settings, may end-up weighing more.


Written by Harsh

April 5th, 2005 at 7:29 pm

Posted in GIS,Mashup

Tagged with ,

Tech One

with one comment

My pick of technology-related headlines from The New York Times Page One 1851-2002:

• Pi: Quiet Musing
[10/18/1907] Signalizing the opening of the Marconi Service to the public, and conveying a message of congratulation from Privy Councillor Baron Avebury, formerly Sir John Lubbock
• Pi: Quiet Musing
[01/08/1927] Opening new radiophone service; First private call to The New York Times
• Pi: Quiet Musing
[10/05/1957] The Naval Research Laboratory announced early today that it had recorded four crossings of the Soviet earth satellite over the United States
• Pi: Quiet Musing
[07/21/1969] Astronauts land on plain; Collect rocks; Plant flag

Since the most important technological developments in the time period covered occured in the western world, and since The New York Times can safely be assumed to best mirror these developments, notwithstanding the selective sample included in Page One, I consider these to be our most important technology-related headlines from 1851 to 2002. Although, sometimes technological change can seep in without so much as a loud knock or one bold headline [think Internet].

For those wondering about a headline that may seem conspicuous by its absence, say one that heralds the omnipresent automobile, keep in mind the time period covered. It is widely accepted that the automobile, for example, was invented by France’s Nicolas Cugnot between 1725-1804.

• LoC: Auto
• Wikipedia: Automobile
• Encyclopedia: Automobile
• Encarta: Automobile
• About: Automobile History
• Mercedes-Benz: History


• From the same source, my pick of highly arguable socio-political turning points important to a broad American psyche:

[12/08/1941] Japan wars on US and Britain
[08/07/1945] First atomic bomb dropped on Japan
[05/18/1954] High Court bans school segregation
[04/30/1975] Minh surrenders, Vietcong in Saigon
[09/12/2001] US attacked

Written by Harsh

March 29th, 2005 at 10:12 pm

Posted in Social,Technology

Tagged with ,

Making Public Policy: A Nutshell

without comments

Nutshell: “‘Substituting tax-increase with state lottery’ [Policy – Director/Manager/Planner] as a means to generate additional revenue. Here, it becomes important to first find the ‘percentage of non-gamblers/gamblers/disinterested in the effected constituency’ [Information – Spatial Analyst] because ‘opposition to such a move is more likely to come from non-gamblers’ [Theory – Planner]”.

Nutshell adapted from [Skinner, B. Beyond Freedom and Dignity. 1971].

Such a policy-decision can then be supported by any of the many preferred values for its successful adoption: Religious Value- ‘Scriptures say lottery is a sin, but taxing is a bigger sin. Hence…’; Nerdy Value- ‘People who are weak in probability must pay for it. Hence…’; and so on.

By similarly lopsiding options and obfuscating issues, policy-makers often nudge the intellectually lethargic mass along a preferred course.

*• “There is no subjugation as perfect as the one which keeps the appearance of freedom, for in that way, it captures volition itself” [Rousseau, Jean-Jacques].
•* Political Equilibrium
* Operant Conditioning
Pi: Quiet Musing
SOL = Standard of Life
c = Consumers
l = Labor
p = Producers
e = Environmental Resources

Written by Harsh

February 22nd, 2005 at 7:44 pm

Posted in Planning,Social

Tagged with , ,


without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

  • I am getting a ‘jsForm.htm not found’ error? If you are using Internet Explorer, first make sure you have the latest version of that browser. Then remove the Arcims site from your browser favorites, reopen the browser and try again.

  • How do I import Arcims maps inside ESRI Arcmap? If you have Arcmap 9.x, you can import Arcims maps by connecting to the services of an Arcims server. In Arccatalog 9.x, simply click on ‘GIS Servers’ to add the Arcims server and type-in its URL. Note that this does lead to a noticeable performance drop.

  • How do I accurately rescale the map when that functionality is provided? True scale depends on monitor resolution, the default being 96 DPI (Dots Per Inch). To make sure that your monitor is configured correctly, for MS Windows, check Display Properties–>Settings–>Advanced–>General. Note that when the map is rescaled to, say 1:12000, 1 inch on the map should represent 12,000 inches. Also note that you can use the Esc button on your keyboard to stop the map from rescaling at any time. Refer to Map Scales for related information.

  • I click on the print button but nothing happens? Make sure pop-ups are allowed for your Arcims site, then try the Print Tool again.

* ESRI Support
* WIKI: Edit Lock Schema

Written by Harsh

January 6th, 2005 at 1:43 pm

Posted in Education,GIS,IMS,Technology,Web

Tagged with , ,

Tsunami Links

with one comment

Written by Harsh

January 5th, 2005 at 6:10 pm

Digital conversion of Flood Insurance Rate Maps (DFIRMs): Summary

without comments

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

My notes reflect procedural changes brought about by the integration of DFIRM Production Tools.


  • Request jurisdiction(s) for existing geodata like new political boundaries and road names for use as base map. Base map geodata must NOT be older than 7 years.
  • Request GEOPOP from the MOD team and use it to create an empty DFIRM geodatabase. Use existing political boundaries for its geographic extent.
  • GEOPOP creates 3 table types- S (Spatial), L (Lookup) and D (Domain). Edit the main lookup tables:

L_COMM_INFO (community information)
L_SOURCE_CIT (source citation)
L_WTR_NM (hydrographic feature information- stream names etc)
L_STN_START (properties of starting points for stream distance measurements)

  • Create panel index and data catalogs
  • Georeference, scan and rectify geodata at its recommended scale to capture required floodplain features. Refer to FEMA MSC for full-sized PDFs of FIRM panels.


* HEC-RAS Online Help
* WIKI: Edit Lock Schema

Written by Harsh

December 27th, 2004 at 8:12 pm

Wanted: Proactive Policies

with 2 comments

What is the most effective method to spread the digital wave, especially of the spatial kind, in rural communities and developing countries? The following links offer some fodder, although Korea left the company of developing nations some time ago. A lot of talk has centered around the potential of wireless to bridge the digital chasm between the Knows and the Know-nots in places lacking adequate infrastructure.

• “Broadband Korea”
• “Broadband Wonderland”
• “South Korea leads the way”

More musing on this topic with time.

• Political Equilibrium

Written by Harsh

November 14th, 2004 at 6:40 pm

Posted in LBS,Planning,Social

Tagged with , , ,

Graphic Software

with one comment

The discussion “So …How About That Election Coverage?” at Directions Magazine makes you think about graphic software, like Macromedia Flash, that cater to small-time spatial needs.

Such graphic software, minus the topology and advanced query benefits, function well as basic spatial tools and comfortably serve data over the web with a “fair” amount of interactivity.

Does this make your overpriced IMS overhyped and overblown too?

[my comment]
“Macromedia Flash fills this niche quite well as demonstrated [here]. And as the market seems to indicate, it does that [while] satisfying more customers than what an overly fancy GIS would. [This] reminds me of the MapQuest survey when polled customers had expressed great contentment with their level of map detail, whereas cartographers were red with indignation. Akin to using an atomic clock to serve your wake-up call- not needed!”
[/my comment]

So is the complexity in Geospatial, better still Spatial, Information System or SIS overblown too? Much of SIS requires common-sense logic arranged linearly. If a person can drive her car in rush-hour traffic as she deciphers vague directions off a schematic map while trying to make sense of rain-washed road signs and maintain a semblance of conversation with her passenger, and still manage to engage the kid in the back-seat [read “multi-linear tasking”]; she can achieve a sound understanding of spatial databases with little persistence, except for the eye-for-details that comes with practice.

My point: SIS is non-complex and not at the cutting-edge of technological change, and there is ample room for non-traditional spatial software!

• This rise of non-traditional spatial software challenges the accepted definition of SIS. If you were to follow the modernist’s approach to design where in the end you remove everything you can without taking away from the essence of your creation and apply it to defining a SIS, you wonder what such a conceptual SIS would be in its simplest stark-naked Spartan form?

Written by Harsh

November 11th, 2004 at 7:35 pm

Posted in GIS,Mashup

Tagged with , , ,

Wireless Application Protocol

with one comment

As the year-end inches closer, let us look at one significant industry trend:
A potential increase in location-based wireless services [“Where are my kids …no really, WHERE are my kids …and give me that in Lat/Long”]? This could be brought about by a spread of handy ‘location-aware’ productivity tools, such as a GPS-enabled internet-ready Blackberry phone that also functions as a TV. Such tools could tell you when your family members or selected friends move into your vicinity. Based on industry reports, this might be old news in parts of Japan.

• SmartPhlow: Real-time Traffic Monitoring
• Real-time Mobile Mapping

• Social Software

The earliest benefit could be in emergency-response which just might be the area most likely to get heavy government funding. Ex: Volunteer Fire Departments being able to access critical layout and hydrant information that they need for machine placement and egress route planning as they respond to a distress call. Or, first-responders being able to retrieve medical history on-the-go. Check out an earlier National Incident Management System memo. Also, take a look at the developments at the WV Statewide Addressing and Mapping Board which plans to implement a statewide Spatial Information System [SIS] using aerial photography etc. The project has been funded in part by Verizon. Its objective is to help emergency-response by integrating mapping with E911, postal and public utility services, and telephone companies. This project was initially started to provide city-style addresses for rural areas so that all areas receive the same level of emergency services. With this broadening of its scope, it could serve as a guide for other states.

• ESRI Library: Challenges for GIS in Emergency Preparedness
and Response

• “Efficient Operations and Emergency Response”

• Google: SMS, Froogle, [http://www.google.com/wml]
• “U.S. launches a new Global Positioning Satellite”

Written by Harsh

November 6th, 2004 at 7:30 pm

Posted in LBS,Technology

Tagged with , , ,

Social Software

without comments

Interesting blog on Life With Alacrity about Social Software. For the ignoramus, crudely put Social Software or Groupware or Collaborative Software is software that facilitates group interaction. Often, there is “no overt coordination with the group functioning as an aggregation of interested individuals” rather than as a cohesive unit.

Two intriguing perspectives on the internet from the blog:
• “By ‘augmenting human intellect’ we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems” [Engelbart. Augmenting Human Intellect: A Conceptual Framework. 1962].
• “To appreciate the importance the new computer-aided communication can have, one must consider the dynamics of ‘critical mass,’ as it applies to cooperation in creative endeavor. Take any problem worthy of the name, and you find only a few people who can contribute effectively to its solution. Those people must be brought into close intellectual partnership so that their ideas can come into contact with one another. But bring these people together physically in one place to form a team, and you have trouble, for the most creative people are often not the best team players, and there are not enough top positions in a single organization to keep them all happy. Let them go their separate ways, and each creates his own empire, large or small, and devotes more time to the role of emperor than to the role of problem solver” [Licklider. The Computer as a Communication Device. 1968].

• Center for Computational Analysis of Social and Organizational Systems
• Open Groupware
• “A group is its own worst enemy”
• Friend of a Friend
• Applications: E-voting, WAP, Blogging […of course!]

• “Friendly foxes are cleverer”

Written by Harsh

October 30th, 2004 at 7:02 pm

Map Viewer and Google

with one comment

Interesting web-based map viewer– very snazzy. Now only if the download was quicker.

In related news, Google acquires Keyhole: a company promising a similar 3D interface. Right now, if you google an address, Google provides links to its 2D maps from Yahoo!Maps and MapQuest. Google also provides possible address matches and map links if you type in a name, akin to what Switchboard does.

It would be better if you could click and drag on a map to limit the spatial extent for your search. Although that would clutter the clean interface of Google Local, which by the way, does show maps.

Note to self- invest in Google.

Pi: Quiet Musing
• Google acquires gbrowser.com, and moves into video search. And here‘s the Google Blog.

Written by Harsh

October 27th, 2004 at 6:15 pm

Posted in GIS,Mashup,Service

Tagged with ,