What’s A Map Maker Look Like?

Mrs. OldTopographer reminds me regularly that I’m a geek and that she lives in the House of Useless Knowledge. I do retort that I’m not the one with the masters degree in math education, but to no avail.

Her accusations are clearly unfair. Yes I like maps. And mapping, photomapping, field sketching, surveying, navigation, compasses, GPS, GIS, geology, geography, geodesy, landform analysis…

OK, I’m a geek.


But not that kind of geek.

I’m more of a collaborative geek, someone who understands the value of teamwork. I like working with others. As long as they do what I tell ’em to do. So maybe I look like one of these folks.

Mapmakers Image


None of these photos reflect my true adventurous spirit, my willingness to court danger, my aggressive Type A map making nature that has me living life on the edge.

This is who I am!


And that’s Mrs. OldTopographer, letting me know that I’ve missed the turn. Again.

– Brian

The 2014 ESRI UC

Well kiddies, I’m back from San Diego and the 2014 ESRI International User Conference. This is my third conference in five years, and it’s starting to feel like Groundhog Day.

Now please, do not get me wrong – it was a very good conference and in some respects it was a great conference. I and my team learned a lot and picked up a lot of new information.  But in the plenary and in the technical sessions and presentations side it was the same drumbeat we’ve been hearing for the past several years – ‘rich’ web maps and apps (could someone in ESRI please tell me just what ‘rich’ means?), ArcGIS as an integrated system, not just a bunch of parts, ArcGIS Online, ArcGIS Online, ArcGIS Online, yaddah, yaddah, yaddah.  In the Solutions Expo (i.e., vendor displays) it was the same vendors, many in the same locations, showing the same stuff, giving the same spiel, etc.

You know, Groundhog Day. C’mon ESRI, mix it up a bit. It’s getting a little stale.

OK, that’s out of the way.  Let’s change tack. If you listened closely to the presentations and have been paying attention to what ESRI’s been doing in the past few months you were able to tease out some great information regarding new and emerging capabilities. Let’s start with one of ESRI’s current flagship product, ArcGIS Online.

If anyone reading this blog still doubts that ESRI considers ArcGIS Online and web mapping a central part of ESRI’s GIS universe then this UC would have set you straight. The message was obvious and unmistakable, like a croquet mallet to the head. ArcGIS Online is here to stay, is only going to get bigger, and if you are going to play in the ESRI sandbox you need to know (and buy into) ArcGIS Online. I didn’t attend a single ESRI session, whether it was the plenary or a one-on-one discussion with a product expert where the topic of ArcGIS Online integration didn’t pop up early and often. Most vendors I talked to – and certainly all of those that ‘got it’ – had ArcGIS Online integration as a key selling point for their product or service. Heck, even IBM with their painfully complex work order management program called Maximo ‘got it’ and touted how I could now ‘easily and seamlessly’ integrate ArcGIS Online feature services with Maximo. Anybody who knows Maximo knows it doesn’t do anything ‘easily and seamlessly’. I don’t really think Maximo can use hosted feature services from ArcGIS Online, at least not yet. The REST endpoints I saw Maximo consuming looked like dynamic map services. But at least the IBM sales team took the time to read the memo from Redlands.

ArcGIS Online

The ArcGIS Online product space was the single biggest product presence ESRI had set up in the Expo. It was huge, and a reflection of the importance ESRI places on the product

ESRI’s incessant chatter about ArcGIS Online would have fallen flat with those who are long time users of the product if ESRI had not done a product update just a few weeks ago. The July update of ArcGIS Online included a number of significant improvements and new features that signaled to those who know the product that ESRI is serious about ArcGIS Online being more than just a toy for making simple web maps. The upgrades in system security certification, administration tools, data management, data integration, analysis and cartographic tools shows ESRI has full confidence in ArcGIS Online as a serious enterprise tool.  I’ll admit that a few years ago I was having doubts that ESRI would be able to pull this off. Today I’m convinced that ArcGIS Online and web mapping is the most significant development in geographic content delivery since the invention of the printing press.

This year I spent more time wandering the Solutions Expo hall than I did attending the technical sessions. In past years there were sessions I felt I just couldn’t miss, but this year my technical needs were somewhat less well defined and I wanted to spend more time speaking with the vendors and visiting the ESRI product islands. It was time well spent.

One of the focuses (foci?) of this year’s plenary presentation was the issue of ‘open data’. Open data is nothing more than data that is available free to any user. Open data can take any format (though it is understood that for data to be truly ‘open’ it needs to be available in a non-proprietary format). For decades the federal and state governments have made GIS data available in a variety of GIS formats. A good example of this is census data. The data for most censuses held in the last 40 years or so is freely available in GIS format from the US government. It’s easy to pull that data into a GIS system and do all kinds of analysis against it. In fact, census data is one of the first data types that new GIS students learn to analyze in their core classes. In the same vein, many states make state-specific GIS data available from freely accessible data servers. Things like elevation data, transportation network data, hydrology, landcover and more have been commonly available for years.

However, it was often difficult for smaller government entities – cities, counties, or regional authorities – to share out their public data because of the IT and GIS management overhead involved. Let’s face it, nobody makes money sharing out free data so there’s little incentive to put a lot of resources behind the effort. As a result a lot of currently available open GIS data is pretty stale. ESRI is backing a push to pump new vitality into the sharing of open data via the new Open Data tools embedded in ArcGIS Online (see, there it is again). OK, I admit that ArcGIS Online isn’t exactly free to the organization looking to share out data, but if you do happen to be an ArcGIS Online subscriber then setting up an Open Data site is fast and easy. One of the great concepts behind ESRI’s effort is that the organization is really sharing a feature service from which an Open Data user can extract the data. This means that the data should not suffer from ‘shelf life’ issues; as long as the data behind the feature service is regularly updated the Open Data user will have the latest and greatest representation of what’s being shared.

On one of my laps around the Expo floor I stopped at the Open Data demonstration kisoks set up in the ArcGIS Online area and talked through the concept and implementation with one of the ESRI technical reps. At first I didn’t think my organization would have much use for this feature, but after thinking about they types of data we routinely pass out to anyone that asks – road centerlines, jurisdictional boundaries, parcels, etc. – I began to think this might be of some value to us. In about 15 minutes she helped me set up my organization’s Open Data site and share some common use data out to the public. If for no other purpose, an Open Data site could lift some of the data distribution burden off of us.


The new Open Data tab in ArcGIS Online allows the administrator to configure an open data page from which the organization can share data with the public

Another lap took me to the US Geological Survey information tables. The USGS table was set up in the Federal Government area and while most of the agencies suffered from a serious lack of attendee interest (and I pity the poor souls who had to man the Veteran’s Administration table), the USGS tables were doing a good business. The USGS reps were stirring the pot a bit. It seems that there’s a move afoot in the USGS to do away with the National Atlas. I’m not sure yet how I feel about this move. Clearly elimination of the National Atlas is a cost cutting move (and the USGS makes no bones about it on their website), but if the same digital data can be made available via other portals, like the National Map portal, then this may all be a moot point.  Still, this is the National Atlas and as such should be a point of pride not just for the USGS but for the nation. If for no other reason than that I’d keep it alive. The USGS reps working the tables were clearly pro-National Atlas and were running a petition campaign to garner support to keep the program going.

I also spent some time discussing the new US Topo series of maps with the USGS reps. If you’ve read any of my posts on the US Topo maps you know that from a cartographic perspective I think they stink. The map base – imagery – is poorly selected and processed and the maps looks like crap when printed out. That’s the basic problem; the US Topo series are compiled as though the intent is to print them out full scale for use in the field. They carry full legends and marginal data. However, it’s clear they were compiled specifically to look best on a back-lit computer screen. When printed out the maps are dark, muddy and the image data is difficult to discern. When I brought this up to one of the USGS reps she turned her badge around to indicate she was speaking for herself and said, “I agree completely, and we get a lot of complaints about the visual and cartographic quality of these maps.” Here’s hoping the USGS doesn’t go tone-deaf on this issue and takes steps to improve the quality of the US Topo series. She also let me know that there’s growing support within the USGS to provide the US Topo series maps not just in GeoPDF format but also in GeoTIFF. This would be a great move, especially if the USGS provided them in a collarless format for use in systems like ArcGIS for Desktop.

I took the time to mosey over to the Trimble display area and talk to a rep about my favorite Trimble issue – the lack of a Google certified version of Android on their very capable (and very expensive) Juno 5-series of handheld data collectors. I’ve bugged Trimble so much about this that I have to assume my picture is hanging on a dartboard in the executive conference room at Trimble’s headquarters. I got the same response out of the Trimble rep that I’ve been getting for about a year now, “We hear it’s coming but we don’t know when”. Yeah right.

After I left the Trimble area I found myself a few rows over at the table of a company I’d never heard of before, Cedar Tree Technologies. It was just a couple of guys with a couple of pieces of hardware, but my eye caught something that looked a lot like a beefed up smartphone and the guys at the booth were eager to give me their story. It seems that Cedar Tree Technologies is a brand new spin-off of Juniper Systems, a company that’s been making rugged handheld systems for the surveying and GIS community since the 1990’s. Cedar Tree’s specific focus is on the Android OS, and each of the devices on display were running Google certified versions of Android 4.2. The device that caught my eye was the CT4. The CT4 is what it looked like – a ruggedized smartphone that runs on Android. It looked like an OK product with very good specs – a quad core processor running at 1.2 GHz, a 4.3″ Gorilla Glass display, and 8 mp camera, a 3000 mAh battery, Bluetooth and an IP68 rating.  It did have a few drawbacks – only 16 gig of system memory and a 3G (not 4G or LTE) cell radio, and I forgot to ask if it was fully GNSS capable. But here’s the kicker – this damned thing is only $489! Roughly one third the price of the baseline Juno 5, yet it looks like it offers 3/4 or more more of the Juno’s capability. You can bet I’ll be contacting Cedar Tree about borrowing one of these for an evaluation.

Cedar Tree1

 He’s smiling because he thinks he’s got Trimble beat in the Android space. I think he might be right!


Cedar Tree2

 The Cedar Tree Technologies CT4. Perhaps the first truly usable Android-based field data collector

OK, I don’t want to get too far into the weeds on other topics of interest, so let me just do some quick summaries:

  • I talked to Trimble, Leica, Carlson, Juniper and Topcon reps about their software offerings. All plan to remain tightly wedded to the Windows Mobile 6.5 OS (a.k.a., Windows Embedded Handheld), which hasn’t had any significant updates for over 2 years. Many of the reps indicated that the mobile version of Windows 8 still has some issues and they are very reluctant to move in that direction. So it looks like the industry will be stuck with an archaic and moribund OS for some time yet
  • What the world needs, in addition to a good 5¢ cigar, is a good spatially based document management system. Lord knows my organization is in desperate need of something like this. I saw only one document management system vendor at the show, and their system has a strong dependency on ArcGIS Online (there it is again). I think this is a market area that is ripe for exploitation. The tools are now in place with ArcGIS Online and reliable cloud services to bring this type of functionality quickly and cheaply to an enterprise and I’d love to see some new developments in this area. Pleeeeze!
  • I attended a very interesting working session where the GIS team from Pierce County, WA discussed their adoption of enterprise GIS and ArcGIS Online. I felt like I was sitting through a presentation I had written about my own team’s struggles and experiences. Like us, Pierce County faced a lot of push-back and foot dragging from their IT department on implementing IT-dependent GIS initiatives, and productivity among the county’s field maintenance crews suffered. Here’s my point – for every GIS/IT success story I’ve heard or read about I’ve heard an equal number of stories where thick-headed IT departments get in the way of a successful GIS initiative. If you are IT and don’t fully support the GIS initiatives in your organization then watch out. You will wake up one day soon to find you’ve been replaced by a cloud based service. It’s happened in my organization and it’s happening across the industry.
  • How come I never heard of the Association of American Geographers? I’m not joking. I’ve been in this industry for over 30 years and have been attending trade shows for all of that time. I’ve heard of the ASPRS, the American Society of Photogrammetry and others, but never the Association of American Geographers. Seems like a good organization. May have to join!
  • Like a good 5¢ cigar, the world also needs more quality geospatial sciences masters program options. I talked to a number of the universities set up at the conference and while they all seemed to be offering quality programs, too many of them are targeted at the professional student, someone who heads into a masters program directly from a bachelors program. For example, here in Atlanta the Georgia State University offers what looks like a cracking good geosciences masters program with a focus on geospatial science, but it’s structured so that all of the coursework is classroom focused and only offered during working hours. For someone making a living in the real world this type of program really isn’t feasible. We need more fully on-line options and more local colleges and universities to offer evening and weekend programs.
  • Let’s get back on the ArcGIS Online horse and discuss a very interesting service that the developers tell me is under serious consideration. One of the gripes that users of Collector for ArcGIS have is the lousy positions that are provided by the GPS/GNSS receivers on handheld units. Keep in mind that this is not a Collector issue, but a hardware issue. One of the improvements ESRI is looking at is a subscription based correction service for use with Collector. It will probably work like this – collect a point or a series of verticies and when they are synced with the ArcGIS Online server the points first pass through a correction service before being passed on to ArcGIS Online. This will likely be a single base station correction solution, but it could offer sub-meter accuracy if using a data collector with a more advanced GPS/GNSS receiver (sorry, this will not work with your iPhone or Android smartphone because of the low quality receivers they use). Sort of like on-the-fly post processing. A very interesting concept, and it could move a lot of hardware manufacturers like Trimble, Topcon and Leica to put out low(er) cost Android-based field data collectors with improved receivers

Before I go, some kudos:

  • To the City of San Diego. I can’t think of a better place to hold this conference
  • To GIS, Inc for a wonderful dinner cruise with NO sales pressure (Mrs. OldTopographer just loved it!)
  • To Andrew Stauffer from ESRI and fellow BGSU grad. Andrew provided invaluable support to our organization over the past few years while we worked through our ArcGIS Online implementation issues. I finally got to meet him in person and thank him
  • To Pat Wallis from ESRI who proved you can hold down a serious job and still be a 1990’s era ‘dude’
  • To Courtney Claessens and Lauri Dafner from ESRI who entertained all of my dumb questions about Open Data
  • To Kurt Schowppe from ESRI. I’m sure my pool party invite got lost in the mail <grin>
  • To Adam Carnow, for putting up with all of my other dumb questions and requests
  • To all the great people I bumped into at random and had wonderful conversations with

And finally, it was good to see my alma mater making a showing at the Map Gallery exhibition. Go Falcons!

BGSU poster

– Brian



Off To San Diego!

It’s off to San Diego for the annual gathering of the faithful, also known as the 2014 ESRI International User Conference.


Looking forward to hear what Uncle Jack has in store for us and picking up mountains of geo-swag from all the exhibitors (I’m bringing along an extra large suitcase just to hold all the cool stuff).

And maybe, just maybe, I’ll be able to find a Trimble rep who is willing to give me a straight answer on why, almost a year on, Trimble still hasn’t released a certified version of Android for their very expensive and potentially very capable, yet unnecessarily crippled, Juno 5 data collectors.

So we’ll see you back here in a week with a report on all the neat stuff I discovered!

– Brian


An Interesting Challenge

I caught this posting the other day on the C4ISR & Networks page:

Geospatial Sensors

This challenge got me thinking. Most of the requirements could easily be met using something like LiDAR – flood the target with enough laser sensors and you could track all the way down to the crickets chirping in the grass, in 4D. But the real challenge in this requirement is for a passive sensor that does not give away the user or sensor location. LiDAR and other systems that could tackle this challenge are all active systems – continuously bathing the target area with active signals in order to collect data. This makes them easy to detect with fairly rudimentary electromagnetic spectrum receivers (i.e., radios tuned to the right frequency).

So I started thinking. How could this problem be tackled with a passive collector system? For an old topographer like me the answer is simple – photogrammetric technology!  A pair of video cameras that collect high resolution imagery in a variety of spectrums (visible and passive IR) and have modern night vision capability could easily collect real time stereo still imagery (3D) and video (4D). They’d have to sit along a calibrated baseline, but that’s easy to establish using military grade GPS. All of this technology could be easily squeezed into fairly small, man-portable units. The data would have to be uploaded to computers for post-processing and visual display, and for real or near-real time use that implies some sort of a communications link. But still, the data collection part seems to me to be a simple application of available technology.

Hmm… maybe it’s time I hang out my shingle as a defense industry consultant!

– Brian



The Littlest Geo-Geek

So I came home from work yesterday, sat down at my computer and was surprised to find this:


No less than eight instances of ArcCatalog running. Huh? When I left for work in the morning the only thing open on the computer was Chrome. Did some eeeevil ESRI trojan take control of my desktop?!

Then I remembered – this little two year old cutie spent the day with Grandma:


And this cute little tornado takes every opportunity to search every computer, tablet or iPhone she gets her hands on for this:


I applaud her early interest in investigating the mysteries of the geosciences and exploring ESRI’s software offerings. It warms the heart of this old geo-geek grandfather.

…or could it be she just made a series of random mouse clicks while singing (for about the 20th time that morning, according to Grandma) “Let it go, let it goooooo”?

Grandpa prefers to think she’s a budding geo-genius.

– Brian

Welcome to the new(ish) Northing & Easting!

Welcome!  For several years I’ve been blogging under the title Northing & Easting using Google’s blogging service but lately I’ve become dissatisfied with Google’s offerings. Now mind you, what Google offers is a free service so I’m not going to complain too loudly, but as a free service its features are limited.  My hope is that WordPress will offer more options not just for blogging, but also for setting up and running a website. We’ll see how it goes.

So what’s the philosophy here? I’m a professional Topographer. That’s a job title you don’t hear any more, but it accurately describes what I do in my daily work. A Topographer, or Topographic Engineer, is someone who studies and describes the shape of the earth and the feature on it. The discipline of Topographic Engineering traditionally incorporates surveying, geodesy, photogrammetry, cartography, geography, geology, soils science, forestry, hydrology, landform analysis, cultural analysis and astronomy. A good Topographer is a true polymath.

Topographic Engineering was not an obscure discipline. Many of our greatest historical figures worked as Topographers at one point or another in their careers. Men like George Washington, Thomas Jefferson, Meriwether Lewis, Robert E. Lee, George C. Marshall, Douglas MacArthur and Herbert Hoover all worked as topographers at some point during their professional lives. Topographers working in the service of the US government or US Army mapped the coastlines, rivers, interior lands and natural wealth of the USA, laid out the land survey grid system that facilitated the opening and settling of the vast expanses of the American west, mapped extensive regions of the Pacific, Far East, Europe and Asia, and even mapped the Moon. Topographic Engineering is an old and honorable profession and I’m proud to be a part of it.

Today the discipline has a much fancier high tech 21st Century title: Geospatial Engineer. The term Geospatial Engineer was coined by the US government to describe the role of the traditional Topographer brought forward into the world of computers, GPS, satellite imagery, remote sensors, digital data and the Internet. The biggest difference is that the Topographer walked the land to study, describe and map while the Geospatial Engineer sits at a desktop computer and leverages a world of digital data to accomplish the same mission. But are they the same, these old ground walkers and the wiz kids at the computer terminals? Conceptually, yes. Both are leveraging a broad range of data to study and describe the earth and its features.  The difference is in the how.  To do their job the old Topographers actually had to leave the office, travel to the ground they were going to study, walk it, measure it, sample it, map it and describe it based on their personal observations. This always resulted in a much more intimate and, I would argue, more accurate and precise evaluation. Today’s Geospatial Engineer has, in most cases, lost that direct contact with the very ground they are studying and describing. While they have complex digital analysis tools and vast digital data holdings at their disposal the final product often tends to be detached, clinical, often imprecise and inaccurate.

I’m no Luddite. I love the technology. I’m lucky to work for an organization that provides me access to the very latest in geospatial information systems (GIS) software, web mapping technology and digital field data collection devices. We maintain vast digital data holdings that cover a very compact space on the Earth’s surface (a large airport) so we are able to develop a wide range of analytic scenarios and quickly field check the results. We are able to effectively evaluate what works, what doesn’t work, and we maintain a direct dialog with industry leading software and hardware developers. This puts me in a perfect position to evaluate current trends against historical precedent.

The central mission of this blog is to celebrate the history of Topographic Engineering and to capture and discuss the fast disappearing technologies and methods used by generations of old Topographers. But we’ll also  keep an eye on the current state and trends in the Geospatial Engineering field. Since this is my blog and I get to call the shots I may take occasional forays off to explore other topics, but we’ll always quickly return to the central theme. One of my primary objectives is to keep it interesting, and that’s where you come in. I’ll need your honest feedback on what I post here and any recommendations you may have regarding topics, style or any other features of this blog.

Thanks for your time, and I hope you come back often!

– Brian

Around The World On A Cup Of Coffee

My Grandmother was never one to let a good product promotion pass her by.  I think it was part of her fascination with her adopted country.  Where else but in America could you save up a few box tops or clip a few coupons, mail them off and the sponsoring company would send you something free in the mail.  Often these little give-aways would be handed down to one of her grandchildren.  It was always fun to visit Grandma – you never knew what she had waiting for you.  Usually it was just some small trinket from a cereal or soap company.  Occasionally it was something really neat.  One year she gave me and my brother miniature Civil War cannons she got from the Quaker Oats people – you know, the ones that shoot puffed rice out of a cannon.  Those cannons ended up shooting thousands of pretend cannon balls in hundreds of pretend battles we held on our living room floor.  Oh, the pretend carnage!  Even after I was all grown up and in the Army I’d still occasionally get an odd item or two from her.  Bless her heart, she never stopped thinking about us even long after we had left home and struck out on our own.

In the late 70’s the Nestle company was running an advertising campaign highlighting the fact that their instant coffee, Nescafe, was the #1 selling coffee product world wide.  It was no idle boast.  Nescafe was (and still is) extremely popular in Europe, and Nescafe is commonly listed as a separate drink option on many restaurant menus.  A lot of Europeans prefer it over brewed coffee.  While I’m no fan of instant coffee I do have to admit that Nescafe is the least objectionable of the bunch.

As part of the promotion Nestle produced a series of small glass coffee mugs emblazoned with a world map.  Nothing fancy, just a highly simplified small scale map with a grid.  My Grandmother got a set and passed them on to my parents.  One day while home on leave my Mom passed along a couple of these mugs to me. At the time I was working as a topographer for the Army and maps were my business, so I thought it was a neat coincidence.  I kept one of the cups on my desk at work and even occasionally drank coffee out of it.  Over time the world map wore off and in a few years I was left with just a bare glass coffee mug so I pitched it.  Some where along the way I lost its twin.  It probably got chipped or broken during one of our many moves while I was in the Army.

Being a lover of all things topographic, even kitschy little glass coffee mugs with world map appliques, I always kept my eye open for replacements.  Last week I was cruising around eBay and decided to do a quick search for ‘nestle coffee cup’.  I was surprised at the number of listings that came up for my long lost little mug.  Apparently Nestle had millions of them made and most are still available through eBay sellers.  I found a dealer who gave me a good price on a set of them and a few days later I was the proud owner of four gen-u-ine 1970s vintage cheap cast glass coffee mugs sporting world maps.

I love ’em!

So let’s take a quick world tour courtesy of Nestle…

Eastern Hemisphere.  Hey, where’s the British Isles?
Interestingly, they included Lake Baikal.
Western Hemisphere.  Florida’s been squinched into a vestigial bump
but at least they included Puerto Rico.

Being a topographic geek I note that the far north and far south polar regions are cut off and the grid appears to be square, so I’m guessing it’s a Universal Transverse Mercator projection.

Hands Off Data Collecting With Huey, Dewey and Louie

A few weeks ago I posted a blog discussing how my organization is testing the newest release of ArcGIS for Windows Mobile 10.1.1.  I’ve started to use the combination of the ArcGIS mobile software and my Trimble Juno to collect walking trail data on a fairly regular basis.  I call it ‘development’.  My wife calls it ‘playing around.’

Anyway there are three occupants of my house that think my only job in life is to take them outside and entertain them.  They shall rename nameless, but they each have four legs and a tail.  Whenever they see me putting on a jacket or lacing up my boots or tossing things into a backpack they go nuts jumping around, barking and causing general mayhem.  “It’s walkie time!”

Now, I don’t mind having them along on a walk or hike.  They are generally well behaved.  Unless they see a cat.  Or a deer, squirrel, rabbit, duck, goose, bird, grasshopper, snail, ant or, heaven forbid, another dog.  Other than that they’re fine.  Great company.

But managing three dogs and a data collector is all but impossible unless you’re an octopus.  I was in a quandary; how do I take these three amigos along on a data collection hike and get everything done?  Well today I had one of those ‘duh’ moments, as in ‘duh, why didn’t I think of this before?’  Just launch the collection job on the Juno, toss it into the zippered compartment in the lid of the rucksack (where it sits up high and the internal GPS receiver should have a fairly good ‘view’ of the sky), hitch up the dogs and start walking!

Launch the data collection job on the Juno and go!

When I started today’s walk I wasn’t 100% sure how this would work out.  I’ve carried other GPS receivers in pockets on the shoulder straps of rucksacks before, but I wasn’t collecting data with those units.  As I walked the trail my mind was working, thinking up all sorts of configurations and contraptions I might use to improve signal reception.  By the end of the walk I had myself convinced that what I needed was some sort of pole mounted external antenna.  I had it all figured out in my mind – about 4′ of quarter inch PVC with a metal plate epoxied to one end and a magnetic patch antenna (which Trimble makes for the Juno) stuck to that, with the antenna cord running down the tubing and connecting to the Juno inside my rucksack.  I could lash the PVC tubing to the side of the pack using the compression straps.  Absolutely, perfectly Rube Goldberg-esque!  Why it was so clever I’d be the envy of all the neighborhood GPS data collecting kids!

Rides nice and high in the pack where signal reception is pretty good!

At the end of the hike I pulled the Juno out of the bag and realized I didn’t need all the fancy gadettry I was dreaming up.  The Juno did just fine collecting data while sitting snugly inside the rucksack lid.  It’s not a perfect setup by any means; the GPS signal still has to penetrate the bag material and my big fat noggin’ blocks a lot of the signals, and it’s impossible to stop collecting streaming GPS data to collect point data.  But for hikes like these where I’m just after trail alignments it works fine.  The Juno isn’t a survey-grade instrument anyway so some GPS track shift is to be expected, particularly under heavy tree canopy.

“Hey, we really like this data collection thing!”

So how did we do?  Not bad.  I need to clean the data up somewhat and I know that in the future if I’m set up for GPS data post processing I’ll get better accuracies, but for now it’ll work.  As was said in that classic movie about chronic over-achievement, “That’ll do pig.”


My Data Is More Accurate Because It Got Here First

Earlier this month Eric Gagstatter wrote a great little article for Geospatial Solutions Monthly titled “Nightmare on GIS Street: Accuracy, Datums, and Geospatial Data”.  Anybody who’s worked in the GIS field for more than a week has experienced the kind of issues Eric discusses.  Simply put, it is a rare event when data pulled from multiple sources fits together with any semblance of accuracy or precision.

For a small scale project (let’s say 1:20,000 or smaller) data fit is less important – at those smaller scales ‘eyeball close’ is often good enough.  The problem we face is that with modern GIS software the user is not stuck with a fixed scale like they were when everything was based on paper maps.  We live in the era of Google Earth, the era of high resolution satellite imagery, where everybody expects to be able to read the address number on their mailbox from space.  This new found ability to zoom to any scale with just the scroll of a mouse wheel has highlighted a problem that the general public and, to be frank, many Geospatial and civil engineering professionals, were not aware of: the data doesn’t fit.

Eric highlights the most important factor impacting this issue – the emergence of high precision GPS-based field data.  In the past 10 years or so GPS data, that data collected by survey grade or SBAS* augmented GPS units, has dramatically exposed the errors embedded in decades of historical geospatial data.

It’s not that this old data was collected badly – most of it was collected to established standards using the best resources and techniques available at the time.  In the old days it was paper maps, scaled aerial photos, compass headings, pace counts (or odometer readings for really long distances) and field notebooks.  Mapping grade accuracy was the accepted norm.  When you were using 1:24,000 USGS topo sheets as your project base an error of +/- 24 meters (the approximate National Map Accuracy Standard for those map sheets) was good enough.  Formal surveys were expensive and time consuming, and only done if there was a strong business justification – usually to establish legal boundary definitions, accurately map out small project areas, or precisely position critical features.

Today a Geospatial professional can collect data with handheld GPS units that easily achieves accuracies of +/- 15 feet with just SBAS augmentation, and centimeter level accuracies with survey-grade RTK (real time kinematic) equipment.  Accuracy has improved by several orders of magnitude and the cost of acquiring that data had dropped dramatically.

While Eric focuses on the issues of datums and datum transformations, my experience is a little different.  I work at a major airport that has terabytes of historical CAD data and a warehouse full of historical project plans on paper, mylar or linen that go back to the early 1940s.  Virtually all of this data is referenced to a local grid system that was first established as a construction grid back in 1948.  At the time this grid was established it was never formally defined in reference to the local State Plane coordinate system.  In fact, the surveyors who laid it out committed the cardinal sin of not establishing a central meridian that is oriented to true north.  The entire grid is rotated a few degrees off of true north and that angle of rotation was never defined when the grid was established.  For years this was not a problem.  The airport was happy to exist as an ‘island’, floating on the face of the earth within its own little grid system.  However, when the airport started to expand dramatically in the 1960s the engineers realized they needed to start tying into properly defined coordinate systems like State Plane.  USGS and USC&GS survey control was extended onto the airport and several monuments were defined in both the local grid system and State Plane.  This allowed project engineers and surveyors to ‘extend’ State Plane control onto their project sites if required, but all design and construction work was continued in the local grid system.  To this point all design work was done using old manual drafting methods, so the levels of error inherent in these processes were acceptable for the time.

In the 1980s CAD (computer aided design and drafting) systems started to be used on more and more projects at the airport. Since our local grid is a simple x,y grid based on distances in feet measured from an origin point it was easy to lay out in CAD.  No need to worry about that pesky rotation.  Or, for that matter, grid-to-ground mismatches over long distances (like say on a 9,000′ runway).  But very soon some serious folks with serious money, like the Federal government, began asking for airport data in a ‘real’ coordinate system like State Plane.  A number of attempts were made to try to define the local grid as a true spatial coordinate system (with a tie to a known coordinate system, an origin point and a rotation and scale factor) but with no success.  As a result some very sloppy work-arounds were developed, most using a ‘local fit’ method  – an engineer or CAD technician would snap local project data from one coordinate system to known good data in the other coordinate system; building corners, grid tics, manholes, whatever they could find.  The problem was that a lot of ‘known good’ data turned out to be not so good.  Errors propagated and started to become uncontrollable.  The engineering staff worked around this by referencing in local project data (for example, a new taxiway segment) against a small subset of the overall CAD basemap for the airport.  This method tended to keep the the coordinate system shift error within acceptable limits for the small project area, but when the data was added to the larger CAD basemap grid shift errors of up to 15′ were common.

When my Geospatial group came on board in 2007 the coordinate system transformation issue quickly became one of our biggest headaches.  We were faced with creating an enterprise geospatial database from scratch using this legacy CAD data.  We desperately needed a proper spatial definition for this local grid system, something that would work in both our CAD and GIS systems.  Our engineering staff was happy to dump the issue in our lap.  In fact, when I interviewed for the job one of the senior engineers told me that if I was hired the first thing he wanted me to do was to “fix this damned State Plane thing.”

As we started talking with the engineering staff about the problem it became apparent they all had an institutional distrust of State Plane, or any spatial coordinate system for that matter.  They placed the entire blame for the data fit issues on ‘inaccuracies’ in the State Plane system – inaccuracies they couldn’t articulate.  In their minds all data prepared in their local grid system was golden.  After all, the local grid system was known.  It was proven.  It was simple.  They had built an entire world-class airport on it.  This goofy State Plane thing just got everybody confused and besides, when they did move their CAD data to State Plane it ‘got shifted all around’ and didn’t work anymore.  It might fit OK at one corner of the airport, but didn’t fit too well at the other.

We eventually got the grid transformation issue solved.  We attacked it from several directions and ended up with a very accurate local grid system projection file for use in both AutoCAD and ArcGIS, and a best-fit definition for use in Blue Marble (for bulk coordinate point conversions).  All of these definitions are based on the same survey data so errors are consistent and controllable from system to system.  We can hold transformation errors to about 0.2′ across the airport property.  And yet our engineering staff still retained a latent distrust of State Plane-based data.  The old institutional bias remained.  The perception that ran deep is that the old ‘known’ CAD data in the local coordinate system is somehow better, more accurate, than any newly collected GIS data.  There is a natural distrust of geospatial data; few civil engineers understand what geospatial data is, how it differs from CAD data and how geospatial data can be incorporated into planning and design projects.  If the data file doesn’t have a .dwg at the end of it they don’t like it.

We decided to approach the perception issue from two directions.  The first was a current high resolution, high accuracy orthophoto of the airport.  Using our newly developed projection file we were able to reproject the aerial from State Plane to the local grid system for use in AutoCAD.  For the first time ever the engineers and CAD staff had a single continuous coverage aerial image in their grid system that could be used as a base for project planning and drawing development.  Next, we acquired RTK-based data collectors that are capable of centimeter level accuracy.  We launched on an aggressive project to collect photo identifiable data – manholes, paint markings, slab joints, airfield lights – and provide the data in both grid systems as a tool to check current and historical data against.  From this we created a ‘trusted’ CAD file,  one the engineering group verified using their own sources.  Ever so slowly some of the doubters started to come around.  Once they started matching their legacy data against these new sources and saw the problems for themselves they began to do more aggressive data checks and not take CAD data, old or new, at face value.

Yet we continued to have perception problems.  The old-line engineering staff retained a deeply embedded distrust of GIS data in State Plane and our insistence that all legacy data to be closely checked and adjusted if necessary.  Their reasoning actually sounded pretty good – “We spent decades building a world class airport with this CAD data and it all came together.  How can the data be wrong?”

Our GIS group didn’t really have a good response until some of the long time CAD staff complained that “it’s impossible to get as-builts around here.”  Our antennae went up and we started to do some digging on the issue.  Very quickly the problem revealed itself.  Our engineering staff rarely received true as-builts from the contractors that do construction on the airport.  The as-built delivery requirement is written into most contracts but is rarely enforced.  Contractors would regularly walk away from the as-built requirement and eat the contract penalty because they were too busy on other airport projects or the cost of developing the as-builts exceeded the monetary penalty.  If a contractor did deliver what they labeled as ‘as-built’ drawings they were seldom, if ever, checked for accuracy and completeness by the project manager.  The data was accepted at face value and often recycled for use on the next project.  Errors in spatial accuracy or attributes (pipe sizes, slab thicknesses, etc.) were unknowingly propagated from project to project as the project planners and designers used the same inaccurate data over and over again.  Down the line some errors became so glaringly obvious (like a stormwater line flowing uphill) that the engineering staff would hire engineering firms to go to the field and conduct existing condition surveys.  It was not unusual for the airport to hire the same firm that originally delivered the bad data to go back out and field verify what they should have originally delivered years before in the project as-builts!

But this only addresses half of the question.  The fact remains that this airport got built, and got built pretty darned well.  Was it all built on sloppy CAD data and it’s just a happy accident that everything fits?  Well, once we understood the as-built issue the rest of the story fell into place.  The engineering staff at this airport only does planning and initial design.  The final design work and construction drawings are done by contracted engineering firms.  Construction drawings are based on a number of sources – initial design, existing condition surveys and final design plans.  Construction drawings are what the project engineers and tradesmen take to the field to actually build against  These are the drawings that get marked up as modifications are done in the field and it’s these drawings that should be used to generate the as-builts.  These engineering firms do a very good job of making sure everything fits within the designated project space, and any ties to existing systems – utility lines, roadways, buildings, etc. – are adjusted for in the final design or in the field.  But we are back to the old as-built issue.  Much of what was actually constructed in the field never makes it back into the airport’s master CAD drawing.

So the reality is that the airport got built, but the airport doesn’t have a complete and accurate record of what got built.

But I do get the sense that we are over the hump.  In the last two years we’ve noticed an improvement in the consistency of the spatial accuracy of the CAD data being delivered.  We still find a good number of attribute data issues (stormwater manholes labeled as sewer manholes, that sort of thing), but as far as spatial accuracy things seem to be greatly improved.  I put it down to our engineering staff’s increased use of known good data as a quality control check, increased emphasis on as-built delivery, a willingness to let us be part of the quality control check process, increased dialog between the CAD and GIS analysts and an increased dependence on our RTK data collectors to do quick field verification.  In fact, our engineering staff is now the #1 hands-on  user of our RTK systems.  The GIS group also has tight relationships with many of the major construction contractors doing work at the airport and we provide the coordinate system definition files and verified base data for use in project planning.  We also offer ourselves up as the data conversion experts and will help contractors verify that their data has been properly moved from one grid system to the other.  Over time our insistence on spatial accuracy has ‘leaked into’ the engineering business processes and workflows here at the airport.

We’ve shifted the paradigm just a bit and the momentum is in our favor.  Geospatial engineering 1, bad data 0.  That’s the way it should be.


*SBAS = Space Based Augmentation System.  There are several SBAS systems in use around the world.  These are satellites in geosynchronous orbit that transmit correction data for the US GPS satellite constellation.  The Wide Area Augmentation System (WAAS) is a set of satellites transmitting correction data over the US and the eastern Pacific and is maintained by the FAA and DOT.  If your GPS unit can receive and use these signals they will roughly double the accuracy of the position fix your unit provides.  You can read more about WAAS and SBAS on Wikipedia.

Busting Brush with ArcGIS for Windows Mobile

Huh?  What?

OK, at work we are testing a new software package – ArcGIS for Windows Mobile.  The name is a mashup of two software package names – ArcGIS (ESRI) and Windows Mobile (Microsoft).  Yes, it is a cumbersome name.  Really, really cumbersome.

It is nothing more than a lite version of ESRI’s ArcGIS software designed specifically to run on the Windows Mobile OS.  Never heard of Windows Mobile?  Don’t worry, you haven’t missed much.  It’s an operating system that saw its widest use on mobile phones.  Notice I didn’t say ‘smart’ phones, because Windows Mobile was (and still is) an awful operating system that made every piece of hardware it touched dumber.  Same for its users.  I’m sure when Steve Jobs was yelling at his software engineers during the early stages of Apple’s iOS development he forced them to use Windows Mobile phones so they clearly understood what  iOS would not end up looking like.

Microsoft has moved on and introduced their new Windows Phone OS that is based on the Windows 8 platform.  However, their Windows Mobile OS hangs on in a few interesting places.  It’s used a lot in ’embedded’ applications, computers running inside of other things that don’t look like computers.  For example, the Microsoft Sync system that controls just about everything in my new Ford F-150 is a version of Windows Mobile.

Another place Windows Mobile has achieved a lot of market penetration is in the surveying and GPS-based data collection market.  These are highly customized hardware systems that are more than mobile phones but less that full-fledged computers, robust devices dedicated to a narrow set of field data collection tasks.  Virtually all manufacturers of surveying and GPS-based field data collection systems use Windows Mobile – there’s really nothing else available today that meets the need.

Two of the devices in this picture run on Windows Mobile.
The rest are easy to use.

Because of this ESRI still develops a lot of software to run on the Windows Mobile OS (now up to version 6.5 and renamed Windows Embedded Handheld).  ArcPad, ESRI’s flagship field data collection package, has been running on the Windows Mobile platform for almost a decade.  A few years back ESRI released a version of its server software (called, naturally, ArcGIS for Server) that allowed the user to develop GIS applications that run on Windows Mobile handhelds and consume map services hosted on a local ArcGIS for Server instance.  We tested this at work but came away with the impression that the whole system required more care and feeding than we were able to provide.  In addition, our IT department was never willing to cooperate and provide a way to authenticate these mobile devices (Trimble Junos) on our organization’s domain so they could ‘see’ our internal GIS software servers.

Instead, we’ve spent the last year developing mobile GIS applications to run on Apple iOS devices – iPads and iPhones – and leveraging the new hosted map service concept available through ESRI’s ArcGIS Online cloud service.  The entire system works amazingly well and our users like the idea of collecting GIS data using an iPad (plus they can play Angry Birds in their down time).  But as we tested and developed in this environment we quickly bumped up against another roadblock – again, our IT department.  They repeatedly refuse to approve the purchase of iPads for the user base.  No real explanation – they say no just because they can.  We were stuck and desperate for an alternative.

This all changed in late February with the release of ArcGIS for Windows Mobile 10.1.1.  The #1 change with this new version is the ability of the software to connect to an ArcGIS Online subscription account and use a hosted (cloud) feature service as a data layer.  We no longer are forced to use our local instance of ArcGIS for Server as the data source.  Since the data is hosted in the cloud all we need is a wi-fi connection and an ArcGIS Online subscription account to get to our data.  The need to have our mobile devices authenticate on our organization’s network is gone.  Our dependency on our local IT department is severed.

The other big benefit that ArcGIS for Windows Mobile brings is the ability to do disconnected editing when there is no wi-fi signal available.  This was always a concern with the iOS devices, which require an ‘always on’ internet connection when being used to collect data.  ArcGIS for Windows Mobile works differently in that the application places a copy of the map layer’s database directly on the mobile device when it is first downloaded from ArcGIS Online.  In the field a wi-fi connection to the internet is not necessary – all the newly collected data and edits are stored on the mobile device.  When the user get back under wi-fi coverage they can do a synchronization of this new data with the master database stored in the cloud on ArcGIS Online.  The new data is pushed up to the master database via a wi-fi connection to the internet – any wi-fi connection to the internet; in the office, in Starbucks, in McDonalds, wherever they can get a signal.  Simple, slick and robust.

But why Trimble Junos?  While our IT department balks at the purchase of iPads, they have no problem with us purchasing Junos even though the current generation of Junos cost about twice as much as an iPad!.  IT views them as dedicated field survey devices and allows us to buy as many as our budget can support.  Over the past two years we’ve purchased seven of them so we have plenty of hardware available to put into the hands of our users.

The Trimble fairy barfed on my desk

So like any good Geospatial weenie I figured I needed to test my applications before unleashing them on the unsuspecting public.  To acquaint myself with the workflows embedded in ArcGIS for Windows Mobile I set up a project to collect data during one of my favorite activities – hiking.  Then it was time to take ArcGIS for Windows Mobile to the wilds of suburban Atlanta for the ultimate test: can a befuddled 56 year old make sense of this mobile thing and actually collect useful data?

Everything I need to survive – water, food, first aid kit and
ArcGIS for Windows Mobile on a Trimble Juno

So how did everything work?  Pretty darned good.  In fact, better than I expected.  The simplified workflows in the ArcGIS for Windows Mobile interface make collecting data almost foolproof.  I only had a chance to capture about two miles of trail information and some points of interest, but it was enough to convince me this mobile GIS interface will work just fine for most of our user base.  Of course it’s an imperfect world, and so is this application.  I’d love the ability to collect photos while remaining in the GPS data streaming mode, and being the GPS geek that I am I’d like a better GPS performance interface (similar to what you get with ArcPad or Trimble’s TerraSync), but I also understand this package is designed for simplified data collection by non-GIS personnel, so I can live with the lack of GPS performance data.

Yellow SO clashes with my woods gear!

Of course all this simplification also serves as a straightjacket.  What you give up with the ArcGIS for Windows Mobile interface is the ability to make on-the-fly changes to your project – add new data types, change symbology, adjust GPS performance parameters, do complex searches, buffers, etc.  It’s a trade-off;  reduced complexity =  fewer options.  It’s a trade-off my organization can live with.

ArcGIS for Windows Mobile up close and personal.
A simple interface that works well on devices with small screens.

The REAL elephant in the room is the overall cost of this capability.  ArcGIS for Windows Mobile is an enterprise level solution for enterprise level projects.  As configured this project relies on an ArcGIS Online subscription account as the data hosting platform and ArcGIS Mobile deployment licenses tied to a very expensive ArcGIS enterprise license.  This equates to about $55,000 in licensing costs (toss in another $1,200 or so for the Trimble Juno).

Obviously this is not for the little guy.  But it should be, and it can be!  Let’s say you are a Geospatial geek like me.  Right now, today, if you participate in the ArcGIS for Home Use Program you get one free user license for  ArcGIS for Windows Mobile.  The software package includes a toolset that allows you to stage all your data on your local computer instead of on ArcGIS for Server or in the ArcGIS Online cloud.  You don’t get to wirelessly update your data – you have to connect the device to your computer via USB to do the synchronization process – but the rest of the workflows are the same.  A great (and cheap) way to test this capability for yourself.  You just have to go find a compatible Windows Mobile device (check eBay, there’s plenty for sale out there).

Looking into the future I see ESRI opening the ArcGIS Online subscription program up to market segments that don’t need and can’t afford to buy into the service at the current enterprise-like levels.  It’s almost an inevitability.  There’s a lot of emerging competition in the cloud GIS services arena, and companies leveraging some of the better Open Source GIS tools will start to provide low cost cloud hosting services in direct competition with ESRI.  Of course ESRI has the market share and clout, but the ‘cloud’ is a huge space, there’s a huge potential market, and ESRI can’t control it all.  Price competition will soon have its inevitable impact.  Plus, ESRI is pushing ‘the cloud’ like it’s the Second Coming and at some point they will realize their service availability will have to mesh with their message.  As they move more and more capability to the cloud ESRI will have to start offering low cost services for the little guy.

Maybe not this year, maybe not next year, but soon.

In the meantime use what’s available and get moving on Mobile!