Aviation GIS

Time for a little venting and praise. Our team at The World’s Busiest Airport does pioneering work on an amazingly broad range of geospatial issues and, frankly, we don’t get the recognition we deserve from our own airport leadership. It’s shameful that after over eight years of groundbreaking work and international recognition from the aviation GIS community we still have senior management that thinks all we exist for is to make pretty paper maps. You can (repeatedly) lead a horse to water, but you can’t make them drink, even when they are dying of thirst.

So let me toss an atta-boy to one of the best geospatial teams I’ve had the honor and pleasure to work with in over 30 years in the industry. You guys are outstanding, and I’m particularly pleased with how everyone on the team has grown over the past eight years. Nobody stays static – you are all pushing the boundaries and breaking through to find new and innovative ways to support the enterprise.

Here’s a (very) small sample of the kind of work our team does. None of this is pie-in-the-sky. We do this kind of stuff every day


Great job team! This is one grumpy old topographer that truly appreciates you.

– Brian

Terrain Analysis, Soviet Style

Yesterday the folks over at Wired posted a really neat article on the Soviet Union’s military mapping program during the Cold War titled Inside the Secret World of Russia’s Cold War Mapmakers.

Russian Map makers

The author seems to imply that the Russians were better at this sort of thing than the US was. Oh pish-posh. I spent most of my 23 year career in the US Army doing this very same thing – compiling what we called terrain intelligence (now called geospatial intelligence) and placing it onto annotated maps or, more commonly, map overlays. This was GIS long before there was GIS.

But while the former Soviets have done a wholesale dump of their formerly classified terrain intelligence data onto the commercial market to make a few bucks, the US and NATO studies are likely still classified and remain under lock and key.

The Soviets and the US (and our NATO partners like the British and Germans) approached the task in the same way – use every available source, from readily available tourist maps to ‘technical intelligence’ (aka, spy satellites) and on-the-ground observers (aka, spies) to compile extremely detailed map-based studies. In my field we tended to concentrate on factors that would directly impact Army ground operations, things like soil conditions, vegetation types, ground slope, road and bridge capacities, building densities in cities and towns, airport and landing zone data, river and stream conditions and much more.

What made the Soviet’s job so much easier is that the West’s open societies gave them virtually unrestricted access to accurate, detailed mapping data compiled for civilian use. A Soviet military attache assigned to their embassy in Washington DC could simply walk out of his embassy compound and stroll a few blocks to one of several well stocked map stores in the US capital. This included, I’m sure, the excellent map holdings over at the National Geographic Society. The US Geological Survey’s map store was just a short drive away in Reston, VA and I’m sure the Soviet Embassy was one of its best customers.

This military attache’s counterpart in the US Embassy in Moscow couldn’t do the same thing. The Soviets simply didn’t sell or give away maps of their territory. Most mapping data, even the most innocuous, was considered classified. That meant we had to get the data some other way. Of course I’m sure we did our share of bribing, cajoling, blackmailing and stealing to get copies of their maps (remember now, this was a cold war; we weren’t playing patty-cake), but we also very quickly developed out ‘technical intelligence’ capabilities – again, spy satellites – that allowed us to accurately map vast areas of the Soviet Union and her client states from space. It is said that the Defense Mapping Agency was the single biggest consumer of spy satellite imagery during the Cold War.

So dear reader, rest assured that while the Soviets were spying on us to develop highly accurate map and geographic intelligence data we were doing the exact same thing to them. We just haven’t seen the need to sell our intelligence on the open market to make a fast buck.

– Brian

Army Geospatial

My good friend Bill Farr sent me this YouTube link yesterday showing US Army Geospatial Engineers in what appears to be a planning exercise in preparation for deploying to Africa as part of the Ebola relief effort. The video was filmed at the Army Geospatial Center on Fort Belvoir, VA (the old Topographic Engineering Center) and it shows enlisted geospatial analysts running through different scenarios.

I post the video mainly for comparison. A lot of my civilian GIS colleagues really have no idea just what geospatial engineers in the US military do. This is a short but good snapshot of the kinds of projects Army geospatial engineers work on day-to-day.

– Brian

The 2014 ESRI UC

Well kiddies, I’m back from San Diego and the 2014 ESRI International User Conference. This is my third conference in five years, and it’s starting to feel like Groundhog Day.

Now please, do not get me wrong – it was a very good conference and in some respects it was a great conference. I and my team learned a lot and picked up a lot of new information.  But in the plenary and in the technical sessions and presentations side it was the same drumbeat we’ve been hearing for the past several years – ‘rich’ web maps and apps (could someone in ESRI please tell me just what ‘rich’ means?), ArcGIS as an integrated system, not just a bunch of parts, ArcGIS Online, ArcGIS Online, ArcGIS Online, yaddah, yaddah, yaddah.  In the Solutions Expo (i.e., vendor displays) it was the same vendors, many in the same locations, showing the same stuff, giving the same spiel, etc.

You know, Groundhog Day. C’mon ESRI, mix it up a bit. It’s getting a little stale.

OK, that’s out of the way.  Let’s change tack. If you listened closely to the presentations and have been paying attention to what ESRI’s been doing in the past few months you were able to tease out some great information regarding new and emerging capabilities. Let’s start with one of ESRI’s current flagship product, ArcGIS Online.

If anyone reading this blog still doubts that ESRI considers ArcGIS Online and web mapping a central part of ESRI’s GIS universe then this UC would have set you straight. The message was obvious and unmistakable, like a croquet mallet to the head. ArcGIS Online is here to stay, is only going to get bigger, and if you are going to play in the ESRI sandbox you need to know (and buy into) ArcGIS Online. I didn’t attend a single ESRI session, whether it was the plenary or a one-on-one discussion with a product expert where the topic of ArcGIS Online integration didn’t pop up early and often. Most vendors I talked to – and certainly all of those that ‘got it’ – had ArcGIS Online integration as a key selling point for their product or service. Heck, even IBM with their painfully complex work order management program called Maximo ‘got it’ and touted how I could now ‘easily and seamlessly’ integrate ArcGIS Online feature services with Maximo. Anybody who knows Maximo knows it doesn’t do anything ‘easily and seamlessly’. I don’t really think Maximo can use hosted feature services from ArcGIS Online, at least not yet. The REST endpoints I saw Maximo consuming looked like dynamic map services. But at least the IBM sales team took the time to read the memo from Redlands.

ArcGIS Online

The ArcGIS Online product space was the single biggest product presence ESRI had set up in the Expo. It was huge, and a reflection of the importance ESRI places on the product

ESRI’s incessant chatter about ArcGIS Online would have fallen flat with those who are long time users of the product if ESRI had not done a product update just a few weeks ago. The July update of ArcGIS Online included a number of significant improvements and new features that signaled to those who know the product that ESRI is serious about ArcGIS Online being more than just a toy for making simple web maps. The upgrades in system security certification, administration tools, data management, data integration, analysis and cartographic tools shows ESRI has full confidence in ArcGIS Online as a serious enterprise tool.  I’ll admit that a few years ago I was having doubts that ESRI would be able to pull this off. Today I’m convinced that ArcGIS Online and web mapping is the most significant development in geographic content delivery since the invention of the printing press.

This year I spent more time wandering the Solutions Expo hall than I did attending the technical sessions. In past years there were sessions I felt I just couldn’t miss, but this year my technical needs were somewhat less well defined and I wanted to spend more time speaking with the vendors and visiting the ESRI product islands. It was time well spent.

One of the focuses (foci?) of this year’s plenary presentation was the issue of ‘open data’. Open data is nothing more than data that is available free to any user. Open data can take any format (though it is understood that for data to be truly ‘open’ it needs to be available in a non-proprietary format). For decades the federal and state governments have made GIS data available in a variety of GIS formats. A good example of this is census data. The data for most censuses held in the last 40 years or so is freely available in GIS format from the US government. It’s easy to pull that data into a GIS system and do all kinds of analysis against it. In fact, census data is one of the first data types that new GIS students learn to analyze in their core classes. In the same vein, many states make state-specific GIS data available from freely accessible data servers. Things like elevation data, transportation network data, hydrology, landcover and more have been commonly available for years.

However, it was often difficult for smaller government entities – cities, counties, or regional authorities – to share out their public data because of the IT and GIS management overhead involved. Let’s face it, nobody makes money sharing out free data so there’s little incentive to put a lot of resources behind the effort. As a result a lot of currently available open GIS data is pretty stale. ESRI is backing a push to pump new vitality into the sharing of open data via the new Open Data tools embedded in ArcGIS Online (see, there it is again). OK, I admit that ArcGIS Online isn’t exactly free to the organization looking to share out data, but if you do happen to be an ArcGIS Online subscriber then setting up an Open Data site is fast and easy. One of the great concepts behind ESRI’s effort is that the organization is really sharing a feature service from which an Open Data user can extract the data. This means that the data should not suffer from ‘shelf life’ issues; as long as the data behind the feature service is regularly updated the Open Data user will have the latest and greatest representation of what’s being shared.

On one of my laps around the Expo floor I stopped at the Open Data demonstration kisoks set up in the ArcGIS Online area and talked through the concept and implementation with one of the ESRI technical reps. At first I didn’t think my organization would have much use for this feature, but after thinking about they types of data we routinely pass out to anyone that asks – road centerlines, jurisdictional boundaries, parcels, etc. – I began to think this might be of some value to us. In about 15 minutes she helped me set up my organization’s Open Data site and share some common use data out to the public. If for no other purpose, an Open Data site could lift some of the data distribution burden off of us.

OpenData1

The new Open Data tab in ArcGIS Online allows the administrator to configure an open data page from which the organization can share data with the public

Another lap took me to the US Geological Survey information tables. The USGS table was set up in the Federal Government area and while most of the agencies suffered from a serious lack of attendee interest (and I pity the poor souls who had to man the Veteran’s Administration table), the USGS tables were doing a good business. The USGS reps were stirring the pot a bit. It seems that there’s a move afoot in the USGS to do away with the National Atlas. I’m not sure yet how I feel about this move. Clearly elimination of the National Atlas is a cost cutting move (and the USGS makes no bones about it on their website), but if the same digital data can be made available via other portals, like the National Map portal, then this may all be a moot point.  Still, this is the National Atlas and as such should be a point of pride not just for the USGS but for the nation. If for no other reason than that I’d keep it alive. The USGS reps working the tables were clearly pro-National Atlas and were running a petition campaign to garner support to keep the program going.

I also spent some time discussing the new US Topo series of maps with the USGS reps. If you’ve read any of my posts on the US Topo maps you know that from a cartographic perspective I think they stink. The map base – imagery – is poorly selected and processed and the maps looks like crap when printed out. That’s the basic problem; the US Topo series are compiled as though the intent is to print them out full scale for use in the field. They carry full legends and marginal data. However, it’s clear they were compiled specifically to look best on a back-lit computer screen. When printed out the maps are dark, muddy and the image data is difficult to discern. When I brought this up to one of the USGS reps she turned her badge around to indicate she was speaking for herself and said, “I agree completely, and we get a lot of complaints about the visual and cartographic quality of these maps.” Here’s hoping the USGS doesn’t go tone-deaf on this issue and takes steps to improve the quality of the US Topo series. She also let me know that there’s growing support within the USGS to provide the US Topo series maps not just in GeoPDF format but also in GeoTIFF. This would be a great move, especially if the USGS provided them in a collarless format for use in systems like ArcGIS for Desktop.

I took the time to mosey over to the Trimble display area and talk to a rep about my favorite Trimble issue – the lack of a Google certified version of Android on their very capable (and very expensive) Juno 5-series of handheld data collectors. I’ve bugged Trimble so much about this that I have to assume my picture is hanging on a dartboard in the executive conference room at Trimble’s headquarters. I got the same response out of the Trimble rep that I’ve been getting for about a year now, “We hear it’s coming but we don’t know when”. Yeah right.

After I left the Trimble area I found myself a few rows over at the table of a company I’d never heard of before, Cedar Tree Technologies. It was just a couple of guys with a couple of pieces of hardware, but my eye caught something that looked a lot like a beefed up smartphone and the guys at the booth were eager to give me their story. It seems that Cedar Tree Technologies is a brand new spin-off of Juniper Systems, a company that’s been making rugged handheld systems for the surveying and GIS community since the 1990’s. Cedar Tree’s specific focus is on the Android OS, and each of the devices on display were running Google certified versions of Android 4.2. The device that caught my eye was the CT4. The CT4 is what it looked like – a ruggedized smartphone that runs on Android. It looked like an OK product with very good specs – a quad core processor running at 1.2 GHz, a 4.3″ Gorilla Glass display, and 8 mp camera, a 3000 mAh battery, Bluetooth and an IP68 rating.  It did have a few drawbacks – only 16 gig of system memory and a 3G (not 4G or LTE) cell radio, and I forgot to ask if it was fully GNSS capable. But here’s the kicker – this damned thing is only $489! Roughly one third the price of the baseline Juno 5, yet it looks like it offers 3/4 or more more of the Juno’s capability. You can bet I’ll be contacting Cedar Tree about borrowing one of these for an evaluation.

Cedar Tree1

 He’s smiling because he thinks he’s got Trimble beat in the Android space. I think he might be right!

 

Cedar Tree2

 The Cedar Tree Technologies CT4. Perhaps the first truly usable Android-based field data collector

OK, I don’t want to get too far into the weeds on other topics of interest, so let me just do some quick summaries:

  • I talked to Trimble, Leica, Carlson, Juniper and Topcon reps about their software offerings. All plan to remain tightly wedded to the Windows Mobile 6.5 OS (a.k.a., Windows Embedded Handheld), which hasn’t had any significant updates for over 2 years. Many of the reps indicated that the mobile version of Windows 8 still has some issues and they are very reluctant to move in that direction. So it looks like the industry will be stuck with an archaic and moribund OS for some time yet
  • What the world needs, in addition to a good 5¢ cigar, is a good spatially based document management system. Lord knows my organization is in desperate need of something like this. I saw only one document management system vendor at the show, and their system has a strong dependency on ArcGIS Online (there it is again). I think this is a market area that is ripe for exploitation. The tools are now in place with ArcGIS Online and reliable cloud services to bring this type of functionality quickly and cheaply to an enterprise and I’d love to see some new developments in this area. Pleeeeze!
  • I attended a very interesting working session where the GIS team from Pierce County, WA discussed their adoption of enterprise GIS and ArcGIS Online. I felt like I was sitting through a presentation I had written about my own team’s struggles and experiences. Like us, Pierce County faced a lot of push-back and foot dragging from their IT department on implementing IT-dependent GIS initiatives, and productivity among the county’s field maintenance crews suffered. Here’s my point – for every GIS/IT success story I’ve heard or read about I’ve heard an equal number of stories where thick-headed IT departments get in the way of a successful GIS initiative. If you are IT and don’t fully support the GIS initiatives in your organization then watch out. You will wake up one day soon to find you’ve been replaced by a cloud based service. It’s happened in my organization and it’s happening across the industry.
  • How come I never heard of the Association of American Geographers? I’m not joking. I’ve been in this industry for over 30 years and have been attending trade shows for all of that time. I’ve heard of the ASPRS, the American Society of Photogrammetry and others, but never the Association of American Geographers. Seems like a good organization. May have to join!
  • Like a good 5¢ cigar, the world also needs more quality geospatial sciences masters program options. I talked to a number of the universities set up at the conference and while they all seemed to be offering quality programs, too many of them are targeted at the professional student, someone who heads into a masters program directly from a bachelors program. For example, here in Atlanta the Georgia State University offers what looks like a cracking good geosciences masters program with a focus on geospatial science, but it’s structured so that all of the coursework is classroom focused and only offered during working hours. For someone making a living in the real world this type of program really isn’t feasible. We need more fully on-line options and more local colleges and universities to offer evening and weekend programs.
  • Let’s get back on the ArcGIS Online horse and discuss a very interesting service that the developers tell me is under serious consideration. One of the gripes that users of Collector for ArcGIS have is the lousy positions that are provided by the GPS/GNSS receivers on handheld units. Keep in mind that this is not a Collector issue, but a hardware issue. One of the improvements ESRI is looking at is a subscription based correction service for use with Collector. It will probably work like this – collect a point or a series of verticies and when they are synced with the ArcGIS Online server the points first pass through a correction service before being passed on to ArcGIS Online. This will likely be a single base station correction solution, but it could offer sub-meter accuracy if using a data collector with a more advanced GPS/GNSS receiver (sorry, this will not work with your iPhone or Android smartphone because of the low quality receivers they use). Sort of like on-the-fly post processing. A very interesting concept, and it could move a lot of hardware manufacturers like Trimble, Topcon and Leica to put out low(er) cost Android-based field data collectors with improved receivers

Before I go, some kudos:

  • To the City of San Diego. I can’t think of a better place to hold this conference
  • To GIS, Inc for a wonderful dinner cruise with NO sales pressure (Mrs. OldTopographer just loved it!)
  • To Andrew Stauffer from ESRI and fellow BGSU grad. Andrew provided invaluable support to our organization over the past few years while we worked through our ArcGIS Online implementation issues. I finally got to meet him in person and thank him
  • To Pat Wallis from ESRI who proved you can hold down a serious job and still be a 1990’s era ‘dude’
  • To Courtney Claessens and Lauri Dafner from ESRI who entertained all of my dumb questions about Open Data
  • To Kurt Schowppe from ESRI. I’m sure my pool party invite got lost in the mail <grin>
  • To Adam Carnow, for putting up with all of my other dumb questions and requests
  • To all the great people I bumped into at random and had wonderful conversations with

And finally, it was good to see my alma mater making a showing at the Map Gallery exhibition. Go Falcons!

BGSU poster

– Brian

 

 

The Littlest Geo-Geek

So I came home from work yesterday, sat down at my computer and was surprised to find this:

ArcCatalog

No less than eight instances of ArcCatalog running. Huh? When I left for work in the morning the only thing open on the computer was Chrome. Did some eeeevil ESRI trojan take control of my desktop?!

Then I remembered – this little two year old cutie spent the day with Grandma:

Helen

And this cute little tornado takes every opportunity to search every computer, tablet or iPhone she gets her hands on for this:

FROZN_014M_G_ENG-GB_70x100.indd

I applaud her early interest in investigating the mysteries of the geosciences and exploring ESRI’s software offerings. It warms the heart of this old geo-geek grandfather.

…or could it be she just made a series of random mouse clicks while singing (for about the 20th time that morning, according to Grandma) “Let it go, let it goooooo”?

Grandpa prefers to think she’s a budding geo-genius.

– Brian

Some GNSS Musings

First things first – GNSS is the new GPS.  Actually, GPS is a subset of GNSS. GPS stands for Global Positioning System, GNSS stands for Global Navigational Satellite System. For decades folks refereed to any and all satellite navigation systems as GPS, and for good reason – the US Global Positioning Satellite system was the only game in town. However, the term ‘GPS’ properly describes just the global positioning system established and maintained by the United States. Now that the Russian GLONASS system is operational, and systems from the European Union, China and perhaps other players (India?) are coming on-line, the term for ALL space-based satellite navigation systems has shifted to GNSS.

OK, now that that’s out of the way.

I spent the last two days in training finally learning how to run Trimble’s TerraSync and Pathfinder Office software.  We’ve had TerraSync and Pathfinder Office software in our office for years, but never got any formal training on how to use either package.  The training was actually very good, and I can see now why a lot of surveying and engineering firms prefer TerraSync over GIS-centric packages like ESRI’s ArcPad.

The class was taught by one of the training and support personnel from our local vendor, NEI, and he did a great job.  Woven throughout the class are discussions about GPS, datums, coordinate systems and issues like unanticipated coordinate system shifts due to improper datum selection or datum mis-matches between the software and virtual reference station (VRS) datums.  We spent a good deal of time in the field actually experiencing the impact of changing datum selections in the software (for example, the shift seen when selecting NAD83 vs. NAD83 HARN).

So this class got me thinking again about GNSS and data quality and accuracy…

In the olden days, like before the turn of the century, these datum shifts generally didn’t concern GIS folks.  The shifts introduced by any datum mis-match were well within most folk’s error budgets.  In most cases we were ecstatic when GPS got us within a few dozen feet of the features we were collecting.  When the accuracy standard of the 1:50,000 topographic map you were using as a base was +/- 50 meters having GPS points a dozen or so feet off was no big deal.  In fact, we were tickled pink to be able to get that level of autonomous GPS accuracy.

Today things are much different. Improved GNSS software, antenna designs, the open availability of reliable GPS and GLONASS signals and the wide availability of GPS augmentation services like WAAS and local virtual reference stations (VRS) means that these systems are capable of sub-meter, often sub-foot, accuracies. That’s just for GIS data collection.  Survey-grade GNSS systems are capable of real-time accuracies to tenths of a foot. Suddenly datum shift errors of even one foot become very, very important for high precision data collection and surveying.

One of the biggest problems people in my line of work face is a general lack of understanding of GNSS in the GIS and civil engineering fields. In particular, many professionals lack up-to-date training and working knowledge of GNSS system capabilities, limitations and application to their line of work.  Evaluating and planning for the potential impact of things like datum shift on GNSS-based surveys or data collection projects is something they can’t comprehend largely because they haven’t been trained on it and, perhaps most important, have’t been forced to consider it when planning or managing a project.

Sadly, I’ve met far too many people with a GISP certificate hanging on their wall who couldn’t tell me the fundamental difference between the NAD 27 and NAD 83 datums, and I have yet to meet a single civil engineer who is not also a licensed surveyor who could explain to me the importance of knowing the datum his or her CAD drawing coordinate system is based on.  Yet both of these groups – the GIS professional and the civil engineer – have a fundamental interest in controlling the overall accuracy and precision of their work.  For the GIS professional it’s a matter reputation and trust.  For the licensed civil engineer it could be a matter of putting his or her work at legal risk.

If you work in the GIS field you can not call yourself a GIS professional unless you have a fundamental understanding of datums, coordinate systems and the importance of applying this knowledge to your workflows.  A strong knowledge of datums and coordinate systems is one of the foundational building blocks of our profession, and since so much of what we do these days is GNSS-based it makes it equally important to have a strong understanding of the impact different datum selections can have on the spatial quality of our data.

I’ve said before in this blog that those GIS ‘professionals’ who consider GIS to be little more than making web maps are headed to extinction. Here in the Atlanta metro area it would take me about an hour to hire a busload of web developers who can make web maps (and this includes time out for a stop at Starbucks). If that bus accidentally rolls into the Chattahoochee River and everybody drowns I can get another busload just as fast. However, the number of GIS professionals I’ve run into who can tell me the anticipated shift in State Plane (NAD83) and State Plane (NAD83 HARN) coordinates wouldn’t fill the first row of that bus.

For the civil engineering community the issue is less obvious but just as critical. GNSS-based surveying and data collection is becoming the norm on many projects. It is faster, cheaper and just as accurate as conventional surveys under the right conditions. This means civil engineers will be incorporating more and more GNSS-based data into their designs and relying on GNSS for jobsite control, machine control and as-built data verification. While the task of establishing project control, setting up survey equipment configurations and managing project survey requirements will fall to the the project surveyor, the project engineer still has overall responsibility for ensuring things are built to design.  If the project stakeout is a few feet out from the design drawings it may not be because the instrument operator has a hangover; it may be because the design work was done in one datum and the GNSS survey unit is set to another. Being able to identify a potential datum shift problem is a key skill for civil engineers working in today’s GNSS-based world.

– Brian

My Data Is More Accurate Because It Got Here First

Earlier this month Eric Gagstatter wrote a great little article for Geospatial Solutions Monthly titled “Nightmare on GIS Street: Accuracy, Datums, and Geospatial Data”.  Anybody who’s worked in the GIS field for more than a week has experienced the kind of issues Eric discusses.  Simply put, it is a rare event when data pulled from multiple sources fits together with any semblance of accuracy or precision.

For a small scale project (let’s say 1:20,000 or smaller) data fit is less important – at those smaller scales ‘eyeball close’ is often good enough.  The problem we face is that with modern GIS software the user is not stuck with a fixed scale like they were when everything was based on paper maps.  We live in the era of Google Earth, the era of high resolution satellite imagery, where everybody expects to be able to read the address number on their mailbox from space.  This new found ability to zoom to any scale with just the scroll of a mouse wheel has highlighted a problem that the general public and, to be frank, many Geospatial and civil engineering professionals, were not aware of: the data doesn’t fit.

Eric highlights the most important factor impacting this issue – the emergence of high precision GPS-based field data.  In the past 10 years or so GPS data, that data collected by survey grade or SBAS* augmented GPS units, has dramatically exposed the errors embedded in decades of historical geospatial data.

It’s not that this old data was collected badly – most of it was collected to established standards using the best resources and techniques available at the time.  In the old days it was paper maps, scaled aerial photos, compass headings, pace counts (or odometer readings for really long distances) and field notebooks.  Mapping grade accuracy was the accepted norm.  When you were using 1:24,000 USGS topo sheets as your project base an error of +/- 24 meters (the approximate National Map Accuracy Standard for those map sheets) was good enough.  Formal surveys were expensive and time consuming, and only done if there was a strong business justification – usually to establish legal boundary definitions, accurately map out small project areas, or precisely position critical features.

Today a Geospatial professional can collect data with handheld GPS units that easily achieves accuracies of +/- 15 feet with just SBAS augmentation, and centimeter level accuracies with survey-grade RTK (real time kinematic) equipment.  Accuracy has improved by several orders of magnitude and the cost of acquiring that data had dropped dramatically.

While Eric focuses on the issues of datums and datum transformations, my experience is a little different.  I work at a major airport that has terabytes of historical CAD data and a warehouse full of historical project plans on paper, mylar or linen that go back to the early 1940s.  Virtually all of this data is referenced to a local grid system that was first established as a construction grid back in 1948.  At the time this grid was established it was never formally defined in reference to the local State Plane coordinate system.  In fact, the surveyors who laid it out committed the cardinal sin of not establishing a central meridian that is oriented to true north.  The entire grid is rotated a few degrees off of true north and that angle of rotation was never defined when the grid was established.  For years this was not a problem.  The airport was happy to exist as an ‘island’, floating on the face of the earth within its own little grid system.  However, when the airport started to expand dramatically in the 1960s the engineers realized they needed to start tying into properly defined coordinate systems like State Plane.  USGS and USC&GS survey control was extended onto the airport and several monuments were defined in both the local grid system and State Plane.  This allowed project engineers and surveyors to ‘extend’ State Plane control onto their project sites if required, but all design and construction work was continued in the local grid system.  To this point all design work was done using old manual drafting methods, so the levels of error inherent in these processes were acceptable for the time.

In the 1980s CAD (computer aided design and drafting) systems started to be used on more and more projects at the airport. Since our local grid is a simple x,y grid based on distances in feet measured from an origin point it was easy to lay out in CAD.  No need to worry about that pesky rotation.  Or, for that matter, grid-to-ground mismatches over long distances (like say on a 9,000′ runway).  But very soon some serious folks with serious money, like the Federal government, began asking for airport data in a ‘real’ coordinate system like State Plane.  A number of attempts were made to try to define the local grid as a true spatial coordinate system (with a tie to a known coordinate system, an origin point and a rotation and scale factor) but with no success.  As a result some very sloppy work-arounds were developed, most using a ‘local fit’ method  – an engineer or CAD technician would snap local project data from one coordinate system to known good data in the other coordinate system; building corners, grid tics, manholes, whatever they could find.  The problem was that a lot of ‘known good’ data turned out to be not so good.  Errors propagated and started to become uncontrollable.  The engineering staff worked around this by referencing in local project data (for example, a new taxiway segment) against a small subset of the overall CAD basemap for the airport.  This method tended to keep the the coordinate system shift error within acceptable limits for the small project area, but when the data was added to the larger CAD basemap grid shift errors of up to 15′ were common.

When my Geospatial group came on board in 2007 the coordinate system transformation issue quickly became one of our biggest headaches.  We were faced with creating an enterprise geospatial database from scratch using this legacy CAD data.  We desperately needed a proper spatial definition for this local grid system, something that would work in both our CAD and GIS systems.  Our engineering staff was happy to dump the issue in our lap.  In fact, when I interviewed for the job one of the senior engineers told me that if I was hired the first thing he wanted me to do was to “fix this damned State Plane thing.”

As we started talking with the engineering staff about the problem it became apparent they all had an institutional distrust of State Plane, or any spatial coordinate system for that matter.  They placed the entire blame for the data fit issues on ‘inaccuracies’ in the State Plane system – inaccuracies they couldn’t articulate.  In their minds all data prepared in their local grid system was golden.  After all, the local grid system was known.  It was proven.  It was simple.  They had built an entire world-class airport on it.  This goofy State Plane thing just got everybody confused and besides, when they did move their CAD data to State Plane it ‘got shifted all around’ and didn’t work anymore.  It might fit OK at one corner of the airport, but didn’t fit too well at the other.

We eventually got the grid transformation issue solved.  We attacked it from several directions and ended up with a very accurate local grid system projection file for use in both AutoCAD and ArcGIS, and a best-fit definition for use in Blue Marble (for bulk coordinate point conversions).  All of these definitions are based on the same survey data so errors are consistent and controllable from system to system.  We can hold transformation errors to about 0.2′ across the airport property.  And yet our engineering staff still retained a latent distrust of State Plane-based data.  The old institutional bias remained.  The perception that ran deep is that the old ‘known’ CAD data in the local coordinate system is somehow better, more accurate, than any newly collected GIS data.  There is a natural distrust of geospatial data; few civil engineers understand what geospatial data is, how it differs from CAD data and how geospatial data can be incorporated into planning and design projects.  If the data file doesn’t have a .dwg at the end of it they don’t like it.

We decided to approach the perception issue from two directions.  The first was a current high resolution, high accuracy orthophoto of the airport.  Using our newly developed projection file we were able to reproject the aerial from State Plane to the local grid system for use in AutoCAD.  For the first time ever the engineers and CAD staff had a single continuous coverage aerial image in their grid system that could be used as a base for project planning and drawing development.  Next, we acquired RTK-based data collectors that are capable of centimeter level accuracy.  We launched on an aggressive project to collect photo identifiable data – manholes, paint markings, slab joints, airfield lights – and provide the data in both grid systems as a tool to check current and historical data against.  From this we created a ‘trusted’ CAD file,  one the engineering group verified using their own sources.  Ever so slowly some of the doubters started to come around.  Once they started matching their legacy data against these new sources and saw the problems for themselves they began to do more aggressive data checks and not take CAD data, old or new, at face value.

Yet we continued to have perception problems.  The old-line engineering staff retained a deeply embedded distrust of GIS data in State Plane and our insistence that all legacy data to be closely checked and adjusted if necessary.  Their reasoning actually sounded pretty good – “We spent decades building a world class airport with this CAD data and it all came together.  How can the data be wrong?”

Our GIS group didn’t really have a good response until some of the long time CAD staff complained that “it’s impossible to get as-builts around here.”  Our antennae went up and we started to do some digging on the issue.  Very quickly the problem revealed itself.  Our engineering staff rarely received true as-builts from the contractors that do construction on the airport.  The as-built delivery requirement is written into most contracts but is rarely enforced.  Contractors would regularly walk away from the as-built requirement and eat the contract penalty because they were too busy on other airport projects or the cost of developing the as-builts exceeded the monetary penalty.  If a contractor did deliver what they labeled as ‘as-built’ drawings they were seldom, if ever, checked for accuracy and completeness by the project manager.  The data was accepted at face value and often recycled for use on the next project.  Errors in spatial accuracy or attributes (pipe sizes, slab thicknesses, etc.) were unknowingly propagated from project to project as the project planners and designers used the same inaccurate data over and over again.  Down the line some errors became so glaringly obvious (like a stormwater line flowing uphill) that the engineering staff would hire engineering firms to go to the field and conduct existing condition surveys.  It was not unusual for the airport to hire the same firm that originally delivered the bad data to go back out and field verify what they should have originally delivered years before in the project as-builts!

But this only addresses half of the question.  The fact remains that this airport got built, and got built pretty darned well.  Was it all built on sloppy CAD data and it’s just a happy accident that everything fits?  Well, once we understood the as-built issue the rest of the story fell into place.  The engineering staff at this airport only does planning and initial design.  The final design work and construction drawings are done by contracted engineering firms.  Construction drawings are based on a number of sources – initial design, existing condition surveys and final design plans.  Construction drawings are what the project engineers and tradesmen take to the field to actually build against  These are the drawings that get marked up as modifications are done in the field and it’s these drawings that should be used to generate the as-builts.  These engineering firms do a very good job of making sure everything fits within the designated project space, and any ties to existing systems – utility lines, roadways, buildings, etc. – are adjusted for in the final design or in the field.  But we are back to the old as-built issue.  Much of what was actually constructed in the field never makes it back into the airport’s master CAD drawing.

So the reality is that the airport got built, but the airport doesn’t have a complete and accurate record of what got built.

But I do get the sense that we are over the hump.  In the last two years we’ve noticed an improvement in the consistency of the spatial accuracy of the CAD data being delivered.  We still find a good number of attribute data issues (stormwater manholes labeled as sewer manholes, that sort of thing), but as far as spatial accuracy things seem to be greatly improved.  I put it down to our engineering staff’s increased use of known good data as a quality control check, increased emphasis on as-built delivery, a willingness to let us be part of the quality control check process, increased dialog between the CAD and GIS analysts and an increased dependence on our RTK data collectors to do quick field verification.  In fact, our engineering staff is now the #1 hands-on  user of our RTK systems.  The GIS group also has tight relationships with many of the major construction contractors doing work at the airport and we provide the coordinate system definition files and verified base data for use in project planning.  We also offer ourselves up as the data conversion experts and will help contractors verify that their data has been properly moved from one grid system to the other.  Over time our insistence on spatial accuracy has ‘leaked into’ the engineering business processes and workflows here at the airport.

We’ve shifted the paradigm just a bit and the momentum is in our favor.  Geospatial engineering 1, bad data 0.  That’s the way it should be.

Brian

*SBAS = Space Based Augmentation System.  There are several SBAS systems in use around the world.  These are satellites in geosynchronous orbit that transmit correction data for the US GPS satellite constellation.  The Wide Area Augmentation System (WAAS) is a set of satellites transmitting correction data over the US and the eastern Pacific and is maintained by the FAA and DOT.  If your GPS unit can receive and use these signals they will roughly double the accuracy of the position fix your unit provides.  You can read more about WAAS and SBAS on Wikipedia.

Are You Lining Up With The Dinosaurs?

Yesterday there was an interesting article posted in the NY Times Business Day section about a sharp scientist and entrepreneur named Gilan Elbaz and his company Factual.

In layman’s terms Factual can be described as a data warehouse.  A real BIG data warehouse.  But beyond being just a big storage bucket in the cloud, Factual applies quality values against the data and puts it into context.

The real message here is that Factual is structuring it’s data holdings to accommodate and enhance automated spatial analysis and reasoning.  The goal is that highly intelligent software systems, acting against quality data, will perform much of the same complex spatial analysis and decision support operations that human GIS analysts currently do with desktop applications like ArcGIS.

In the future – and the future isn’t too far off – high end software packages like ArcGIS will be nothing more than embedded applications in larger, highly intellegent software systems.  Think I’m joking?  Ever heard of ArcObjects?

So, can you read the writing on the wall?  Can you hear me stomping my foot on the floor?  Do you get the hint?

If you are a GIS ‘professional’ and you describe yourself and your role in an organization in relation to the software you use you are a dinosaur, and you are headed for extinction.

The Geospatial professional needs to be a complex problem solver, not a software jockey.

Adapt or die.