A First Look At ArcGIS Pro

For the past week I’ve been playing with the pre-release version of ESRI’s ArcGIS Pro. The operative word is ‘playing’, because this software application is just one heck of a lot of fun to play with.

Arcgis Pro

If you are a user of ESRI’s GIS software you’ve likely heard of ArcGIS Pro. ESRI first announced it at the 2012 International User Conference. I was there and watched the software development team lead run through a quick demo. It looked interesting, but at the time I thought it would be little more than a re-packaging of the current version 10.x desktop software – something like a new skin that would sit on top of ArcGIS for Desktop 10.x.

With the release of ArcGIS 10.3 in the latter half of 2014 ESRI started to seed out the pre-release version of ArcGIS Pro. Because of other attention grabbing issues I didn’t get the chance to install the software until early last week.

So what is ArcGIS Pro? Well here’s what it’s not – it is not a re-packaging of ArcGIS 10.3. ArcGIS Pro is (as far as I can tell) a completely new application. If ESRI were to hang a numeric tag on the release I imagine they’d label it something like ArcGIS 11. It is a generational leap in features and capabilities. Without getting into details let’s look at some of the things ArcGIS Pro brings to the desktop:

  • ArcGIS Pro is a native 64 bit application. This means the application takes full advantage of all the multi-threaded processing horsepower available in 64 bit desktop systems. ArcGIS for Desktop 10.3 is a mostly 32 bit application and just loafs along on 64 bit hardware. ArcGIS Pro unlocks all the available processor capability (but there’s a price to pay – see ‘issues to be aware of’ below).
  • ArcGIS Pro ‘lives’ in 3D. The new 3D visualization and analysis tools in ArcGIS Pro are what make it fun to play with. If you have 3D data (z-enabled) then ArcGIS Pro relates to the data in 3D. If you have simple 2D data (say, building footprints) then it’s easy to turn it into 3D data using the Extrusion toolset.
  • The new ribbon menu interface is context sensitive and works well. Yes, I know ribbon menus have been out for years and are generally well implemented (Microsoft Office), but they can also be poorly done (<cough> AutoCAD <cough>). The default ArcGIS Pro ribbon menu is very well done. It’s clear ESRI spent a lot of time getting this right.
  • ESRI manages to keep most of the workflow options and commands where experienced ArcGIS for Desktop are used to finding them. While there is a learning curve it is fairly short for most common tasks.
  • The 3D navigation tools are well implemented using mouse commands. This functionality is a core part of ArcGIS Pro – it’s there when you open the application. Even if all your data is 2D, you still have immediate access to 3D tools.
  • ArcGIS Pro moves beyond the .mxd map file and introduces the concept of the GIS ‘project’. Think of a project as a collection of map files, data links, configuration files, rules and settings that define a particular GIS project. Projects can be shared across individuals and work groups to help standardize workflows and other processes.

Arcgis Pro Projects

  • Licensing. This is a huge paradigm shift for ESRI. ArcGIS Pro is authorized via ArcGIS Online. The two are tightly linked and can’t be separated. Your ArcGIS Online administrator now authorizes you to use ArcGIS Pro from the new ‘Manage Licenses’ interface in ArcGIS Online. It looks like no more separate license manager, authorization codes and provisioning files!

ArcGIS Pro License Manager

  • It appears as though you get one ArcGIS Pro license for every ArcGIS for Desktop 10.3 Advanced license you have. This includes those who have a Home Use license.

Some questions I can’t answer yet include:

  • What is the relationship between ArcGIS for Desktop 10.3 and ArcGIS Pro? Can ArcGIS Pro stand alone, or does the application have dependencies on 10.3?
  • What about ArcCatalog? There appears to be no ArcGIS Pro equivalent to Catalog.

Some issues to be aware of:

  • Keep in mind this is pre-release (beta) software. There are still some rough edges. While everything seems to work well I have experienced some crashes. If you do decide to test it, don’t use it for production purposes!
  • There seem to be some issues publishing a 3D scene from ArcGIS Pro to ArcGIS Online (the new 3D Scene functionality). So far neither I or any of the GIS team at work have successfully moved a scene from the desktop to the web. Again, this is pre-release software…
  • This is perhaps the #1 issue to be aware of – ArcGIS Pro is a workstation-class desktop application. Some would classify it as a resource hog, but more accurately it is a fully 64 bit application that makes full use of all available resources. You can forget about running it on your dual core Pentium 4 box running Windows XP.
  • ArcGIS Pro is heavily graphics intensive and demands a fairly robust graphics card. When I first installed ArcGIS Pro on my home computer it quickly brought my cheap 512 MB gaming card to its knees – the card was literally crying for mercy. ESRI recommends at least a 1 GB OpenGL compliant card. I ended up buying a 2 GB NVIDIA Quadro K620 card and that vastly improved performance.

Before we close, let’s take a look at a short video that highlights some of the ArcGIS Pro features I discussed above.

That’s it for now. As ArcGIS Pro moves out of pre-release and we get more experience with it I’ll come back with updates. But for now, this seems to be a great update and I’m really quite excited about it.

– Brian

Merry Christmas 2014

 

Patton2

On 14 December 1944 General George S. Patton, commander of the Third US Army, released his Christmas greeting to his troops. It’s a classic greeting and military prayer that was made famous in the movie ‘Patton’, but there’s much more to the story and it has a wonderful topographic and, for me, personal connection.

Click here to read more.

And a very blessed and merry Christmas to you and your family!

– Brian

The 2014 ESRI UC

Well kiddies, I’m back from San Diego and the 2014 ESRI International User Conference. This is my third conference in five years, and it’s starting to feel like Groundhog Day.

Now please, do not get me wrong – it was a very good conference and in some respects it was a great conference. I and my team learned a lot and picked up a lot of new information.  But in the plenary and in the technical sessions and presentations side it was the same drumbeat we’ve been hearing for the past several years – ‘rich’ web maps and apps (could someone in ESRI please tell me just what ‘rich’ means?), ArcGIS as an integrated system, not just a bunch of parts, ArcGIS Online, ArcGIS Online, ArcGIS Online, yaddah, yaddah, yaddah.  In the Solutions Expo (i.e., vendor displays) it was the same vendors, many in the same locations, showing the same stuff, giving the same spiel, etc.

You know, Groundhog Day. C’mon ESRI, mix it up a bit. It’s getting a little stale.

OK, that’s out of the way.  Let’s change tack. If you listened closely to the presentations and have been paying attention to what ESRI’s been doing in the past few months you were able to tease out some great information regarding new and emerging capabilities. Let’s start with one of ESRI’s current flagship product, ArcGIS Online.

If anyone reading this blog still doubts that ESRI considers ArcGIS Online and web mapping a central part of ESRI’s GIS universe then this UC would have set you straight. The message was obvious and unmistakable, like a croquet mallet to the head. ArcGIS Online is here to stay, is only going to get bigger, and if you are going to play in the ESRI sandbox you need to know (and buy into) ArcGIS Online. I didn’t attend a single ESRI session, whether it was the plenary or a one-on-one discussion with a product expert where the topic of ArcGIS Online integration didn’t pop up early and often. Most vendors I talked to – and certainly all of those that ‘got it’ – had ArcGIS Online integration as a key selling point for their product or service. Heck, even IBM with their painfully complex work order management program called Maximo ‘got it’ and touted how I could now ‘easily and seamlessly’ integrate ArcGIS Online feature services with Maximo. Anybody who knows Maximo knows it doesn’t do anything ‘easily and seamlessly’. I don’t really think Maximo can use hosted feature services from ArcGIS Online, at least not yet. The REST endpoints I saw Maximo consuming looked like dynamic map services. But at least the IBM sales team took the time to read the memo from Redlands.

ArcGIS Online

The ArcGIS Online product space was the single biggest product presence ESRI had set up in the Expo. It was huge, and a reflection of the importance ESRI places on the product

ESRI’s incessant chatter about ArcGIS Online would have fallen flat with those who are long time users of the product if ESRI had not done a product update just a few weeks ago. The July update of ArcGIS Online included a number of significant improvements and new features that signaled to those who know the product that ESRI is serious about ArcGIS Online being more than just a toy for making simple web maps. The upgrades in system security certification, administration tools, data management, data integration, analysis and cartographic tools shows ESRI has full confidence in ArcGIS Online as a serious enterprise tool.  I’ll admit that a few years ago I was having doubts that ESRI would be able to pull this off. Today I’m convinced that ArcGIS Online and web mapping is the most significant development in geographic content delivery since the invention of the printing press.

This year I spent more time wandering the Solutions Expo hall than I did attending the technical sessions. In past years there were sessions I felt I just couldn’t miss, but this year my technical needs were somewhat less well defined and I wanted to spend more time speaking with the vendors and visiting the ESRI product islands. It was time well spent.

One of the focuses (foci?) of this year’s plenary presentation was the issue of ‘open data’. Open data is nothing more than data that is available free to any user. Open data can take any format (though it is understood that for data to be truly ‘open’ it needs to be available in a non-proprietary format). For decades the federal and state governments have made GIS data available in a variety of GIS formats. A good example of this is census data. The data for most censuses held in the last 40 years or so is freely available in GIS format from the US government. It’s easy to pull that data into a GIS system and do all kinds of analysis against it. In fact, census data is one of the first data types that new GIS students learn to analyze in their core classes. In the same vein, many states make state-specific GIS data available from freely accessible data servers. Things like elevation data, transportation network data, hydrology, landcover and more have been commonly available for years.

However, it was often difficult for smaller government entities – cities, counties, or regional authorities – to share out their public data because of the IT and GIS management overhead involved. Let’s face it, nobody makes money sharing out free data so there’s little incentive to put a lot of resources behind the effort. As a result a lot of currently available open GIS data is pretty stale. ESRI is backing a push to pump new vitality into the sharing of open data via the new Open Data tools embedded in ArcGIS Online (see, there it is again). OK, I admit that ArcGIS Online isn’t exactly free to the organization looking to share out data, but if you do happen to be an ArcGIS Online subscriber then setting up an Open Data site is fast and easy. One of the great concepts behind ESRI’s effort is that the organization is really sharing a feature service from which an Open Data user can extract the data. This means that the data should not suffer from ‘shelf life’ issues; as long as the data behind the feature service is regularly updated the Open Data user will have the latest and greatest representation of what’s being shared.

On one of my laps around the Expo floor I stopped at the Open Data demonstration kisoks set up in the ArcGIS Online area and talked through the concept and implementation with one of the ESRI technical reps. At first I didn’t think my organization would have much use for this feature, but after thinking about they types of data we routinely pass out to anyone that asks – road centerlines, jurisdictional boundaries, parcels, etc. – I began to think this might be of some value to us. In about 15 minutes she helped me set up my organization’s Open Data site and share some common use data out to the public. If for no other purpose, an Open Data site could lift some of the data distribution burden off of us.

OpenData1

The new Open Data tab in ArcGIS Online allows the administrator to configure an open data page from which the organization can share data with the public

Another lap took me to the US Geological Survey information tables. The USGS table was set up in the Federal Government area and while most of the agencies suffered from a serious lack of attendee interest (and I pity the poor souls who had to man the Veteran’s Administration table), the USGS tables were doing a good business. The USGS reps were stirring the pot a bit. It seems that there’s a move afoot in the USGS to do away with the National Atlas. I’m not sure yet how I feel about this move. Clearly elimination of the National Atlas is a cost cutting move (and the USGS makes no bones about it on their website), but if the same digital data can be made available via other portals, like the National Map portal, then this may all be a moot point.  Still, this is the National Atlas and as such should be a point of pride not just for the USGS but for the nation. If for no other reason than that I’d keep it alive. The USGS reps working the tables were clearly pro-National Atlas and were running a petition campaign to garner support to keep the program going.

I also spent some time discussing the new US Topo series of maps with the USGS reps. If you’ve read any of my posts on the US Topo maps you know that from a cartographic perspective I think they stink. The map base – imagery – is poorly selected and processed and the maps looks like crap when printed out. That’s the basic problem; the US Topo series are compiled as though the intent is to print them out full scale for use in the field. They carry full legends and marginal data. However, it’s clear they were compiled specifically to look best on a back-lit computer screen. When printed out the maps are dark, muddy and the image data is difficult to discern. When I brought this up to one of the USGS reps she turned her badge around to indicate she was speaking for herself and said, “I agree completely, and we get a lot of complaints about the visual and cartographic quality of these maps.” Here’s hoping the USGS doesn’t go tone-deaf on this issue and takes steps to improve the quality of the US Topo series. She also let me know that there’s growing support within the USGS to provide the US Topo series maps not just in GeoPDF format but also in GeoTIFF. This would be a great move, especially if the USGS provided them in a collarless format for use in systems like ArcGIS for Desktop.

I took the time to mosey over to the Trimble display area and talk to a rep about my favorite Trimble issue – the lack of a Google certified version of Android on their very capable (and very expensive) Juno 5-series of handheld data collectors. I’ve bugged Trimble so much about this that I have to assume my picture is hanging on a dartboard in the executive conference room at Trimble’s headquarters. I got the same response out of the Trimble rep that I’ve been getting for about a year now, “We hear it’s coming but we don’t know when”. Yeah right.

After I left the Trimble area I found myself a few rows over at the table of a company I’d never heard of before, Cedar Tree Technologies. It was just a couple of guys with a couple of pieces of hardware, but my eye caught something that looked a lot like a beefed up smartphone and the guys at the booth were eager to give me their story. It seems that Cedar Tree Technologies is a brand new spin-off of Juniper Systems, a company that’s been making rugged handheld systems for the surveying and GIS community since the 1990’s. Cedar Tree’s specific focus is on the Android OS, and each of the devices on display were running Google certified versions of Android 4.2. The device that caught my eye was the CT4. The CT4 is what it looked like – a ruggedized smartphone that runs on Android. It looked like an OK product with very good specs – a quad core processor running at 1.2 GHz, a 4.3″ Gorilla Glass display, and 8 mp camera, a 3000 mAh battery, Bluetooth and an IP68 rating.  It did have a few drawbacks – only 16 gig of system memory and a 3G (not 4G or LTE) cell radio, and I forgot to ask if it was fully GNSS capable. But here’s the kicker – this damned thing is only $489! Roughly one third the price of the baseline Juno 5, yet it looks like it offers 3/4 or more more of the Juno’s capability. You can bet I’ll be contacting Cedar Tree about borrowing one of these for an evaluation.

Cedar Tree1

 He’s smiling because he thinks he’s got Trimble beat in the Android space. I think he might be right!

 

Cedar Tree2

 The Cedar Tree Technologies CT4. Perhaps the first truly usable Android-based field data collector

OK, I don’t want to get too far into the weeds on other topics of interest, so let me just do some quick summaries:

  • I talked to Trimble, Leica, Carlson, Juniper and Topcon reps about their software offerings. All plan to remain tightly wedded to the Windows Mobile 6.5 OS (a.k.a., Windows Embedded Handheld), which hasn’t had any significant updates for over 2 years. Many of the reps indicated that the mobile version of Windows 8 still has some issues and they are very reluctant to move in that direction. So it looks like the industry will be stuck with an archaic and moribund OS for some time yet
  • What the world needs, in addition to a good 5¢ cigar, is a good spatially based document management system. Lord knows my organization is in desperate need of something like this. I saw only one document management system vendor at the show, and their system has a strong dependency on ArcGIS Online (there it is again). I think this is a market area that is ripe for exploitation. The tools are now in place with ArcGIS Online and reliable cloud services to bring this type of functionality quickly and cheaply to an enterprise and I’d love to see some new developments in this area. Pleeeeze!
  • I attended a very interesting working session where the GIS team from Pierce County, WA discussed their adoption of enterprise GIS and ArcGIS Online. I felt like I was sitting through a presentation I had written about my own team’s struggles and experiences. Like us, Pierce County faced a lot of push-back and foot dragging from their IT department on implementing IT-dependent GIS initiatives, and productivity among the county’s field maintenance crews suffered. Here’s my point – for every GIS/IT success story I’ve heard or read about I’ve heard an equal number of stories where thick-headed IT departments get in the way of a successful GIS initiative. If you are IT and don’t fully support the GIS initiatives in your organization then watch out. You will wake up one day soon to find you’ve been replaced by a cloud based service. It’s happened in my organization and it’s happening across the industry.
  • How come I never heard of the Association of American Geographers? I’m not joking. I’ve been in this industry for over 30 years and have been attending trade shows for all of that time. I’ve heard of the ASPRS, the American Society of Photogrammetry and others, but never the Association of American Geographers. Seems like a good organization. May have to join!
  • Like a good 5¢ cigar, the world also needs more quality geospatial sciences masters program options. I talked to a number of the universities set up at the conference and while they all seemed to be offering quality programs, too many of them are targeted at the professional student, someone who heads into a masters program directly from a bachelors program. For example, here in Atlanta the Georgia State University offers what looks like a cracking good geosciences masters program with a focus on geospatial science, but it’s structured so that all of the coursework is classroom focused and only offered during working hours. For someone making a living in the real world this type of program really isn’t feasible. We need more fully on-line options and more local colleges and universities to offer evening and weekend programs.
  • Let’s get back on the ArcGIS Online horse and discuss a very interesting service that the developers tell me is under serious consideration. One of the gripes that users of Collector for ArcGIS have is the lousy positions that are provided by the GPS/GNSS receivers on handheld units. Keep in mind that this is not a Collector issue, but a hardware issue. One of the improvements ESRI is looking at is a subscription based correction service for use with Collector. It will probably work like this – collect a point or a series of verticies and when they are synced with the ArcGIS Online server the points first pass through a correction service before being passed on to ArcGIS Online. This will likely be a single base station correction solution, but it could offer sub-meter accuracy if using a data collector with a more advanced GPS/GNSS receiver (sorry, this will not work with your iPhone or Android smartphone because of the low quality receivers they use). Sort of like on-the-fly post processing. A very interesting concept, and it could move a lot of hardware manufacturers like Trimble, Topcon and Leica to put out low(er) cost Android-based field data collectors with improved receivers

Before I go, some kudos:

  • To the City of San Diego. I can’t think of a better place to hold this conference
  • To GIS, Inc for a wonderful dinner cruise with NO sales pressure (Mrs. OldTopographer just loved it!)
  • To Andrew Stauffer from ESRI and fellow BGSU grad. Andrew provided invaluable support to our organization over the past few years while we worked through our ArcGIS Online implementation issues. I finally got to meet him in person and thank him
  • To Pat Wallis from ESRI who proved you can hold down a serious job and still be a 1990’s era ‘dude’
  • To Courtney Claessens and Lauri Dafner from ESRI who entertained all of my dumb questions about Open Data
  • To Kurt Schowppe from ESRI. I’m sure my pool party invite got lost in the mail <grin>
  • To Adam Carnow, for putting up with all of my other dumb questions and requests
  • To all the great people I bumped into at random and had wonderful conversations with

And finally, it was good to see my alma mater making a showing at the Map Gallery exhibition. Go Falcons!

BGSU poster

– Brian

 

 

Map Makers At Work

I was going through some photos in my collection for some Facebook friends and ran across a collection of old shots I acquired while assigned to the 320th Engineer Company (Topographic) back in the late 1990s. I say ‘acquired’ because I just happened to find them lying around my office as part of some left over historical records from the old 649th Engineer Battalion (Topographic). All are 1970’s vintage photos that highlight the map making activities of the 649th. The 649th provided comprehensive topographic support – survey, map production & distribution, terrain analysis, and geographic intelligence analysis – to US Army Europe (USAREUR).

In the early 1980’s I was assigned to the 649th at Tompkins Barracks in Schwetizingen, Germany. I served as the commander of one of the terrain analysis detachments. I have a lot of good (and some not-so-good) memories of the 649th and the Soldiers that served in the unit. Fifteen years later I found myself back in Germany. By then the 649th had been deactivated and my unit, the 320th Engineer Company (Topographic), was the last surviving remnant of the 649th. The 320th had received a lot of left over equipment and paperwork from the 649th and one of my duties was to sort through it all to determine what was worth keeping and what could be trashed. During this process I found the photos you see in this post.

All of these photos were rejects, shots the photo editor didn’t think were worthy of using in a presentation. That accounts for a lot of the grease pencil ‘mark outs’ you see on many of the images. The interesting part is that the pictures that made it past the editor and into various publications or presentations are long gone – either packed up with the unit’s archives and locked away in an Army records warehouse or tossed into a trash can. Only the rejects survived to make it to my desk long after the battalion inactivated.

The 649th rarely made a new map from scratch. Most of the work involved updating existing maps or creating specialized overlays (like military operational graphics) to be overprinted onto existing maps. Still, the battalion had all the functional components necessary to create a new map:

  • Topographic and geodetic survey
  • Photomapping and cartography
  • Layout, photolithography and printing press operations

To accomplish these tasks each topographic company within the 649th was divided into three platoons – the Survey Platoon (topographic & geodetic survey), the Photomapping Platoon (photo control, compilation and drafting) and the Reproduction Platoon (layout, photolithography and press).

So let’s take a look at these processes as practiced by Soldiers of the 649th. This isn’t intended to be a comprehensive overview of map making; there are some large gaps in this tale simply because I don’t have any photos depicting specific steps. The real goal here is to highlight the Soldiers and their activities.

 


 Survey

Each map starts with two things – stereo aerial photography and a topographic survey to ‘tie’ the photos to their location on the face of the earth. Army topographic units relied on US Air Force photomapping squadrons to provide the aerial photography. Once the topographic unit got the aerial photography in hand it was up to the surveyors to go out and collect precise location data for points visible in the photographs – things like road intersections, prominent terrain features or pre-positioned survey point targets.

APPS

APPS – the Analytical Photogrammetric Positioning System. The APPS permitted surveyors and terrain analysts to precisely determine points on the ground using georeferenced stereo images. The system consisted of a point positioning stereoscope hooked to an early HP desktop computer. Each set of stereo images came with a computer tape that held the ephemeris data for each image. The operator would pick a point on the image in stereo (road intersection, building roof, etc.), tap a foot switch and the computer would print out the point location in latitude and longitude. The APPS was often used by surveyors to collect photo control point coordinates without having to do a formal field survey. For its time this was a revolutionary system.

 

Survey1

Surveyors ‘turning angles’ with a conventional theodolite (probably a Wild T2). The instrument would be set up over a known control point and used to measure the precise angle to other control points. The operator would call out the angle readings and the Soldier standing behind with the notebook would record the readings and do quick checks of the angle measurements to ensure the readings were accurate.

 

Survey4

What are the surveyors in the previous photo aiming at? One of these, of course! This is a Wild survey target. It would get set up over a control point that is key to the survey. The theodolite operator adjusts the crosshairs in the theodolite telescope so they bisect the white ‘arms’ and ‘skirt’, of the target, then reads the angle of measurement.

 

Survey2

 If a theodolite measures angles how do we measure distances? Well in the 1970’s we used microwave distance measuring equipment called the Tellurometer. The system consisted of a master and a remote unit and measured the time it takes a reflected microwave signal to return the master unit, which was then converted into distance. While I don’t have any hands-on experience using these units, I do remember sitting in on several meetings at the 649th where the surveyors discussed what a headache they were to operate and maintain. By 1980’s standards this was old technology and the units the Army had adopted were becoming maintenance nightmares. But in their time these distance measuring units were a revolutionary time saver. Note the headset the soldier is using. He’s actually talking to the operator at the remote unit via a built-in radio link. The two operators had to continuously coordinate settings and monitor performance during the distance measurement operation.

 

Survey3

 Here is the back of the Tellurometer unit showing the instrument settings panel.

 

Calculator

This picture shows the only Soldier in this series that I’ve met. CW2 Thomas (on the right) is demonstrating a new Hewlett-Packard calculator to a visiting British Army officer. CW2 Thomas was one of several survey warrant officers assigned to the 649th. We met at Fort Bragg, North Carolina years after this photo was taken. The Hewlett-Packard (HP) calculator is interesting because they were widely adopted by surveyors due to their rugged construction and advanced functions that were well suited to surveying applications. While I don’t think any HP pocket calculator was ever officially adopted by the Army they were in wide use at the Defense Mapping School where our surveyors were trained. Many of the HP calculators found their way into survey units through local purchase by individual units.

 

 


 Photomapping

Photomapping is the process of compiling a topographic map based on information seen in the aerial photographs. Military topographic maps consisted of at least five distinct information layers, each with its own color – cultural features (black), water (blue), vegetation (green), contour lines (brown) and boundaries and built-up areas (red). It is the job of the cartographer to extract each of these information layers from the aerial photography to create a map manuscript. First the cartographers would rectify each aerial photo by removing any tip or tilt in the photo and tying it to the survey control points. Then they would use a device called a multiplex plotter to project the photos in 3D so the cartographer can trace out the key features while viewing the photos in stereo. Once the information is traced onto a manuscript sheet it is passed over to other cartographers who precisely trace out the collected information using the precise symbols we see on the finished map.

Carto3

This photo shows a cartographer setting up a multiplex plotter in preparation for tracing out a new map information layer using photomapping techniques. The multiplex plotter used a stereo pair of aerial photos to project a 3-dimensional image of the terrain onto the white disc or platen of the mutiplex tracing ‘table’ (the device with the white disc seen sitting on the table). The tracing table has a small tracing dot engraved on it that sits directly above a pen holder. The stereo images have been transferred to two small glass diapositive images, and in this picture you see the cartographer holding one of the diapositives in his left hand as he mounts it into the projection stage. The tall ‘can’ in his right hand is the projector assembly that contains the projection lamp, lens assembly and filter. One diapositive is filtered red, the other blue, and the operator wears a pair of glasses with one red and one blue lens (just like the old-time 3D movie glasses), enabling him to see the projected image in stereo.

 

multiplex plotter

Here we see a cartographer tracing data from the projected image onto a manuscript sheet (usually a sheet of dimensionally stable material like Mylar). You can see entire image projected onto the the table surface just under his forearm, but only the small area projected onto the multiplex table platen is in focus. Engraved on the center of the platen is a small tracing ‘dot’, and mounted directly below it is a tracing pencil. The cartographer carefully adjusts the platen up or down so the tracing dot appears to rest directly on the ground on the stereo image and he then begins to trace out features. He traces one layer type onto each sheet; one sheet for cultural features, one sheet for hydrology (water), one sheet for vegetation etc.

 

Cartographer

Once a manuscript map layer is drawn it is turned over to other cartographers who carefully trace out the data using approved map symbols and line types. This is called the compilation process, where the cartographers compile the data into standard formats. Map compilation is precise and exacting work, and a cartographer can spend days, sometimes weeks, working on a single sheet.

 

Carto2

Once each manuscript layer is complete it the information it holds is photographically ‘burned’ or transferred to specially coated plastic called either scribe coat or peel coat (seen above). The words ‘scribe’ and ‘peel’ describe the manner in which the orange coating (seen above) is removed to create clear windows through which a photographic negative can be exposed. Because of the unique nature of the orange coating it completely blocks all the light wavelengths that a film negative is sensitive to, so the orange areas come out black (or unexposed) when the negative is processed. Cartographers use specially designed scribing tools to carefully etch away the areas that represent point or linear features like individual buildings or roads. For larger areas like lakes or farm fields a sheet of peel coat is used, which allows large areas to be carefully cut with a sharp knife or razor blade and peeled away.

 

Carto4

All military maps have grids, and the grid on each map is unique based on the area of the world it covers. Calculating and drawing these grids requires great precision and accuracy since the grids must be exact or the grid coordinates a Soldier derives from an improperly gridded map could be hundreds or thousands of meters off. Here a cartographer is setting up an automated plotting device used to precisely draw the grid for a particular map sheet. Automated tools like this greatly reduced the human error often encountered when drawing grids and speeded up map compilation.

 


Map Reproduction

Once the cartographers completed the map compilation phase the manuscript sheets were turned over to the Reproduction Platoon for all the steps necessary to print the final map. This normally involved preparing negatives from the scribe and peel coat layers prepared by the Photomapping Platoon, editing and correcting the negatives, making press plates and finally, printing the map.

Layout3

To speed up the map compilation process cartographers made heavy use of pre-printed text. Things like standard place names (cities, towns, etc.), major feature names (rivers, mountains, etc.) and road identification symbols (highways, autobahns, etc.) would all be identified using standarized text that was prepared by the Reproduction Platoon. The cartographer would submit a list of feature names with text style and size requirements and the Reproduction Platoon personnel would provide the information on clear adhesive backed sheets that were created using a photo transfer process. Here we see a Soldier setting up some text as requested on the order sheet attached to the clipboard.

 

Layout1

One of the last steps in map production is making the negatives from which the press plates are produced. Here we see a Soldier from the Reproduction Platoon doing a final check of a negative before approving it for plate production.

 

Layout2

The final step before going to press is the plate making or ‘burning’. Press plates are just thin sheets of aluminum specifically sized to fit on a printing press. A map layer negative representing all features of the same color (black, blue, green, brown or red) is placed on top of a press plate that has been coated with a photo sensitive emulsion and the two are placed in a vacuum frame plate maker that uses a high intensity lamp to ‘burn’ or expose the positive image onto the plate. Once the plate is burned it is washed to remove the emulsion and the resulting image is what gets printed on the map. In this photo we see a Soldier from the Reproduction Platoon doing a final cleaning of a press plate before sending it on to the press section.

 

Press3

The 649th ran a number of presses of different size and capacity, everything from small trailer mounted presses capable of producing only 1:50,000 and 1:250,000 scale maps to large format presses permanently installed in the battalion’s base plant and capable of producing over-sized maps and other geographic products. In this picture we see a press operator from the Reproduction Platoon loading a press plate onto a van mounted Harris offset press.

 

Press2

Here’s a photo of one of the 649th’s presses installed in its baseplant at Tompkins Barracks in Schwetzingen.

 

Press1

Here’s a photo of the feeder end of another one of the 649th’s large format presses. In this photo it looks like already printed map sheets are being fed back through the press to add another information layer or military overprint.

 

Map Distro

The very last step in map production is actually map distribution. The 649th also had a Map Distribution Platoon that stocked and distributed the printed maps to units all across Europe. The platoon was responsible for getting the most current maps into the hands of the front line soldier as quickly as possible. The 649th maintained a map distribution warehouse at Tompkins Barracks and at several contingency sites around Europe. The platoon even had specially designed vans that were mobile distribution warehouses that could service forward deployed headquarters. Here we see Soldiers from the Map Distribution Platoon restocking maps at the distribution warehouse at Tompkins Barracks.

 

And there you have it, Army field map production circa 1970. As I mentioned in the start of this post I’ve taken a lot of liberties by overly simplifying the map making process with the intent of highlighting the Soldiers and activities of the 649th. If any readers recognize any of the Soldiers shown in these photos (or if you happen to be one of these Soldiers) I’d love to hear from you. You can either add a comment to this post or contact me at oldtopographer(at)gmail.com. If I’ve made any factual errors in the map making process, or if I’ve mis-identified any of the process shown in these photos please leave a comment here and I’ll make the necessary corrections. Thanks, and I hope you’ve enjoyed this trip down memory lane!

– Brian