The Options dialog has five tabs for customizing different aspects of the GCD 5 software. All options are user-specific, meaning that any changes are saved for the current user only.

Workspace Tab

The workspace tab controls several features that determine the default behavior of GCD.

This video explains the different commands:

Survey Types Tab

Survey types can be customized to represent different topographic survey methods. This list controls which survey types are available as the Survey Method  int he DEM Survey Tab of the DEM Survey Properties dialog of the Survey Library. The error value is in map units (e.g. meters) and represents the default elevation uncertainty used for this survey method when it is a spatially uniform error estimate (i.e the Error Calculations tab of the DEM Survey Properties dialog). The default values shown are very crude rules of thumb from our experience and the literature, but can vary dramatically depending on the specifics of survey implementation, instrumentatoin, sampling design, post-processing, and surface creation.

This video shows you how to modify and/or add new survey types.

Symbology Tab

This tab has no functionality in the current release. It is a placeholder for allowing the user to over-ride the default symbology when rasters are loaded to the Survey Library or created by the GCD are added to the map.

Graphs Tab

The Graphs tab simply controls the output resolution and dimensions (in pixels) of the output *.png graph image files that are automatically produced by GCD. The default is for a square graph at 1000 x 1000 pixels. The graphs currently exported by GCD include:

NOTE: All graphs produced in GCD can also be exported manually from their respective panels/dialogs (with a right click) and that all the data required to produce these graphs automatically is also exported to the output folder.


Coordinate Precision Tab

This tab determines how GCD checks for grid orthogonality and dimensional divisibility. ArcGIS introduces small rounding errors in raster dimensions (width and height) as well as raster resolution. The result is a raster intended to be 0.1 m in cell resolution, is actually stored and treated as either 0.0999999999999998372 or 0.10000000000000003432 (even though when you check the raster properties it may still say 0.1), and this can lead to the recorded height and width of the raster not being evenly divisible by the cell resolution. Although most users can ignore this issue, it is critical during change detection, because it leads to unnecessary re-sampling of rasters and introduction of minor interpolation errors in your data.  When we run Orthogonality, Divisibility and Concurrency checks in the software, we round the ESRI-reported values (by default) to 4 decimal places so that these minor precision inaccuracies do not propagate. In the above example this will treat what gets reported as 0.1 as 0.1000, despite how the value is actually stored in memory. You can change that precision here: