The thirteenth of Blue Marble’s GeoTalks Express online webinar series entitled An Introduction to Scripting in Global Mapper, was conducted on September 2nd, 2020. During the live session, numerous questions were submitted to the presenters. The following is a list of these questions and the answers provided by Blue Marble’s technical support team.
Can you use relative filenames, recursive/subfolders?
With an IMPORT command you can use file paths relative to the location of the saved Global Mapper script you are running. For example, if I have a script saved in the C:\Scripting_Data folder, and data in C:\Scripting_Data\Imagery_Tiles, I can list the filename to an image as only \Imagery_Tiles\19TDK380050.jp2.
Global Mapper script also supports some built in variables including %SCRIPT_FOLDER% defined as the folder where the Global Mapper script is saved. You can use this variable in a file path parameter values to make file locations dependent on where the script is saved.
With the IMPORT_DIR_TREE command or a directory loop you do have the option to search subfolders for files to import. Use the parameter RECURSE_DIR=YES to search the subfolders of a directory.
Is it possible to use EPSG instead of the definition of the projection?
Yes, you can load a projection from a recognized EPSG code using the command and parameter LOAD_PROJ PROJ_EPSG_CODE=XXXX. This will change the workspace projection as you would through Configuration > Projection in the Global Mapper user interface.
Is there an option to export eg. image tiles basing on names of shapefiles/areas opened in the project?
Yes, to tile an export based on area features from a vector file, like a shapefile, you would use the POLYGON_CROP_FILE parameter in the EXPORT command along with some other specific parameters.
The POLYGON_CROP_FILE value would be the filename or loaded layer name for the areas you would like to use as your tiles. In addition to this parameter, use POLYGON_CROP_USE_EACH=YES to export a file for each area in the layer, and POLYGON_CROP_NAME_ATTR with the value being the name of an attribute you would like to use in the exported file names. The attribute value from the POLYGON_CROP_NAME_ATTR parameter will be added to the end of the specified export file name.
Is it possible to run a script to automate the loading of pictures, run pixel-to-point, DEM, and contour processes?
Yes, with Global Mapper and the Lidar Module registered you can use the GENERATE_POINT_CLOUD command to run the Pixels to Points process. The command is limited by the script format in that you cannot manually place control points or mask your input images via Global Mapper script. These actions would need to be completed in the user interface of Global Mapper. If you would like to use ground control points and/or image masks you may want to create a Pixels to Points workspace in the user interface. and save it. You can then use the saved Pixels to Points workspace in the GENERATE_POINT_CLOUD command in your script.
Can you import CAD files and export into kml using script?
Is the / character like a REM type command where whatever you type after is just for reference and runs “nothing”?
Yes, the / character at the beginning of a line indicates a comment in the script. Global Mapper will not try to run any lines that begin with a forward slash character (/).
Is there a possibility to crop a specific area by script?
Yes, you can crop data during import, export, or when performing other analysis functions by using crop parameters. POLYGON_CROP_FILE can be used to define an area based on an existing file. POLYGON_CROP_NAME can be used to reference a shape previously defined (DEFINE_SHAPE) in the script.
What raster formats can you export to?
Global Mapper supports many raster formats for export. Formats supported in Global Mapper script along with additional export parameters can be found here in the Scripting Reference section of the Global Mapper knowledge base.
Can you just point to a folder with all the jp2 files and the script will import all jp2 in the folder?
Yes, you can use the IMPORT_DIR_TREE command to import all files from a directory. To limit the files being imported to a specific type, use a FILENAME_MASKS parameter to filter for a file extension.
Where can I find the zip file of Scripting_Data?
The sample data and scripts can be downloaded using this link. To ensure that the script files work as designed, unzip the downloaded folder and copy it to the root of your C drive (C:\Scripting_Data).
With 19-Create_Watersheds.gms as an example, is it possible to join stream reaches into multiple lines starting from the outlet or from a key location. In practice, this would allow export of a series of stream profiles from the head to the outlet for all cases.
The watershed being created in sample script 19-Create_Watershed is a catchment area script. The area generated shows the area from which water would flow to the point indicated by the parameter FLOW_TO_POS=”3059028.37,497357.79″.
The COMBINE_LINES command in Global Mapper script can combine lines in a layer, but this would not calculate new watershed attributes for the stream features. Additionally, since the streams come from different locations and flow together into larger streams, a single line from head to outlet would likely not be created.
Alternatively, a batch export of each profile to XYZ could be used and connected via an external code. Just looking for your thoughts on this.
The stream line features generated by the create watershed process are 3D lines and have per vertex elevations. Since this is true you can export the lines to a format that preserves the vertex elevations, like shapefile format using the parameter GEN_3D_FEATURES=YES. Or you can create points at vertices and export those to an XYZ format. To create points at vertices use the EDIT_VECTOR SHAPE_TYPE=LINES CREATE_VERTEX_POINTS=YES.
Will the export script help bypass size issues/Global Mapper memory issues that you can run into when exporting like for example making a points file from a large text file and then exporting as a tab file?
Using Global Mapper script will not resolve memory limitations on your machine. Even though the data may not be displayed when running a script, Global Mapper is still loading the data and executing specific commands like exports.
If you are seeing consistent issues with a specific workflow or dataset, first check that your machine meets the system requirements for running Global Mapper. If possible share some additional information on your workflow steps including file sizes and feature counts, any settings you use for exports, and any errors you are seeing in the Global Mapper program.
I am running v19. I have the script folder in the root of C: ask directed. None of the scripts are working. Is it a software version issue?
Even in the older version 19 of Global Mapper many of the provided scripts should run. Make sure the Scripting_Data is unzipped on the root of C:\. What errors or warnings are you seeing when you run the script through the Global Mapper user interface? These errors or warnings may help us to determine why the scripts are not running as expected.