Tag Archives: LiDAR

A script to find and download GEDI passes

The Global Ecosystem Dynamics Investigation (GEDI) is a spaceborne lidar instrument mounted on the International Space Station (ISS). The GEDI instrument is a geodetic-class lidar with 3 lasers that produce 8 parallel tracks. Each laser illuminates a 25 m footprint on the ground and fires at a rate of 242 times per second. Each footprint is separated by 60 m in the along track direction, at an across-track distance of 600 m between each of the 8 tracks. GEDI’s precise measurements of forest canopy height, canopy vertical structure, and surface elevation will be critical to characterizing and understanding carbon and water cycling processes to further our knowledge of the world we live in: http://www.gedi.umd.edu.

GEDI is mounted on the ISS and therefore its orbit is dictated by that of the ISS. This prevents repeat acquisitions in the same way that Landsat or Sentinel satellite data is collected. While this enables wider coverage, its data acquisition is not consistent. Further to this, GEDI has the ability to point its lasers so that the area on the ground imaged is not necessarily that at nadir beneath the sensor.

To date, GEDI data is available via two means:

Earthdata (https://search.earthdata.nasa.gov/search) provides the data but does not currently provide sufficient visualization so that users can see whether the data intersects their ROI.

Alternatively, the NASA LP DAAC provide the data in list/ftp format (https://e4ftl01.cr.usgs.gov/GEDI/). This allows easy access to all the data but requires that the user knows the orbit that they require. Since February 2020, this has been supplemented by a web tool called GEDI Finder (https://lpdaacsvc.cr.usgs.gov/services/gedifinder). Users pass in bounding box dimensions alongside the GEDI product and version they require and it returns a subset of the list of data that intersects their bounding box. While this is a simple method for accessing data, it requires that a user either manually downloads each one or passes each name to software (wget, curl) to pull the data.

SearchPullGEDI.py was written to make the best of these existing tools and enable the automation of the whole search and download process. It relies heavily on the GEDI Finder tool to search for the data that intersects a bounding box and is, in its simplest form, a wrapper for this tool. It also allows an optional date range to be specified if users are only interested in data collected during a specific time period. SearchPullGEDI.py requires a number of command line options that include GEDI product, version, bounding box, output path and Earthdata login credentials. An overview and example of SearchPullGEDI.py is:

python SearchPullGedi.py -p GEDI02_B -v 001 \
-bb 0.3714633 9.277247 -0.08164874 10.00922 \
-d 2019-01-01 2019-04-30 \
-o /Users/Me/Data/Gedi \
-u MyEarthDataUsername -pw MyEarthDataPassword

whereby:

-p is the GEDI product (e.g., GEDI02_B)

-v is the GEDI version (e.g., 001)

-bb is the bounding box given in UpperLeftLon (maxY), UpperLeftLat (minx), LowerRightLon (minY) and LowerRightLat (maxX). These should be passed to the terminal separated by spaces.

-d is the date range specified in the format YYYY-MM-DD with the start and end data separated with a space

-o is the local path where you want to download the data

-u is your EarthData login username (Note an EarthData account is required)

-pw is your EarthData login password

Running:

python SearchPullData.py -h

will also provide this information.

SearchPullGedi.py contains 3 functions.

  • The first constructs the command line options and passes it to the existing GEDI Finder tool and pulls a list of the GEDI files that are on the html webpage.
  • The second function parses the search results pulled from the GEDI Finder URL and constructs a list where each element is a separate GEDI H5 file.
  • The final function iterates over the list of H5 files and pulls each one using wget software.

SearchPullGEDI.py is available here: https://bitbucket.org/nathanmthomas/bucket-of-rs-and-gis-scripts/src/master/SearchPullGEDI.py

SearchPullGEDI.py requires wget is installed.  On Linux this can be installed through the package manager if not already installed, on macOS you can install this from conda-forge using:

conda create -n gedi -c conda-forge python wget

The command will print the number of files found then start downloading them. Each file is approximatly 1 GB.

Author: Nathan Thomas (@DrNASApants)

Nathan is a UMD Earth System Science Interdisciplinary Center (ESSIC) PostDoc positioned at the NASA Goddard Space Flight Center. Nathan’s research is focused primarily around land cover mapping and characterizing the above ground structure of vegetation, particularly mangrove forests. Through this he uses python to pull, preprocess, calibrate, analyze and display remote sensing info. Some of this code is distributed through his bitbucket: https://bitbucket.org/nathanmthomas/bucket-of-rs-and-gis-scripts/src/master/

Working with full waveform LiDAR data in SPDLib (part 2)

The first post of this series was written quite a while ago now. Apologies it has taken so long for a follow up. Since the first post has been written there have been two exciting developments:

  1. The methods described for generating full waveform metrics have been used to perform the LiDAR analysis for a paper led by Chloe Brown, University of Nottingham.Brown, C.; Boyd, D.S.; Sjögersten, S.; Clewley, D.; Evers, S.L.; Aplin, P. Tropical Peatland Vegetation Structure and Biomass: Optimal Exploitation of Airborne Laser Scanning. Remote Sens. 2018, 10, 671. https://doi.org/10.3390/rs10050671
  2. SPDLib is now available on Windows, macOS and Linux through conda-forge (as is RSGISLib). See part one for updated install instructions.

At the end of part one we had imported the LAS 1.3 file into SPDLib and decomposed the waveforms. This next section will cover ground classification and metrics generation.

  1. Spatially index data

    In part one we had been working with an SPD file without a spatial index (UPD file). However, for subsequent processing steps a spatial index is needed so a spatially indexed file is generated using the spdtranslate command.

    As the gridding can use a lot of RAM we are going to process in tiles and then stitch them together. We create a temporary directory to store the tiles using:

    mkdir spd_tmp
    

    Then run the translate command:

    spdtranslate --if SPD --of SPD \
                 -x LAST_RETURN \
                 -b 1 \
                 --temppath spd_tmp \
                 -i LDR-FW-RG13_06-2014-303-05_subset_decomp.spd \
                 -o LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded.spd
    

    The temporary directory can be removed after processing has completed:

    rm -fr spd_tmp
    
  2. Classify ground returns and populate height

    Many metrics use the height about ground rather than absolute elevation so this must be defined. To derive heights from LiDAR data it is first necessary to determine the ground elevation so heights can be calculated above this. Within SPDLib the ground classification results are achieved using a combination of two classification algorithms: a Progressive Morphology Filter (PMF; [1]) followed by the Multi-Scale Curvature algorithm (MCC; [2]). Both these algorithms use only the discrete points rather than the waveform information.

    Apply a Progressive Morphology Filter using the following command:

    spdpmfgrd --grd 1 -i LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded.spd \
              -o LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded_pmf_grd.spd
    

    Then apply the Multi-Scale Curvature algorithm to the output file using:

    spdmccgrd --class 3 --initcurvetol 1 \
              -i LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded_pmf_grd.spd \
              -o LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded_pmf_mcc_grd.spd
    
  3. Attribute with height

    The final step of the SPD processing is to attribute each pulse with heights above ground level. An interpolation is used for ground points, similar to generating a Digital Terrain Model (DTM), but rather than using a regular grid the ground height is calculated for the position of each point.

    spddefheight --interp --in NATURAL_NEIGHBOR_CGAL \
                 -i LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded_pmf_mcc_grd.spd \
                 -o LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded_pmf_mcc_grd_defheight.spd
    
  4. Calculate metrics

    After all the pre-processing steps to convert the LAS 1.3 file into a gridded SPD format file with a defined height it is possible to generate a number of metrics from the waveform data. The command to calculate metrics within SPDLib (`spdmetrics`) takes an XML file in which the metrics are defined. There are a large number of metrics available and operators (addition, subtraction etc.,) allowing existing metrics to be combined to implement new metrics. The full list of metrics is available in the The full list of metrics is available in the SPDMetrics.xml file, distributed with the source of SPDLib. Most metrics have an option to specify the minimum number of returns (`minNumReturns`), setting this to 0 will use the waveform information to calculate the metric, setting to 1 (default) or above will use the discrete data. In this way full waveform and discrete metrics can be created at the same time.For this exercise we will be calculating Height of Medium Energy (HOME) and waveform distance (WD), a detailed description of these metrics is given in [3].

    First, create a file containing these metrics. Create a text file called ‘spd_metrics.xml’ and paste the text below into it:

    
    <!-- SPDLib Metrics file -->
    
    <spdlib:metrics
    xmlns:spdlib="http://www.spdlib.org/xml/">
    
    <!-- HOME -->
    
    <spdlib:metric metric="home" field="HOME"/>
    
    <!-- WD -->
    
    <spdlib:metric metric="maxheight" field="WD" minNumReturns="0"/>
    
    </spdlib:metrics>
    

    If this doesn’t display, try copying from https://gist.github.com/danclewley/4eefda2200e7593f1e5e2aaa6bae2c03

    To calculate the metrics and produce an image as an output run.

    spdmetrics --image -o LDR-FW-RG13_06-2014-303-05_subset_metrics.bsq \
               -f ENVI \
               -i LDR-FW-RG13_06-2014-303-05_subset_decomp_gridded_pmf_mcc_grd_defheight.spd \
               -m spd_metrics.xml
    

    Once the command has finished, open the metrics image using:

    tuiview LDR-FW-RG13_06-2014-303-05_subset_metrics.bsq
    

More metrics can be added to the ‘spd_metrics.xml’ file as needed, it is also possible to define new metrics using the operator tags.

This post was derived from the LiDAR practical given as part of the NERC-ARF workshop held at BAS, Cambridge in March 2018. If you have any questions about working with NERC-ARF data contact the NERC-ARF Data Analysis Node (NERC-ARF-DAN) see https://nerc-arf-dan.pml.ac.uk/ or follow us on twitter: @NERC_ARF_DAN.

[1] Keqi Zhang, Shu-Ching Chen, Whitman, D., Mei-Ling Shyu, Jianhua Yan, & Zhang, C. (2003). A progressive morphological filter for removing nonground measurements from airborne LIDAR data. IEEE Transactions on Geoscience and Remote Sensing, 41(4), 872–882. http://doi.org/10.1109/TGRS.2003.810682
[2]: Evans, J.S., Hudak, A.T., 2007. A multiscale curvature algorithm for classifying discrete return lidar in forested environments. IEEE Transactions on Geoscience and Remote Sensing 45 (4), 1029–1038.
[3] Cao, L., Coops, N., Hermosilla, T., Innes, J., Dai, J., & She, G. (2014). Using Small-Footprint Discrete and Full-Waveform Airborne LiDAR Metrics to Estimate Total Biomass and Biomass Components in Subtropical Forests. Remote Sensing, 6(8), 7110–7135. http://doi.org/10.3390/rs6087110

Working with full waveform LiDAR data in SPDLib (part 1)

The SPD file format was designed around storing LiDAR pulses with digitised waveforms and associated points. The most recent version (3.3) has the ability to import waveform data from LAS files using LASlib, which is part of LAStools. Binaries are available for Linux, macOS and Windows, they can be installed through conda.

conda create -n spdlib -c conda-forge spdlib tuiview
. activate spdlib
conda update -c conda-forge --all

For this example LAS 1.3 files acquired by the NERC Airborne Research Facility (NERC-ARF, previously ARSF) over Borneo using a Leica ALS50-II instrument with full waveform digitiser will be used.

Once you have registered with CEDA and applied for access to the ARSF archive these data can be downloaded from: http://browse.ceda.ac.uk/browse/neodc/arsf/2014/RG13_06/RG13_06-2014_303_Maliau_Basin/LiDAR/flightlines/fw_laser/las1.3

You can also follow through with any of the other NERC-ARF datasets or other waveform LAS files you have.

  1. Convert LAS to SPD format
    First you need a WKT file to define the projection. This step is optional but is more reliable than reading from a LAS file. For the example the projection is UTM50N, you can download a WKT file using:

    wget https://bitbucket.org/petebunting/rsgis_scripts/raw/c8cf94528cdb58b753029df3bc631a2509740ad1/WKT/UTM_WGS84/UTM_WGS84_Z50_N.wkt
    

    Then convert to an unsorted SPD file (UPD).

    spdtranslate --if LAS --of UPD \
                 -x LAST_RETURN \
                 --input_proj UTM_WGS84_Z50_N.wkt \
                 -i LDR-FW-RG13_06-2014-303-05.LAS \
                 -o LDR-FW-RG13_06-2014-303-05.spd
    
  2. Subset SPD file (optional)

    As full waveform processing is quite intensive it is recommended to subset the data for the purpose of running though this tutorial, you can do this using the spdsubset command.

    spdsubset --xmin 494400 --ymin 524800 \
              --xmax 494800 --ymax 525000 \
              -i LDR-FW-RG13_06-2014-303-05.spd \
              -o LDR-FW-RG13_06-2014-303-05_subset.spd
    
  3. Decompose waveform

    One of the limitations of discrete systems is there is are only a given number of ‘points’ recorded (normally 2 – 4) and the rest of the information is lost. As full waveform data records the entire waveform it is possible to extract more returns after data are acquired. A common approach to this ‘Gaussian Decomposition’ which involves fitting Gaussian distributions to the peaks, within SPDLib this is available as the ‘spddecomp’ command.

    spddecomp --all --noise --threshold 25 \
              -i  LDR-FW-RG13_06-2014-303-05_subset.spd \
              -o  LDR-FW-RG13_06-2014-303-05_subset_decomp.spd
    

    This will still take around 5 minutes to run. If you decide to decompose the full dataset after, expect it to take an hour or so.

  4. Export returns to LAS file
    The final step for this part of the tutorial is to export the returns to a LAS file, using the spdtranslate command.

    spdtranslate --if SPD --of LAS \
                 -i LDR-FW-RG13_06-2014-303-05_subset_decomp.spd \
                 -o LDR-FW-RG13_06-2014-303-05_subset_decomp.las
    

Classifying ground returns and calculating LiDAR metrics using SPDLib are covered in part two

If you have further questions about using SPDLib please contact the mailing list (details available from https://lists.sourceforge.net/lists/listinfo/spdlib-develop). If you have any questions about working with NERC-ARF data contact the NERC-ARF Data Analysis Node (NERC-ARF-DAN) see https://nerc-arf-dan.pml.ac.uk/ or follow us on twitter: @NERC_ARF_DAN).

Mosaic Environment Agency DTM/DSM tiles using RSGISLib

The Environment Agency have just made their high resolution LiDAR-derived Digital Surface Model (DSM) and Digital Terrain Model (DTM) data available under the Open Government Licence through http://environment.data.gov.uk/ds/survey

The files are downloaded as zipped archives containing ASCII format files. To create a mosaic from them I have created a script using:

  1. The archive reader functionality from TuiView, which can get a list of files within a zip archive and convert them to paths which can be read with GDAL using a virtual filesytem as described in an earlier post
  2. The ‘createImageMosaic’ function in RSGISLib to create the mosaic, changing the no data value from -9999 to 0
  3. The ‘assignProj’ function in RSGISLib to assign the correct projection to the mosaic
  4. The ‘popImageStats’ function in RSGISLib to calculate stats and overviews for fast display in TuiView and other programs

The script mosaic_ea_lidar.py is available to download and is run using:

python mosaic_ea_lidar.py -o sx36_dsm_mosaic.kea \
           ~/Downloads/LIDAR-DSM-1M-SX36.zip

Multiple zip files can be passed in at once to be added to the same mosaic. The output format is set based on the output file extension.

A similar script using using GRASS through the ARSF DEM Scripts is also available to download from here. Usage is the same.

Creating a DEM from LiDAR data using the ARSF DEM Scripts

The Airborne Survey and Research Facility Data Analysis Node (ARSF-DAN) based at Plymouth Marine Laboratory (PML) process airborne hyperspectral and lidar data acquired by the ARSF. As part of the LiDAR pre-processing a Digital Surface Model (DSM) is produced from discrete lidar returns, patched with a lower resolution DSM (normally ASTER) suitable for use in APL for hyperspectral data processing. The scripts used to produce these DSM use GRASS. Updated versions, which use the GRASS Python bindings, have recently been made available on GitHub under a GPLv3 license:

https://github.com/pmlrsg/arsf_dem_scripts

Installation

There are two main pre-requisites for the ARSF DEM Scripts: GRASS and the open source LAStools (namely las2txt), both are available under Windows, Linux and OS X. Once these are installed, download the scripts from GitHub (direct link) and install using:

python setup.py install

As the scripts use the GRASS Python bindings they need to be run using the same version of Python used by GRASS, this will likely be Python 2.7. If you have Python 3 installed as your default Python you should be able to specify they use Python 2.7 by installing using:

python2 setup.py install

For more detailed installation instructions see here.

Create a DSM Mosaic

To create a mosaic of all LAS files the following command is used:

create_dem_from_lidar.py --in_projection UTM33N \
                      --outdem EUFAR11_02-2011-187_dsm.dem \
                      las1.2

This will:

  1. Set up a GRASS database.
  2. For each line, convert the LAS file to a temporary text file using las2txt, keeping only the first returns and dropping points flagged as noise (class 7).
  3. Import the text file into GRASS using r.in.xyz.
  4. Patch all LAS files together using r.patch.
  5. Export the mosaic using r.out.gdal.
  6. Remove the GRASS database (unless –keepgrassdb is specified) and any other temp files created.

The library supports horizontal and vertical transforms by setting the ‘–out_projection’ flag. For transforms from British National Grid some additional files are required to ensure and accurate transform.

  • The OSTN02 transform file, which can be downloaded from Ordnance Survey for horizontal transforms.
  • A vertical difference file between the WGS-84 ellipsoid and the Ordnance Survey datum – ARSF can provide a copy of this ready for use in the DEM scripts.

The location of these needs to be set in the arsf_dem.cfg file. This is installed with the library but can be overridden by placing a copy in the current working directory (using the current name) or the home folder (stored with the name ‘.arsf_dem’ or ‘.arsf_dem.cfg’ so it is hidden).

DSM / DTM Utility Scripts

In addition to the scripts used as part of the standard ARSF processing, based on GRASS, there are two utility scripts to create a DSM and Digital Terrain Model (DTM). These two scripts (las_to_dsm and las_to_dtm) provide a common interface to a number of open source (e.g., SPDLib, FUSION) and paid packages (e.g., LAStools). If SPDLib is installed a DTM/DSM can be created using:

las_to_dsm -o LDR-EUFAR11_02-2011-187-01_spdlib_dsm.tif \
           --hillshade LDR-EUFAR11_02-2011-187-01_spdlib_dsm_hillshade.tif \
           --projection UTM33N  \
           --method SPDLib \
           LDR-EUFAR11_02-2011-187-01.LAS

las_to_dtm -o LDR-EUFAR11_02-2011-187-01_spdlib_dtm.tif \
           --hillshade LDR-EUFAR11_02-2011-187-01_spdlib_dtm_hillshade.tif \
           --projection UTM33N  \
           --method SPDLib \
           LDR-EUFAR11_02-2011-187-01.LAS

Note, these utility scripts create a DSM/DTM using the default settings. For more control over output accessing the programs directly is advised. See previous post for an example of creating a DSM/DTM using SPDLib.

For more details on the use of the ARSF DEM scripts, including creating patched DSM for use in APL see the tutorial.

Archived LiDAR data from ARSF flights is available to download from NEODC. Registered users can apply for access to the ARSF archive here.

Import ASCII format LiDAR data to SPDLib

It is common to get LiDAR data (from airborne or terrestrial systems) in ASCII format. However, the format of data within the file (which data is in which column, delimiter etc.,) often varies. To account for these differences SPDLib uses an XML scheme to define which data is in which column and the delimiter. There are some example schemas provided with in the ‘schemas’ folder with the SPDLib source

As an example to demonstrate reading an ASCII file into SPDLib, LiDAR data from NERC’s Airborne Research & Survey Facility (ARSF) is available to download from http://neodc.nerc.ac.uk. You just need to sign up for a NEODC account and then apply for access to the ARSF data.

The ASCII lidar files supplied by ARSF are space separated and contain the following columns:

Time, Easting, Northing, elevation, intensity, classification, return number, number of returns for given pulse, scan angle rank

Note: the ASCII files are only used as an example here, ARSF also supply data in LAS format which is recommended for importing into SPDLib

To import the points with associated intensity and classification the following schema is used:

<?xml version="1.0" encoding="UTF-8" ?>
<line delimiter=" " comment="#" ignorelines="0" >
    <field name="X" type="spd_double" index="1" />
    <field name="Y" type="spd_double" index="2" />
    <field name="Z" type="spd_float" index="3" />
    <field name="AMPLITUDE_RETURN" type="spd_uint" index="4" />
    <field name="CLASSIFICATION" type="spd_uint" index="5" />
</line>

This is saved as ‘arsf_ascii_lidar.xml’ and is passed into SPDLib using the ‘–schema’ flag:

spdtranslate -i LDR-BAS13_01-2014-063-20.txt \
             --if ASCII --schema arsf_ascii_lidar.xml \
             -o LDR-BAS13_01-2014-063-20.spd \
             --of SPD \
             -b 2

Create a DTM and DSM from LAS Format LiDAR data using SPDLib

A common product of LiDAR data is high resolution Digital Elevation Models (DEM) which are a raster (gridded) product. There are two types of DEM: a Digital Terrain Model (DTM) is a model of the bare earth and doesn’t contain trees or buildings, a Digital Surface Model (DSM) is a model of the surface which includes the top of buildings and trees etc. The difference between these can provide the height of trees and buildings.

The Sorted Pulse Data (SPD) library is a format for storing, and a set of tools for processing, full waveform and discrete return LiDAR data. More details on SPDLib are available from spdlib.org and in the following publications:

  • Bunting, P.J., Armston, J., Lucas, R. M., Clewley, D. 2013. Sorted pulse data (SPD) library. Part I: A generic file format for LiDAR data from pulsed laser systems in terrestrial environments. Computers & Geosciences, 56, pp.197–206. (link)
  • Bunting, P.J., Armston, J., Clewley, D., Lucas, R. M. 2013. Sorted pulse data (SPD) – Part II: A processing framework for LiDAR data from pulsed laser systems in terrestrial environments. Computers & geosciences, 56, pp.207–215. (link)

For Linux the software can be installed through conda. Once anaconda / miniconda has been installed (Python 3.5) SPDLib and the prerequisites can be installed using:

conda create -n spdlib_env \
   -c conda-forge -c rios -c osgeo \
   spdlib spd3dpointsviewer tuiview
source activate spdlib_env

To create the DTM and DSM the following steps are required.

  1. Convert to SPD Format
  2. Assuming discrete LiDAR data in LAS 1.2 format:

    spdtranslate --if LAS --of SPD -b 10 -x LAST_RETURN \
    -i LiDAR.las -o LiDAR_10m.spd
    

    The spdtranslate command can also be used with data in ASCII format. For full waveform data in LAS 1.3 format a separate script is needed – there will be more details on working with waveform data in SPDLib in a future post.

    If the LAS file doesn’t have the projection defined properly you can specify it by setting –input_proj and –output_proj to point to WKT file with the correct projection.

    Using the -b option creates a spatially indexed point cloud, which is needed for display and many of the other algorithms in SPDLib. In this example a bin size of 10 m is used.

  3. Classify ground returns
  4. There are multiple algorithms in SPDLib to classify ground returns, one of the recommenced algorithms is a progressive morphology filter [1]. This is available through the command spdpmfgrd.

    spdpmfgrd -r 50 --overlap 10 --initelev 0.1 --maxfilter 14 -b 0.5 \
    -i LiDAR_10m.spd -o LiDAR_10m_pmfgrd.spd
    

    To view the classified points open the file in SPDPoints viewer, and select ‘Classification’ as the point colour.

    SPDPointsViewer_ground_classification

    Note – you might have problems running SPDPointsViewer if you’re running Linux through VirtualBox.

    Within SPDLib there is also an implementation of the multi-scale curvature algorithm [2], created at the US Forest Service. This does a good job at classifying ground returns under a forest canopy while retaining the terrain but it does not differentiate the buildings.

    spdmccgrd -r 50 --overlap 10 -i LiDAR_10m.spd -o LiDAR_10m_mccgrd.spd
    

    More details on both these algorithms and the required parameters are available in the SPDLib Documentation and their respective references

  5. Interpolate to DTM and DSM
  6. Once the points have been classified they need to be interpolated to create a raster DEM. A key parameter is the resolution of the raster which is generated (-b) which needs to be a whole number multiple of the SPD index. For example, if the SPD file has a bin size of 10 m then the the output raster file resolution can be 1, 2 or 5 m but not 3 m. There are different interpolation algorithms available in SPDLib, the Natural Neighbour algorithm is recommended.

    # DTM
    spdinterp --dtm --topo -r 50 --overlap 10 --in NATURAL_NEIGHBOR \
    -f GTiff -b 1 -i LiDAR_10m_pmfgrd.spd -o LiDAR_1m_dtm.tif
    
    # DSM
    spdinterp --dsm --topo -r 50 --overlap 10 --in NATURAL_NEIGHBOR \
    -f GTiff -b 1 -i LiDAR_10m_pmfgrd.spd -o LiDAR_1m_dsm.tif
    

    The DTM and DSM can be opened in TuiView. To see the differences you can create a colour composite and look at a profile (the DSM and DTM will be shown in a different colour)

    # Create virtual raster stack
    gdalbuildvrt -separate LiDAR_1m_dtm_dms_stack.vrt \
        LiDAR_1m_dtm.tif LiDAR_1m_dsm.tif
    
    # Open composite in TuiView
    tuiview --rgb -b 2,1,1 --stddev LiDAR_1m_dtm_dms_stack.vrt
    

    DTM_DSM_screengrab

    You can also create a hillshade image using GDAL:

    gdaldem hillshade -of GTiff LiDAR_1m_dtm.tif \
    LiDAR_1m_dtm_hillshade.tif
    

    And view in TuiView:

    Hillshade_screengrab

    Using the spdinterp command it is also possible to interpolate a canopy height model (CHM) to provide the height of a forest canopy using –chm.

    This post was based on the SPDLib documentation written by Pete Bunting. A PDF of the documentation and example datasets can be downloaded from bitbucket.org/petebunting/spdlib-documentation/

    [1] Zhang, K., Chen, S., Whitman, D., Shyu, M., Yan, J., Zhang, C., 2003. A progressive morphological filter for removing nonground measurements from airborne LIDAR data. IEEE Transactions on Geoscience and Remote Sensing 41 (4), pp. 872–882.
    [2] Evans, J. S., Hudak, A. T., 2007. A multiscale curvature algorithm for classifying discrete return lidar in forested environments. IEEE Transactions on Geoscience and Remote Sensing 45 (4), pp. 1029–1038.