Do you also sometimes get maps which contain zero (0) rather than NULL (no data) in some parts of the map? This can be easily solved with “floodfilling”, even in a GIS.
My original map looks like this (here, Trentino elevation model):
Now what? In a paint software we would simply use bucket fill but what about GIS data? Well, we can do something similar using “clumping”. It requires a bit of computational time but works perfectly, even for large DEMs, e.g., all Italy at 20m resolution. Using the open source software GRASS GIS 7, we can compute all “clumps” (that are many for a floating point DEM!):
# first we set the computational region to the raster map:
g.region rast=pat_DTM_2008_derived_2m -p
r.clump pat_DTM_2008_derived_2m out=pat_DTM_2008_derived_2m_clump
The resulting clump map produced by r.clump is nicely colorized:
As we can see, the area of interest (province) is now surrounded by three clumps. With a simple map algebra statement (r.mapcalc or GUI calculator) we can create a MASK by assigning these outer boundary clumps to NULL and the other “good” clumps to 1:
We now activate this MASK and generate a copy of the original map into a new map name by using map algebra again (this just keeps the data matched by the MASK). Eventually we remove the MASK and verify the result:
# apply the mask
r.mask no_data_mask
# generate a copy of the DEM, filter on the fly
r.mapcalc "pat_DTM_2008_derived_2m_fixed = pat_DTM_2008_derived_2m"
# assign a nice color table
r.colors pat_DTM_2008_derived_2m_fixed color=srtmplus
# remove the MASK
r.mask -r
And the final DEM is now properly cleaned up in terms of NULL values (no data):
https://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.png00Markushttps://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.pngMarkus2014-09-02 20:54:412023-10-22 18:55:27Selective data removal in an elevation map by means of floodfilling
LiDAR point cloud data are commonly delivered in the ASPRS LAS format. The format is supported by libLAS, a BSD-licensed C++ library for reading/writing these data. GRASS GIS 7 supports the LAS format directly when built against libLAS (as the case for most binary packages being available for download).
In this exercise we will import a sample LAS data set covering a tiny area close to Raleigh, NC (USA), belonging to the North Carolina sample data set. Sample LAS data download: https://grass.osgeo.org/sampledata/north_carolina/ (25MB).
For a full exercise, we will, however, assume that no GRASS GIS location is ready so far (so: newbies are welcome!) and create a new one initially.
1. Having the LAS file: now what?
In the first place, check the metadata of the LAS file using the lasinfo command (comes with libLAS; here only parts of the output shown):
lasinfo points.las
---------------------------------------------------------
Header Summary
---------------------------------------------------------
Version: 1.2
Source ID: 0
Reserved: 0
Project ID/GUID: '00000000-0000-0000-0000-000000000000'
System ID: 'libLAS'
Generating Software: 'libLAS 1.2'
[...]
Spatial Reference:
None
[...]
---------------------------------------------------------
Point Inspection Summary
---------------------------------------------------------
Header Point Count: 1287775
Actual Point Count: 1287775
Minimum and Maximum Attributes (min,max)
---------------------------------------------------------
Min X, Y, Z: 6066629.86, 2190053.45, -3.60
Max X, Y, Z: 6070237.92, 2193507.74, 906.00
Bounding Box: 6066629.86, 2190053.45, 6070237.92, 2193507.74
Time: 0.000000, 0.000000
Return Number: 1, 3
Return Count: 1, 7
Flightline Edge: 0, 0
Intensity: 0, 256
Scan Direction Flag: 0, 0
Scan Angle Rank: 0, 0
Classification: 2, 7
Point Source Id: 0, 0
User Data: 0, 0
Minimum Color (RGB): 0 0 0
Maximum Color (RGB): 0 0 0
Number of Points by Return
---------------------------------------------------------
(1) 1225886 (2) 61430 (3) 459
Number of Returns by Pulse
---------------------------------------------------------
(1) 30877 (2) 153 (5) 1225886 (6) 30706 (7) 153
Point Classifications
---------------------------------------------------------
647337 Ground (2)
639673 Low Vegetation (3)
740 Building (6)
25 Low Point (noise) (7)
-------------------------------------------------------
0 withheld
0 keypoint
0 synthetic
-------------------------------------------------------
We see: no spatial reference system indicated!
Luckily we know from here that the projection is NAD83(HARN) / North Carolina, LCC 2SP metric, EPSG code 3358). Furthermore we see:
Number of Points by Return: 3 (i.e., first, mid, last)
Point Classifications: the points are already classified as “Ground” (class 2), “Low Vegetation” (3), “Building” (6), and Low Point (noise) (class 7). Something to play with later.
Time to create a GRASS GIS location and import the LAS file.
2. Creating a GRASS GIS location for the LAS file
Since we know the EPSG code of the projection, that’s an easy task. Please note that GRASS GIS can generate locations directly from SHAPE files (with .prj file), GeoTIFF and more.
We fire up GRASS GIS 7 and open the Location Wizard:
In the Location Wizard, we first define a name for the new location:
We select the “EPSG code” method for creating a new location:
You can search for “North Carolina” and select the EPSG code 3358 from the list:
Next summary should show up as follows (be sure to have the metric projection shown!):
With “Finish” you reach this notification (indeed, nothing to change! It is already fine):
Since we want to import the LAS file, no need to manually define any region extent here – just say “No”:
While we could import the data also into the PERMANENT mapset, we prefer to create an own mapset “lasdata” for our LAS data (once you reach hundreds of maps to manage, you will be happy about the concept of mapsets):
Voilà , we get back to the initial startup screen and can now start our GRASS GIS session with our “nc_nad83_lcc” location and “lasdata” mapset within the location: “Start GRASS”!
3. Import of the LAS file
When creating a new location from a GeoTIFF or SHAPE file (or other GDAL supported format), then the data set is imported right away. This is not the case for LAS files, also due to the fact that we can directly apply binning statistics during import of the LAS file (e.g. percentiles, min or max) and create a raster surface from the points right away rather than importing them as vector points.
3. a) Creating a raster surface from LAS during import
The LAS import into a raster surface is available through r.in.lidar:
First the LAS file needs to be selected and an output file name specified (in this example, we want to extract the 95th percentile as binning method, hence a reasonable map name):
In the “Statistic” tab, we select the “percentile” method along with 95 as value:
In the “Optional” tab we activate to extend the computational region from the LAS file and, since the spatial reference system metadata are lacking from the LAS file, also “override dataset projection” to use that predefined in the location. Finally, we define 5m as desired raster resolution for the resulting raster map:
Upon conpletion of the import/binning, the new raster elevation map is shown after zooming to the map (r.in.lidar -e … restores upon completion the previous region settings, hence we may have to zoom):
Now we can start to analyze or visually explore the imported LAS file.
4. Visual LiDAR data exploration
Using the wxNVIZ 3D viewer, we can easily fly through the new DEM. Switch in the Map Display to “3D view” (1). Note that the default rendering is initially done at low visual resolution for speed reasons. You can switch to “Rotate mode” as well to easily navigate with the mouse. In the “Data” tab (2) you can increase the visual resolution (3) to obtain a crisp view:
https://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.png00Markushttps://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.pngMarkus2014-02-23 22:45:082023-11-20 16:35:14Importing and visualizing LAS LiDAR files in GRASS GIS 7: r.in.lidar
Recently, in order to nicely plan ahead for a birthday lunch at the agriturismo Malga Brigolina (a farm-restaurant near Sopramonte di Trento, Italy, at 1,000m a.s.l. in the Southern Alps), friends of mine asked me the day before:
“Will the place be sunny at lunch time for making a nice walk?“
[well, the weather was close to clear sky conditions but mountains are high here and cast long shadows in winter time].
A rather easy task I thought, so I got my tools ready since that was an occasion to verify the predictions with some photos! Thanks to the new EU-DEM at 25m I was able to perform the computations right away in a metric system rather than dealing with degree in LatLong.
Direct sunlight can be assessed from the beam radiation map of GRASS GIS’ r.sun when running it for a specific day and time. But even easier, there is a new Addon for GRASS GIS 7 which calculates right away time series of insolation maps given start/stop timestamps and a time step: r.sun.hourly. This shrinks the overall effort to almost nothing.
Creating the direct sunlight maps
The first step is to calculate where direct sunlight reaches the ground. Here the input elevation map is the European “eu_dem_25”, while the output is the beam radiation for a certain day (15 Dec 2013 is DOY 349).
Important hint: the computational region must be large enough to east/south/west to capure the cast shadow effects of all relevant surrounding mountains.
I let calculations start at 8am and finish at 5pm which an hourly time step. The authors have kindly parallelized r.sun.hourly, so I let it run on four processors simultaneously to speed up:
# calculate DOY (day-of-year) from given date:
date -d 2013-12-15 +%j
349
# calculate beam radiation maps for a given time period
# (note: minutes are to be given in decimal time: 30min = 0.5)
r.sun.hourly elev_in=eu_dem_25 beam_rad=trento_beam_doy349 \
start_time=8 end_time=17 time_step=1 day=349 year=2013 \
nprocs=4
The ten resulting maps contain the beam radiation for each pixel considering the cast shadow effects of the pixel-surrounding mountains. However, the question was not to calculate irradiance raster maps in Wh/m^2 but simply “sun-yes” or “sun-no”. So a subsequent filtering had to be applied. Each beam radiation map was filtered: if pixel value equal to 0 then “sun-no”, otherwise “sun-yes” (what my friends wanted to achieve; effectively a conversion into a binary map).
Best done in a simple shell script loop:
for map in `g.mlist rast pattern="trento_beam_doy349*"` ; do
# rename current map to tmp
g.rename rast=$map,$map.tmp
# filter and save with original name
r.mapcalc "$map = if($map.tmp == 0, null(), 1)"
# colorize the binary map
echo "1 yellow" | r.colors $map rules=-
# remove cruft
g.remove rast=$map.tmp
done
As a result we got ten binary maps, ideal for using them as overlay with shaded DEMs or OpenStreetMap layers. The areas exposed to direct sunlight are shown in yellow.
Trento, direct sunlight, 15 Dec 2013 between 10am and 5pm (See here for creating an animated GIF). Quality reduced for this blog.
Time to look at some details: So, is Malga Brigolina in sunlight at lunch time?
Situation at 12pm (noon): predicted that the restaurant is still in shadow – confirmed in the photo:
(click to enlarge)
Situation at 1:30/2:00pm: sun is getting closer to the Malga, as confirmed in photo (note that the left photo is 20min ahead of the map). The small street in the right photo is still in the shadow as predicted in the map):
(click to enlarge)
Situation at 3:00pm: sun here and there at Malga Brigolina:
Sunlight map blended with OpenStreetmap layer (r.blend + r.composite) and draped over DEM in wxNVIZ of GRASS GIS 7 (click to enlarge). The sunday walk path around Malga Brigolina is the blue/red vector line shown in the view center.
Situation at 4:00pm: we are close to sunset in Trentino… view towards the Rotaliana (Mezzocorona, S. Michele all’Adige), last sunlit summits also seen in photo:
(click to enlarge)
Outcome
The resulting sunlight/shadow map appear to match nicely realty. Perhaps r.sun.hourly should get an additional flag to generate the binary “sun-yes” – “sun-no” maps directly.
Direct sunlight zones (yellow, 15 Dec 2013, 3pm): Trento with Monte Bondone, Paganella, Marzolan, Lago di Caldonazzo, Lago di Levico and surroundings (click to enlarge)
GRASS GIS usage note
The wxGUI settings were as simple as this (note the transparency values for the various layers):
https://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.png00Markushttps://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.pngMarkus2013-12-27 21:52:402023-11-20 10:13:45Will the sun shine on us?
To warm up, I registered the “normally” shaded DEM (previously generated with gdaldem) with r.external in a GRASS GIS 7 location (EPSG 3035, LAEA) and overlayed the OpenStreetMap layer using WMS with GRASS 7’s r.in.wms. An easy task thanks to University of Heidelberg’s www.osm-wms.de. Indeed, they offer a similar shading via WMS, however, in the screenshot below you see the new EU data being used for controlling the light on our own:
OpenStreetMap shaded with EU DEM 25m (click to enlarge)
Next item: trying r.shaded.pca… It supports multi-core calculation and the possibility to strengthen the effects through z-rescaling. In my example, I used:
https://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.png00Markushttps://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.pngMarkus2013-12-15 00:34:052023-11-20 10:13:45Using the 25m EU-DEM for shading OpenStreetMap layers
EU DEM 25m upper Garda Lake area with Riva del Garda and Arco (Italy). 3D view in wxNVIZ – GRASS GIS 7
The 25m European Digital Elevation Model (EU-DEM, Version 1) is a Digital Surface Model (DSM) representing the first surface as illuminated by the sensors:
EU DEM Rotaliana with Mezzocorona and S. Michele (Italy). Produced using Copernicus data and information funded by the European Union – EU-DEM layers.
Its elevations were captured at 1 arc second postings (2.78E-4 degrees). The tiles are provided at 25m resolution in EU-LAEA (EPSG. 3035) projection, temporal coverage: 2000, published in Oct 2013. It is a realisation of the Copernicus programme, managed by the European Commission, DG Enterprise and Industry. Metadata are provided here. According to their “Methodology” page it is a hybrid product based on SRTM and ASTER GDEM data fused by a weighted averaging approach and it has been generated as a contiguous dataset divided into 1 degree by 1 degree tiles, corresponding to the SRTM naming convention. In addition to the DEM data, a colour shaded relief image over Europe is provided.
From the metadata page: “The EU-DEM data are provided as is, i.e. without a formal validation yet. An independent statistical validation is scheduled as part of the GIO land monitoring service activities, and will be made available in the course of 2014.”
Data download
Note that the GeoTIFF files are of major size, up to 5 GB:
Hint for GRASS GIS users: instead of importing the data, you can use the r.external command to register the GeoTIFF DEM file instead of imorting it within a EU LAEA projected location.
https://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.png00Markushttps://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.pngMarkus2013-12-01 16:51:132023-11-20 16:44:50EU-DEM: new Digital Surface Model at 25m
USGS has published a new nice data set called the “Global Multi-resolution Terrain Elevation Data 2010” (GMTED2010). It is offered at three different resolutions (approximately 1,000, 500, and 250 meters).
Example (MEA = mean dataset) Trento – Garda Lake – Verona area (Northern Italy):
The 250m product looks quite smooth -a nice new DEM product…
https://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.png00Markushttps://neteler.org/wp-content/uploads/2024/01/wg_neteler_logo.pngMarkus2011-12-01 16:12:002023-10-22 18:42:34New data set: Global Multi-resolution Terrain Elevation Data 2010
The recent publication of the ASTER GDEM (30m) is a great step towards a worldwide high resolution elevation model. I have done some test in the Southern Alps around Trento which is among the most complex terrains in Europe. The scope was to calculate a difference map to the local high resolution DEM.
The steps
1. Mosaicking and reprojection to UTM32/WGS84 of ASTER GDEM tiles of the area (using gdalwarp). 2. Import of the ASTER GDEM map into GRASS (r.in.gdal) 3. Creation of a difference map to the provincial DEM (r.mapcalc, r.colors) 4. Histogram and univariate statistics (d.histogram, r.univar)
The results (click for higher resolution or download slides as PDF):
Comments
The hydrological pattern appears to coincide with the provincial rivers map. It can be seen that lakes weren’t masked during the ASTER DEM preparation, howver, these could be identified for many areas in the world with OpenStreetMap layers. Despite spikes (including unexpected craters), the overall quality appears to be acceptable for this first version of ASTER GDEM – the standard deviation is 18m for the test area with outliers predominantely found in the areas of complex terrain rather than in the valley floors. A shift of -2.3m is also observed (reprojection artifact?). It will be worth to check if SRTM data (90m) could help to identify and remove the spikes from the DEM. More hopefully in the near future…
Acknowledgments: ASTER GDEM is a product of METI and NASA.