QGIS GPG issue

Trying to get QGIS 3.8 to install on Ubuntu 19.04 caused some issues. I’m not sure if the following fix is a good way to go, but it allowed the install to proceed. And I kinda trust them.

With the relevant sources added to /etc/apt/sources.list

deb https://qgis.org/ubuntu disco main
deb-src https://qgis.org/ubuntu disco main

I tried to pull the 2019 (and 2017) gpg keys

wget -O - https://qgis.org/downloads/qgis-2019.gpg.key | gpg --import
gpg --fingerprint 51F523511C7028C3

These kept coming back with (unknown) so that they wouldn’t be trusted and therefore the software (I guess) wouldn’t be downloaded.

So I looked online and found this (https://trog.qgl.org/20091030/troubleshooting-gnupg-gpg-no-ultimately-trusted-keys-found/)

Following those instructions I edited the key to make it trusted:

1) gpg --edit-key [your key id]
2) type ‘trust’ to change the owner trust
3) select option 3, “I trust marginally”
4) type ‘quit’

Like I say, that’s probably a totally bad thing to do, but I needed the software and it got it installed.

Tagged ,

Allow Ubuntu to find scanner

I had an issue with Ubuntu 16.04 and 18.04 not being able to scan using my multi-functional device. Turns out I needed to install libusb….

sudo apt install libusb-0.1-4
scanimage -L

Running the scanimage command should return a message identifying your scanner.

Tagged , ,

Count files in a folder

A quick BASH command to count the number of files in a folder with a specific search term:

ls -d *searchterm* | wc -l

So for example, count all the shapefiles in a folder called spatialdata in your home folder:

~$ pwd
ls -d spatialdata/*shp* | wc -l

Zipping shapefiles using Bash

Using bash, the following command will get the shapefile name for each .shp in a folder, strip the shp at the end of the name, and then zip all the shapefile component files into a zip archive in the zips sub-directory.


for file in *.shp; do ext=${file##*.}; fname=`basename $file $ext`; zip zips/$fname.zip $fname*; done

Notes on processing large files with GDAL


I have recently had a project where I was provided with a reasonable number of large aerial image files. These were 600-800MB each and there were 10-15 of them. I needed to cut out the area of interest (which covered some part of all of the images), mosaic the images into a single representation of the site and find a way to share the data with non-specialists.

The workflow in my head was Clip – Mosaic – Webmap

Things to note

The images were too slow to render for me to mess about trying to do this using a GUI system, and the GDAL functions in QGIS3 (for me at least) don’t seem to be up and running properly (milage may vary).

Certain things that I have found when doing this in GDAL are that it’s best to define a data type for the bands, and to set the data structure as cloud optimised. The easiest way to create cloud optimised geotiffs is to use the following:

gdal_translate in.tif out.tif -co TILED=YES -co COPY_SRC_OVERVIEWS=YES -co COMPRESS=DEFLATE

Initially I was going to use gdal_merge.py to mosaic the clipped images as that is what is used in QGIS. However, gdal_merge.py reads everything into memory and I quickly found out that 16GB of RAM wasn’t enough as my swap partition started to be used and the whole thing ground to a halt.

The trick is to use the CPU if you can. First make a virtual raster using gdalbuildvrt, and then use gdal_translate (which utilises the CPU) to change the vrt file to whatever format you want (e.g. cloud optimised geotiff). This was actually really fast, and used the structure found in the following commands:

gdalbuildvrt output.vrt /path/to/folder/of/*.tif
gdal_translate -of GTiff output.vrt mosaic.tif

This ends up with an image that will be relatively large (1.2 GB in my case) which is still tricky to share. But fear not! We can use gdal2tiles.py to create a whole load of tiles and automatically generate some leaflet code to allow you to see the data. Then it’s just a question of moving that folder onto a web server and sending your contacts the link.

A quick thing to note/remember is that the webmap zoom factors relate to the following:

  • 0 represents the whole world (1:500,000,000)
  • 19 is very close up (1:1000)


The workflow that I used in the end was as follows:

First crop each aerial image to the site shapefile

cd ${maindir}

for f in *.tif

gdalwarp -ot Byte -of GTiff -cutline siteBoundary.shp 
-crop_to_cutline -dstnodata 0.0 -overwrite ${f} 
outputfolder/${f}_cropped.tif-co TILED=YES -co COPY_SRC_OVERVIEWS=YES

Then build the virtual image

gdalbuildvrt output.vrt /path/to/folder/of/*.tif

Output the virtual image to a cloud optimised geotiff

gdal_translate -ot Byte -of GTiff -co TILED=YES 
-co COPY_SRC_OVERVIEWS=YES -co COMPRESS=DEFLATE output.vrt mosaic.tif

Create the web tiles and map file using the following command

gdal2tiles.py -s EPSG:27700 -z 11-19 mosaic.tif




Tagged , , ,