Category Archives: RS

AWS & satellite data: a primer

Massive thanks to Annekatrien for her original post on this topic which should be available using this link: http://www.digital-geography.com/accessing-landsat-and-sentinel-2-on-amazon-web-services/

This is mainly a post to remind me of the steps I undertook (following the blog post above) to access Sentinel 2 data on AWS (developed and managed by Sinergise).

I use an Ubuntu 16.04 Mate VM with shared folders to the host machine as my processing/dev box. I use the Anaconda Python 3.5 distribution and the following has been tested and works on that system.

Python setup

I installed the AWS commandline packages using the conda-forge repository
conda install -c conda-forge awscli=1.10.44

and tested the installation using
aws help

which returned the man pages.

AWS Setup

I am currently on the free tier (https://aws.amazon.com/free/) which is fine for what is listed in this post.

Sign in and go to the Amazon Console.

Go to the Services tab at the top of the console then select:
Security&Identity > IAM > Users > Create a new user

Ensure that the check box to generate a new key is ticked, add your user name and click Create. Download your user credentials BEFORE clicking Close.

To access the data on Amazon S3, change the permissions using the Console by clicking on
Services > Security&Identity > IAM > Users

again and then clicking on the name of your new user. This will open a summary page where you can manage a user. Click on
Permissions > Attach policy

and choose AmazonEC2FullAccess and AmazonS3FullAccess (and any others you want) before clicking Attach. The IAM user should now be set up.

AWSCLI Config

In the bash terminal, type:
aws configure

and type in your access key ID and secret access key from the file downloaded earlier, when prompted. For region use the appropriate value from the table below based on the data you want access to. For output format, use json.

Landsat Sentinel-2
us-west-2 eu-central-1

Accessing data

To list the data available for Landsat use the following Terminal commands:
aws s3 ls landsat-pds

and for Sentinel use
aws s3 ls sentinel-s2-l1c

As Annakatrien says in her blog post, ‘to go deeper into the storage, and see separate images, you have to know what you’re looking for’.

In general you will use the following structure for landsat:
aws s3 ls landsat-pds/L8/<path>/<row>/<image name>/

and this for Sentinel-2:
aws s3 ls sentinel-s2-l1c/tiles/<UTM zone number>/<grid number>/<subgrid number>/<year>/<month>/<day>/

To download an image use one of the following commands:
aws s3 cp s3://landsat-pds/L8/201/024/LC82015242016111LGN00/ ~/Downloads/ --recursive
aws s3 cp s3://sentinel-s2-l1c/tiles/30/U/YC/2016/4/13/0/ ~/Downloads/ --recursive

More effort is then required to sort the downloads (if more than one image at a time) into a file structure on the local computer, as all images are download to the same directory.

Geoserver notes

This post is mainly for my notes. On a clean install of Ubuntu 14.04 server I went through the following.

1) Java

sudo apt-get update 

Then, check if Java is already installed:

java -version 

If it returns “The program java can be found in the following packages”, Java hasn’t been installed yet, so execute the following to install the standard runtime environment:

sudo apt-get install default-jre 

2) Geoserver

These installation notes are based on those found on the Geoserver website.

  • Select the Stable version of GeoServer to download.

Select Platform Independent Binary on the download page.

  • Download the archive and unpack to the directory where you would like the program to be located (assumed to be /usr/local/geoserver for these notes).
wget http://sourceforge.net/projects/geoserver/files/GeoServer/2.7.0/geoserver-2.7.0-bin.zip
sudo mv ~/geoserver-2.7.0-bin.zip .
sudo unzip geoserver-2.7.0-bin.zip
  • Add an environment variable to save the location of GeoServer by typing the following command:
echo "export GEOSERVER_HOME=/usr/local/geoserver" >> ~/.profile
. ~/.profile
  • Make yourself the owner of the geoserver folder using the following command replacing USER_NAME with your own username :
sudo chown -R USER_NAME /usr/local/geoserver/
  • Start GeoServer by changing into the directory geoserver/bin and executing the startup.shscript:
cd geoserver/bin
sh startup.sh

If you see the GeoServer logo, then GeoServer is successfully installed. If you get a message about Java then it may be that you need to run the following:

echo "export JAVA_HOME="/usr/lib/jvm/java-6-sun-1.6.0.07" >> ~/.profile
. ~/.profile
Tagged , ,

ARCSI

At the end of 2014 I used ARCSI (Atmospheric and Radiometric Correction of Satellite Imagery) for the first time. ARCSI is an open software project that provides a command line tool for the atmospheric correction of Earth Observation imagery. It provides a pretty much automatic way of running 6S.

Details of the software can be found here: https://bitbucket.org/petebunting/arcsi

At tutorial can be found here: https://spectraldifferences.wordpress.com/2014/05/27/arcsi/ 

This post is based largely on the tutorial linked to above but I also try to pull together some of the tips I read about in the help forums here: https://groups.google.com/forum/#!forum/rsgislib-support

Set up

All of the following instructions are run on an installation of the Ubuntu 14.04 operating system.

First, download the Anaconda 3.4 python distribution: http://docs.continuum.io/anaconda/

Install the Anaconda version of Python and the conda package manager using the command line. Once the bash script has run then install ARCSI and TuiView (a fast image viewer) using conda. If a warning comes up regarding gdal then install the gdal-data package.

bash Anaconda3-2.1.0-Linux-x86_64.sh

conda install -c https://conda.binstar.org/osgeo arcsi tuiview
conda update rsgislib arcsi

conda install -c jjhelmus gdal-data

For a successful installation I needed to change the default installation path for Anaconda from ~/anaconda3 to ~/anaconda

You should now be able to check your installation using the following command:


arcsi.py -h | less

To finalise the set up you need t point the GDAL driver path to the KEA drivers. Run the following to set up the path:


export GDAL_DRIVER_PATH=~/anaconda/gdalplugins:$GDAL_DRIVER_PATH

GDAL_DATA="/home/username/anaconda/share/gdal"

export GDAL_DATA

where username is the user account on the Ubuntu installation.

Run the code

When you run ARCSI, you’ll enter a command similar to the following:


arcsi.py -s ls8 -f KEA --stats -p RAD TOA SREF --aeropro NoAerosols --atmospro Tropical --aot 0.25 -o dir/to/outputs -i metadatafile_MTL.txt

To break this down a bit it first calls the arcsi.py script, passing to it the following parameters:

  • Sensor (-s) – landsat 8
  • Output image format (-f) – KEA
  • Parameters to compute (-p) – RAD Radiance conversion, TAO Top of atmosphere, SREF surface reflectance using 6S (for the full range please consult the official documentation)
  • Output directory (-o) – dir/to/outputs, into which the three computed outputs will be saved
  • Information (-i) – Landsat metadata file, using the format provided by the images available through Earth Explorer

 

Further information of all the parameters can be found in the help forums, the arcsi help command and in the code.

If an error is reported when you first run the ARCSI command, it might be that LIBGFORTRAN.SO.3 cannot be found. This will lead to the 6S model failing. You will need to install the required files using the following command:


sudo apt-get install libgfortran3

Dealing with outputs

The best way to view the output is to start the supplied viewer using the  ‘tuiview‘  command. This is an intuitive to use and very responsive viewer that handles the KEA format natively. To transform the KEA format outputs into a format readable by a wide range of GIS software, use the GDAL Translate command.


gdal_translate -of GTiff keafile.kea outputfile.tif

Tagged , , , ,

Remote Sensing is everywhere!

Although the ongoing UK winter floods and storms of late 2013 and early 2014 must be an ordeal for those who are experiencing them first hand in their homes and businesses, they have also been a great showcase for the power and benefits of remote sensing. All over Twitter, LinkedIn and other social media are examples of maps showing either satellite imagery, or the extent of the floods derived from satellite imagery. People who haven’t been aware or interested in climate dynamics are now talking about the jet stream, and the feedback loops between it and North Atlantic low pressure systems! New methods of visualising and disseminating information (I’m thinking JavaScript libraries and web-mapping, specifically) that was created using atmospheric models, or derived from global satellite measurements, are helping inform and educate about the reasons behind this period of impressive weather.

But it isn’t just satellites that are getting press coverage. Land based remote sensing was mentioned on Radio 4’s PM programme on 11 Feb in an interview with the Coastal Processes Research Group (University of Plymouth) in the context of using laser scanning systems and video to monitor wave heights and to profile beaches. On the BBC website there are videos of flooded railway lines in the area around Windsor collected using unmanned aerial systems.

Remote sensing is becoming all pervasive as a method of rapidly collecting information across wide areas and quickly disseminating that out to the public. The general population may not even consciously register that this is the case, and for the correct information to be obtained, extracted and visualised in the most accessible and meaningful way there will be a continued requirement for well-trained RS experts.

Tagged

LiDAR processing

sudo apt-get install liblas-bin

Install the liblas library and you are good to go in terms of understanding what you have in your .las file.

A .las file is a standard binary format file containing LiDAR instrument data. LiDAR data provides a source of high-quality and very dense topographic data which is usually/often represented as an unstructured point cloud. The points will have an X, Y, Z coordinate to allow them to be placed in 3D space but can have much more information associated to them, such as return intensity, point classification and the return number. From this information it is possible to infer facts about the target being observed, such as whether it is vegetation and what the density of the vegetative canopy might be. The last return (or if there is only a single return) is usually taken to be the ground surface (or the roof of a building).

liblas provides access to some useful command line utilities:

1) lasinfo

lasinfo options:
  -h [ --help ]         produce help message
  -i [ --input ] arg    input LAS file
  -v [ --verbose ]      Verbose message output
  --no-vlrs             Don't show VLRs
  --no-schema           Don't show schema
  --no-check            Don't scan points
  --xml                 Output as XML
  -p [ --point ] arg    Display a point with a given id.

2) las2txt

las2txt options:
  -h [ --help ]         produce help message
  -i [ --input ] arg    input LAS file.
  -o [ --output ] arg   output text file.  Use 'stdout' if you want it written
                        to the standard output stream
  --parse arg           The '--parse txyz' flag specifies how to format each
                        each line of the ASCII file. For example, 'txyzia'
                        means that the first number of each line should be the
                        gpstime, the next three numbers should be the x, y, and
                        z coordinate, the next number should be the intensity
                        and the next number should be the scan angle.

                         The supported entries are:
                           x - x coordinate as a double
                           y - y coordinate as a double
                           z - z coordinate as a double
                           X - x coordinate as unscaled integer
                           Y - y coordinate as unscaled integer
                           Z - z coordinate as unscaled integer
                           a - scan angle
                           i - intensity
                           n - number of returns for given pulse
                           r - number of this return
                           c - classification number
                           C - classification name
                           u - user data
                           p - point source ID
                           e - edge of flight line
                           d - direction of scan flag
                           R - red channel of RGB color
                           G - green channel of RGB color
                           B - blue channel of RGB color
                           M - vertex index number

  --precision arg       The number of decimal places to use for x,y,z,[t]
                        output.
                         --precision 7 7 3
                         --precision 3 3 4 6
                        If you don't specify any precision, las2txt uses the
                        implicit values defined by the header's scale value
                        (and a precision of 8 is used for any time values.)
  --delimiter arg       The character to use for delimiting fields in the
                        output.
                         --delimiter ","
                         --delimiter ""
                         --delimiter " "
  --labels              Print row of header labels
  --header              Print header information
  -v [ --verbose ]      Verbose message output
  --xml                 Output as XML -- no formatting given by --parse is
                        respected in this case.
  --stdout              Output data to stdout

3) las2ogr

las2ogr options:
    -h print this message
    -i <infile>     input ASPRS LAS file
    -o <outfile>    output file
    -f <format>     OGR format for output file
    -formats        list supported OGR formats

Together these tools can help get the information contained in a las file into a GIS ready format to then be taken into a desktop software package such as SAGA or QGIS. Interpolation of the Z parameter for the relevant return number will then create an elevation model that can be used in subsequent analyses.

More information can be found here:  http://www.liblas.org/utilities/index.html

Tagged , ,