AWS Serverless Application which expose a REST API to extract digital elevation by european latitude/longitude using EU DEM 1.1.
There is a public demo of application:
https://dp6yf9dpul.execute-api.eu-west-1.amazonaws.com/Prod/public/{lat}/{lng}
You can find a location in Google Terrain: https://www.google.it/maps/@65.5427462,-18.2604901,14z/data=!5m1!1e4
and test the API: https://dp6yf9dpul.execute-api.eu-west-1.amazonaws.com/Prod/public/65.5427462/-18.2604901
There is a throttle configuration of 1 request per 20 seconds
EU DEM 1.1 (Digital Elevation Model) are 27 DEM Geotiffs 1000x1000 KM which cover the Europe with a space resolution 25 meters and a vertical precision +/- 7 meters, provided by Copernicus produced with funding by the European Union and downlodable here: EU DEM 1.1.
EU DEM Geotiffs elevation value are more and more precise and stable then GPS elevation value.
GDAL Library tool gdallocationinfo
you can query these Geottiffs by latitude and longitude coordinated to extract the digital elevation. For instance:
gdallocationinfo eu_dem_v11_E30N40.TIF -wgs84 -18.278987 65.550225 -xml
<Report pixel="1910" line="817">
<BandReport band="1">
<Value>1315.36547851562</Value>
</BandReport>
</Report>
The total size of 27 EU DEM Geotiffs is 46 GB, some Geotiff size is more then 4.6 GB.
In order to use geotiffs in the cloud, Cloud Optimized Geotiff permit to query geotiff remotely downloading the only pieces of geotiff which contain the location requested, without download entirely TIF file, using HTTP Range Header requests.
AWS Simple Storage Service (S3) support HTTP Range requests and GDAL commands support S3 as virtual remote filesystem, so you can query a geotiff stored into a S3 bucket without download entirely Geotiff:
gdallocationinfo /vsis3/<S3_BUCKET>/<S3_GEOTIFF_KEY> -wgs84 -18.278987 65.550225 -xml
The Servlerless Application geotiffdem
is composed by:
- 27 Cloud Optimized Geotiff uploaded into a S3 Bucket
- a AWS Lambda Layer containing
gdallocationinfo
linux command - a AWS Lambda Function based on layer expose a REST API
/{lat}/{lng}
and/public/{lat}/{lng}
returning the elevation callinggdallocationinfo
on the correct Geotiff
In order to reduce costs, geotiffs are uploaded into an S3 Bucket in One Region-IA storage class. You can use Standard storage class for High Availability constraint or you have a lot o requests, but you spend more.
Follow the Cloud Optimized Geotiff developer guide, the 27 geotiffs has been optimize with this gdal_translate
command:
gdal_translate eu_dem_v11_E00N20.TIF optimized/eu_dem_v11_E00N20.TIF -co TILED=YES -co COPY_SRC_OVERVIEWS=YES -co COMPRESS=DEFLATE
This optimization reduces the total size form 46 GB to 30 GB and reduces the amount of byte returned bye HTTP Requests of a gdallocationinfo
query. cog_optimizer.sh
is a bash script optimize whicj all tiffs into a directory.
gdalinfo
command calculated the bounds (four edge locations) of each geotiff. So the bounds are loaded into the dictionary geotiffdem/src/lib/dictionary.js
queried by Lambda Function to find the Geotiff file containing latitude and longiture requested.
geotiffdem/src/lib/dictionary.js
is generated one time after geotiffs are uploaded on S3 by the node script geotiffdem/src/build-tiff-dictionary.js
via the command
cd geotiffdem
npm run dictionary
Before, you have to configure you credentials of a user able to read S3 bucket using aws configure
No predefined lambda runtimes doesn't provide GDAL commands.
In order to provide these commands to your Lambda we have to create an AWS Lambda Layer with GDAL's libraries and binaries.
To do this, you have to follow these general istructions, valid for any kind of Linux commands you want use in your lambda.
- create a EC2 Amazon Linux 2 instance (which is the base of all Lambda runtime)
- install the packages you need inside the layer
- create a layer directory with
bin
andlib
subdirs - discover the library dependencies of the command using
ldd <ABS_PATH_COMMAND>
- copy the library dependencies to
lib
subdir - copy the commands into
bin
subdir - zip
lib
andbin
directories - copy into a S3 bucket
- publish the lambda layer
Specifying the layer ARN in Layers
of your lambda configuration in SAM Template tempmate.yml
, you are able to invoke the commands via /opt/bin/<COMMAND>
indipendently the runtime you choose.
In this project we need a layer with gdallocationinfo
and gdalinfo
command so:
- install from yum
make automake gcc gcc-c++ libcurl-devel sqlite-devel
- download and install from sources
proj-6.0.0
(dependency of GDAL) - download and install from sources
gdal-3.1.2
- discover the lib dependencies of
gdallocationinfo
andgdalinfo
usingldd
- copy dependencies in
lib
e commands inbin
- publish the layer
You can find the complete script here: ec2-publish-gdal-layer.sh
This GDAL Lambda Layer cam be used in combination of all Lambda runtimes based on Amazon Linux 2: NodeJs, Python, Java, Ruby, Go, .NET
The Lambda Function getDemFunction is based on runtime nodejs12.x and use the GDAL Lambda Layer published for this project.
This lambda gets from HTTP request latitude and longitude returning as HTTP response the digital elevation in this way:
- query
dictionary.js
by latitude and longitude in order to find the geotiff file name containing the latitude longitude coordinates - if geotiff is found, query the geotiff in S3 via
gdallocationinfo
- return an HTTP response containing the digital elevation
The source code of lambda handler: geotiffdem/src/handlers/get-dem.js
The lambda function is exposed via two AWS Api Gateway:
/{lat}/{lng}
api API key protected with Throttle configuration:- Burst Limit: 100 requests per second
- Rate Limit: 10 requests per second
To invoke the API REST you have to specify the API_KEY into
x-api-key
header:
curl -H "x-api-key:<API_KEY>" https://<AWS_GATEWAY_ID>.execute-api.<AWS_REGION>.amazonaws.com/Prod/<LATITUDE>/<LONGITUDE>
/public/{lat}/{lng}
api without API key protected with Throttle configuration:- Burst Limit: 1 requests per second
- Rate Limit: 1 requests per 20 seconds
curl https://<AWS_GATEWAY_ID>.execute-api.<AWS_REGION>.amazonaws.com/Prod/public <LATITUDE>/<LONGITUDE>
If these limits are exceeded, an Too Many Requests
error is raised.
Lambda, API Gateway configurations are defined into SAM Template tamplate.yml
. To deploy into a AWS region:
sam build
sam deploy
To install and use SAM visit SAM Developer Guide
The duration of lambda call is constantly 800 ms, except the first invocation when the lamba is initialized where the duration is 3000 ms (cold start)
The most expensive instruction is the gdallocationinfo
execution on S3 geotiff where the duration is 650 ms circa.
Lambda Runtime is nodejs12.x
and Lambda Memory Size allocated 128 Mb
In Servless Application
you pay what you use, so you pay:
- per requests (Lambda and Api Gateway)
- per GB allocated in S3
Using AWS Pricing Calculator 1,000,000 requests per Year (83,000 requests per Month):
-
S3
30 GB x 0.01 USD = 0.30 USD (S3 One Zone-IA storage cost) 80,000 GET requests for S3 One Zone-IA Storage x 0.000001 USD per request = 0.08 USD (S3 One Zone-IA GET requests cost) 0.30 USD + 0.08 USD = 0.38 USD (Total S3 One Zone-IA Storage and other costs) S3 One Zone - Infrequent Access (S3 One Zone-IA) cost (monthly): 0.38 USD
Data transfer between Lambda and S3 is free
-
API Gateway
3.50 USD per million requests API Gateway Costs (Montly): 0.29 USD ( 3.50 USD / 12 months)
-
AWS Lambda Function
RoundUp (800) = 800 Duration rounded to nearest 100ms 1,000,000 requests x 800 ms x 0.001 ms to sec conversion factor = 800,000.00 total compute (seconds) 0.125 GB x 800,000.00 seconds = 100,000.00 total compute (GB-s) 100,000.00 GB-s - 400000 free tier GB-s = -300,000.00 GB-s Max (-300000.00 GB-s, 0 ) = 0.00 total billable GB-s Max (0 monthly billable requests, 0 ) = 0.00 total monthly billable requests Lambda costs - With Free Tier (monthly): 0.00 USD (First milion requests are free)
Total Costs per Month (80,000 API requests): 0.67 USD Total Costs per Year (1,000,000 API requests): 8,04 USD
npm run dictionary
alias of
node src/build-tiff-dictionary.js
npm test
sam build && sam local invoke getDemFunction --event events/event-get-dem.json
sam build
sam deploy