-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable Sentinel Hub Batch API #52
Comments
[WIP] list of questions to ask as I'm looking at Batch Processing API:
|
Initial ResearchGoalGenerate orthorectified sigma0 sentinel 1 grd chips in an s3 bucket under Azavea control via the SentinelHub Batch Process API (beta). Summary of FindingsThe API is straightforward to use, and Batch Process jobs (once submitted) only take a few minutes to complete. The Kansas flood used ~850 processing units out of available 30,000 / mo on our current demo plan. Results are single band COGs of a data mask, VV and VH band for the s1 grd orthorectified sigma0 chips. Chips use one of three processing "grids" that output small chips in local UTM. IIUC, we'd have to post process merge these output chips into a single 4326 COG for input to Raster Vision. The evalscript v3 spec allows essentially arbitrary per-pixel processing of the input bands mapped to user specified output bands/tifs/images. tl;dr @jamesmcclain and/or @echeipesh should take a look at the evalscript v3 and tiling grid specs linked in the paragraph above to determine whether this API still meets our needs. Otherwise, processing is pretty quick, straightforward, and won't too quickly run us out of credits. I could see us wrapping the API calls in a reusable Python or Node CLI in a day or two of effort. We'd get somewhere on the order of 30-40 Batch Process jobs / mo on the current plan assuming average size is about the sen1floods11 Kansas bbox. ProcessSentinelHub Batch Process API WorkflowThe docs at the link in the goal section above have a nice diagram and description of the steps necessary to perform a batch process request. High level steps:
User AWS S3 Bucket ConfigurationIn order to run Batch Process requests, the API needs to be able to put results into a user-owned S3 bucket. I created and configured the bucket DetailsBefore running a Batch Process request, users can verify that imagery of the desired type (in this case sentinel 1 grd) is available via their STAC search endpoint (or you could skip this and just YOLO it). Here we request the approx bbox and time of the sen1floods11 Kansas flood.
Next, generate a Batch Process request for orthorectified sigma0 sentinel 1 grd tiles for the bounding box of the sen1floods11 Kansas flood on May 22, 2019 with the approx bounding box Here's the JSON body for that request:
Next we analyze the request we created which will tell us the cost, with a POST https://services.sentinel-hub.com/api/v1/batch/process/fbabeaa1-6378-4242-8149-9ec0f26de989/analyse. This request returns a 204 so you have to query the Batch Process detail endpoint at GET https://services.sentinel-hub.com/api/v1/batch/process/fbabeaa1-6378-4242-8149-9ec0f26de989 and wait for status If the cost and tile count appear acceptable, the Batch Process job can be started with POST https://services.sentinel-hub.com/api/v1/batch/process/fbabeaa1-6378-4242-8149-9ec0f26de989/start Again, this returns a 204 and you can query the same GET https://services.sentinel-hub.com/api/v1/batch/process/fbabeaa1-6378-4242-8149-9ec0f26de989 until the status is
SUCCESS!!! I made all my requests with the MacOS Paw API client, here's a project that can be loaded if you have the software. If not, let me know, it looks like I can export the project in Postman format. |
Two follow up requests so far during an offline discussion with @echeipesh:
|
1
I'm honestly still not sure. I've been able to make the case that they're close with a few assumptions:
Holy crap research in this area must be frustrating with multiple different data sources that are all the same but slightly different. 2
Probably yes. SentinelHub provides an OpenAPI 3.0.2 specification for v1.0 of the SentinelHub API which includes the Batch Processing endpoints. I was able to build a python client that includes the Batch Process endpoints with:
There's no param checking or anything for the POST Create Batch Process endpoint and the work of doing stuff like reading and formatting an evalscript from some file is left as an exercise for the reader. Another option is to write the endpoints we need as a contribution to the SentinelHub python library. I verified that they'd eventually want these endpoints in this library in sentinel-hub/sentinelhub-py#136 |
I have found this issue following a thread of breadcrumbs from sentinelhub-py issue. I just wanted to let you know that the Sentinel Hub now offers Sentinel-1 ortorectification with Copernicus DEM (10m resolution within EEA, 30m worldwide) as well as radiometric terrain correction. See processing options or inspect it on EOBrowser |
This is a nice find. Thank you! |
After exploration of #39 it seems like the AWS Sentinel-1 catalogs are not as useful for this project as initially expected.
We may still be able to use it, pending research in #49 ... however looking at Sentinel Hub as Sentinel 1 data source has been suggested by our advisors.
At first glance it looks like a very attractive option.
Given that flood events are relatively rare and happen over limited area our data requirements will likely be quite modest even allowing for iteration.
This issue is to more carefully evaluate the option
The text was updated successfully, but these errors were encountered: