-
Notifications
You must be signed in to change notification settings - Fork 300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix AGRI L1 C07 having a valid LUT value for its fill value #2726
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2726 +/- ##
=======================================
Coverage 95.40% 95.40%
=======================================
Files 371 371
Lines 52825 52825
=======================================
Hits 50399 50399
Misses 2426 2426
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Pull Request Test Coverage Report for Build 7657460878
💛 - Coveralls |
Ok this is ready for review @mraspaud and @simonrp84 if you have the time. The existing test code was...rough. The expected values and the coefficients used were based on the index of the channel in the file, not the actual channel number and it made it very confusing. Additionally there was a lot of "group all this stuff into a method to make linters happy about less complexity", but a lot of it didn't need to be that way. Or rather, it should have been done even further to make the code even simpler. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me! Thanks for refactoring the tests on the way there.
@@ -390,6 +370,7 @@ def test_agri_for_one_resolution(self, resolution_to_test, satname): | |||
available_datasets = reader.available_dataset_ids | |||
band_names = CHANNELS_BY_RESOLUTION[resolution_to_test] | |||
self._assert_which_channels_are_loaded(available_datasets, band_names, resolution_to_test) | |||
# band_names = ["C07"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a reminder of the contents of band_names or a forgotten comment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Forgotten comment. I'll get rid of it.
Unless I'm doing somethign wrong, this doesn't seem to solve the issue. With the data linked to earlier in this issue I still get a value of 200K for deep space pixels. |
Created with: In [1]: from satpy import Scene; from glob import glob
scn
In [2]: scn = Scene(reader="agri_fy4a_l1", filenames=glob("/data/satellite/agri_l1/20220120/FY4A-_AGRI--_N_DISK_1047E_L1-_*"))
In [3]: scn.load(["C07"])
In [4]: scn["C07"].plot.imshow().figure.show() @simonrp84 possibly incorrect installation on your end? |
Aah,I was using the FY4B reader for FY4A. Maybe that is the problem. Will test later. |
I'm going to consider this good to go. Merging... |
Closes ssec/polar2grid#565
CC @simonrp84
This was first encountered in Geo2Grid development by @kathys. Some debugging showed that C07 files for AGRI L1 have a LUT with 65536 elements in it even though 65535 is a fill value. The end result is fill values being replaced by real BT values.
This PR implements one possible solution. It says "if the LUT already includes the fill value, force that LUT value to NaN". An alternative could be slicing the LUT array to the input data's
valid_range
and then getting NaN for anything beyond the LUT. The ending implementation ends up looking pretty similar I think.Note: I haven't added tests for this yet because I wanted to give @simonrp84 time to review the solution as soon as possible. I also wanted to see if this would work without having to (likely) rewrite a ton of the tests to fit the situation...which is basically a bug in the input data.
AUTHORS.md
if not there already