Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: wk_trans inconsistent meta flags handling #217

Merged
merged 3 commits into from
Mar 7, 2024

Conversation

anthonynorth
Copy link
Contributor

Fixes wk_trans inconsistent meta flags handling, which can result in garbage values in the m dimension when input doesn't contain m dimension and use_m = TRUE.

Fixes #216

@codecov-commenter
Copy link

codecov-commenter commented Mar 7, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 98.85%. Comparing base (67f7013) to head (9ee8dd2).

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #217   +/-   ##
=======================================
  Coverage   98.85%   98.85%           
=======================================
  Files          85       85           
  Lines        6211     6211           
=======================================
  Hits         6140     6140           
  Misses         71       71           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@anthonynorth
Copy link
Contributor Author

anthonynorth commented Mar 7, 2024

I wasn't entirely certain if transformers should see dimensions that aren't present in the output. E.g. if input contains z but use_z = FALSE, should a transformer get z = NaN, or z from input?

I've assumed the intent was to forward input as is to transformers (thus ignoring use_z/m) and omit transformed z/m if use_z/m = FALSE.

Similarly, if use_z/m = TRUE and these dimensions aren't in the input, transformer is handed a default z/m = NaN. doesn't see a junk z/m, but the next handler in the pipeline sees z/m = NaN.

Copy link
Owner

@paleolimbot paleolimbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I'm actually not entirely sure what I intended when I wrote this, so I trust your interpretation. In any case, the change you made seems to correct a bit of code that is obviously out of place from the surrounding code.

@paleolimbot paleolimbot merged commit 40573b8 into paleolimbot:master Mar 7, 2024
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Transformers are using meta flags inconsistently when operating on coords
3 participants