-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add `pmetrictest.IgnoreMetricsFloatPrecision(), #35060
Labels
Comments
greatestusername
added
enhancement
New feature or request
needs triage
New item requiring triage
labels
Sep 6, 2024
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
Makes sense to me. Want to make the pr @greatestusername? |
Yup! will do! Thanks! |
djaglowski
pushed a commit
that referenced
this issue
Sep 11, 2024
… using `CompareMetrics` (#35085) **Description:** <Describe what has changed.> Addresses issue: #35060 Adds option to `CompareMetrics` testing function that rounds floats to an arbitrary number of decimals. This is needed to avoid issues with float precision during testing. **Link to tracking Issue:** <Issue number if applicable> #35060 **Testing:** <Describe what testing was performed and which tests were added.> Test and test data added for this CompareMetrics option **Documentation:** No new docs. Let me know if I should be adding something somewhere! Co-authored-by: Antoine Toulme <atoulme@splunk.com>
PR is merged! |
jriguera
pushed a commit
to springernature/opentelemetry-collector-contrib
that referenced
this issue
Oct 4, 2024
… using `CompareMetrics` (open-telemetry#35085) **Description:** <Describe what has changed.> Addresses issue: open-telemetry#35060 Adds option to `CompareMetrics` testing function that rounds floats to an arbitrary number of decimals. This is needed to avoid issues with float precision during testing. **Link to tracking Issue:** <Issue number if applicable> open-telemetry#35060 **Testing:** <Describe what testing was performed and which tests were added.> Test and test data added for this CompareMetrics option **Documentation:** No new docs. Let me know if I should be adding something somewhere! Co-authored-by: Antoine Toulme <atoulme@splunk.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Component(s)
pkg/pdatatest
Is your feature request related to a problem? Please describe.
I'm currently writing a connector for summing attribute values into metrics values. Due to needing to use floats (some values may be decimals) I'm running into issues with float precision during testing with
CompareMetrics
. This has cropped up while writing tests for the main summing logic of the component when we expect something like23.9
and get back23.899999
.Describe the solution you'd like
I'd like to create another optional flag for
pmetrictest
to allow ignoring float discrepancies below a certain threshold. Say an expected23.9
is actually coming back as23.899999
this slight difference should be ignorable and countable as a pass.This would be an optional flag similar to
pmetrictest.IgnoreMetricsOrder()
and the like.Describe alternatives you've considered
This is the best option I've been able to come up with.
Additional context
PR where this issue cropped up:
#34797
The insidious part of this is that
CompareMetrics
doesn't seem to be reporting the discrepancy properly. My test output will look like this:But if I output the value in question to console it is
23.899999
Currently I've added a rounding function to get around this temporarily, get tests passing, and get my logic reviewed.
The text was updated successfully, but these errors were encountered: