Can we change the bin size in pipeline for tokenizing during inference? #282
Unanswered
sriyachakravarthy
asked this question in
Q&A
Replies: 1 comment 2 replies
-
@sriyachakravarthy the models were trained on data down to 5-minute granularity. No, it's not possible to change the bin size during inference: the bin size used for quantization is deeply tied to how the model represents real-valued data internally, so changing it would break the model. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi!
I would like to know if we can work on high frequency data, if so, what is the maximum frequency?
Can we reduce the bi size for tokenizing during inference ?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions