-
Notifications
You must be signed in to change notification settings - Fork 561
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make it possible to pickle simple kernels and likelihoods #1876
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, thanks! Very useful.
@@ -172,7 +172,7 @@ def __init__( | |||
if not isinstance(lengthscale_prior, Prior): | |||
raise TypeError("Expected gpytorch.priors.Prior but got " + type(lengthscale_prior).__name__) | |||
self.register_prior( | |||
"lengthscale_prior", lengthscale_prior, lambda m: m.lengthscale, lambda m, v: m._set_lengthscale(v) | |||
"lengthscale_prior", lengthscale_prior, self._lengthscale_param, self._lengthscale_closure |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess this should also work here?
"lengthscale_prior", lengthscale_prior, self._lengthscale_param, self._lengthscale_closure | |
"lengthscale_prior", lengthscale_prior, "lengthscale", self._lengthscale_closure |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this works (making this change causes 31 tests to fail). Note that _lengthscale_param
returns m.lengthscale
and not self.lengthscale
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm not sure I understand why it doesn’t, seems like the closure definition here should do the same?
Line 235 in 32cde57
def closure(module): |
Either way, we can figure that out later (or I’m missing sth obvious). Accepting this as is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good! Looks like merging is blocked though until the docs can be built successfully. Looking at https://readthedocs.org/projects/gpytorch/builds/15684237/ it looks like the issue is a warning that is unrelated to this PR.
Ok, fixed the doc building issue by pinning |
We sometimes need to pickle (simple) GPyTorch models, but this doesn't work when using priors since
register_prior
creates local lambdas. This PR makes it possible to pickle simple kernels such as Matern, RBF, Scale as well as the Gaussian likelihood.