Welcome to the non_param_score_est
Python package (GitHub repository, PyPi library).
To install the package, it is advised to have a Python 3.10 or newer environment. Then, simply run:
pip install non_param_score_est
The following estimators are available (and the corresponding import names):
Estimator | Import Name |
---|---|
Tikhonov regularization | Tikhonov |
NKEF (e.g., rate 0.75) | Tikhonov(subsample_rate=0.75) |
Kernel density estimator | KDE |
Landweber iteration | Landweber |
Nu-method | NuMethod |
Spectral Stein gradient estimator | SSGE |
Stein estimator | Stein |
To use the estimators in your code, simply import the estimator and call the estimate_gradients_x_s
or estimate_gradients_s
function. For example, to utilise the Tikhonov estimator, you would write:
import numpy as np
from non_param_score_est.estimators import Tikhonov
samples = np.random.normal(1000)
est = Tikhonov(bandwidth=1., lam=1e-4)
#estimate the gradients of the generated samples
score_estimate = est.estimate_gradients_s(samples=samples)
#estimate the gradients of new query while fitting the score estimator to previously generated samples
new_query = np.random.normal(100)
new_estimate = est.estimate_gradients_s_x(queries=new_query, samples=samples)
A great way to further investigate how the estimators work is to check the plots.py file. It contains a script that generates plots of the estimators on a simple 1D and 2D examples. The plots are generated by running the following command:
from non_param_score_est.estimators import Tikhonov
from non_param_score_est.tests.plots import plotOneDim, plotTwoDim
# selecting Tikhonov regularization
est = Tikhonov(bandwidth=10., lam=1e-5)
# One-dimensional Gaussian distribution experiment
plotOneDim(estimator=est)
# Two-dimensional Gaussian distribution experiment
plotTwoDim(estimator=est)
These generate the following outputs:
One-dimensional experiment | Two-dimensional experiment |
---|---|
We welcome contributions! Please follow these guidelines if you'd like to contribute to the project:
- Fork our GitHub repository and clone it to your local machine.
- Create a new branch for your feature or bug fix.
- Make your changes and ensure that tests pass.
- Submit a pull request with a clear title and description.
This project is licensed under the MIT License - see the LICENSE file for details.
The code in JAX was inspired by the repository of the Nonparametric Score Estimators paper, by Yuhao Zhou, Jiaxin Shi, Jun Zhu.
Krunoslav Lehman Pavasovic Email: krunolp@gmail.com GitHub: krunolp