Skip to content

Latest commit

 

History

History
98 lines (57 loc) · 3.38 KB

README.md

File metadata and controls

98 lines (57 loc) · 3.38 KB

FairQuantize

Repo for MICCAI24 paper FairQuantize: Achieving Fairness Through Weight Quantization for Dermatological Disease Diagnosis.

Update

11/11/2024

Fixed the args.pre_load bug; added Apple mps support; removed some codes that are not used in this project.

Note: args.pre_load is to load datasets in advance to accelerate the program on certain devices. It is far from developed (e.g., not supporting CelebA for now), so by default it is disabled, and please use it with caution if you really need it.

Dependency

The repo has been built and run on Python 3.9.18.

First, install PyTorch and related packages. See PyTorch website for installation instructions that work for your machine.

Then, install the following packages (the command is for conda; modify according to your environment):

conda install pandas bokeh tqdm scikit-learn scikit-image
pip install torch-pruning backpack-for-pytorch

Also, install inq from INQ-pytorch, which offers its own installation methods.

Dataset

The datasets are too large, so we do not upload it to GitHub, but we offer links for you to look for or simly download them.

Fitzpatrick 17k

offical website

packaged dataset

direct link

The following commands will make the data ready for default paths in the code:

wget https://notredame.box.com/shared/static/pjf9kw5y1rtljnh81kuri4poecuiqngf.tar -O fitzpatrick17k.tar
tar –xvf fitzpatrick17k.tar

ISIC 2019

offical website

packaged dataset

direct link

The following commands will make the data ready for default paths in the code:

wget https://notredame.box.com/shared/static/uw8g5urs7m4n4ztxfo100kkga6arzi9k.tar -O ISIC_2019_train.tar
tar –xvf ISIC_2019_train.tar

CelebA

Note: so far, our paper has not used CelebA for any experiment yet.

offical website

packaged dataset

direct link

The following commands will make the data ready for default paths in the code:

wget https://notredame.box.com/shared/static/s2ais65ldhzpm6wx4sej11ltecajbtqt.tar -O img_align_celeba.tar
tar –xvf img_align_celeba.tar

Usage

For most modules (e.g., pre_train.py), use --help or -h for usage information.

python pre_trained.py --help

pre_trained.py is to train a vanilla model on the selected dataset.

quantize.py applies quantization to a given model.

test.py and test_group.py test given models. test.py tests one model per time, while test_group.py tests all models in a given directory. If you have a bunch of models to test (e.g., output models from quantization), test_group.py would be faster than calling test.py for each model.

Contact

If you have any question, feel free to submit issues and/or email me (Yuanbo Guo): yguo6 AT nd DOT edu. Thank you so much for your support!