Koifish is a c++ framework focused on efficient training/fine-tuning language model on edge devices & PC.
- Efficient on-device training of billion parameter language model.
- Efficient fine-tuning 10-billion parameter LLMs on edge device.
- Rematerialisation and fusion of operators
- Mixture of models
- Support LLAMA/GPT ...
- CPU, GPU and Hybrid training
- Json config file
- Pure C++ project
git clone /~https://github.com/gruai/koifish
cd koifish
# build ggml lib first
cd llama.cpp
mkdir build && cd build && cmake ..
make clean && make VERBOSE=TRUE
cd ../../
mkdir build && cd build && cmake ..
# export CPATH=~/cudnn-frontend/include/:/usr/local/cuda-12.1/include:$CPATH # maybe need this to export CPATH
make clean && make VERBOSE=TRUE
- Hybrid 1-bit Optimizer
- Support DEEPSEEK/MAMBA
- Sparse mapping of token-embedding to logits
- Contributors can open PRs
- Collaborators can push to branches in the
koifish
repo and merge PRs into themaster
branch - Collaborators will be invited based on contributions
- Any help with managing issues, PRs and projects is very appreciated!