Skip to content

Commit

Permalink
Merge branch 'hiyouga:main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
yuwangnexusera authored Nov 11, 2024
2 parents e695325 + 3eebae8 commit 705e183
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion src/llamafactory/chat/vllm_engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,8 @@ def __init__(
"enable_lora": model_args.adapter_name_or_path is not None,
"max_lora_rank": model_args.vllm_max_lora_rank,
}
engine_args.update(model_args.vllm_config)
if isinstance(model_args.vllm_config, dict):
engine_args.update(model_args.vllm_config)

if getattr(config, "is_yi_vl_derived_model", None):
import vllm.model_executor.models.llava
Expand Down

0 comments on commit 705e183

Please sign in to comment.