support GQA #7906
support GQA #7906
29.41% of diff hit (target 80.00%)
View this Pull Request on Codecov
29.41% of diff hit (target 80.00%)
Annotations
Check warning on line 940 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L934-L940
Added lines #L934 - L940 were not covered by tests
Check warning on line 945 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L943-L945
Added lines #L943 - L945 were not covered by tests
Check warning on line 954 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L954
Added line #L954 was not covered by tests
Check warning on line 986 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L986
Added line #L986 was not covered by tests