OpenCompass/opencompass/configs/models/deepseek/vllm_deepseek_moe_16b_chat.py
Songyang Zhang 46cc7894e1
[Feature] Support import configs/models/summarizers from whl (#1376)
* [Feature] Support import configs/models/summarizers from whl

* Update LCBench configs

* Update

* Update

* Update

* Update

* update

* Update

* Update

* Update

* Update

* Update
2024-08-01 00:42:48 +08:00

14 lines
372 B
Python

from opencompass.models import VLLMwithChatTemplate
models = [
dict(
type=VLLMwithChatTemplate,
abbr='deepseek-moe-16b-chat-vllm',
path='deepseek-ai/deepseek-moe-16b-chat',
model_kwargs=dict(tensor_parallel_size=1, gpu_memory_utilization=0.6),
max_out_len=1024,
batch_size=16,
run_cfg=dict(num_gpus=1),
)
]