OpenCompass/configs/models
Yi Ding bcb707dbfc
[Fix] Fix BailingAPI model (#1707)
* [fix] sequence under the multiple samples

* resolve the lint problems

* change the parameter name

* add another error code for retry

* output the log for invalid response

* format correction

* update

* update

* update

* update

* add two model python files

* update the default parameter

* use random for delay

* update the api example of bailing

* remove the unnecessary parameter
2024-11-26 19:24:47 +08:00
..
accessory [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
alaya [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
aquila [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
baichuan [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
bailing_api [Fix] Fix BailingAPI model (#1707) 2024-11-26 19:24:47 +08:00
bluelm [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
chatglm [Feature] Update the max_out_len for many models (#1559) 2024-09-24 21:52:28 +08:00
claude [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
codegeex2 [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
codellama [Feature] Add huggingface apply_chat_template (#1098) 2024-05-14 14:50:16 +08:00
deepseek [Feature] Update the max_out_len for many models (#1559) 2024-09-24 21:52:28 +08:00
falcon [Feature] Add huggingface apply_chat_template (#1098) 2024-05-14 14:50:16 +08:00
gemini [Feature] Update CHARM Memeorziation (#1230) 2024-07-26 18:42:30 +08:00
gemma [Fix] Update SciCode and Gemma model (#1449) 2024-08-23 10:42:27 +08:00
hf_internlm [Feature] Integrate lmdeploy pipeline api (#1198) 2024-10-09 22:58:06 +08:00
hf_llama [Feature] Support LiveCodeBench (#1617) 2024-10-21 20:50:39 +08:00
internlm [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
judge_llm [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
lemur [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
lingowhale [Sync] update model configs (#574) 2023-11-13 15:15:34 +08:00
llama [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
mistral [Feature] Add long context evaluation for base models (#1666) 2024-11-08 10:53:29 +08:00
moss [Fix] Fix moss template config (#897) 2024-02-21 11:19:24 +08:00
mpt [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
ms_internlm [Feature] support download from modelscope (#534) 2023-11-22 15:32:21 +08:00
nanbeige [Sync] Sync with internal codes 2024.06.28 (#1279) 2024-06-28 14:16:34 +08:00
openai [Feature] Support OpenAI O1 models (#1539) 2024-09-18 22:41:17 +08:00
openbmb [Feature] Update Models (#1518) 2024-09-12 23:35:30 +08:00
opt [Fix] Rollback opt model configs (#1213) 2024-05-30 00:03:22 +08:00
others [Sync] Sync with internal codes 2024.06.28 (#1279) 2024-06-28 14:16:34 +08:00
phi [Feature] Update Models (#1518) 2024-09-12 23:35:30 +08:00
pulse [Feature] add support for hf_pulse_7b (#1255) 2024-07-29 19:01:52 +08:00
qwen [Feature] Update the max_out_len for many models (#1559) 2024-09-24 21:52:28 +08:00
qwen2_5 [Fix] Qwen 2.5 model config (#1626) 2024-10-21 16:58:18 +08:00
rwkv [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
skywork [Feature] Add huggingface apply_chat_template (#1098) 2024-05-14 14:50:16 +08:00
tigerbot [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
vicuna Fix VLLM argument error (#1207) 2024-05-29 10:14:08 +08:00
wizardcoder [Format] Add config lints (#892) 2024-05-14 15:35:58 +08:00
wizardlm Fix VLLM argument error (#1207) 2024-05-29 10:14:08 +08:00
yi [Feature] Update the max_out_len for many models (#1559) 2024-09-24 21:52:28 +08:00
zephyr Fix VLLM argument error (#1207) 2024-05-29 10:14:08 +08:00