•  


[Feature]: 允?基?功能指定模型 by awwaawwa · Pull Request #1708 · binary-husky/gpt_academic · GitHub
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement . We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: 允?基?功能指定模型 #1708

Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Prev Previous commit
allow model mutex override
  • Loading branch information
binary-husky committed May 17, 2024
commit 6a8329c74ff25cd6a7da0b06e3422be470c2c26f
41 changes: 28 additions & 13 deletions request_llms/bridge_all.py
Original file line number Diff line number Diff line change
Expand Up @@ -852,6 +852,13 @@ def decode(self, *args, **kwargs):
AVAIL_LLM_MODELS += [azure_model_name]


# -=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=-=-=
# -=-=-=-=-=-=-=-=-=- ?? 以上是模型路由 -=-=-=-=-=-=-=-=-=
# -=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=-=-=

# -=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=-=-=
# -=-=-=-=-=-=-= ?? 以下是多模型路由切?函? -=-=-=-=-=-=-=
# -=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=--=-=-=-=-=-=-=-=


def LLM_CATCH_EXCEPTION(f):
Expand Down Expand Up @@ -888,13 +895,11 @@ def predict_no_ui_long_connection(inputs:str, llm_kwargs:dict, history:list, sys
model = llm_kwargs['llm_model']
n_model = 1
if '&' not in model:

# 如果只??1?大?言模型:
# 如果只??“一?”大?言模型(多?情?):
method = model_info[model]["fn_without_ui"]
return method(inputs, llm_kwargs, history, sys_prompt, observe_window, console_slience)
else:

# 如果同???多?大?言模型,??稍微??一点,但思路相同,?不必???else分支
# 如果同???“多?”大?言模型,??稍微??一点,但思路相同,?不必???else分支
executor = ThreadPoolExecutor(max_workers=4)
models = model.split('&')
n_model = len(models)
Expand Down Expand Up @@ -947,8 +952,23 @@ def mutex_manager(window_mutex, observe_window):
res = '<br/><br/>\n\n---\n\n'.join(return_string_collect)
return res

import core_functional
# 根据基?功能? ModelOverride ???整模型?型,用于 `predict` 中
import importlib
import core_functional
def execute_model_override(llm_kwargs, additional_fn, method):
functional = core_functional.get_core_functions()
if 'ModelOverride' in functional[additional_fn]:
# ?更新Prompt & ModelOverride
importlib.reload(core_functional)
functional = core_functional.get_core_functions()
model_override = functional[additional_fn]['ModelOverride']
if model_override not in model_info:
raise ValueError(f"模型覆盖?? '{model_override}' 指向一??不支持的模型,???配置文件。")
method = model_info[model_override]["fn_with_ui"]
llm_kwargs['llm_model'] = model_override
return llm_kwargs, additional_fn, method
# 默?返回原??
return llm_kwargs, additional_fn, method

def predict(inputs:str, llm_kwargs:dict, plugin_kwargs:dict, chatbot,
history:list=[], system_prompt:str='', stream:bool=True, additional_fn:str=None):
Expand All @@ -970,16 +990,11 @@ def predict(inputs:str, llm_kwargs:dict, plugin_kwargs:dict, chatbot,
"""

inputs = apply_gpt_academic_string_mask(inputs, mode="show_llm")

method = model_info[llm_kwargs['llm_model']]["fn_with_ui"] # 如果?里??,??config中的AVAIL_LLM_MODELS??

importlib.reload(core_functional) # ?更新prompt
functional = core_functional.get_core_functions()
if 'ModelOverride' in functional[additional_fn]:
model_override = functional[additional_fn]['ModelOverride']
if model_override not in model_info:
raise ValueError(f"模型覆盖?? '{model_override}' 指向一??不支持的模型,???配置文件。")
method = model_info[model_override]["fn_with_ui"]
llm_kwargs['llm_model'] = model_override
if additional_fn: # 根据基?功能? ModelOverride ???整模型?型
llm_kwargs, additional_fn, method = execute_model_override(llm_kwargs, additional_fn, method)

yield from method(inputs, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt, stream, additional_fn)

- "漢字路" 한글한자자동변환 서비스는 교육부 고전문헌국역지원사업의 지원으로 구축되었습니다.
- "漢字路" 한글한자자동변환 서비스는 전통문화연구회 "울산대학교한국어처리연구실 옥철영(IT융합전공)교수팀"에서 개발한 한글한자자동변환기를 바탕하여 지속적으로 공동 연구 개발하고 있는 서비스입니다.
- 현재 고유명사(인명, 지명등)을 비롯한 여러 변환오류가 있으며 이를 해결하고자 많은 연구 개발을 진행하고자 하고 있습니다. 이를 인지하시고 다른 곳에서 인용시 한자 변환 결과를 한번 더 검토하시고 사용해 주시기 바랍니다.
- 변환오류 및 건의,문의사항은 juntong@juntong.or.kr로 메일로 보내주시면 감사하겠습니다. .
Copyright ⓒ 2020 By '전통문화연구회(傳統文化硏究會)' All Rights reserved.
 한국   대만   중국   일본