•  


[Feature]: 允?基?功能指定模型 by awwaawwa · Pull Request #1708 · binary-husky/gpt_academic · GitHub
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement . We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: 允?基?功能指定模型 #1708

Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Prev Previous commit
Next Next commit
模型覆盖支持?更新&?模型覆盖指向不存在的模型???
  • Loading branch information
awwaawwa committed Apr 15, 2024
commit fc69bbb0e7848e67d779ced73930f47aa9152e7a
11 changes: 8 additions & 3 deletions request_llms/bridge_all.py
Original file line number Diff line number Diff line change
Expand Up @@ -947,8 +947,9 @@ def mutex_manager(window_mutex, observe_window):
res = '<br/><br/>\n\n---\n\n'.join(return_string_collect)
return res

from core_functional import get_core_functions
functional = get_core_functions()
import core_functional
import importlib

def predict(inputs:str, llm_kwargs:dict, plugin_kwargs:dict, chatbot,
history:list=[], system_prompt:str='', stream:bool=True, additional_fn:str=None):
"""
Expand All @@ -971,10 +972,14 @@ def predict(inputs:str, llm_kwargs:dict, plugin_kwargs:dict, chatbot,
inputs = apply_gpt_academic_string_mask(inputs, mode="show_llm")
method = model_info[llm_kwargs['llm_model']]["fn_with_ui"] # 如果?里??,??config中的AVAIL_LLM_MODELS??

importlib.reload(core_functional) # ?更新prompt
functional = core_functional.get_core_functions()
if 'ModelOverride' in functional[additional_fn]:
model_override = functional[additional_fn]['ModelOverride']
if model_override not in model_info:
raise ValueError(f"模型覆盖?? '{model_override}' 指向一??不支持的模型,???配置文件。")
method = model_info[model_override]["fn_with_ui"]
llm_kwargs['llm_model'] = functional[additional_fn]['ModelOverride']
llm_kwargs['llm_model'] = model_override

yield from method(inputs, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt, stream, additional_fn)

- "漢字路" 한글한자자동변환 서비스는 교육부 고전문헌국역지원사업의 지원으로 구축되었습니다.
- "漢字路" 한글한자자동변환 서비스는 전통문화연구회 "울산대학교한국어처리연구실 옥철영(IT융합전공)교수팀"에서 개발한 한글한자자동변환기를 바탕하여 지속적으로 공동 연구 개발하고 있는 서비스입니다.
- 현재 고유명사(인명, 지명등)을 비롯한 여러 변환오류가 있으며 이를 해결하고자 많은 연구 개발을 진행하고자 하고 있습니다. 이를 인지하시고 다른 곳에서 인용시 한자 변환 결과를 한번 더 검토하시고 사용해 주시기 바랍니다.
- 변환오류 및 건의,문의사항은 juntong@juntong.or.kr로 메일로 보내주시면 감사하겠습니다. .
Copyright ⓒ 2020 By '전통문화연구회(傳統文化硏究會)' All Rights reserved.
 한국   대만   중국   일본