We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation .
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement . We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi,
the Error is thrown in line 61 litellm.py:
results.append(result["choices"][0]["message"]["content"])
'CustomStreamWrapper' object is not subscriptable
Call is from txtai.pipeline import LLM
MODEL_NAME = "huggingface/TheBloke/leo-hessianai-70B-chat-GPTQ" llm = LLM(path=MODEL_NAME,method="litellm", api_base=api_base,stream=True)
This works fine:
import litellm from litellm import completion
MODEL_NAME = "huggingface/TheBloke/leo-hessianai-70B-chat-GPTQ" messages = [{"content": "C", "role": "user"}] # LiteLLM follows the OpenAI format api_base = " http://127.0.0.1:8080 "
CALLING ENDPOINT
response=completion(model=MODEL_NAME, messages=messages, api_base=api_base,stream=True) for part in response: print(part.choices[0].delta.content or "")
The text was updated successfully, but these errors were encountered:
Thank you for the issue. I'll see if streaming support can be added in.
Sorry, something went wrong.
No branches or pull requests