在离线环境下运行 cli_demo.py,输入问题提问后无响应 #1124
Unanswered
Chen-Jianxiong
asked this question in
Q&A
Replies: 2 comments 2 replies
-
包括启动api_server.py后,执行python openai_api_request.py,能够讲出半段完整的故事: |
Beta Was this translation helpful? Give feedback.
0 replies
-
啥配置啊 |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
在离线环境下运行 cli_demo.py,输入问题提问后无响应

使用Ctrl+C停止后,报错如下:
racepack (most recent call last):
File "/data/cix/ChatGLM3/basic demo/ci demo.py", line 57, in
main0
File "/data/cjx/ChatGLM3/basic demo/cli demo.py", line 43, in main
for response, history, past key values in model.stream chattokenizer, query, history=history, top p=1.
File "/data/cix/python3.9/site-packages/torch/utils/ contextlib.py", line 35, in generator context
response = gen.sendNone)
File "home/chenianxiong/.ache/huggingface/modules/transformers modules/chatglm3-b/modeling chatglm.py", line 1076, in stream chat
for outputs in self.stream generate(**inputs, past key values=past key values.
File "/data/cix/python3.9/site-packages/torch/utils/ contextlib,py", line 35, in generator context
response = gen.send(None)
File "home/chenianxiong/.ache/huqgingface/modules/transformers modules/chatalm3-6b/modeling chatalm,py", line 1163, in stream generai
outputs = self(
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in wrapped call impl
return self. call impl(*args, **kwargs)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in call impl
return forward call(*args, **kwargs)
File "home/chenianxiong/.cache/huggingface/modules/transformers modules/chatalm3-6b/modeling chatalm.py", line 941, in forward
transformer outputs = self.transformer(
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in wrapped call impl
return self. call impl(*args, **kwargs)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in call impl
return forward call(*args,**kwargs)
File "/home/chenianxiong/.cache/huggingface/modules/transformers modules/chatglm3-6b/modeling chatglm.py", line 834, in forward
hidden states, presents, all hidden states, all self attentions = selfencoder(
File "/data/cjx/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in wrapped call impl
return self. call impl(*args, **kwargs)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in call impl
return forward call(*args, **kwargs)
File "home/chenjianxiong/.cache/huggingface/modules/transformers modules/chatlm3-6b/modeling chatglm.py" line 641, in forwaro
layer ret = layer(
File "/data/cx/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in wrapped call impl
return self. call impl(*args, **kwargs)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in call impl
return forward call(*args,**kwargs)
File "home/chenjianxiong/.cachehuggingface/modules/transformers modules/chatglm3-6b/modeling chatglm,py", line 565, in forward
mlp output = self.mlp(layernorm output)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in wrapped call impl
return self. call impl(*aras, **kwargs)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in call impl
return forward call(*args,**kwargs)
File "home/chenianxiong/.cachehuggingface/modules/transformers modules/chatalm3-6b/modeling chatalm,py'", line 501,in forward
output = self.dense_4h_to_h(intermediate_parallel)
File "/data/cix/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in wrapped cal impl
return self. call impl(*args, **kwargs)
File "/data/cix/python3.9/site-packages/torch/nn/modules/modulepy", line 1520, in call impl
return forward call(*args,**kwargs)
File "/data/cjx/python3.9/site-packages/torch/nn/modules/linear.py", line 116, in forward
return F.linear(input, self.weight, self.bias)
KeyboardInterrupt
==================================
请问是出现了什么问题呢?
Beta Was this translation helpful? Give feedback.
All reactions