This repository was archived by the owner on May 11, 2025. It is now read-only.
Getting error with LLama2 and LLama3 based AWQ models. #462
Unanswered
alok-abhishek
asked this question in
Q&A
Replies: 1 comment
-
Having the same issue. I think it's some issue with certain 13B AWQ models because GGUF models and x<7B AWQ models work. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
AttributeError: 'LlamaLikeModel' object has no attribute 'layers'
Traceback (most recent call last):
File "inference\awq\inference_awq_hf.py", line 50, in
generation_output = model.generate(
^^^^^^^^^^^^^^^
File ".venv\Lib\site-packages\awq\models\base.py", line 111, in generate
return self.model.generate(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File ".venv\Lib\site-packages\transformers\generation\utils.py", line 1622, in generate
result = self._sample(
^^^^^^^^^^^^^
File ".venv\Lib\site-packages\transformers\generation\utils.py", line 2788, in _sample
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv\Lib\site-packages\transformers\models\llama\modeling_llama.py", line 1262, in prepare_inputs_for_generation
past_key_values = getattr(getattr(self.model.layers[0], "self_attn", {}), "past_key_value", None)
^^^^^^^^^^^^^^^^^
File ".venv\Lib\site-packages\torch\nn\modules\module.py", line 1688, in getattr
raise AttributeError(f"'{type(self).name}' object has no attribute '{name}'")
AttributeError: 'LlamaLikeModel' object has no attribute 'layers'
I also saw the same issue report here: oobabooga_text-generation-webui_issues_5778
Beta Was this translation helpful? Give feedback.
All reactions