Replies: 1 comment
-
https://claude.ai/ I see you're running into an error with your Stable Diffusion WebUI installation. The core issue appears to be a missing Python dependency related to the tokenizers library. Let me help you resolve this. Root Cause AnalysisThe main error is:
This is causing a cascade of import failures as various libraries that depend on the tokenizers package are trying to load. SolutionYou need to reinstall the tokenizers library. This issue typically happens when the Rust-based tokenizers package isn't properly installed or built. Here's how to fix it:
pip uninstall -y tokenizers
pip install --no-binary tokenizers tokenizers
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source $HOME/.cargo/env
pip install --no-binary tokenizers tokenizers
pip uninstall -y transformers
pip install transformers
Alternative SolutionIf the above doesn't work, you can try a complete reinstallation of the relevant packages: pip uninstall -y tokenizers transformers pytorch-lightning torchmetrics
pip install tokenizers transformers pytorch-lightning torchmetrics Additional ContextThis error is commonly seen when running Stable Diffusion WebUI in environments like Google Colab where package compatibility can sometimes be an issue. The error occurs during the initialization process when the WebUI is trying to load various dependencies. Let me know if you need any clarification or if you encounter any other issues after trying these solutions! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
При запуске выдает вот это
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1146, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1126, in _find_and_load_unlocked
File "", line 241, in _call_with_frames_removed
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/usr/local/lib/python3.11/dist-packages/transformers/models/init.py", line 15, in
from . import (
File "/usr/local/lib/python3.11/dist-packages/transformers/models/mt5/init.py", line 29, in
from ..t5.tokenization_t5 import T5Tokenizer
File "/usr/local/lib/python3.11/dist-packages/transformers/models/t5/tokenization_t5.py", line 26, in
from ...tokenization_utils import PreTrainedTokenizer
File "/usr/local/lib/python3.11/dist-packages/transformers/tokenization_utils.py", line 26, in
from .tokenization_utils_base import (
File "/usr/local/lib/python3.11/dist-packages/transformers/tokenization_utils_base.py", line 74, in
from tokenizers import AddedToken
File "/usr/local/lib/python3.11/dist-packages/tokenizers/init.py", line 80, in
from .tokenizers import (
ModuleNotFoundError: No module named 'tokenizers.tokenizers'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):

File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 13, in
initialize.imports()
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/initialize.py", line 17, in imports
import pytorch_lightning # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/init.py", line 34, in
from pytorch_lightning.callbacks import Callback # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/init.py", line 14, in
from pytorch_lightning.callbacks.callback import Callback
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/callback.py", line 25, in
from pytorch_lightning.utilities.types import STEP_OUTPUT
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/utilities/types.py", line 28, in
from torchmetrics import Metric
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/init.py", line 14, in
from torchmetrics import functional # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/init.py", line 77, in
from torchmetrics.functional.text.bleu import bleu_score
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/init.py", line 30, in
from torchmetrics.functional.text.bert import bert_score # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/bert.py", line 24, in
from torchmetrics.functional.text.helper_embedding_metric import (
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/helper_embedding_metric.py", line 26, in
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1136, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1148, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto because of the following error (look up to see its traceback):
No module named 'tokenizers.tokenizers'
Beta Was this translation helpful? Give feedback.
All reactions