Skip to content

GPT-SoVITS-v3-nvidia50-调用api-出错 #2294

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
1060778506 opened this issue Apr 17, 2025 · 3 comments
Open

GPT-SoVITS-v3-nvidia50-调用api-出错 #2294

1060778506 opened this issue Apr 17, 2025 · 3 comments

Comments

@1060778506
Copy link

Image

已经研究了好多天了,求好心人帮忙,调用api最后一步了

Image

Image

Image

INFO: 未指定默认参考音频 INFO: 半精: True INFO: 编码格式: wav INFO: 数据类型: int16 INFO: Started server process [2464] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:9880 (Press CTRL+C to quit) INFO: 127.0.0.1:10448 - "GET /docs HTTP/1.1" 200 OK INFO: 127.0.0.1:10448 - "GET /openapi.json HTTP/1.1" 200 OK INFO: 127.0.0.1:10449 - "GET /?refer_wav_path=C%3A%5CUsers%5Cfengmaode%5CDesktop%5C1%E6%9C%80%E6%96%B0%E7%9A%84GPT-SoVITS-v3lora-20250228-nvidia50%5C123.wav&prompt_text=%E4%B8%AD%E6%96%87&prompt_language=%E4%B8%AD%E6%96%87&text=%E4%B8%AD%E6%96%87&text_language=%E4%B8%AD%E6%96%87&top_k=15&top_p=1&temperature=1&speed=1 HTTP/1.1" 200 OK ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi result = await app( # type: ignore[func-returns-value] File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in __call__ return await self.app(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\fastapi\applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\middleware\errors.py", line 186, in __call__ raise exc File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\middleware\errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__ await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app raise exc File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\routing.py", line 754, in __call__ await self.middleware_stack(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\routing.py", line 774, in app await route.handle(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\routing.py", line 295, in handle await self.app(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app raise exc File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\routing.py", line 75, in app await response(scope, receive, send) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\responses.py", line 265, in __call__ await wrap(partial(self.listen_for_disconnect, receive)) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\anyio\_backends\_asyncio.py", line 662, in __aexit__ raise exceptions[0] File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\responses.py", line 261, in wrap await func() File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\responses.py", line 250, in stream_response async for chunk in self.body_iterator: File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\concurrency.py", line 65, in iterate_in_threadpool yield await anyio.to_thread.run_sync(_next, as_iterator) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\anyio\to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run result = context.run(func, *args) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\runtime\lib\site-packages\starlette\concurrency.py", line 54, in _next return next(iterator) File "C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\api.py", line 560, in get_tts_wav infer_sovits = speaker_list[spk].sovits KeyError: 'default'

@1060778506
Copy link
Author

对了我在webui用网页端可以正常生成音频的,但是调用api就无法使用,

使用路径,
C:\Users\fengmaode\Desktop\1最新的GPT-SoVITS-v3lora-20250228-nvidia50\123.wav
绝对路径
我的是50系显卡,但是也已经下载50系显卡所有的文件,也已经更新了V3文件,下载了所有的V3模型,
依然错误

Image

@1060778506
Copy link
Author

我已经解决问题了,并且写了一篇文章
https://blog.csdn.net/m0_60628575/article/details/147299459?sharetype=blogdetail&sharerId=147299459&sharerefer=PC&sharesource=m0_60628575&spm=1011.2480.3001.8118

@zzb1420
Copy link

zzb1420 commented Apr 17, 2025

作者更新api_v2.py错误就来了。例如下面的报错,手动给text没问题。一旦换成浏览器插件。和手动完全没差别的一段text,直接报错。
按照提示装了
import nltk
nltk.download('averaged_perceptron_tagger') # 下载英文词性标注器
直接读成英文而且是乱哼哼。

解决方案:回滚了API_V2.PY。使用这个版本替换到KAGGLE。
https://github.com/RVC-Boss/GPT-SoVITS/blob/7394dc7b0c9e5012b614f8d7b48404a1d6c5ad38/api_v2.py

INFO: 13.229.250.233:0 - "GET /tts?text_lang=zh&ref_audio_path=/kaggle/input/cv-reference/MoBai1.wav&prompt_lang=zh&prompt_text=%25E4%25BD%25A0%25E8%25BF%2599%25E4%25B8%25AA%25E6%25B7%25B7%25E8%259B%258B%25EF%25BC%2581%25E8%25BF%2599%25E5%2588%25B0%25E5%25BA%2595%25E6%2598%25AF%25E6%2580%258E%25E4%25B9%2588%25E5%259B%259E%25E4%25BA%258B%25E5%2595%258A%25EF%25BC%259F&text_split_method=cut5&batch_size=1&media_type=wav&streaming_mode=true&text=%E9%97%BB%E7%9D%80%E6%88%BF%E9%97%B4%E9%87%8C%E9%82%A3%E4%BE%9D%E6%97%A7%E6%B5%93 HTTP/1.1" 200 OK
Set seed to 646836920
Parallel Inference Mode Enabled
Segmented Return Mode Enabled
Segmented Return Mode does not support Bucket Processing, Bucket Processing Disabled automatically
当开启并行推理模式时,SoVits V3模型不支持分桶处理,已自动关闭分桶处理
Actual Input Reference Text: %E4%BD%A0%E8%BF%99%E4%B8%AA%E6%B7%B7%E8%9B%8B%EF%BC%81%E8%BF%99%E5%88%B0%E5%BA%95%E6%98%AF%E6%80%8E%E4%B9%88%E5%9B%9E%E4%BA%8B%E5%95%8A%EF%BC%9F。
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/uvicorn/middleware/proxy_headers.py", line 60, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/applications.py", line 112, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/middleware/errors.py", line 187, in call
raise exc
File "/usr/local/lib/python3.11/dist-packages/starlette/middleware/errors.py", line 165, in call
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/dist-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 53, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 714, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 734, in app
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 76, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 53, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 74, in app
await response(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/responses.py", line 263, in call
async with anyio.create_task_group() as task_group:
File "/usr/local/lib/python3.11/dist-packages/anyio/_backends/_asyncio.py", line 597, in aexit
raise exceptions[0]
File "/usr/local/lib/python3.11/dist-packages/starlette/responses.py", line 266, in wrap
await func()
File "/usr/local/lib/python3.11/dist-packages/starlette/responses.py", line 246, in stream_response
async for chunk in self.body_iterator:
File "/usr/local/lib/python3.11/dist-packages/starlette/concurrency.py", line 60, in iterate_in_threadpool
yield await anyio.to_thread.run_sync(_next, as_iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/starlette/concurrency.py", line 49, in _next
return next(iterator)
^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/api_v2.py", line 352, in streaming_generator
for sr, chunk in tts_generator:
File "/usr/local/lib/python3.11/dist-packages/torch/utils/_contextlib.py", line 36, in generator_context
response = gen.send(None)
^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/TTS_infer_pack/TTS.py", line 975, in run
phones, bert_features, norm_text = self.text_preprocessor.segment_and_extract_feature_for_text(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/TTS_infer_pack/TextPreprocessor.py", line 120, in segment_and_extract_feature_for_text
return self.get_phones_and_bert(text, language, version)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/TTS_infer_pack/TextPreprocessor.py", line 175, in get_phones_and_bert
phones, word2ph, norm_text = self.clean_text_inf(textlist[i], lang, version)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/TTS_infer_pack/TextPreprocessor.py", line 206, in clean_text_inf
phones, word2ph, norm_text = clean_text(text, language, version)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/text/cleaner.py", line 47, in clean_text
phones = language_module.g2p(norm_text)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/text/english.py", line 365, in g2p
phone_list = _g2p(text)
^^^^^^^^^^
File "/kaggle/working/GPT-SoVITS/GPT_SoVITS/text/english.py", line 273, in call
tokens = pos_tag(words) # tuples of (word, tag)
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/nltk/tag/init.py", line 168, in pos_tag
tagger = _get_tagger(lang)
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/nltk/tag/init.py", line 110, in get_tagger
tagger = PerceptronTagger()
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/nltk/tag/perceptron.py", line 183, in init
self.load_from_json(lang)
File "/usr/local/lib/python3.11/dist-packages/nltk/tag/perceptron.py", line 273, in load_from_json
loc = find(f"taggers/averaged_perceptron_tagger{lang}/")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/nltk/data.py", line 579, in find
raise LookupError(resource_not_found)
LookupError:
Resource averaged_perceptron_tagger_eng not found.
Please use the NLTK Downloader to obtain the resource:
import nltk
nltk.download('averaged_perceptron_tagger_eng')
For more information see: https://www.nltk.org/data.html
Attempted to load taggers/averaged_perceptron_tagger_eng/
Searched in:

  • '/root/nltk_data'
  • '/usr/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/local/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/local/lib/nltk_data'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants