forked from yl4579/StyleTTS2
-
Notifications
You must be signed in to change notification settings - Fork 16
Open
Description
I need to avoid loading of the below each time I run StyleTTS2:
D:\PythonProjects\StyleTTS2>python style2wav.py
[nltk_data] Downloading package punkt to
[nltk_data] C:\Users\Max\AppData\Roaming\nltk_data...
[nltk_data] Package punkt is already up-to-date!
177
C:\Users\Max\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\nn\utils\weight_norm.py:28: UserWarning: torch.nn.utils.weight_norm is deprecated in favor of torch.nn.utils.parametrizations.weight_norm.
warnings.warn("torch.nn.utils.weight_norm is deprecated in favor of torch.nn.utils.parametrizations.weight_norm.")
C:\Users\Max\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\nn\modules\rnn.py:83: UserWarning: dropout option adds dropout after all but last recurrent layer, so non-zero dropout expects num_layers greater than 1, but got dropout=0.2 and num_layers=1
warnings.warn("dropout option adds dropout after all but last "
bert loaded
bert_encoder loaded
predictor loaded
decoder loaded
text_encoder loaded
predictor_encoder loaded
style_encoder loaded
diffusion loaded
text_aligner loaded
pitch_extractor loaded
mpd loaded
msd loaded
wd loaded
C:\Users\Max\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\nn\modules\conv.py:306: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ..\aten\src\ATen\native\cudnn\Conv_v8.cpp:919.)
return F.conv1d(input, weight, bias, self.stride,
How is it possible to keep it in high alert mode so that inference time would be minimal?
I use this script:
from scipy.io.wavfile import write
import msinference
text = 'Hello world!'
voice = msinference.compute_style('voice.wav')
wav = msinference.inference(text, voice, alpha=0.3, beta=0.7, diffusion_steps=7, embedding_scale=1)
write('result.wav', 24000, wav)
Metadata
Metadata
Assignees
Labels
No labels