CLIPTextEncode Error running ComfyUI in Paperspace #2715
Unanswered
dreamlogic-X
asked this question in
Q&A
Replies: 1 comment
-
Hey, so I got it to run by running: pip install --upgrade xformers in the terminal. In case that helps for info... script may need to be updated? Thanks |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Ben, I'm getting this error when running ComfyUI in Paperspace.
I tried using a fresh install of the PPS-ComfyUI.ipynb and still no luck.
It seems to happen at CLIP Text Encode (Prompt)
Any ideas? Thanks
`Error occurred when executing CLIPTextEncode: No operator found for this attention: Inputs(query=tensor([[[[-6.7488e-01, 5.1577e-01, 6.8423e-01, ..., 3.9683e-01, -3.9383e-03, -5.6663e-01]], [[-6.3144e-01, -5.2544e-02, 1.8661e-01, ..., 1.3055e+00, 1.1994e-01, 9.2716e-01]], [[-1.1868e+00, -9.8138e-01, -2.8040e-01, ..., 5.5139e-01, 1.1242e-02, 1.9957e-01]], ..., [[-5.2328e-01, 8.9376e-01, 2.4378e+00, ..., 6.2628e-01, -2.3550e-01, 4.0294e-02]], [[-7.6531e-01, 8.0951e-01, 2.6288e+00, ..., 4.6906e-01, -3.3714e-01, 2.1690e-01]], [[-5.1776e-01, 3.3292e-01, 3.2028e+00, ..., 9.1801e-01, -5.7839e-01, 6.0762e-02]]], [[[ 3.0827e-01, -3.0757e-01, 1.0608e-01, ..., 2.7738e-01, -4.3200e-01, 5.6365e-02]], [[-9.8380e-01, -1.4724e+00, 3.7520e-01, ..., 1.4109e-01, 1.1181e+00, 5.2133e-02]], [[ 2.0802e+00, 1.4514e+00, 1.7588e+00, ..., -6.1113e-01, 3.1905e-01, -5.3384e-01]], ..., [[-2.8547e-01, 4.4364e-01, -6.7209e-01, ..., -2.3543e-01, -1.1057e-01, 9.6503e-01]], [[-4.1693e-01, 3.7227e-01, -8.4520e-01, ..., -4.1186e-01, -2.1468e-01, 8.9070e-01]], [[ 1.0797e+00, 8.1634e-01, -1.8440e-01, ..., -1.2762e+00, -4.8650e-01, 1.8436e+00]]], [[[ 9.5296e-03, 1.2181e-01, 7.9206e-01, ..., 7.0886e-01, -7.3888e-03, 1.0873e-01]], [[ 1.0137e+00, -1.7560e-01, 2.0113e+00, ..., 1.7817e+00, 2.4458e-01, 1.2268e+00]], [[ 6.3306e-01, -8.5580e-02, 1.6994e+00, ..., -9.3098e-01, 1.4039e+00, 1.5376e+00]], ..., [[ 4.6605e-01, 1.2056e+00, 3.5266e+00, ..., 1.5482e+00, 3.5674e-01, 6.2724e-02]], [[ 3.1212e-01, 9.7626e-01, 3.2254e+00, ..., 1.2096e+00, 4.4409e-01, 4.5703e-01]], [[-2.8524e-02, 1.3045e+00, 4.0862e+00, ..., 6.0453e-01, 5.9494e-01, 1.0891e+00]]], ..., [[[-6.9510e-01, 3.1505e-01, 3.0573e-01, ..., -4.8095e-01, 1.0444e-01, 5.4716e-01]], [[ 2.8849e-01, -4.9984e-01, 5.6186e-02, ..., -1.0522e+00, -2.9095e-01, -1.7005e+00]], [[-1.5088e-01, -1.4392e+00, 1.9111e+00, ..., -5.3256e-01, 1.2864e+00, 8.0053e-02]], ..., [[-1.1301e+00, 1.6648e+00, 1.2353e+00, ..., -1.5020e+00, -9.7617e-01, -3.2251e-01]], [[-1.2968e+00, 1.6814e+00, 1.2032e+00, ..., -1.5126e+00, -1.3132e+00, -2.5688e-01]], [[-1.7929e+00, 1.5921e+00, -3.6814e-01, ..., -3.2148e+00, -2.3043e+00, 1.1122e+00]]], [[[ 1.8485e-01, -4.8493e-01, 1.3730e-01, ..., 5.0783e-01, 3.6839e-01, 3.1928e-01]], [[-4.6115e-01, -6.4388e-01, 8.1977e-01, ..., -1.0439e-01, -1.0164e-02, 1.6567e+00]], [[ 9.1815e-01, -1.8905e+00, 2.1990e-01, ..., 1.5677e-01, 1.7856e+00, 1.4176e+00]], ..., [[ 7.6037e-01, -5.0644e-01, -1.8063e-01, ..., -7.9016e-01, -2.1822e-01, 8.3856e-01]], [[ 5.7213e-01, -4.9126e-01, -1.4427e-01, ..., -1.3879e+00, -2.0085e-01, 9.9057e-01]], [[-3.0756e-01, -8.6040e-01, 3.5680e-01, ..., -1.0697e+00, 2.6099e-01, 1.3805e+00]]], [[[-3.7133e-01, -2.7151e-01, -6.2306e-01, ..., 6.0139e-01, 2.5587e-01, 3.6634e-02]], [[-1.2086e+00, -1.6093e-01, -8.4849e-01, ..., -1.7287e+00, -6.1575e-01, -1.7973e+00]], [[-1.2712e+00, -8.1069e-01, -2.9809e-01, ..., 3.9211e-01, -4.2019e-01, -4.5768e-01]], ..., [[ 5.1190e-01, -2.6481e-01, -4.9650e-01, ..., 3.3682e-01, -5.6177e-01, -8.1726e-01]], [[ 4.5328e-01, -5.3697e-01, -2.0252e-01, ..., 4.9948e-01, -7.1358e-01, -7.0644e-01]], [[ 3.0634e-01, -3.8949e-02, -1.6500e-01, ..., 1.5346e+00, -8.6871e-01, -2.3001e+00]]]], device='cuda:0'), key=tensor([[[[ 0.2225, 0.3829, 0.3919, ..., -0.2338, -0.6743, 0.3813]], [[-0.3914, 0.5260, 1.3559, ..., -0.6213, 0.6408, 0.9860]], [[ 0.4400, 0.1284, -0.2851, ..., -0.3130, -1.6255, -1.5853]], ..., [[-0.9238, 1.1829, 0.3760, ..., 0.3259, -0.0439, -0.0151]], [[-1.0695, 1.2280, 0.3677, ..., 0.2461, 0.0896, 0.2631]], [[-1.1863, -0.7584, 0.0732, ..., -0.5237, -0.4759, -0.2792]]], [[[-0.8111, 0.0165, 0.1384, ..., 0.2557, 0.4391, -0.0400]], [[ 0.3161, 0.6209, -1.8762, ..., -1.2239, 0.8957, 0.7907]], [[ 1.2654, -0.7234, 0.0786, ..., 0.3887, -0.6165, 1.2698]], ..., [[ 0.6217, 0.8877, -0.6187, ..., -0.1765, -1.2244, -0.6701]], [[ 0.4442, 1.0431, -0.7312, ..., -0.1740, -1.0511, -0.7805]], [[ 0.4581, 1.0331, -0.5743, ..., -0.7688, -1.1922, 0.6369]]], [[[-0.1416, 0.0080, 0.1473, ..., 0.7388, 0.2726, -0.0822]], [[-0.4706, 0.5499, 1.5977, ..., -0.2315, -0.5640, 0.4014]], [[-0.1144, -1.0792, 3.1093, ..., -1.0820, 0.1194, 1.0384]], ..., [[ 0.3697, 0.1318, 0.2624, ..., 0.1392, -0.0723, -0.3426]], [[ 0.3056, -0.2542, -0.1324, ..., 0.1459, -0.1065, -0.1400]], [[-0.5513, 0.0448, 0.1618, ..., -0.2068, -1.0762, -0.1012]]], ..., [[[-0.4264, 1.1087, 0.2000, ..., -0.7001, -1.4033, -0.2859]], [[ 1.7797, 2.9125, 1.5304, ..., -1.7124, -1.8764, -1.4826]], [[ 0.5210, 0.2478, 1.3169, ..., -0.9325, -1.2476, 0.6585]], ..., [[-0.1665, -0.1923, 0.8762, ..., 0.1464, 1.4227, 0.1397]], [[-0.2044, -0.0893, 0.6557, ..., 0.0371, 1.2897, 0.1352]], [[-0.6666, 0.6126, -0.1702, ..., -1.0928, 1.5631, 0.8543]]], [[[ 0.4011, 0.7039, -0.2056, ..., -1.0748, -0.0749, 0.2013]], [[-0.8609, 0.7651, 1.9173, ..., -1.6872, 0.8742, 0.5019]], [[-1.2654, -0.7566, 0.1407, ..., 0.8202, 0.3652, 0.4956]], ..., [[ 0.2247, -0.4024, 0.6381, ..., 0.6544, 0.0306, 0.6339]], [[-0.0068, -0.2806, 0.7280, ..., 0.4637, -0.2340, 0.7150]], [[-0.0511, -1.2465, 0.4235, ..., 0.3154, 0.0531, 0.3618]]], [[[ 0.1059, -0.5422, -1.0616, ..., -0.1313, 0.3962, 0.1333]], [[ 0.2474, 0.4230, 0.1573, ..., 2.7996, 0.1577, -0.3729]], [[-2.2718, -0.2293, -0.6600, ..., 0.5141, 0.2587, -1.8130]], ..., [[-0.2934, -0.6348, 0.5844, ..., 0.2816, -0.0604, -0.0104]], [[-0.6865, -0.8504, 0.6332, ..., 0.3120, 0.1551, -0.0401]], [[ 0.2250, 0.2767, 0.0220, ..., 1.3379, -0.4606, -0.7049]]]], device='cuda:0'), value=tensor([[[[-1.9607e-01, -1.0340e-01, -4.5641e-02, ..., 2.6531e-02, -8.5922e-02, -1.3481e-01]], [[ 1.5433e-01, -3.2845e-01, 3.6522e-01, ..., 4.3518e-01, 3.0755e-01, 1.0946e+00]], [[-4.8771e-01, 3.1215e-01, 8.5987e-01, ..., 3.6031e-01, 1.6432e-03, 3.6412e-01]], ..., [[ 3.5736e-01, -2.2022e-01, 1.9795e-01, ..., -3.0432e-01, 1.3866e-01, -1.6771e-02]], [[ 2.5598e-01, -1.5180e-01, 2.0755e-01, ..., -2.3797e-01, 1.4482e-01, -1.5837e-02]], [[ 2.6987e-01, -2.8464e-02, -1.3285e-01, ..., -3.4559e-01, -1.0168e-01, -1.3185e-01]]], [[[-1.4733e-01, -1.2122e-04, 1.9332e-02, ..., -5.9325e-02, 1.1186e-01, 2.1632e-02]], [[ 1.7177e+00, 2.3316e-01, -3.0359e-01, ..., 5.9948e-01, -8.2840e-01, -1.0738e-01]], [[ 3.4346e-01, -8.5884e-01, -4.0623e-02, ..., 1.4234e-01, -3.5109e-01, -9.3003e-01]], ..., [[-2.4580e-01, 9.2379e-02, 8.4424e-02, ..., -2.6472e-02, 8.3533e-02, -2.1126e-01]], [[-1.7943e-01, 6.8415e-02, 4.8308e-02, ..., -1.3254e-01, -9.5109e-02, -1.8656e-01]], [[-1.9958e-01, 3.0628e-01, 2.5258e-01, ..., -2.7705e-01, 1.0724e-01, -4.3978e-01]]], [[[-9.6336e-03, -1.0743e-01, 1.2341e-01, ..., 1.1395e-01, -1.6397e-02, -4.7007e-02]], [[-6.3434e-01, -2.9219e-01, 2.6918e-01, ..., -1.1832e+00, -5.7926e-01, 1.4109e-01]], [[-7.4620e-01, -1.6650e+00, -6.5955e-01, ..., 3.0339e-01, -6.6231e-02, 1.0941e+00]], ..., [[ 1.2261e-01, 8.1049e-02, 3.6346e-02, ..., 1.0894e-01, -6.9975e-02, -9.6849e-02]], [[ 9.3092e-03, 8.6740e-02, 1.3931e-02, ..., 1.2088e-01, -6.0746e-02, -1.3843e-01]], [[-3.7855e-01, 3.1102e-01, 5.9594e-02, ..., -1.1665e-01, -2.8196e-01, -7.8178e-02]]], ..., [[[-4.4505e-02, 1.4762e-02, -5.9929e-03, ..., 4.1916e-02, 6.5472e-02, 2.1557e-03]], [[-2.6067e-01, -4.5666e-01, -4.0917e-01, ..., -6.5205e-03, -6.8814e-01, 2.9037e-01]], [[-2.3370e-01, 3.3780e-02, 3.0740e-01, ..., 1.5955e-01, 4.4985e-01, 9.0680e-01]], ..., [[-1.1483e-02, 2.9200e-01, -2.1991e-01, ..., 3.1748e-02, -3.2303e-01, 1.9139e-01]], [[-5.6075e-02, 3.3120e-01, -1.8882e-01, ..., 1.9405e-02, -4.1359e-01, 1.6850e-01]], [[ 2.4218e-01, 2.9629e-01, -6.6272e-01, ..., -1.6432e-01, -2.0461e-01, 2.3530e-01]]], [[[-3.2361e-02, -1.1712e-01, -7.9820e-02, ..., -6.7869e-02, 6.4931e-02, 6.8451e-02]], [[ 4.8545e-01, -5.7125e-04, 3.2583e-01, ..., 6.3640e-02, -7.4155e-01, -5.8492e-01]], [[ 1.0604e+00, -3.5128e-01, 1.9647e-02, ..., 1.2786e+00, 7.1629e-02, 2.5388e-01]], ..., [[ 3.8634e-01, 4.9907e-01, 3.4718e-02, ..., 2.1903e-01, -8.4186e-02, 1.7073e-01]], [[ 3.3268e-01, 4.1972e-01, 7.5791e-02, ..., 3.4523e-01, -9.2264e-02, 1.1682e-01]], [[ 5.8609e-01, 2.9350e-01, 3.1652e-01, ..., 4.1926e-01, 2.7375e-01, 7.9032e-02]]], [[[-4.6152e-02, -5.6051e-02, 5.2634e-02, ..., -3.3528e-02, 3.8065e-02, -3.9371e-02]], [[-4.4946e-02, 4.3916e-01, -2.8196e-01, ..., 4.7240e-01, -4.3540e-02, 6.3626e-01]], [[ 4.0096e-01, 2.4972e-01, -7.9232e-01, ..., -1.2059e-01, 7.0303e-01, 9.3737e-01]], ..., [[-9.5547e-02, -7.6902e-02, 5.6489e-02, ..., 6.6675e-02, -1.2127e-01, -1.0040e-01]], [[-8.0787e-02, 1.1797e-04, 3.5048e-02, ..., 7.6210e-02, -1.4651e-01, -2.2845e-01]], [[ 1.9317e-01, 2.6317e-01, 4.3951e-02, ..., 1.1624e-01, -2.1259e-01, -1.2062e-01]]]], device='cuda:0'), attn_bias=tensor([[[0., -inf, -inf, ..., -inf, -inf, -inf], [0., 0., -inf, ..., -inf, -inf, -inf], [0., 0., 0., ..., -inf, -inf, -inf], ..., [0., 0., 0., ..., 0., -inf, -inf], [0., 0., 0., ..., 0., 0., -inf], [0., 0., 0., ..., 0., 0., 0.]], [[0., -inf, -inf, ..., -inf, -inf, -inf], [0., 0., -inf, ..., -inf, -inf, -inf], [0., 0., 0., ..., -inf, -inf, -inf], ..., [0., 0., 0., ..., 0., -inf, -inf], [0., 0., 0., ..., 0., 0., -inf], [0., 0., 0., ..., 0., 0., 0.]], [[0., -inf, -inf, ..., -inf, -inf, -inf], [0., 0., -inf, ..., -inf, -inf, -inf], [0., 0., 0., ..., -inf, -inf, -inf], ..., [0., 0., 0., ..., 0., -inf, -inf], [0., 0., 0., ..., 0., 0., -inf], [0., 0., 0., ..., 0., 0., 0.]], ..., [[0., -inf, -inf, ..., -inf, -inf, -inf], [0., 0., -inf, ..., -inf, -inf, -inf], [0., 0., 0., ..., -inf, -inf, -inf], ..., [0., 0., 0., ..., 0., -inf, -inf], [0., 0., 0., ..., 0., 0., -inf], [0., 0., 0., ..., 0., 0., 0.]], [[0., -inf, -inf, ..., -inf, -inf, -inf], [0., 0., -inf, ..., -inf, -inf, -inf], [0., 0., 0., ..., -inf, -inf, -inf], ..., [0., 0., 0., ..., 0., -inf, -inf], [0., 0., 0., ..., 0., 0., -inf], [0., 0., 0., ..., 0., 0., 0.]], [[0., -inf, -inf, ..., -inf, -inf, -inf], [0., 0., -inf, ..., -inf, -inf, -inf], [0., 0., 0., ..., -inf, -inf, -inf], ..., [0., 0., 0., ..., 0., -inf, -inf], [0., 0., 0., ..., 0., 0., -inf], [0., 0., 0., ..., 0., 0., 0.]]], device='cuda:0'), p=0.0, scale=None) File "/notebooks/ComfyUI/execution.py", line 154, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "/notebooks/ComfyUI/execution.py", line 84, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "/notebooks/ComfyUI/execution.py", line 77, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "/notebooks/ComfyUI/nodes.py", line 56, in encode cond, pooled = clip.encode_from_tokens(tokens, return_pooled=True) File "/notebooks/ComfyUI/comfy/sd.py", line 131, in encode_from_tokens cond, pooled = self.cond_stage_model.encode_token_weights(tokens) File "/notebooks/ComfyUI/comfy/sd1_clip.py", line 515, in encode_token_weights out, pooled = getattr(self, self.clip).encode_token_weights(token_weight_pairs) File "/notebooks/ComfyUI/comfy/sd1_clip.py", line 40, in encode_token_weights out, pooled = self.encode(to_encode) File "/notebooks/ComfyUI/comfy/sd1_clip.py", line 191, in encode return self(tokens) File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, **kwargs) File "/notebooks/ComfyUI/comfy/sd1_clip.py", line 173, in forward outputs = self.transformer(tokens, attention_mask, intermediate_output=self.layer_idx, final_layer_norm_intermediate=self.layer_norm_hidden_state) File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, **kwargs) File "/notebooks/ComfyUI/comfy/clip_model.py", line 131, in forward return self.text_model(*args, **kwargs) File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, **kwargs) File "/notebooks/ComfyUI/comfy/clip_model.py", line 109, in forward x, i = self.encoder(x, mask=mask, intermediate_output=intermediate_output) File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, **kwargs) File "/notebooks/ComfyUI/comfy/clip_model.py", line 68, in forward x = l(x, mask, optimized_attention) File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, **kwargs) File "/notebooks/ComfyUI/comfy/clip_model.py", line 49, in forward x += self.self_attn(self.layer_norm1(x), mask, optimized_attention) File "/usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, **kwargs) File "/notebooks/ComfyUI/comfy/clip_model.py", line 20, in forward out = optimized_attention(q, k, v, self.heads, mask) File "/notebooks/ComfyUI/comfy/ldm/modules/attention.py", line 310, in attention_xformers out = xformers.ops.memory_efficient_attention(q, k, v, attn_bias=mask) File "/usr/local/lib/python3.9/dist-packages/xformers/ops/fmha/init.py", line 197, in memory_efficient_attention return _memory_efficient_attention( File "/usr/local/lib/python3.9/dist-packages/xformers/ops/fmha/init.py", line 293, in _memory_efficient_attention return _memory_efficient_attention_forward( File "/usr/local/lib/python3.9/dist-packages/xformers/ops/fmha/init.py", line 309, in _memory_efficient_attention_forward op = _dispatch_fw(inp) File "/usr/local/lib/python3.9/dist-packages/xformers/ops/fmha/dispatch.py", line 53, in _dispatch_fw raise NotImplementedError(f"No operator found for this attention: {inp}")
Close
Queue size: 0⚙️
Queue Prompt
Extra options
Queue FrontView QueueView History
Save
Load
Refresh
Clipspace
Clear
Load Default
`
Beta Was this translation helpful? Give feedback.
All reactions