Proper way to return tensor from node #124
-
I am having issues converting a numpy to a proper tensor that doesn't result in 768 individual slices on the Y axis x.x I am adding a MiDaS depth node to my custom nodes, but can't get a workable image for other nodes. I can save out the image just fine from PIL (below). Example: depth = prediction.cpu().numpy()
depth = (depth * 255 / (np.max(depth)+1)).astype('uint8')
depth_image = Image.fromarray(depth)
depth_image.save('TEST_DEPTH.png')
tensor = torch.from_numpy( depth )
tensors = ( tensor, ) |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
See the LoadImage node: https://github.com/comfyanonymous/ComfyUI/blob/master/nodes.py#L863 |
Beta Was this translation helpful? Give feedback.
-
But sometime the tensor is not image, it can not be visualized simply like image |
Beta Was this translation helpful? Give feedback.
See the LoadImage node: https://github.com/comfyanonymous/ComfyUI/blob/master/nodes.py#L863