Skip to content

GPU Utilization? #2

@bblakeslee-maker

Description

@bblakeslee-maker

I've been running inference with the provided pre-trained model, but I've noticed that it only runs on the CPU. I attempted to convert the code to run on a GPU; however, I get numerous runtime errors regarding CPU tensors vs GPU tensors. I see that there are several C++ source files included. Does this mean that this implementation of CRF as RNN is not able to run on a GPU, due to the code compiling for the CPU? Or am I missing something in my conversion of your code?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions