Skip to content

Can not find ‘all_response’ #10

@lingzehui-123

Description

@lingzehui-123

Hello, your work is great, but I encountered some problems while conducting experiments. After initiating math_server.py, when executing run_rloo_1.5B.sh, the system threw an error.

Image

I have noticed that in the _generate_vllm function within openrlhf/trainer/ppo_utils/experience_maker.py, there is the appearance of all_response. However, I only have two A100 GPUs, which means I cannot use vllm. Meanwhile, in the generate_samples function, there is no occurrence of all_response. So, how can I conduct experiments without using vllm?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions