Skip to content

Request too large #6

@ikbened

Description

@ikbened

Hi Many,

May be this is more a feature request than a bug, but I encountered this....

In my test I search for an article in a webshop and verify that a counter with the number of articles found is 1 or more. I made the css to that counter invalid. When running this test, I get this output:

Bol com search :: Search an existing product.                         Keyword 'Browser.Get Text' with arguments '['css=p[data-test="number-of-articlesX"]']' used on line 37 failed.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.

Attempt 1 failed: litellm.RateLimitError: RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'Request too large for gpt-4o in organization org-XXXXXX on tokens per min (TPM): Limit 30000, Requested 253262. The input or output tokens must be reduced in order to run successfully. Visit https://platform.openai.com/account/rate-limits to learn more.', 'type': 'tokens', 'param': None, 'code': 'rate_limit_exceeded'}}

Although this page is not that complex/big, it seems that the request to openAI is too big. Is there a way to limit the request size, for now I only get these messages...

Cheers,
Ed

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions