Skip to content

Aysnc reader node #2752

Discussion options

You must be logged in to vote

@nasrin-taghizadeh the predict function is not intended to be async. The whole pipeline can only proceed when the prediction was made. So we wouldn't gain too much by making it async-compatible. You can make a blocking call to the remote server however.
If you want to make multiple predictions at once, check out the predict_batch function. Also this function is not async. But you could fire multiple requests from it, collect them and return the collected batch results.
If you're interested in the latter approach, checkout how asyncio.run is being used in https://blog.devgenius.io/how-to-send-concurrent-http-requests-in-python-d9cda284c86a

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by nasrin-taghizadeh
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants