Buffer IAsyncEnumerable #73898
-
I have an AsyncEnumerable that pulls data from a (very slow) web service. Ideally I'd like a As an alternative, is it safe to do the following?
If I consume all items, I think this works fine, but my concern is that maybe the "fill it up" task would be blocked and live forever if I abandoned the enumerable after just pulling out a few items. Is my approach safe or is that a bad one? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 1 reply
-
Just use Channels.
Why not just create a It's possible with channels, too, but it sounds more complicated (KISS). |
Beta Was this translation helpful? Give feedback.
-
In the example you gave, if I understand correctly, it would essentially function as Chunk. What I'm wanting to do is essentially still work over items in individually in the enumerable but I want the items cached locally if they are available so that I'm eliminating the network delay. |
Beta Was this translation helpful? Give feedback.
-
@gfoidl That wasn't exactly what I wanted but you inspired me toward the answer I was looking for. Here is what I wanted:
|
Beta Was this translation helpful? Give feedback.
-
Maybe not what OP asked for, but I stumbled in here looking for chunking/batching similar to System.Linq.Enumerable.Chunk and I found this gist helpful: public static async IAsyncEnumerable<T[]> BatchAsync<T>(
this IAsyncEnumerable<T> source,
int batchSize,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var batch = new List<T>(batchSize);
await foreach (var item in source.WithCancellation(cancellationToken))
{
if (cancellationToken.IsCancellationRequested)
yield break;
batch.Add(item);
if (batch.Count >= batchSize)
{
yield return batch.ToArray();
batch.Clear();
}
}
if (batch.Count > 0)
yield return batch.ToArray();
} Some other implementations as well:
Found in similar discussion of MoreLINQ package: |
Beta Was this translation helpful? Give feedback.
@gfoidl That wasn't exactly what I wanted but you inspired me toward the answer I was looking for. Here is what I wanted: