-
-
Notifications
You must be signed in to change notification settings - Fork 97
Description
Not sure if this is a bug or if im doing something wrong, so i'll just describe what im trying to do and maybe someone will know whether or not im doing something blatantly wrong with the approach.
i setup a pretty simple consumer inside an artisan command
$consumer = Kafka::consumer(['topic'])
->withSasl(
config('kafka.sasl.username'),
config('kafka.sasl.password'),
config('kafka.sasl.mechanisms'),
config('kafka.securityProtocol')
)
->withHandler(function (ConsumedMessage $message) {
$body = $message->getBody();
event(new Event($body));
$this->info('Received message: '.json_encode($body));
})->build();
$consumer->consume();
and this seems to work just fine for a period of time, my consumer reads from the topic, pushes the events, continues consuming. so on and so forth.
After a while i get a memory exhaustion like the following
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 20480 bytes) in /local-path/vendor/laravel/framework/src/Illuminate/Collections/Arr.php on line 213
I don't believe this is related to the messages themselves as the data is pretty slim, i.e. a couple numeric/string fields. I think this may be related to the sheer volume of data im trying to consume.
For reference, the topic im consuming from receives maybe 10K messages every minute or so. Probably 1m+ messages a day. Is there a better way to consume from a topic such as this, or is there some configuration issue i missed?
Can this library handle the scale described?
Thanks in advance, its been very easy to work with the library so far.