-
Notifications
You must be signed in to change notification settings - Fork 21
Dynamic send #276
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dynamic send #276
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #276 +/- ##
==========================================
- Coverage 90.17% 89.00% -1.17%
==========================================
Files 80 81 +1
Lines 7390 7541 +151
==========================================
+ Hits 6664 6712 +48
- Misses 726 829 +103 ☔ View full report in Codecov by Sentry. |
@@ -66,3 +66,6 @@ path="examples/superstreams/send_super_stream.rs" | |||
name="environment_deserialization" | |||
path="examples/environment_deserialization.rs" | |||
|
|||
[[bin]] | |||
name = "perf-producer" | |||
path = "src/bin/perf-producer.rs" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As this file is doing perf-tests any possibility to move it in the tests folder? (maybe creating a folder like performances)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have different possibilities:
- put it under
bench
folder where, typically, you put performance tests, commonly developed withcriterion
crate. Anyway, this actually isn't a benchmark in the same sense ofcriterion
which invokes the function multiple time and measure the duration. - put it under
tests
folder and threat it as a integration test. Anyway, as a test, the output is not shown to the developer without the appropriate flag (--nocapture
). - put it under
bin
and threat is as a binary, as in this PR. You can run it usingcargo run --release --bin perf-producer
and print the output easily.
Anyway, I don't have a strong opinion on that and I can follow a different solution. let me know how you want to proceed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't have a strong opinion too. As a preference I would choose the bench folder. It can be also useful in case in the future we want to add other performance tests and group them together
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me! @wolf4ood If you want to do a final validation too!
This is not relevant, but is there a way to increase the code coverage? |
Thanks a lot @allevo @DanielePalaia |
This PR implements a new method to send a message:
batch_send
follows the same flow assend
batch_publishing_delay
params (breaking)tests
to avoid parallel execution. Parallel execution causes problems running the tests locally due to the limitation of the OS file description number.send_with_confirm
I introduced also a bin to calculate the latency performance to proof the improvement on low send rate. The bin creates a producer and a consumer. Every minute, 50 messages with the current timestamp are sent to RabbitMQ and the consumer store the metric. Every minute, a log is printed with the statistics.
Before this PR:
After this PR: