New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable raft message batching (#1519) #3964
Conversation
ba57680
to
0cbff7b
Compare
f6cbb58
to
5917246
Compare
Bigger queue size is not required if batching enabled
5917246
to
997566f
Compare
Could you elaborate a bit on the observable effects of this change? 📈 |
Slight (and more stable) speedup on top of #3931. E.g., instead of sending 100 separate messages (doing 100 sequential gRPC calls) we send 1 batched message (doing 1 gRPC call). IIRC, result were something like ~2500-2800 ops / 5 seconds with #3931 and ~2900-3000 ops / 5 seconds with batching enabled. The only possible concern would be to check if we have any limit on maximal size of a single gRPC message, maybe we should increase it. |
I thought we didn't have a limit on the internal API, but we should confirm it. |
Seems like we don't. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think would I have any insights without giving it a run on CI & chaos
This PR enables built-in raft message batching. (See #1519.)
All Submissions:
dev
branch. Did you create your branch fromdev
?New Feature Submissions:
cargo +nightly fmt --all
command prior to submission?cargo clippy --all --all-features
command?Changes to Core Features: