Skip to content

Latency added by proxy climbs and remains high during load test #2282

Answered by jakebanks
jakebanks asked this question in Q&A
Discussion options

You must be logged in to vote

OK I think I am onto something here...

To recap, we have 6 proxies proxying to 12 mock servers via an AWS Application Load Balancer.

Since we are seeing request queuing, we know that MAX_CONCURRENT_STREAMS is being reached. It seems like the default would be 100 but I couldn't find a better reference than this unfortunately. I could see Kestrel's Limits.Http2.MaxStreamsPerConnection defaults to 100 but I think that is inbound.

We have 12 mock servers, so what gives? shouldn't we theoretically have 6 (proxies) * 100 (streams) * 12 (mock targets) = 7200 concurrent requests without queuing? well, because our back end target is just an AWS Application Load Balancer, that is considered the sam…

Replies: 5 comments 5 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@jakebanks
Comment options

@MihaZupan
Comment options

Comment options

You must be logged in to vote
1 reply
@jakebanks
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@Tratcher
Comment options

@Tratcher
Comment options

Answer selected by jakebanks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants