Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to 2.1.7 dramatically increased CPU utilization #7308

Open
BastianVoigt opened this issue Jul 10, 2023 · 6 comments
Open

Upgrade to 2.1.7 dramatically increased CPU utilization #7308

BastianVoigt opened this issue Jul 10, 2023 · 6 comments
Labels

Comments

@BastianVoigt
Copy link

We recently upgraded our services from Dropwizard 2.1.6 to 2.1.7. When we deployed the change to production with no changes to our application code, we saw the CPU usage suddenly jumping about from ~20% to ~30% (see attached chart). There were no changes to traffic patterns either. Later on the same day we even saw the CPU usage hit our alarm threshold which normally never happens.

image

Wondering what might have caused this, we reverted the change and indeed the CPU usage went down again.

I scanned the release notes quickly but could not find an apparent reason for this behaviour. Do you have any ideas?

@joschi joschi added the bug label Jul 10, 2023
@joschi
Copy link
Member

joschi commented Jul 10, 2023

@BastianVoigt Thanks for reporting this!

Dropwizard 2.1.7 contains mostly dependency upgrades and no changes on the production code:
https://github.com/dropwizard/dropwizard/releases/tag/v2.1.7

Could you please tell us which Dropwizard modules you are using in your project so that we can further narrow down the cause of this change?

And just to make sure: The increased CPU usage started when deploying the application which was updated to Dropwizard 2.1.7, correct?

Does the new application version also require more memory?

Did you change anything else besides upgrading to Dropwizard 2.1.7?

@BastianVoigt
Copy link
Author

Thanks for the quick response!

We use the following modules (now reverted to 2.1.6):

implementation group: 'io.dropwizard', name: 'dropwizard-core', version: '2.1.6'
implementation group: 'io.dropwizard', name: 'dropwizard-assets', version: '2.1.6'
implementation group: 'io.dropwizard', name: 'dropwizard-auth', version: '2.1.6'
implementation group: 'io.dropwizard', name: 'dropwizard-jdbi3', version: '2.1.6'
implementation group: 'io.dropwizard', name: 'dropwizard-migrations', version: '2.1.6'

testImplementation group: 'io.dropwizard', name: 'dropwizard-testing', version: '2.1.6'

There were some other dependency updates deployed at the same time, but after reverting only the dropwizard upgrade we saw the CPU usage go back to normal. So we are pretty sure it was caused by this.

I am afraid we currently do not collect garbage collection metrics, but I am pretty sure the heap is large enough, plus we also saw the increased CPU usage at night time when the traffic is an order of magnitude lower than our peak. So my gut feeling tells me it is not a Garbage Collection overhead issue. But we will probably start monitoring these metrics soonish.

@BastianVoigt
Copy link
Author

@joschi Please let me know what information you need to analyze the issue.

@zUniQueX
Copy link
Member

Hi @BastianVoigt. I agree with @joschi that this behavior is most likely introduced with a dependency upgrade. For us it might be hard to create a reproducing example with enough load. Do you have the chance to bump the dependency versions in your project to the versions of the new release?

The release mostly contains upgrades of infrastructure versions, so you're not using many updated libraries. The changelogs of the updated libraries seem unobtrusive IMHO.

Based on your list of Dropwizard modules I'd expect the increased load to be introduced in jdbi or tomcat-jdbc.

@joschi
Copy link
Member

joschi commented Nov 4, 2023

@BastianVoigt Did you have any chance to check whether the issue still exists in Dropwizard 2.1.9?

@joschi
Copy link
Member

joschi commented Dec 20, 2023

@BastianVoigt ping 😉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants