You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Other than the pending changes in https://github.com/AleoHQ/snarkVM/pull/1985 (including the ones in the review comments), the following issues were found via snarkOS network message fuzzing:
the size member of EvaluationDomain is used to resize vectors; it's u64, which can be maliciously crafted to be large enough to cause an OOM
proposed solution: manually implement CanonicalDeserialize and limit the maximum allowed size to some concrete value or a smaller data type
The degree of the EpochChallenge, which is derived from the epoch_polynomial member, can cause an OOM when used in hash_to_polynomial (and ultimately in hash_to_coefficients), as it's used to determine the number of coefficients collected into a vector
proposed solution: check for a fixed maximum value for the degree after it's read in EpochChallenge::read_le or reduce its data type below u32
(varuna) Commitments::deserialize_with_mode can panic due to integer overflow when calling batch_sizes.iter().sum() at the very beginning
proposed solution: change that line to let mut w = Vec::new();
BatchCertificate::read_le needs to use IndexMap::new instead of ::with_capacity (no longer an issue with the 2nd version of this object, and the 1st one is going to be removed)
The text was updated successfully, but these errors were encountered:
I'm now fuzzing the current version; the 1st point is no longer applicable, the next 2 can still cause an OOM, and the recent deserialization patches omitted a bit in the BatchCertificate, probably because it's being deprecated (added to the list).
Other than the pending changes in https://github.com/AleoHQ/snarkVM/pull/1985 (including the ones in the review comments), the following issues were found via snarkOS network message fuzzing:
size
member ofEvaluationDomain
is used to resize vectors; it'su64
, which can be maliciously crafted to be large enough to cause an OOMCanonicalDeserialize
and limit the maximum allowedsize
to some concrete value or a smaller data typedegree
of theEpochChallenge
, which is derived from theepoch_polynomial
member, can cause an OOM when used inhash_to_polynomial
(and ultimately inhash_to_coefficients
), as it's used to determine the number ofcoefficients
collected into a vectordegree
after it's read inEpochChallenge::read_le
or reduce its data type belowu32
Commitments::deserialize_with_mode
can panic due to integer overflow when callingbatch_sizes.iter().sum()
at the very beginninglet mut w = Vec::new();
BatchCertificate::read_le
needs to useIndexMap::new
instead of::with_capacity
(no longer an issue with the 2nd version of this object, and the 1st one is going to be removed)The text was updated successfully, but these errors were encountered: