-
Notifications
You must be signed in to change notification settings - Fork 598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AssertionError: Item selection is supported only on python list/tuple objects #2176
Comments
Our support for converting untraced PyTorch models is only experimental. You should have received a warning telling you that. What is the full error (including stack trace) that you are getting? Can you give us minimal, self-contained, code to reproduce this issue? |
Thank you so much @TobyRoseman and team for the quick response on this issue. Here I'm providing required code, logs and information to trace the issue and resolve easily. Please, look at the below sample code which reproduced the above mention error :
Here is the whole traceback of the error :
The particular error was raised from the getitem op function from “coremltools/converters/mil/frontend/torch/ops.py” file of coremltools |
You can replace bs = x.size(0) Note that the output of |
Thank you @M-Quadra, it worked.
As you have mentioned to generate random number externally, any help on how generate the z vector externally that doesn't change the behavior would be appreciated. |
We sincerely appreciate your prompt response and ongoing support @TobyRoseman. We have encountered another issue that appears to be similar to the one previously addressed in the same repository. Would you kindly review the comment link provided below for further details? |
I'll try splitting the model into two parts: class ShapeModel(nn.Module):
def forward(self, input1: torch.Tensor, input2: torch.Tensor):
processed = input1 + input2
@torch.jit.script_if_tracing
def _shape(input1, input2):
return torch.LongTensor([input1.size(0), input2.size(1)]).squeeze()
return processed, _shape(input1, input2)
class ThenModel(nn.Module):
def forward(self, z: torch.Tensor, processed: torch.Tensor):
out = self.cvae(z, processed)
return out Next, generate random numbers and feed them into Like this: extension MLMultiArray {
/// torch.randn([...]) * scale
static func randn(
shape: [NSNumber], dataType: MLMultiArrayDataType = .float32,
scale: Double = 1
) throws -> MLMultiArray {
let ts = try MLMultiArray(shape: shape, dataType: dataType)
// Box-Muller
let mean = 0.0, std = 1.0
let arr = (0..<(ts.count/16 + 1) * 16).map { _ in Double.random(in: 0..<1) }
var i = 0
while i < ts.count {
for j in i..<(i+8) {
let u1 = 1 - arr[j]
let u2 = arr[j + 8]
let radius = sqrt(-2 * log(u1))
let theta = 2 * Double.pi * u2
let z0 = radius * cos(theta) * std + mean
let z1 = radius * sin(theta) * std + mean
if j < ts.count { ts[j] = z0 * scale as NSNumber }
else { return ts }
if j+8 < ts.count { ts[j+8] = z1 * scale as NSNumber }
}
i += 16
}
return ts
}
}
let z = MLMultiArray.randn(shape: [shape[0], shape[1]]) |
Hello Developers,
We are trying to convert Pytorch models to CoreML using coremltools, while converting we used jit.trace to create trace of model where we encountered a warning that if model has controlflow and conditions it is not advisable to use trace instead convert into TorchScript using jit.script,
However after successful conversion of model into TorchScript, Now in the next step of conversion from TorchScript to CoreML here is the error we are getting when we tried to convert to coremltools python package.
This root error is so abstract that we are not able to trace-back from where its occurring.
AssertionError: Item selection is supported only on python list/tuple objects
The text was updated successfully, but these errors were encountered: