Skip to content

Possible reason for increased encoding time? (v1.4.0 --> v.1.5.2) #1723

Answered by ggerganov
LVCSRer asked this question in Q&A
Discussion options

You must be logged in to vote

If you change this:

// select which device to run the Core ML model on
MLModelConfiguration *config = [[MLModelConfiguration alloc] init];
config.computeUnits = MLComputeUnitsCPUAndGPU;
//config.computeUnits = MLComputeUnitsCPUAndNeuralEngine;
//config.computeUnits = MLComputeUnitsAll;

To this:

    // select which device to run the Core ML model on
    MLModelConfiguration *config = [[MLModelConfiguration alloc] init];
    //config.computeUnits = MLComputeUnitsCPUAndGPU;
    //config.computeUnits = MLComputeUnitsCPUAndNeuralEngine;
    config.computeUnits = MLComputeUnitsAll;

Does it help?

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@LVCSRer
Comment options

Answer selected by LVCSRer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants