-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tortoise read.py - test large text files #314
Comments
Hi, thanks for checking in, I'm guessing what you want is this to be in the webui: Are you using React UI? It's far easier for me to add this function and to make it work seamlessly within that UI rather than the gradio. |
Hello, |
Hi, I'm happy that it worked out!
As for the 8 hours - tortoise is very sensitive to the parameters you
choose, low quality parameters are exponentially faster. I need to verify
that the presets work properly because I noticed a bug before.
…On Mon, May 13, 2024, 8:29 AM Christopher Lowden ***@***.***> wrote:
Hello,
I am worked it out with the NEW REACT UI interface. I activated "Split
prompt by lines" button and removed any strange pagination in the text. I
found that having more than one return line break sign stopped the process.
A single return line break sign helped slightly with intonation. It took me
8 hours using a RTX 3090 GPU at 100% and running very hot and noisy to do
8mins of narration. The result is 100 times better than anything else I
have found and compares favorably to a similar production by Eleven Labs.
The voice does go a little strange at some points, but that can be
corrected as the system produces separate files for each line split so
recalculating is easier than correct files from Eleven Labs.
Thank you so much for putting this UI together. It's fantastic.
—
Reply to this email directly, view it on GitHub
<#314 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABTRXI2G7WXKJ3CW5QTXFMLZCBFU7AVCNFSM6AAAAABHS3VCHWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMBWGY4DKMRUHE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Below is the setup I used |
Fixed the presets, now if you change the preset it will actually update the values (#315). |
Hello
I am trying to test how reasonably large text files are interpreted by tortoise. I noticed that there is command line available for Tortoise to breakup large texts into small chunks using the read.py file. I can see the file the tortoise install but not in your webgui version. Is there a way to have this function working in the webgui please?
Many thanks for such a great interface.
The text was updated successfully, but these errors were encountered: