Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we calculate the Token costs (aka figure out what OpenAI will charge) #15

Open
lrauhockey opened this issue Mar 7, 2023 · 1 comment

Comments

@lrauhockey
Copy link

First: This is really great, and I can see tremendous value that me and others can get out.

And more of a question/than an issue...

Anyway to build the data model/sample questions to estimate costs? As always with any tool we bring in its about ROI - so knowing some way to evaluate it (I assume questions are tokens, but also sending the data to openAI etc. is also)

@bluecoconut
Copy link
Contributor

Hey @lrauhockey, I like this idea but I don't have time to add it, feel free to add a PR if you can think of a nice kwarg or path to test this.

For when running locally, if you change this line

code = howto_prompt(dfname=dfname, data_description=description, how=how)

you can extract the exact prompt that will be sent to the endpoint with the command:

prompt_string = howto_prompt.prompt_template.render(dfname=dfname, data_description=description, how=how)

From this you could use the openai tokenizer to get the total number of requested tokens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants