New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error The query is too large. The maximum standard SQL query length is 1024.00K characters
when using insert_rows_method chunk
#675
Comments
Hi @aibazhang, thanks for opening this issue. |
Speaking to our use case, we have a sizable DBT project, and it appears that we need both options in place on a run of the post hook in order for it to execute. When executing the upload of the dbt test meta data we hit errors:
|
@tsmo4 Have you tried lowering the default |
We have a similar use case to @tsmo4 's
We've tried to lower the default Error:
https://cloud.google.com/bigquery/quotas As a result, we considered to use the |
Hi guys!
I've added arg chunk_size when calling insert_rows to fix the error
Maximum number of resources referenced per query are 1,000 resources
serval days ago. #669However, a new error has occurred
The query is too large. The maximum standard SQL query length is 1024.00K characters, including comments and white space characters.
It appears that the
compiled_code
was too long when running the following query, even though we only have 400 records to insert.Is there a way to make
max_query_size
work even when using insert_rows_methodchunk
?If you have any ideas, please let me know, I can submit a PR.
Thank you
Environment
Variables
The text was updated successfully, but these errors were encountered: