[Bug]: load data bz2 compression file on tke env cost almost an hour has not been successful #16250
Open
1 task done
Labels
kind/bug
Something isn't working
severity/s0
Extreme impact: Cause the application to break down and seriously affect the use
Milestone
Is there an existing issue for the same bug?
Branch Name
main
Commit ID
6ff549f
Other Environment Information
Actual Behavior
job:https://github.com/matrixorigin/mo-nightly-regression/actions/runs/9154938692
mo log:
https://grafana.ci.matrixorigin.cn/explore?panes=%7B%22XGF%22:%7B%22datasource%22:%22loki%22,%22queries%22:%5B%7B%22refId%22:%22A%22,%22expr%22:%22%7Bnamespace%3D%5C%22mo-nightly-regression-20240520%5C%22%7D%20%7C%3D%20%60%60%22,%22queryType%22:%22range%22,%22datasource%22:%7B%22type%22:%22loki%22,%22uid%22:%22loki%22%7D,%22editorMode%22:%22builder%22%7D%5D,%22range%22:%7B%22from%22:%221716190082822%22,%22to%22:%221716193620430%22%7D%7D%7D&schemaVersion=1&orgId=1
profile:
2024-05-20_16_00_31.zip
Expected Behavior
No response
Steps to Reproduce
ddl: create table table_100_columns( clo1 tinyint, clo2 smallint, clo3 int, clo4 bigint, clo5 tinyint unsigned, clo6 smallint unsigned, clo7 int unsigned, clo8 bigint unsigned, col9 float, col10 double, col11 varchar(255), col12 Date, col13 DateTime, col14 timestamp, col15 bool, col16 decimal(5,2), col17 text, col18 varchar(255), col19 varchar(255), col20 varchar(255), col21 varchar(255), col22 varchar(255), col23 varchar(255), col24 varchar(255), col25 varchar(255), col26 varchar(255), col27 varchar(255), col28 varchar(255), col29 varchar(255), col30 varchar(255), col31 varchar(255), col32 varchar(255), col33 varchar(255), col34 varchar(255), col35 varchar(255), col36 varchar(255), col37 varchar(255), col38 varchar(255), col39 varchar(255), col40 varchar(255), col41 varchar(255), col42 varchar(255), col43 varchar(255), col44 varchar(255), col45 varchar(255), col46 varchar(255), col47 varchar(255), col48 varchar(255), col49 varchar(255), col50 varchar(255), col51 varchar(255), col52 varchar(255), col53 varchar(255), col54 varchar(255), col55 varchar(255), col56 varchar(255), col57 varchar(255), col58 varchar(255), col59 varchar(255), col60 varchar(255), col61 varchar(255), col62 varchar(255), col63 varchar(255), col64 varchar(255), col65 varchar(255), col66 varchar(255), col67 varchar(255), col68 varchar(255), col69 varchar(255), col70 varchar(255), col71 varchar(255), col72 varchar(255), col73 varchar(255), col74 varchar(255), col75 varchar(255), col76 varchar(255), col77 varchar(255), col78 varchar(255), col79 varchar(255), col80 varchar(255), col81 varchar(255), col82 varchar(255), col83 varchar(255), col84 varchar(255), col85 varchar(255), col86 varchar(255), col87 varchar(255), col88 varchar(255), col89 varchar(255), col90 varchar(255), col91 varchar(255), col92 varchar(255), col93 varchar(255), col94 varchar(255), col95 varchar(255), col96 varchar(255), col97 varchar(255), col98 varchar(255), col99 varchar(255), col100 varchar(255) ); load data url s3option {'endpoint'='http://cos.ap-guangzhou.myqcloud.com','access_key_id'='***','secret_access_key'='***','bucket'='mo-load-guangzhou-1308875761', 'filepath'='compressed_file/100000000_100_columns_load_data.csv.bz2', 'compression'='bz2'} into table test.table_100_columns fields terminated by ',' lines terminated by '\n' parallel 'true';
Additional information
No response
The text was updated successfully, but these errors were encountered: