[Fix](MySQLLoad) Fix load a big local file bug since bytebuffer from mysql packet using the same byte array (#16901)

Loading a big local file will cause `INTERNAL_ERROR]too many filtered rows` issue since the bytebuffer from mysql client always use the same byte array. 

And the later bytes will overwrite the previous one and make wrong bytes order among the network.

Copy the byte array and then fill it into network.
This commit is contained in:
huangzhaowei
2023-02-28 00:06:44 +08:00
committed by GitHub
parent 84413f33b8
commit d3a6cab716
5 changed files with 486 additions and 4 deletions

View File

@ -81,10 +81,10 @@ public class MysqlLoadManager {
InputStreamEntity entity = getInputStreamEntity(context, dataDesc.isClientLocal(), file);
HttpPut request = generateRequestForMySqlLoad(entity, dataDesc, database, table, token);
try (final CloseableHttpResponse response = httpclient.execute(request)) {
JsonObject result = JsonParser.parseString(EntityUtils.toString(response.getEntity()))
.getAsJsonObject();
String body = EntityUtils.toString(response.getEntity());
JsonObject result = JsonParser.parseString(body).getAsJsonObject();
if (!result.get("Status").getAsString().equalsIgnoreCase("Success")) {
LOG.warn("Execute stream load for mysql data load failed with message: " + request);
LOG.warn("Execute mysql data load failed with request: {} and response: {}", request, body);
throw new LoadException(result.get("Message").getAsString());
}
loadResult.incRecords(result.get("NumberLoadedRows").getAsLong());