[Fix](MySQLLoad) Fix load a big local file bug since bytebuffer from mysql packet using the same byte array (#16901)

Loading a big local file will cause `INTERNAL_ERROR]too many filtered rows` issue since the bytebuffer from mysql client always use the same byte array. 

And the later bytes will overwrite the previous one and make wrong bytes order among the network.

Copy the byte array and then fill it into network.
This commit is contained in:
huangzhaowei
2023-02-28 00:06:44 +08:00
committed by GitHub
parent 84413f33b8
commit d3a6cab716
5 changed files with 486 additions and 4 deletions

View File

@ -42,10 +42,16 @@ public class ByteBufferNetworkInputStream extends InputStream {
if (closed) {
throw new IOException("Stream is already closed.");
}
ByteArrayInputStream inputStream = new ByteArrayInputStream(buffer.array(), buffer.position(), buffer.limit());
ByteArrayInputStream inputStream = new ByteArrayInputStream(bytesCopy(buffer));
queue.offer(inputStream, 300, TimeUnit.SECONDS);
}
public byte[] bytesCopy(ByteBuffer buffer) {
byte[] result = new byte[buffer.limit() - buffer.position()];
System.arraycopy(buffer.array(), buffer.position(), result, 0, result.length);
return result;
}
public void markFinished() {
this.finished = true;
}