From 0318da1df3661663cd0affdf6c073e686eb99dfe Mon Sep 17 00:00:00 2001 From: xyf <116467462+xyfsjq@users.noreply.github.com> Date: Sat, 2 Sep 2023 23:17:56 +0800 Subject: [PATCH] =?UTF-8?q?[fix](doc)=20Modify=20README.md=E2=80=A8Change?= =?UTF-8?q?=20=E2=80=98kafka=5Fdefault=5Foffset=E2=80=99=20to=20=E2=80=98k?= =?UTF-8?q?afka=5Fdefault=5Foffsets=E2=80=99.=20(#23791)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .../Load/CREATE-ROUTINE-LOAD.md | 6 +++--- .../Load/CREATE-ROUTINE-LOAD.md | 6 +++--- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md b/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md index 7acd012770..c97748401d 100644 --- a/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md +++ b/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md @@ -582,7 +582,7 @@ Assuming that we need to import data from Kafka into tables "test1" and "test2" ( "kafka_broker_list" = "broker1:9092,broker2:9092", "kafka_topic" = "my_topic", - "kafka_default_offset" = "2021-05-21 10:00:00" + "kafka_default_offsets" = "2021-05-21 10:00:00" ); ```` @@ -600,11 +600,11 @@ There are three relevant parameters: - `kafka_partitions`: Specify a list of partitions to be consumed, such as "0, 1, 2, 3". - `kafka_offsets`: Specify the starting offset of each partition, which must correspond to the number of `kafka_partitions` list. For example: "1000, 1000, 2000, 2000" -- `property.kafka_default_offset`: Specifies the default starting offset of the partition. +- `property.kafka_default_offsets`: Specifies the default starting offset of the partition. When creating an import job, these three parameters can have the following combinations: -| Composition | `kafka_partitions` | `kafka_offsets` | `property.kafka_default_offset` | Behavior | +| Composition | `kafka_partitions` | `kafka_offsets` | `property.kafka_default_offsets` | Behavior | | ----------- | ------------------ | --------------- | ------------------------------- | ------------------------------------------------------------ | | 1 | No | No | No | The system will automatically find all partitions corresponding to the topic and start consumption from OFFSET_END | | 2 | No | No | Yes | The system will automatically find all partitions corresponding to the topic and start consumption from the location specified by default offset | diff --git a/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md b/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md index 1206c53305..77cac7ce4a 100644 --- a/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md +++ b/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/CREATE-ROUTINE-LOAD.md @@ -583,7 +583,7 @@ FROM data_source [data_source_properties] ( "kafka_broker_list" = "broker1:9092,broker2:9092", "kafka_topic" = "my_topic", - "kafka_default_offset" = "2021-05-21 10:00:00" + "kafka_default_offsets" = "2021-05-21 10:00:00" ); ``` @@ -601,11 +601,11 @@ Doris 支持指定 Partition 和 Offset 开始消费,还支持了指定时间 - `kafka_partitions`:指定待消费的 partition 列表,如:"0, 1, 2, 3"。 - `kafka_offsets`:指定每个分区的起始offset,必须和 `kafka_partitions` 列表个数对应。如:"1000, 1000, 2000, 2000" -- `property.kafka_default_offset`:指定分区默认的起始offset。 +- `property.kafka_default_offsets:指定分区默认的起始offset。 在创建导入作业时,这三个参数可以有以下组合: -| 组合 | `kafka_partitions` | `kafka_offsets` | `property.kafka_default_offset` | 行为 | +| 组合 | `kafka_partitions` | `kafka_offsets` | `property.kafka_default_offsets` | 行为 | | ---- | ------------------ | --------------- | ------------------------------- | ------------------------------------------------------------ | | 1 | No | No | No | 系统会自动查找topic对应的所有分区并从 OFFSET_END 开始消费 | | 2 | No | No | Yes | 系统会自动查找topic对应的所有分区并从 default offset 指定的位置开始消费 |