[typo](docs)fix docs format (#16279)
This commit is contained in:
@ -524,6 +524,7 @@ curl --location-trusted -u root: -H "max_filter_ration:0.01" -H "format:json" -
|
||||
```
|
||||
|
||||
Import result:
|
||||
```
|
||||
MySQL > select * from array_test_decimal;
|
||||
+------+----------------------------------+
|
||||
| k1 | k2 |
|
||||
@ -531,6 +532,7 @@ MySQL > select * from array_test_decimal;
|
||||
| 39 | [-818.2173181] |
|
||||
| 40 | [100000000000000000.001111111] |
|
||||
+------+----------------------------------+
|
||||
```
|
||||
|
||||
|
||||
```json
|
||||
@ -542,12 +544,14 @@ curl --location-trusted -u root: -H "max_filter_ration:0.01" -H "format:json" -
|
||||
```
|
||||
|
||||
Import result:
|
||||
```
|
||||
MySQL > select * from array_test_largeint;
|
||||
+------+------------------------------------------------------------------------------------+
|
||||
| k1 | k2 |
|
||||
+------+------------------------------------------------------------------------------------+
|
||||
| 999 | [76959836937749932879763573681792701709, 26017042825937891692910431521038521227] |
|
||||
+------+------------------------------------------------------------------------------------+
|
||||
```
|
||||
|
||||
### Routine Load
|
||||
|
||||
|
||||
@ -33,6 +33,7 @@ under the License.
|
||||
|
||||
Returns the week number for date.The value of the mode argument defaults to 0.
|
||||
The following table describes how the mode argument works.
|
||||
|
||||
|Mode |First day of week |Range |Week 1 is the first week … |
|
||||
|:----|:-----------------|:------|:-----------------------------|
|
||||
|0 |Sunday |0-53 |with a Sunday in this year |
|
||||
|
||||
@ -194,50 +194,36 @@ ERRORS:
|
||||
````
|
||||
|
||||
2. Import the data in the local file 'testData' into the table 'testTbl' in the database 'testDb', use Label for deduplication, and only import data whose k1 is equal to 20180601
|
||||
|
||||
|
||||
````
|
||||
curl --location-trusted -u root -H "label:123" -H "where: k1=20180601" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
````
|
||||
|
||||
3. Import the data in the local file 'testData' into the table 'testTbl' in the database 'testDb', allowing a 20% error rate (the user is in the defalut_cluster)
|
||||
|
||||
|
||||
````
|
||||
curl --location-trusted -u root -H "label:123" -H "max_filter_ratio:0.2" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
````
|
||||
|
||||
4. Import the data in the local file 'testData' into the table 'testTbl' in the database 'testDb', allow a 20% error rate, and specify the column name of the file (the user is in the defalut_cluster)
|
||||
|
||||
|
||||
````
|
||||
curl --location-trusted -u root -H "label:123" -H "max_filter_ratio:0.2" -H "columns: k2, k1, v1" -T testData http://host:port/api/testDb/testTbl /_stream_load
|
||||
````
|
||||
|
||||
5. Import the data in the local file 'testData' into the p1, p2 partitions of the table 'testTbl' in the database 'testDb', allowing a 20% error rate.
|
||||
|
||||
|
||||
````
|
||||
curl --location-trusted -u root -H "label:123" -H "max_filter_ratio:0.2" -H "partitions: p1, p2" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
````
|
||||
|
||||
6. Import using streaming (user is in defalut_cluster)
|
||||
|
||||
|
||||
````
|
||||
seq 1 10 | awk '{OFS="\t"}{print $1, $1 * 10}' | curl --location-trusted -u root -T - http://host:port/api/testDb/testTbl/ _stream_load
|
||||
````
|
||||
|
||||
7. Import a table containing HLL columns, which can be columns in the table or columns in the data to generate HLL columns, or use hll_empty to supplement columns that are not in the data
|
||||
|
||||
|
||||
````
|
||||
curl --location-trusted -u root -H "columns: k1, k2, v1=hll_hash(k1), v2=hll_empty()" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
````
|
||||
|
||||
8. Import data for strict mode filtering and set the time zone to Africa/Abidjan
|
||||
|
||||
|
||||
````
|
||||
curl --location-trusted -u root -H "strict_mode: true" -H "timezone: Africa/Abidjan" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
````
|
||||
|
||||
@ -526,6 +526,7 @@ curl --location-trusted -u root: -H "max_filter_ration:0.01" -H "format:json" -
|
||||
```
|
||||
|
||||
导入结果:
|
||||
```
|
||||
MySQL > select * from array_test_decimal;
|
||||
+------+----------------------------------+
|
||||
| k1 | k2 |
|
||||
@ -533,6 +534,7 @@ MySQL > select * from array_test_decimal;
|
||||
| 39 | [-818.2173181] |
|
||||
| 40 | [100000000000000000.001111111] |
|
||||
+------+----------------------------------+
|
||||
```
|
||||
|
||||
|
||||
```json
|
||||
@ -544,12 +546,14 @@ curl --location-trusted -u root: -H "max_filter_ration:0.01" -H "format:json" -
|
||||
```
|
||||
|
||||
导入结果:
|
||||
```
|
||||
MySQL > select * from array_test_largeint;
|
||||
+------+------------------------------------------------------------------------------------+
|
||||
| k1 | k2 |
|
||||
+------+------------------------------------------------------------------------------------+
|
||||
| 999 | [76959836937749932879763573681792701709, 26017042825937891692910431521038521227] |
|
||||
+------+------------------------------------------------------------------------------------+
|
||||
```
|
||||
|
||||
### Routine Load
|
||||
|
||||
|
||||
@ -33,6 +33,7 @@ under the License.
|
||||
|
||||
返回指定日期的星期数。mode的值默认为0。
|
||||
参数mode的作用参见下面的表格:
|
||||
|
||||
|Mode |星期的第一天 |星期数的范围 |第一个星期的定义 |
|
||||
|:---|:-------------|:-----------|:--------------------------------------------|
|
||||
|0 |星期日 |0-53 |这一年中的第一个星期日所在的星期 |
|
||||
|
||||
@ -191,43 +191,36 @@ ERRORS:
|
||||
```
|
||||
|
||||
2. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表,使用Label用于去重, 并且只导入k1等于20180601的数据
|
||||
|
||||
```
|
||||
curl --location-trusted -u root -H "label:123" -H "where: k1=20180601" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
3. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表, 允许20%的错误率(用户是defalut_cluster中的)
|
||||
|
||||
```
|
||||
curl --location-trusted -u root -H "label:123" -H "max_filter_ratio:0.2" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
4. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表, 允许20%的错误率,并且指定文件的列名(用户是defalut_cluster中的)
|
||||
|
||||
```
|
||||
curl --location-trusted -u root -H "label:123" -H "max_filter_ratio:0.2" -H "columns: k2, k1, v1" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
5. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表中的p1, p2分区, 允许20%的错误率。
|
||||
|
||||
```
|
||||
curl --location-trusted -u root -H "label:123" -H "max_filter_ratio:0.2" -H "partitions: p1, p2" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
6. 使用streaming方式导入(用户是defalut_cluster中的)
|
||||
|
||||
```
|
||||
seq 1 10 | awk '{OFS="\t"}{print $1, $1 * 10}' | curl --location-trusted -u root -T - http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
7. 导入含有HLL列的表,可以是表中的列或者数据中的列用于生成HLL列,也可使用hll_empty补充数据中没有的列
|
||||
|
||||
```
|
||||
curl --location-trusted -u root -H "columns: k1, k2, v1=hll_hash(k1), v2=hll_empty()" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
8. 导入数据进行严格模式过滤,并设置时区为 Africa/Abidjan
|
||||
|
||||
```
|
||||
curl --location-trusted -u root -H "strict_mode: true" -H "timezone: Africa/Abidjan" -T testData http://host:port/api/testDb/testTbl/_stream_load
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user