Files
doris/docs/en/sql-reference/sql-statements/Account Management/SET PROPERTY.md

83 lines
3.3 KiB
Markdown

---
{
"title": "SET PROPERTY",
"language": "en"
}
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# SET PROPERTY
## Description
Syntax:
SET PROPERTY [FOR 'user'] 'key' = 'value' [, 'key' = 'value']
Set user attributes, including resources allocated to users, import cluster, etc. The user attributes set here are for user, not user_identity. That is to say, if two users'jack'@'%' and'jack'@'192%'are created through the CREATE USER statement, the SET PROPERTY statement can only be used for the jack user, not'jack'@'%' or'jack'@'192%'
Importing cluster is only applicable to Baidu internal users.
key:
Super user rights:
Max_user_connections: Maximum number of connections.
resource.cpu_share: cpu资源分配。
Load_cluster. {cluster_name}. priority: assigns priority to a specified cluster, which can be HIGH or NORMAL
Ordinary user rights:
Quota.normal: Resource allocation at the normal level.
Quota.high: Resource allocation at the high level.
Quota.low: Resource allocation at low level.
Load_cluster. {cluster_name}. hadoop_palo_path: The Hadoop directory used by Palo needs to store ETL programs and intermediate data generated by ETL for Palo to import. After the import is completed, the intermediate data will be automatically cleaned up, and the ETL program will be automatically reserved for next use.
Load_cluster. {cluster_name}. hadoop_configs: configuration of hadoop, where fs. default. name, mapred. job. tracker, hadoop. job. UGI must be filled in.
Load ucluster. {cluster name}. hadoop port: Hadoop HDFS name node http}
Default_load_cluster: The default import cluster.
## example
1. Modify the maximum number of user jacks to 1000
SET PROPERTY FOR 'jack' 'max_user_connections' = '1000';
2. Modify the cpu_share of user Jack to 1000
SET PROPERTY FOR 'jack' 'resource.cpu_share' = '1000';
3. Modify the weight of the normal group of Jack users
Set property for'jack''quota. normal' = 400';
4. Add import cluster for user jack
SET PROPERTY FOR 'jack'
'load 'cluster.{cluster name}.hadoop'u palo path' ='/user /palo /palo path',
'load_cluster.{cluster_name}.hadoop_configs' = 'fs.default.name=hdfs://dpp.cluster.com:port;mapred.job.tracker=dpp.cluster.com:port;hadoop.job.ugi=user,password;mapred.job.queue.name=job_queue_name_in_hadoop;mapred.job.priority=HIGH;';
5. Delete the import cluster under user jack.
SET PROPERTY FOR 'jack' 'load_cluster.{cluster_name}' = '';
6. Modify user jack's default import cluster
SET PROPERTY FOR 'jack' 'default_load_cluster' = '{cluster_name}';
7. Modify the cluster priority of user Jack to HIGH
SET PROPERTY FOR 'jack' 'load_cluster.{cluster_name}.priority' = 'HIGH';
## keyword
SET, PROPERTY