Doris couldn't resolve the defaultFS of HDFS with HA configuration, so it could query hive table on HA HDFS.
This is because there's no way to send the HA configs to hive external table.
Describe the overview of changes.
Pass the ha configs to hive external table through create table properties.
Usage:
Example of creating hive table with ha configuration properties:
CREATE TABLE region (
r_regionkey integer NOT NULL,
r_name char(25) NOT NULL,
r_comment varchar(152)
) engine=hive properties
("database"="default",
"table"="region",
"hive.metastore.uris"="thrift://172.21.16.11:7004",
"dfs.nameservices"="hacluster",
"dfs.ha.namenodes.hacluster"="3,4",
"dfs.namenode.rpc-address.hacluster.3"="192.168.0.93:8020",
"dfs.namenode.rpc-address.hacluster.4"="172.21.16.11:8020", "dfs.client.failover.proxy.provider.hacluster"="org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider");