* [Bug] Filter out unavaiable backends when getting scan range location In the previous implementation, we will eliminate non-surviving BEs in the Coordinator phase. But for Spark or Flink Connector, there is no such logic, so when a BE node is down, it will cause the problem of querying errors through the Connector. * fix ut * fix compiule
# fe-common This module is used to store some common classes of other modules. # spark-dpp This module is Spark DPP program, used for Spark Load function. Depends: fe-common # fe-core This module is the main process module of FE. Depends: fe-common, spark-dpp