[feature](mv) add mv rewrite info to explain (#29153)

In query rewrite by mv process, we may want know the mv rewrite process info
such as which materializedView is used by rewrite, which materializedView is rewritten successfully, and 
chose which materializedView by cost finally.

We can run sql as following to see the mv rewrite process summary info
`explain <your_query_sql>`

MaterializedView rewrite info is under the **MATERIALIZATIONS** tag.
For example as following:
we can see that materializedView with name `mv2_3` is rewritten successfuly and chosen finally.
and materializedView with name `mv2_4` and `mv1_3` is avaliable but rewrite fail

Materialized View

MaterializedViewRewriteFail:

  name: mv2_4
  FailSummary: The graph logic between query and view is not consistent

  name: mv1_3
  FailSummary: Match mode is invalid

MaterializedViewRewriteSuccessButNotChose:
  Names: 

MaterializedViewRewriteSuccessAndChose:
  Names: mv2_3

`MaterializedViewRewriteFail`:
it means that it's failure when try to use this materilaized view to represnt the query,
`NAME` is the name of MTMV.
`FAIL_SUMMARY` is the summary for the fail reason.

`MaterializedViewRewriteSuccessButNotChose`
it means that try to use this  materilaized view to represnt the query successfully, but cbo optimizer doesn't chose it finally.

`MaterializedViewRewriteSuccessAndChose`
it means that try to use this  materilaized view to represnt the query successfully and cbo optimizer  chose it finally.


If want to see detail info, we can also run sql as following to see the mv rewrite process detail info

`explain memo plan <your_query_sql>`

MaterializedView rewrite info is under the **MATERIALIZATIONS** tag, 
For example as following:

we can see the materializedView with name `mv2_3` is rewritten successfuly and chosen finally.
and materializedViews with name of `mv2_4` and `mv1_3` is failed with falil reason.

========== MATERIALIZATIONS ==========
materializationContexts:

MaterializationContext[mv1_3] ( rewriteSuccess=false, failReason=[
ObjectId : ObjectId#257.
Summary : Match mode is invalid.
Reason : matchMode is VIEW_PARTIAL.

ObjectId : ObjectId#260.
Summary : Match mode is invalid.
Reason : matchMode is VIEW_PARTIAL.

ObjectId : ObjectId#251.
Summary : Match mode is invalid.
Reason : matchMode is VIEW_PARTIAL.

ObjectId : ObjectId#254.
Summary : Match mode is invalid.
Reason : matchMode is VIEW_PARTIAL.

] )

MaterializationContext[mv2_4] ( rewriteSuccess=false, failReason=[
ObjectId : ObjectId#771.
Summary : The graph logic between query and view is not consistent.
Reason : graph logical is not equal
 query join edges is
 [<{0} --LEFT_OUTER_JOIN-- {1}>],
 view join edges is
 [<{0} --LEFT_OUTER_JOIN-- {1}>],
query filter edges
 is [<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]],
view filter edges
 is [<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]]
inferred edge with conditions
 {}
with error edge <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]=[(o_orderdate#20 = 2023-12-01)].

ObjectId : ObjectId#762.
Summary : The graph logic between query and view is not consistent.
Reason : graph logical is not equal
 query join edges is
 [<{0} --LEFT_OUTER_JOIN-- {1}>],
 view join edges is
 [<{0} --LEFT_OUTER_JOIN-- {1}>],
query filter edges
 is [<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]],
view filter edges
 is [<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]]
inferred edge with conditions
 {}
with error edge <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]=[(o_orderdate#20 = 2023-12-01)].
] )

 MaterializationContext[mv2_3] ( rewriteSuccess=true, failReason=[
] )

`ObjectId` is the id of group expression.
`Summary`is is the summary for the fail reason.
`Reason` is the detail fail reason

such as the info as above

MaterializationContext[mv2_4] ( rewriteSuccess=false, failReason=[
ObjectId : ObjectId#762.
Summary : The graph logic between query and view is not consistent.
Reason : graph logical is not equal
 query join edges is
 [<{0} --LEFT_OUTER_JOIN-- {1}>],
 view join edges is
 [<{0} --LEFT_OUTER_JOIN-- {1}>],
query filter edges
 is [<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]],
view filter edges
 is [<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]]
inferred edge with conditions
 {}
with error edge <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]=[(o_orderdate#20 = 2023-12-01)].
]

`0` represent table lineitem
`1` represent table orders
`[<{0} --LEFT_OUTER_JOIN-- {1}>]` means the edge which is lineitem left outer join orders
`[<{0} --FILTER-- {}>, <{1} --FILTER-- {}>[[] , [<{0} --LEFT_OUTER_JOIN-- {1}>]]]` means there is filter above orders which can not pull up because the edge `[<{0} --LEFT_OUTER_JOIN-- {1}>]`.
this can not rewrite because `[(o_orderdate#20 = 2023-12-01)]` in query is not found in **mv2_4**



**mv1_3**  def as following:
CREATE MATERIALIZED VIEW mv1_3
BUILD IMMEDIATE REFRESH auto ON SCHEDULE EVERY 1 hour
DISTRIBUTED BY RANDOM BUCKETS 12
PROPERTIES ('replication_num' = '1') as 
select 
  o_orderstatus, 
  o_clerk 
from 
  orders 
where 
  O_ORDERDATE = '2023-12-01'
group by 
  o_orderstatus, 
 o_clerk;

**mv2_3**  def as following:
CREATE MATERIALIZED VIEW mv2_3
BUILD IMMEDIATE REFRESH auto ON SCHEDULE EVERY 1 hour
DISTRIBUTED BY RANDOM BUCKETS 12
PROPERTIES ('replication_num' = '1') as 
select 
  l_linestatus, 
 o_clerk, 
from 
 (
   select 
     * 
   from 
     lineitem 
   where 
     l_shipdate = '2023-12-01'
 ) t1 
 left join (
   select 
     * 
   from 
     orders 
   where 
     o_orderdate = '2023-12-01'
 ) t2 on l_orderkey = o_orderkey 
group by 
 l_linestatus, 
 o_clerk;

**mv2_4**  def as following:
CREATE MATERIALIZED VIEW mv2_4
BUILD IMMEDIATE REFRESH auto ON SCHEDULE EVERY 1 hour
DISTRIBUTED BY RANDOM BUCKETS 12
PROPERTIES ('replication_num' = '1') as 
select 
 l_linestatus, 
 o_clerk, 
from 
 (
   select 
     * 
   from 
     lineitem 
   where 
     l_shipdate >= '2023-12-01' and l_shipdate <= '2023-12-05'
 ) t1 
 left join (
   select 
     * 
   from 
     orders 
   where 
     o_orderdate >= '2023-12-01' and o_orderdate <= '2023-12-05'
 ) t2 on l_orderkey = o_orderkey 
group by 
 l_linestatus, 
 o_clerk;
This commit is contained in:
seawinde
2024-01-04 23:01:55 +08:00
committed by GitHub
parent 35150bbc22
commit 6a836a53df
21 changed files with 406 additions and 173 deletions

View File

@ -22,6 +22,7 @@ import org.apache.doris.analysis.ExplainOptions;
import org.apache.doris.analysis.LiteralExpr;
import org.apache.doris.analysis.StatementBase;
import org.apache.doris.catalog.Column;
import org.apache.doris.catalog.MTMV;
import org.apache.doris.common.NereidsException;
import org.apache.doris.common.Pair;
import org.apache.doris.nereids.CascadesContext.Lock;
@ -42,12 +43,14 @@ import org.apache.doris.nereids.minidump.NereidsTracer;
import org.apache.doris.nereids.processor.post.PlanPostProcessors;
import org.apache.doris.nereids.processor.pre.PlanPreprocessors;
import org.apache.doris.nereids.properties.PhysicalProperties;
import org.apache.doris.nereids.rules.exploration.mv.MaterializationContext;
import org.apache.doris.nereids.trees.expressions.Expression;
import org.apache.doris.nereids.trees.expressions.NamedExpression;
import org.apache.doris.nereids.trees.expressions.literal.Literal;
import org.apache.doris.nereids.trees.plans.Plan;
import org.apache.doris.nereids.trees.plans.commands.ExplainCommand.ExplainLevel;
import org.apache.doris.nereids.trees.plans.logical.LogicalPlan;
import org.apache.doris.nereids.trees.plans.physical.PhysicalCatalogRelation;
import org.apache.doris.nereids.trees.plans.physical.PhysicalOneRowRelation;
import org.apache.doris.nereids.trees.plans.physical.PhysicalPlan;
import org.apache.doris.nereids.trees.plans.physical.PhysicalResultSink;
@ -408,7 +411,9 @@ public class NereidsPlanner extends Planner {
case MEMO_PLAN:
plan = cascadesContext.getMemo().toString()
+ "\n\n========== OPTIMIZED PLAN ==========\n"
+ optimizedPlan.treeString();
+ optimizedPlan.treeString()
+ "\n\n========== MATERIALIZATIONS ==========\n"
+ MaterializationContext.toString(cascadesContext.getMaterializationContexts());
break;
case ALL_PLAN:
plan = "========== PARSED PLAN ==========\n"
@ -421,7 +426,14 @@ public class NereidsPlanner extends Planner {
+ optimizedPlan.treeString();
break;
default:
plan = super.getExplainString(explainOptions);
List<MTMV> materializationListChosenByCbo = this.getPhysicalPlan()
.collectToList(node -> node instanceof PhysicalCatalogRelation
&& ((PhysicalCatalogRelation) node).getTable() instanceof MTMV).stream()
.map(node -> (MTMV) ((PhysicalCatalogRelation) node).getTable())
.collect(Collectors.toList());
plan = super.getExplainString(explainOptions)
+ MaterializationContext.toSummaryString(cascadesContext.getMaterializationContexts(),
materializationListChosenByCbo);
}
if (statementContext != null && !statementContext.getHints().isEmpty()) {
String hint = getHintExplainString(statementContext.getHints());

View File

@ -211,10 +211,22 @@ public abstract class Edge {
return getExpressions().get(i);
}
public String getTypeName() {
if (this instanceof FilterEdge) {
return "FILTER";
} else {
return ((JoinEdge) this).getJoinType().toString();
}
}
@Override
public String toString() {
return String.format("<%s - %s>", LongBitmap.toString(leftExtendedNodes), LongBitmap.toString(
rightExtendedNodes));
if (!leftRejectEdges.isEmpty() || !rightRejectEdges.isEmpty()) {
return String.format("<%s --%s-- %s>[%s , %s]", LongBitmap.toString(leftExtendedNodes),
this.getTypeName(), LongBitmap.toString(rightExtendedNodes), leftRejectEdges, rightRejectEdges);
}
return String.format("<%s --%s-- %s>", LongBitmap.toString(leftExtendedNodes),
this.getTypeName(), LongBitmap.toString(rightExtendedNodes));
}
}

View File

@ -21,6 +21,7 @@ import org.apache.doris.nereids.jobs.joinorder.hypergraph.HyperGraph;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.edge.Edge;
import org.apache.doris.nereids.trees.plans.GroupPlan;
import org.apache.doris.nereids.trees.plans.Plan;
import org.apache.doris.nereids.util.Utils;
import com.google.common.collect.ImmutableList;
@ -67,4 +68,9 @@ public class StructInfoNode extends AbstractNode {
return graphs;
}
@Override
public String toString() {
return Utils.toSqlString("StructInfoNode[" + this.getName() + "]",
"plan", this.plan.treeString());
}
}

View File

@ -89,12 +89,16 @@ public abstract class AbstractMaterializedViewAggregateRule extends AbstractMate
// get view and query aggregate and top plan correspondingly
Pair<Plan, LogicalAggregate<Plan>> viewTopPlanAndAggPair = splitToTopPlanAndAggregate(viewStructInfo);
if (viewTopPlanAndAggPair == null) {
logger.warn(currentClassName + " split to view to top plan and agg fail so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Split view to top plan and agg fail",
String.format("view plan = %s\n", viewStructInfo.getOriginalPlan().treeString())));
return null;
}
Pair<Plan, LogicalAggregate<Plan>> queryTopPlanAndAggPair = splitToTopPlanAndAggregate(queryStructInfo);
if (queryTopPlanAndAggPair == null) {
logger.warn(currentClassName + " split to query to top plan and agg fail so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Split query to top plan and agg fail",
String.format("query plan = %s\n", queryStructInfo.getOriginalPlan().treeString())));
return null;
}
// Firstly, handle query group by expression rewrite
@ -123,7 +127,13 @@ public abstract class AbstractMaterializedViewAggregateRule extends AbstractMate
true);
if (rewrittenQueryGroupExpr.isEmpty()) {
// can not rewrite, bail out.
logger.debug(currentClassName + " can not rewrite expression when not need roll up");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Can not rewrite expression when no roll up",
String.format("expressionToWrite = %s,\n mvExprToMvScanExprMapping = %s,\n"
+ "queryToViewSlotMapping = %s",
queryTopPlan.getExpressions(),
materializationContext.getMvExprToMvScanExprMapping(),
queryToViewSlotMapping)));
return null;
}
return new LogicalProject<>(
@ -138,14 +148,20 @@ public abstract class AbstractMaterializedViewAggregateRule extends AbstractMate
viewExpr -> viewExpr.anyMatch(expr -> expr instanceof AggregateFunction
&& ((AggregateFunction) expr).isDistinct()))) {
// if mv aggregate function contains distinct, can not roll up, bail out.
logger.debug(currentClassName + " view contains distinct function so can not roll up");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("View contains distinct function so can not roll up",
String.format("view plan = %s", viewAggregate.getOutputExpressions())));
return null;
}
// split the query top plan expressions to group expressions and functions, if can not, bail out.
Pair<Set<? extends Expression>, Set<? extends Expression>> queryGroupAndFunctionPair
= topPlanSplitToGroupAndFunction(queryTopPlanAndAggPair);
if (queryGroupAndFunctionPair == null) {
logger.warn(currentClassName + " query top plan split to group by and function fail so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Query top plan split to group by and function fail",
String.format("queryTopPlan = %s,\n agg = %s",
queryTopPlanAndAggPair.key().treeString(),
queryTopPlanAndAggPair.value().treeString())));
return null;
}
// Secondly, try to roll up the agg functions
@ -172,18 +188,27 @@ public abstract class AbstractMaterializedViewAggregateRule extends AbstractMate
Function rollupAggregateFunction = rollup(queryFunction, queryFunctionShuttled,
mvExprToMvScanExprQueryBased);
if (rollupAggregateFunction == null) {
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Query function roll up fail",
String.format("queryFunction = %s,\n queryFunctionShuttled = %s,\n"
+ "mvExprToMvScanExprQueryBased = %s",
queryFunction, queryFunctionShuttled, mvExprToMvScanExprQueryBased)));
return null;
}
// key is query need roll up expr, value is mv scan based roll up expr
needRollupExprMap.put(queryFunctionShuttled, rollupAggregateFunction);
// rewrite query function expression by mv expression
ExpressionMapping needRollupExprMapping = new ExpressionMapping(needRollupExprMap);
Expression rewrittenFunctionExpression = rewriteExpression(topExpression,
queryTopPlan,
new ExpressionMapping(needRollupExprMap),
needRollupExprMapping,
queryToViewSlotMapping,
false);
if (rewrittenFunctionExpression == null) {
logger.debug(currentClassName + " roll up expression can not rewrite by view so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Roll up expression can not rewrite by view", String.format(
"topExpression = %s,\n needRollupExprMapping = %s,\n queryToViewSlotMapping = %s",
topExpression, needRollupExprMapping, queryToViewSlotMapping)));
return null;
}
finalAggregateExpressions.add((NamedExpression) rewrittenFunctionExpression);
@ -193,22 +218,28 @@ public abstract class AbstractMaterializedViewAggregateRule extends AbstractMate
ExpressionUtils.shuttleExpressionWithLineage(topExpression, queryTopPlan);
if (!mvExprToMvScanExprQueryBased.containsKey(queryGroupShuttledExpr)) {
// group expr can not rewrite by view
logger.debug(currentClassName
+ " view group expressions can not contains the query group by expression so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("View dimensions doesn't not cover the query dimensions",
String.format("mvExprToMvScanExprQueryBased is %s,\n queryGroupShuttledExpr is %s",
mvExprToMvScanExprQueryBased, queryGroupShuttledExpr)));
return null;
}
groupRewrittenExprMap.put(queryGroupShuttledExpr,
mvExprToMvScanExprQueryBased.get(queryGroupShuttledExpr));
// rewrite query group expression by mv expression
ExpressionMapping groupRewrittenExprMapping = new ExpressionMapping(groupRewrittenExprMap);
Expression rewrittenGroupExpression = rewriteExpression(
topExpression,
queryTopPlan,
new ExpressionMapping(groupRewrittenExprMap),
groupRewrittenExprMapping,
queryToViewSlotMapping,
true);
if (rewrittenGroupExpression == null) {
logger.debug(currentClassName
+ " query top expression can not be rewritten by view so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Query dimensions can not be rewritten by view",
String.format("topExpression is %s,\n groupRewrittenExprMapping is %s,\n"
+ "queryToViewSlotMapping = %s",
topExpression, groupRewrittenExprMapping, queryToViewSlotMapping)));
return null;
}
finalAggregateExpressions.add((NamedExpression) rewrittenGroupExpression);

View File

@ -17,6 +17,7 @@
package org.apache.doris.nereids.rules.exploration.mv;
import org.apache.doris.common.Pair;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.HyperGraph;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.edge.JoinEdge;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.node.AbstractNode;
@ -60,7 +61,13 @@ public abstract class AbstractMaterializedViewJoinRule extends AbstractMateriali
// Can not rewrite, bail out
if (expressionsRewritten.isEmpty()
|| expressionsRewritten.stream().anyMatch(expr -> !(expr instanceof NamedExpression))) {
logger.warn(currentClassName + " expression to rewrite is not named expr so return null");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Rewrite expressions by view in join fail",
String.format("expressionToRewritten is %s,\n mvExprToMvScanExprMapping is %s,\n"
+ "queryToViewSlotMapping = %s",
queryStructInfo.getExpressions(),
materializationContext.getMvExprToMvScanExprMapping(),
queryToViewSlotMapping)));
return null;
}
// record the group id in materializationContext, and when rewrite again in

View File

@ -25,11 +25,13 @@ import org.apache.doris.catalog.PartitionItem;
import org.apache.doris.catalog.PartitionType;
import org.apache.doris.catalog.TableIf;
import org.apache.doris.common.AnalysisException;
import org.apache.doris.common.Pair;
import org.apache.doris.mtmv.BaseTableInfo;
import org.apache.doris.mtmv.MTMVPartitionInfo;
import org.apache.doris.mtmv.MTMVUtil;
import org.apache.doris.nereids.CascadesContext;
import org.apache.doris.nereids.jobs.executor.Rewriter;
import org.apache.doris.nereids.memo.GroupExpression;
import org.apache.doris.nereids.rules.exploration.ExplorationRuleFactory;
import org.apache.doris.nereids.rules.exploration.mv.Predicates.SplitPredicate;
import org.apache.doris.nereids.rules.exploration.mv.mapping.EquivalenceClassSetMapping;
@ -45,6 +47,7 @@ import org.apache.doris.nereids.trees.expressions.SlotReference;
import org.apache.doris.nereids.trees.expressions.literal.BooleanLiteral;
import org.apache.doris.nereids.trees.expressions.literal.Literal;
import org.apache.doris.nereids.trees.plans.JoinType;
import org.apache.doris.nereids.trees.plans.ObjectId;
import org.apache.doris.nereids.trees.plans.Plan;
import org.apache.doris.nereids.trees.plans.algebra.CatalogRelation;
import org.apache.doris.nereids.trees.plans.logical.LogicalFilter;
@ -70,8 +73,8 @@ import java.util.stream.Collectors;
* The abstract class for all materialized view rules
*/
public abstract class AbstractMaterializedViewRule implements ExplorationRuleFactory {
public static final HashSet<JoinType> SUPPORTED_JOIN_TYPE_SET =
Sets.newHashSet(JoinType.INNER_JOIN, JoinType.LEFT_OUTER_JOIN);
public static final HashSet<JoinType> SUPPORTED_JOIN_TYPE_SET = Sets.newHashSet(JoinType.INNER_JOIN,
JoinType.LEFT_OUTER_JOIN);
protected final String currentClassName = this.getClass().getSimpleName();
private final Logger logger = LogManager.getLogger(this.getClass());
@ -83,63 +86,67 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
List<MaterializationContext> materializationContexts = cascadesContext.getMaterializationContexts();
List<Plan> rewriteResults = new ArrayList<>();
if (materializationContexts.isEmpty()) {
logger.debug(currentClassName + " materializationContexts is empty so return");
return rewriteResults;
}
List<StructInfo> queryStructInfos = extractStructInfo(queryPlan, cascadesContext);
// TODO Just Check query queryPlan firstly, support multi later.
StructInfo queryStructInfo = queryStructInfos.get(0);
if (!checkPattern(queryStructInfo)) {
logger.debug(currentClassName + " queryStructInfo is not valid so return");
materializationContexts.forEach(ctx -> ctx.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Query struct info is invalid",
String.format("queryPlan is %s", queryPlan.treeString()))));
return rewriteResults;
}
for (MaterializationContext materializationContext : materializationContexts) {
// already rewrite, bail out
if (queryPlan.getGroupExpression().isPresent()
&& materializationContext.alreadyRewrite(
if (queryPlan.getGroupExpression().isPresent() && materializationContext.alreadyRewrite(
queryPlan.getGroupExpression().get().getOwnerGroup().getGroupId())) {
logger.debug(currentClassName + " this group is already rewritten so skip");
continue;
}
List<StructInfo> viewStructInfos = extractStructInfo(materializationContext.getMvPlan(),
cascadesContext);
List<StructInfo> viewStructInfos = extractStructInfo(materializationContext.getMvPlan(), cascadesContext);
if (viewStructInfos.size() > 1) {
// view struct info should only have one
logger.warn(currentClassName + " the num of view struct info is more then one so return");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("The num of view struct info is more then one",
String.format("mv plan is %s", materializationContext.getMvPlan().treeString())));
return rewriteResults;
}
StructInfo viewStructInfo = viewStructInfos.get(0);
if (!checkPattern(viewStructInfo)) {
logger.debug(currentClassName + " viewStructInfo is not valid so return");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("View struct info is invalid",
String.format(", view plan is %s", viewStructInfo.getOriginalPlan().treeString())));
continue;
}
MatchMode matchMode = decideMatchMode(queryStructInfo.getRelations(), viewStructInfo.getRelations());
if (MatchMode.COMPLETE != matchMode) {
logger.debug(currentClassName + " match mode is not complete so return");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Match mode is invalid", String.format("matchMode is %s", matchMode)));
continue;
}
List<RelationMapping> queryToViewTableMappings =
RelationMapping.generate(queryStructInfo.getRelations(), viewStructInfo.getRelations());
List<RelationMapping> queryToViewTableMappings = RelationMapping.generate(queryStructInfo.getRelations(),
viewStructInfo.getRelations());
// if any relation in query and view can not map, bail out.
if (queryToViewTableMappings == null) {
logger.warn(currentClassName + " query to view table mapping null so return");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Query to view table mapping is null", ""));
return rewriteResults;
}
for (RelationMapping queryToViewTableMapping : queryToViewTableMappings) {
SlotMapping queryToViewSlotMapping = SlotMapping.generate(queryToViewTableMapping);
if (queryToViewSlotMapping == null) {
logger.warn(currentClassName + " query to view slot mapping null so continue");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Query to view slot mapping is null", ""));
continue;
}
LogicalCompatibilityContext compatibilityContext =
LogicalCompatibilityContext.from(queryToViewTableMapping, queryToViewSlotMapping,
queryStructInfo, viewStructInfo);
LogicalCompatibilityContext compatibilityContext = LogicalCompatibilityContext.from(
queryToViewTableMapping, queryToViewSlotMapping, queryStructInfo, viewStructInfo);
ComparisonResult comparisonResult = StructInfo.isGraphLogicalEquals(queryStructInfo, viewStructInfo,
compatibilityContext);
if (comparisonResult.isInvalid()) {
logger.debug(currentClassName + " graph logical is not equals so continue");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("The graph logic between query and view is not consistent",
comparisonResult.getErrorMessage()));
continue;
}
// TODO: Use set of list? And consider view expr
@ -152,7 +159,14 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
queryToViewSlotMapping);
// Can not compensate, bail out
if (compensatePredicates.isEmpty()) {
logger.debug(currentClassName + " predicate compensate fail so continue");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Predicate compensate fail",
String.format("query predicates = %s,\n query equivalenceClass = %s, \n"
+ "view predicates = %s,\n query equivalenceClass = %s\n",
queryStructInfo.getPredicates(),
queryStructInfo.getEquivalenceClass(),
viewStructInfo.getPredicates(),
viewStructInfo.getEquivalenceClass())));
continue;
}
Plan rewrittenPlan;
@ -161,54 +175,59 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
rewrittenPlan = mvScan;
} else {
// Try to rewrite compensate predicates by using mv scan
List<Expression> rewriteCompensatePredicates = rewriteExpression(
compensatePredicates.toList(),
queryPlan,
materializationContext.getMvExprToMvScanExprMapping(),
queryToViewSlotMapping,
List<Expression> rewriteCompensatePredicates = rewriteExpression(compensatePredicates.toList(),
queryPlan, materializationContext.getMvExprToMvScanExprMapping(), queryToViewSlotMapping,
true);
if (rewriteCompensatePredicates.isEmpty()) {
logger.debug(currentClassName + " compensate predicate rewrite by view fail so continue");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Rewrite compensate predicate by view fail", String.format(
"compensatePredicates = %s,\n mvExprToMvScanExprMapping = %s,\n"
+ "queryToViewSlotMapping = %s",
compensatePredicates,
materializationContext.getMvExprToMvScanExprMapping(),
queryToViewSlotMapping)));
continue;
}
rewrittenPlan = new LogicalFilter<>(Sets.newHashSet(rewriteCompensatePredicates), mvScan);
}
// Rewrite query by view
rewrittenPlan = rewriteQueryByView(matchMode,
queryStructInfo,
viewStructInfo,
queryToViewSlotMapping,
rewrittenPlan,
materializationContext);
rewrittenPlan = rewriteQueryByView(matchMode, queryStructInfo, viewStructInfo, queryToViewSlotMapping,
rewrittenPlan, materializationContext);
if (rewrittenPlan == null) {
logger.debug(currentClassName + " rewrite query by view fail so continue");
continue;
}
if (!checkPartitionIsValid(queryStructInfo, materializationContext, cascadesContext)) {
logger.debug(currentClassName + " check partition validation fail so continue");
materializationContext.recordFailReason(queryStructInfo.getOriginalPlanId(),
Pair.of("Check partition validation fail",
"the partition used by query is invalid in materialized view"));
continue;
}
if (!checkOutput(queryPlan, rewrittenPlan)) {
logger.debug(currentClassName + " check output validation fail so continue");
if (!checkOutput(queryPlan, rewrittenPlan, materializationContext)) {
continue;
}
// run rbo job on mv rewritten plan
CascadesContext rewrittenPlanContext =
CascadesContext.initContext(cascadesContext.getStatementContext(), rewrittenPlan,
cascadesContext.getCurrentJobContext().getRequiredProperties());
CascadesContext rewrittenPlanContext = CascadesContext.initContext(
cascadesContext.getStatementContext(), rewrittenPlan,
cascadesContext.getCurrentJobContext().getRequiredProperties());
Rewriter.getWholeTreeRewriter(rewrittenPlanContext).execute();
rewrittenPlan = rewrittenPlanContext.getRewritePlan();
logger.debug(currentClassName + "rewrite by materialized view success");
materializationContext.setSuccess(true);
rewriteResults.add(rewrittenPlan);
}
}
return rewriteResults;
}
protected boolean checkOutput(Plan sourcePlan, Plan rewrittenPlan) {
if (sourcePlan.getGroupExpression().isPresent() && !rewrittenPlan.getLogicalProperties().equals(
sourcePlan.getGroupExpression().get().getOwnerGroup().getLogicalProperties())) {
logger.error("rewrittenPlan output logical properties is not same with target group");
protected boolean checkOutput(Plan sourcePlan, Plan rewrittenPlan, MaterializationContext materializationContext) {
if (sourcePlan.getGroupExpression().isPresent() && !rewrittenPlan.getLogicalProperties()
.equals(sourcePlan.getGroupExpression().get().getOwnerGroup().getLogicalProperties())) {
ObjectId planObjId = sourcePlan.getGroupExpression().map(GroupExpression::getId)
.orElseGet(() -> new ObjectId(-1));
materializationContext.recordFailReason(planObjId, Pair.of(
"RewrittenPlan output logical properties is different with target group",
String.format("planOutput logical properties = %s,\n"
+ "groupOutput logical properties = %s", rewrittenPlan.getLogicalProperties(),
sourcePlan.getGroupExpression().get().getOwnerGroup().getLogicalProperties())));
return false;
}
return true;
@ -220,9 +239,7 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
* Maybe only just some partitions is valid in materialized view, so we should check if the mv can
* offer the partitions which query used or not.
*/
protected boolean checkPartitionIsValid(
StructInfo queryInfo,
MaterializationContext materializationContext,
protected boolean checkPartitionIsValid(StructInfo queryInfo, MaterializationContext materializationContext,
CascadesContext cascadesContext) {
// check partition is valid or not
MTMV mtmv = materializationContext.getMTMV();
@ -240,8 +257,7 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
Optional<LogicalOlapScan> relatedTableRelation = queryInfo.getRelations().stream()
.filter(LogicalOlapScan.class::isInstance)
.filter(relation -> relatedPartitionTable.equals(new BaseTableInfo(relation.getTable())))
.map(LogicalOlapScan.class::cast)
.findFirst();
.map(LogicalOlapScan.class::cast).findFirst();
if (!relatedTableRelation.isPresent()) {
logger.warn("mv is partition update, but related table relation is null");
return false;
@ -263,37 +279,29 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
return false;
}
// get mv related table valid partitions
Set<Long> relatedTalbeValidSet = mvDataValidPartitions.stream()
.map(partition -> {
Set<Long> relatedBaseTablePartitions = mvToBasePartitionMap.get(partition.getId());
if (relatedBaseTablePartitions == null || relatedBaseTablePartitions.isEmpty()) {
return ImmutableList.of();
} else {
return relatedBaseTablePartitions;
}
})
.flatMap(Collection::stream)
.map(Long.class::cast)
.collect(Collectors.toSet());
Set<Long> relatedTalbeValidSet = mvDataValidPartitions.stream().map(partition -> {
Set<Long> relatedBaseTablePartitions = mvToBasePartitionMap.get(partition.getId());
if (relatedBaseTablePartitions == null || relatedBaseTablePartitions.isEmpty()) {
return ImmutableList.of();
} else {
return relatedBaseTablePartitions;
}
}).flatMap(Collection::stream).map(Long.class::cast).collect(Collectors.toSet());
// get query selected partitions to make the partitions is valid or not
Set<Long> relatedTableSelectedPartitionToCheck =
new HashSet<>(relatedTableRelation.get().getSelectedPartitionIds());
Set<Long> relatedTableSelectedPartitionToCheck = new HashSet<>(
relatedTableRelation.get().getSelectedPartitionIds());
if (relatedTableSelectedPartitionToCheck.isEmpty()) {
relatedTableSelectedPartitionToCheck.addAll(relatedTable.getPartitionIds());
}
return !relatedTalbeValidSet.isEmpty()
&& relatedTalbeValidSet.containsAll(relatedTableSelectedPartitionToCheck);
return !relatedTalbeValidSet.isEmpty() && relatedTalbeValidSet.containsAll(
relatedTableSelectedPartitionToCheck);
}
/**
* Rewrite query by view, for aggregate or join rewriting should be different inherit class implementation
*/
protected Plan rewriteQueryByView(MatchMode matchMode,
StructInfo queryStructInfo,
StructInfo viewStructInfo,
SlotMapping queryToViewSlotMapping,
Plan tempRewritedPlan,
MaterializationContext materializationContext) {
protected Plan rewriteQueryByView(MatchMode matchMode, StructInfo queryStructInfo, StructInfo viewStructInfo,
SlotMapping queryToViewSlotMapping, Plan tempRewritedPlan, MaterializationContext materializationContext) {
return tempRewritedPlan;
}
@ -306,11 +314,8 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
* the key expression in targetExpressionMapping should be shuttled. with the method
* ExpressionUtils.shuttleExpressionWithLineage.
*/
protected List<Expression> rewriteExpression(
List<? extends Expression> sourceExpressionsToWrite,
Plan sourcePlan,
ExpressionMapping targetExpressionMapping,
SlotMapping sourceToTargetMapping,
protected List<Expression> rewriteExpression(List<? extends Expression> sourceExpressionsToWrite, Plan sourcePlan,
ExpressionMapping targetExpressionMapping, SlotMapping sourceToTargetMapping,
boolean targetExpressionNeedSourceBased) {
// Firstly, rewrite the target expression using source with inverse mapping
// then try to use the target expression to represent the query. if any of source expressions
@ -325,13 +330,12 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
// project(slot 2, 1)
// target
// generate target to target replacement expression mapping, and change target expression to source based
List<? extends Expression> sourceShuttledExpressions =
ExpressionUtils.shuttleExpressionWithLineage(sourceExpressionsToWrite, sourcePlan);
List<? extends Expression> sourceShuttledExpressions = ExpressionUtils.shuttleExpressionWithLineage(
sourceExpressionsToWrite, sourcePlan);
ExpressionMapping expressionMappingKeySourceBased = targetExpressionNeedSourceBased
? targetExpressionMapping.keyPermute(sourceToTargetMapping.inverse()) : targetExpressionMapping;
// target to target replacement expression mapping, because mv is 1:1 so get first element
List<Map<Expression, Expression>> flattenExpressionMap =
expressionMappingKeySourceBased.flattenMap();
List<Map<Expression, Expression>> flattenExpressionMap = expressionMappingKeySourceBased.flattenMap();
Map<? extends Expression, ? extends Expression> targetToTargetReplacementMapping = flattenExpressionMap.get(0);
List<Expression> rewrittenExpressions = new ArrayList<>();
@ -341,8 +345,8 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
rewrittenExpressions.add(expressionToRewrite);
continue;
}
final Set<Object> slotsToRewrite =
expressionToRewrite.collectToSet(expression -> expression instanceof Slot);
final Set<Object> slotsToRewrite = expressionToRewrite.collectToSet(
expression -> expression instanceof Slot);
Expression replacedExpression = ExpressionUtils.replace(expressionToRewrite,
targetToTargetReplacementMapping);
if (replacedExpression.anyMatch(slotsToRewrite::contains)) {
@ -360,11 +364,8 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
return rewrittenExpressions;
}
protected Expression rewriteExpression(
Expression sourceExpressionsToWrite,
Plan sourcePlan,
ExpressionMapping targetExpressionMapping,
SlotMapping sourceToTargetMapping,
protected Expression rewriteExpression(Expression sourceExpressionsToWrite, Plan sourcePlan,
ExpressionMapping targetExpressionMapping, SlotMapping sourceToTargetMapping,
boolean targetExpressionNeedSourceBased) {
List<Expression> expressionToRewrite = new ArrayList<>();
expressionToRewrite.add(sourceExpressionsToWrite);
@ -382,11 +383,8 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
* For another example as following:
* predicate a = b in mv, and a = b and c = d in query, the compensatory predicate is c = d
*/
protected SplitPredicate predicatesCompensate(
StructInfo queryStructInfo,
StructInfo viewStructInfo,
SlotMapping queryToViewSlotMapping
) {
protected SplitPredicate predicatesCompensate(StructInfo queryStructInfo, StructInfo viewStructInfo,
SlotMapping queryToViewSlotMapping) {
EquivalenceClass queryEquivalenceClass = queryStructInfo.getEquivalenceClass();
EquivalenceClass viewEquivalenceClass = viewStructInfo.getEquivalenceClass();
// viewEquivalenceClass to query based
@ -394,24 +392,20 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
.toSlotReferenceMap();
EquivalenceClass viewEquivalenceClassQueryBased = viewEquivalenceClass.permute(viewToQuerySlotMapping);
if (viewEquivalenceClassQueryBased == null) {
logger.info(currentClassName + " permute view equivalence class by query fail so return empty");
return SplitPredicate.empty();
}
final List<Expression> equalCompensateConjunctions = new ArrayList<>();
if (queryEquivalenceClass.isEmpty() && viewEquivalenceClass.isEmpty()) {
equalCompensateConjunctions.add(BooleanLiteral.of(true));
}
if (queryEquivalenceClass.isEmpty()
&& !viewEquivalenceClass.isEmpty()) {
logger.info(currentClassName + " view has equivalence class but query not so return empty");
if (queryEquivalenceClass.isEmpty() && !viewEquivalenceClass.isEmpty()) {
return SplitPredicate.empty();
}
EquivalenceClassSetMapping queryToViewEquivalenceMapping =
EquivalenceClassSetMapping.generate(queryEquivalenceClass, viewEquivalenceClassQueryBased);
EquivalenceClassSetMapping queryToViewEquivalenceMapping = EquivalenceClassSetMapping.generate(
queryEquivalenceClass, viewEquivalenceClassQueryBased);
// can not map all target equivalence class, can not compensate
if (queryToViewEquivalenceMapping.getEquivalenceClassSetMap().size()
< viewEquivalenceClass.getEquivalenceSetList().size()) {
logger.info(currentClassName + " view has more equivalence than query so return empty");
return SplitPredicate.empty();
}
// do equal compensate
@ -449,17 +443,14 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
List<Expression> rangeCompensate = new ArrayList<>();
Expression queryRangePredicate = querySplitPredicate.getRangePredicate();
Expression viewRangePredicate = viewSplitPredicate.getRangePredicate();
Expression viewRangePredicateQueryBased =
ExpressionUtils.replace(viewRangePredicate, viewToQuerySlotMapping);
Expression viewRangePredicateQueryBased = ExpressionUtils.replace(viewRangePredicate, viewToQuerySlotMapping);
Set<Expression> queryRangeSet =
Sets.newHashSet(ExpressionUtils.extractConjunction(queryRangePredicate));
Set<Expression> viewRangeQueryBasedSet =
Sets.newHashSet(ExpressionUtils.extractConjunction(viewRangePredicateQueryBased));
Set<Expression> queryRangeSet = Sets.newHashSet(ExpressionUtils.extractConjunction(queryRangePredicate));
Set<Expression> viewRangeQueryBasedSet = Sets.newHashSet(
ExpressionUtils.extractConjunction(viewRangePredicateQueryBased));
// query range predicate can not contain all view range predicate when view have range predicate, bail out
if (!viewRangePredicateQueryBased.equals(BooleanLiteral.TRUE)
&& !queryRangeSet.containsAll(viewRangeQueryBasedSet)) {
logger.info(currentClassName + " query range predicate set can not contains all view range predicate");
if (!viewRangePredicateQueryBased.equals(BooleanLiteral.TRUE) && !queryRangeSet.containsAll(
viewRangeQueryBasedSet)) {
return SplitPredicate.empty();
}
queryRangeSet.removeAll(viewRangeQueryBasedSet);
@ -477,10 +468,8 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
Sets.newHashSet(ExpressionUtils.extractConjunction(viewResidualPredicateQueryBased));
// query residual predicate can not contain all view residual predicate when view have residual predicate,
// bail out
if (!viewResidualPredicateQueryBased.equals(BooleanLiteral.TRUE)
&& !queryResidualSet.containsAll(viewResidualQueryBasedSet)) {
logger.info(
currentClassName + " query residual predicate set can not contains all view residual predicate");
if (!viewResidualPredicateQueryBased.equals(BooleanLiteral.TRUE) && !queryResidualSet.containsAll(
viewResidualQueryBasedSet)) {
return SplitPredicate.empty();
}
queryResidualSet.removeAll(viewResidualQueryBasedSet);
@ -497,13 +486,9 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
* @see MatchMode
*/
private MatchMode decideMatchMode(List<CatalogRelation> queryRelations, List<CatalogRelation> viewRelations) {
List<TableIf> queryTableRefs = queryRelations
.stream()
.map(CatalogRelation::getTable)
List<TableIf> queryTableRefs = queryRelations.stream().map(CatalogRelation::getTable)
.collect(Collectors.toList());
List<TableIf> viewTableRefs = viewRelations
.stream()
.map(CatalogRelation::getTable)
List<TableIf> viewTableRefs = viewRelations.stream().map(CatalogRelation::getTable)
.collect(Collectors.toList());
boolean sizeSame = viewTableRefs.size() == queryTableRefs.size();
boolean queryPartial = viewTableRefs.containsAll(queryTableRefs);
@ -524,8 +509,8 @@ public abstract class AbstractMaterializedViewRule implements ExplorationRuleFac
* Extract struct info from plan, support to get struct info from logical plan or plan in group.
*/
public static List<StructInfo> extractStructInfo(Plan plan, CascadesContext cascadesContext) {
if (plan.getGroupExpression().isPresent()
&& !plan.getGroupExpression().get().getOwnerGroup().getStructInfos().isEmpty()) {
if (plan.getGroupExpression().isPresent() && !plan.getGroupExpression().get().getOwnerGroup().getStructInfos()
.isEmpty()) {
return plan.getGroupExpression().get().getOwnerGroup().getStructInfos();
} else {
// build struct info and add them to current group

View File

@ -20,6 +20,7 @@ package org.apache.doris.nereids.rules.exploration.mv;
import org.apache.doris.nereids.trees.expressions.Expression;
import org.apache.doris.nereids.trees.expressions.Slot;
import com.google.common.base.Preconditions;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
@ -31,24 +32,23 @@ import java.util.Set;
* comparison result of view and query
*/
public class ComparisonResult {
public static final ComparisonResult INVALID =
new ComparisonResult(ImmutableList.of(), ImmutableList.of(), ImmutableSet.of(), false);
private final boolean valid;
private final List<Expression> viewExpressions;
private final List<Expression> queryExpressions;
private final Set<Set<Slot>> viewNoNullableSlot;
public ComparisonResult(List<Expression> queryExpressions, List<Expression> viewExpressions,
Set<Set<Slot>> viewNoNullableSlot) {
this(queryExpressions, viewExpressions, viewNoNullableSlot, true);
}
private final String errorMessage;
ComparisonResult(List<Expression> queryExpressions, List<Expression> viewExpressions,
Set<Set<Slot>> viewNoNullableSlot, boolean valid) {
Set<Set<Slot>> viewNoNullableSlot, boolean valid, String message) {
this.viewExpressions = ImmutableList.copyOf(viewExpressions);
this.queryExpressions = ImmutableList.copyOf(queryExpressions);
this.viewNoNullableSlot = ImmutableSet.copyOf(viewNoNullableSlot);
this.valid = valid;
this.errorMessage = message;
}
public static ComparisonResult newInvalidResWithErrorMessage(String errorMessage) {
return new ComparisonResult(ImmutableList.of(), ImmutableList.of(), ImmutableSet.of(), false, errorMessage);
}
public List<Expression> getViewExpressions() {
@ -67,6 +67,10 @@ public class ComparisonResult {
return !valid;
}
public String getErrorMessage() {
return errorMessage;
}
/**
* Builder
*/
@ -109,11 +113,9 @@ public class ComparisonResult {
}
public ComparisonResult build() {
if (isInvalid()) {
return ComparisonResult.INVALID;
}
Preconditions.checkArgument(valid, "Comparison result must be valid");
return new ComparisonResult(queryBuilder.build(), viewBuilder.build(),
viewNoNullableSlotBuilder.build(), valid);
viewNoNullableSlotBuilder.build(), valid, "");
}
}

View File

@ -141,4 +141,9 @@ public class EquivalenceClass {
this.equivalenceSlotList = equivalenceSets;
return this.equivalenceSlotList;
}
@Override
public String toString() {
return "EquivalenceClass{" + "equivalenceSlotMap=" + equivalenceSlotMap + '}';
}
}

View File

@ -69,7 +69,7 @@ public class HyperGraphComparator {
private final Map<Edge, List<? extends Expression>> pullUpViewExprWithEdge = new HashMap<>();
private final LogicalCompatibilityContext logicalCompatibilityContext;
HyperGraphComparator(HyperGraph queryHyperGraph, HyperGraph viewHyperGraph,
public HyperGraphComparator(HyperGraph queryHyperGraph, HyperGraph viewHyperGraph,
LogicalCompatibilityContext logicalCompatibilityContext) {
this.queryHyperGraph = queryHyperGraph;
this.viewHyperGraph = viewHyperGraph;
@ -114,7 +114,7 @@ public class HyperGraphComparator {
.filter(expr -> !ExpressionUtils.isInferred(expr))
.collect(Collectors.toList());
if (!rawFilter.isEmpty() && !canPullUp(e.getKey())) {
return ComparisonResult.INVALID;
return ComparisonResult.newInvalidResWithErrorMessage(getErrorMessage() + "\nwith error edge " + e);
}
builder.addQueryExpressions(rawFilter);
}
@ -123,7 +123,7 @@ public class HyperGraphComparator {
.filter(expr -> !ExpressionUtils.isInferred(expr))
.collect(Collectors.toList());
if (!rawFilter.isEmpty() && !canPullUp(e.getKey())) {
return ComparisonResult.INVALID;
return ComparisonResult.newInvalidResWithErrorMessage(getErrorMessage() + "with error edge\n" + e);
}
builder.addViewExpressions(rawFilter);
}
@ -133,6 +133,21 @@ public class HyperGraphComparator {
return builder.build();
}
/**
* get error message
*/
public String getErrorMessage() {
return String.format(
"graph logical is not equal\n query join edges is\n %s,\n view join edges is\n %s,\n"
+ "query filter edges\n is %s,\nview filter edges\n is %s\n"
+ "inferred edge with conditions\n %s",
getQueryJoinEdges(),
getViewJoinEdges(),
getQueryFilterEdges(),
getViewFilterEdges(),
inferredViewEdgeMap);
}
private boolean canPullUp(Edge edge) {
// Only inner join and filter with none rejectNodes can be pull up
if (edge instanceof JoinEdge && !((JoinEdge) edge).getJoinType().isInnerJoin()) {

View File

@ -18,6 +18,7 @@
package org.apache.doris.nereids.rules.exploration.mv;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.node.StructInfoNode;
import org.apache.doris.nereids.memo.GroupExpression;
import org.apache.doris.nereids.rules.exploration.mv.mapping.Mapping.MappedRelation;
import org.apache.doris.nereids.rules.exploration.mv.mapping.RelationMapping;
import org.apache.doris.nereids.rules.exploration.mv.mapping.SlotMapping;
@ -26,8 +27,10 @@ import org.apache.doris.nereids.trees.expressions.Expression;
import org.apache.doris.nereids.trees.expressions.NamedExpression;
import org.apache.doris.nereids.trees.expressions.SlotReference;
import org.apache.doris.nereids.trees.expressions.visitor.DefaultExpressionRewriter;
import org.apache.doris.nereids.trees.plans.ObjectId;
import org.apache.doris.nereids.trees.plans.RelationId;
import org.apache.doris.nereids.util.ExpressionUtils;
import org.apache.doris.nereids.util.Utils;
import com.google.common.collect.BiMap;
import com.google.common.collect.HashBiMap;
@ -42,12 +45,19 @@ public class LogicalCompatibilityContext {
private final BiMap<StructInfoNode, StructInfoNode> queryToViewNodeMapping;
private final BiMap<Expression, Expression> queryToViewEdgeExpressionMapping;
private final BiMap<Integer, Integer> queryToViewNodeIDMapping;
private final ObjectId planNodeId;
/**
* LogicalCompatibilityContext
*/
public LogicalCompatibilityContext(BiMap<StructInfoNode, StructInfoNode> queryToViewNodeMapping,
BiMap<Expression, Expression> queryToViewEdgeExpressionMapping) {
BiMap<Expression, Expression> queryToViewEdgeExpressionMapping,
StructInfo queryStructInfo) {
this.queryToViewNodeMapping = queryToViewNodeMapping;
this.queryToViewEdgeExpressionMapping = queryToViewEdgeExpressionMapping;
this.queryToViewNodeIDMapping = HashBiMap.create();
this.planNodeId = queryStructInfo.getOriginalPlan().getGroupExpression()
.map(GroupExpression::getId).orElseGet(() -> new ObjectId(-1));
queryToViewNodeMapping.forEach((k, v) -> queryToViewNodeIDMapping.put(k.getIndex(), v.getIndex()));
}
@ -63,6 +73,10 @@ public class LogicalCompatibilityContext {
return queryToViewEdgeExpressionMapping;
}
public ObjectId getPlanNodeId() {
return planNodeId;
}
/**
* generate logical compatibility context
*/
@ -105,7 +119,7 @@ public class LogicalCompatibilityContext {
queryToViewEdgeMapping.put(edge, viewExpr);
}
});
return new LogicalCompatibilityContext(queryToViewNodeMapping, queryToViewEdgeMapping);
return new LogicalCompatibilityContext(queryToViewNodeMapping, queryToViewEdgeMapping, queryStructInfo);
}
private static Expression orderSlotAsc(Expression expression) {
@ -130,4 +144,11 @@ public class LogicalCompatibilityContext {
}
}
}
@Override
public String toString() {
return Utils.toSqlString("LogicalCompatibilityContext",
"queryToViewNodeMapping", queryToViewNodeMapping.toString(),
"queryToViewEdgeExpressionMapping", queryToViewEdgeExpressionMapping.toString());
}
}

View File

@ -20,20 +20,26 @@ package org.apache.doris.nereids.rules.exploration.mv;
import org.apache.doris.catalog.MTMV;
import org.apache.doris.catalog.Table;
import org.apache.doris.common.AnalysisException;
import org.apache.doris.common.Pair;
import org.apache.doris.mtmv.MTMVCache;
import org.apache.doris.nereids.CascadesContext;
import org.apache.doris.nereids.memo.GroupId;
import org.apache.doris.nereids.rules.exploration.mv.mapping.ExpressionMapping;
import org.apache.doris.nereids.trees.plans.ObjectId;
import org.apache.doris.nereids.trees.plans.Plan;
import org.apache.doris.nereids.util.ExpressionUtils;
import org.apache.doris.nereids.util.Utils;
import com.google.common.collect.ImmutableList;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.stream.Collectors;
/**
* Maintain the context for query rewrite by materialized view
@ -53,6 +59,11 @@ public class MaterializationContext {
private boolean available = true;
// the mv plan from cache at present, record it to make sure query rewrite by mv is right when cache change.
private Plan mvPlan;
// mark rewrite success or not
private boolean success = false;
// if rewrite by mv fail, record the reason, if success the failReason should be empty.
// The key is the query belonged group expression objectId, the value is the fail reason
private final Map<ObjectId, Pair<String, String>> failReason = new HashMap<>();
/**
* MaterializationContext, this contains necessary info for query rewriting by mv
@ -127,6 +138,91 @@ public class MaterializationContext {
return mvPlan;
}
public Map<ObjectId, Pair<String, String>> getFailReason() {
return failReason;
}
public void setSuccess(boolean success) {
this.success = success;
this.failReason.clear();
}
/**
* recordFailReason
*/
public void recordFailReason(ObjectId objectId, Pair<String, String> summaryAndReason) {
// once success, do not record the fail reason
if (this.success) {
return;
}
this.success = false;
this.failReason.put(objectId, summaryAndReason);
}
public boolean isSuccess() {
return success;
}
@Override
public String toString() {
StringBuilder failReasonBuilder = new StringBuilder("[").append("\n");
for (Map.Entry<ObjectId, Pair<String, String>> reason : this.failReason.entrySet()) {
failReasonBuilder
.append("\n")
.append("ObjectId : ").append(reason.getKey()).append(".\n")
.append("Summary : ").append(reason.getValue().key()).append(".\n")
.append("Reason : ").append(reason.getValue().value()).append(".\n");
}
failReasonBuilder.append("\n").append("]");
return Utils.toSqlString("MaterializationContext[" + mtmv.getName() + "]",
"rewriteSuccess", this.success,
"failReason", failReasonBuilder.toString());
}
/**
* toString, this contains summary and detail info.
*/
public static String toString(List<MaterializationContext> materializationContexts) {
StringBuilder builder = new StringBuilder();
builder.append("materializationContexts:").append("\n");
for (MaterializationContext ctx : materializationContexts) {
builder.append("\n").append(ctx).append("\n");
}
return builder.toString();
}
/**
* toSummaryString, this contains only summary info.
*/
public static String toSummaryString(List<MaterializationContext> materializationContexts,
List<MTMV> chosenMaterializationNames) {
Set<String> materializationChosenNameSet = chosenMaterializationNames.stream()
.map(MTMV::getName)
.collect(Collectors.toSet());
StringBuilder builder = new StringBuilder();
builder.append("\n\nMaterializedView\n");
builder.append("\nMaterializedViewRewriteFail:");
for (MaterializationContext ctx : materializationContexts) {
if (!ctx.isSuccess()) {
Set<String> failReasonSet =
ctx.getFailReason().values().stream().map(Pair::key).collect(Collectors.toSet());
builder.append("\n\n")
.append(" Name: ").append(ctx.getMTMV().getName())
.append("\n")
.append(" FailSummary: ").append(String.join(", ", failReasonSet));
}
}
builder.append("\n\nMaterializedViewRewriteSuccessButNotChose:\n");
builder.append(" Names: ").append(materializationContexts.stream()
.filter(materializationContext -> materializationContext.isSuccess()
&& !materializationChosenNameSet.contains(materializationContext.getMTMV().getName()))
.map(materializationContext -> materializationContext.getMTMV().getName())
.collect(Collectors.joining(", ")));
builder.append("\n\nMaterializedViewRewriteSuccessAndChose:\n");
builder.append(" Names: ").append(String.join(", ", materializationChosenNameSet));
return builder.toString();
}
/**
* MaterializationContext fromMaterializedView
*/

View File

@ -20,6 +20,7 @@ package org.apache.doris.nereids.rules.exploration.mv;
import org.apache.doris.nereids.trees.expressions.Expression;
import org.apache.doris.nereids.trees.expressions.literal.BooleanLiteral;
import org.apache.doris.nereids.util.ExpressionUtils;
import org.apache.doris.nereids.util.Utils;
import com.google.common.collect.ImmutableList;
@ -71,6 +72,11 @@ public class Predicates {
return predicatesSplit.getSplitPredicate();
}
@Override
public String toString() {
return Utils.toSqlString("Predicates", "pulledUpPredicates", pulledUpPredicates);
}
/**
* The split different representation for predicate expression, such as equal, range and residual predicate.
*/
@ -139,5 +145,13 @@ public class Predicates {
&& ((BooleanLiteral) rangeExpr).getValue()
&& ((BooleanLiteral) residualExpr).getValue();
}
@Override
public String toString() {
return Utils.toSqlString("SplitPredicate",
"equalPredicate", equalPredicate,
"rangePredicate", rangePredicate,
"residualPredicate", residualPredicate);
}
}
}

View File

@ -20,12 +20,14 @@ package org.apache.doris.nereids.rules.exploration.mv;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.HyperGraph;
import org.apache.doris.nereids.jobs.joinorder.hypergraph.node.StructInfoNode;
import org.apache.doris.nereids.memo.Group;
import org.apache.doris.nereids.memo.GroupExpression;
import org.apache.doris.nereids.rules.exploration.mv.Predicates.SplitPredicate;
import org.apache.doris.nereids.trees.expressions.EqualTo;
import org.apache.doris.nereids.trees.expressions.Expression;
import org.apache.doris.nereids.trees.expressions.SlotReference;
import org.apache.doris.nereids.trees.expressions.literal.Literal;
import org.apache.doris.nereids.trees.plans.JoinType;
import org.apache.doris.nereids.trees.plans.ObjectId;
import org.apache.doris.nereids.trees.plans.Plan;
import org.apache.doris.nereids.trees.plans.RelationId;
import org.apache.doris.nereids.trees.plans.algebra.CatalogRelation;
@ -65,6 +67,7 @@ public class StructInfo {
private static final PredicateCollector PREDICATE_COLLECTOR = new PredicateCollector();
// source data
private final Plan originalPlan;
private ObjectId originalPlanId;
private final HyperGraph hyperGraph;
private boolean valid = true;
// derived data following
@ -85,6 +88,8 @@ public class StructInfo {
private StructInfo(Plan originalPlan, @Nullable Plan topPlan, @Nullable Plan bottomPlan, HyperGraph hyperGraph) {
this.originalPlan = originalPlan;
this.originalPlanId = originalPlan.getGroupExpression()
.map(GroupExpression::getId).orElseGet(() -> new ObjectId(-1));
this.hyperGraph = hyperGraph;
this.topPlan = topPlan;
this.bottomPlan = bottomPlan;
@ -101,7 +106,6 @@ public class StructInfo {
}
collectStructInfoFromGraph();
initPredicates();
predicatesDerive();
}
public void addPredicates(List<Expression> canPulledUpExpressions) {
@ -156,6 +160,7 @@ public class StructInfo {
Set<Expression> topPlanPredicates = new HashSet<>();
topPlan.accept(PREDICATE_COLLECTOR, topPlanPredicates);
topPlanPredicates.forEach(this.predicates::addPredicate);
predicatesDerive();
}
// derive some useful predicate by predicates
@ -258,6 +263,10 @@ public class StructInfo {
? ((LogicalProject<Plan>) originalPlan).getProjects() : originalPlan.getOutput();
}
public ObjectId getOriginalPlanId() {
return originalPlanId;
}
/**
* Judge the source graph logical is whether the same as target
* For inner join should judge only the join tables,

View File

@ -20,6 +20,7 @@ package org.apache.doris.nereids.rules.exploration.mv.mapping;
import org.apache.doris.common.Pair;
import org.apache.doris.nereids.trees.expressions.Expression;
import org.apache.doris.nereids.util.ExpressionUtils;
import org.apache.doris.nereids.util.Utils;
import com.google.common.collect.ArrayListMultimap;
import com.google.common.collect.ImmutableMultimap;
@ -123,4 +124,9 @@ public class ExpressionMapping extends Mapping {
}
return new ExpressionMapping(foldedMappingBuilder.build());
}
@Override
public String toString() {
return Utils.toSqlString("ExpressionMapping", "expressionMapping", expressionMapping);
}
}

View File

@ -135,6 +135,11 @@ public abstract class Mapping {
public int hashCode() {
return Objects.hash(exprId);
}
@Override
public String toString() {
return "MappedSlot{" + "slot=" + slot + '}';
}
}
/** Chain fold tow mapping, such as this mapping is {[a -> b]}, the target mapping is

View File

@ -95,4 +95,9 @@ public class SlotMapping extends Mapping {
this.slotReferenceMap = slotReferenceSlotReferenceMap;
return this.slotReferenceMap;
}
@Override
public String toString() {
return "SlotMapping{" + "relationSlotMap=" + relationSlotMap + '}';
}
}