New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Calcite upgrade 1.33 without SEARCH operators #5210
Draft
soumyakanti3578
wants to merge
42
commits into
apache:master
Choose a base branch
from
soumyakanti3578:calcite-upgrade-1.33-no-sarg
base: master
Could not load branches
Branch not found: {{ refName }}
Could not load tags
Nothing to show
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
[WIP] Calcite upgrade 1.33 without SEARCH operators #5210
soumyakanti3578
wants to merge
42
commits into
apache:master
from
soumyakanti3578:calcite-upgrade-1.33-no-sarg
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
The problem as well as root cause analysis can be found under: * https://issues.apache.org/jira/browse/CALCITE-5300 * https://issues.apache.org/jira/browse/LOG4J2-3609 The problem appears only during compilation thus we don't need the value-annotations at runtime. Additionally, we don't want to propagate the dependency to other projects thus we declare it at provided scope.
…erclass constructor Breaking change caused by CALCITE-4640.
RelRule.Config.EMPTY has been removed in CALCITE-4839 which is a known breaking change. Refactor HiveFilterJoinRule using Config interface instead of inheritance to parameterize Hive variants. The refactored version does not use RelRule.Config.EMPTY so it solves the compilation problem.
….EMPTY" This reverts commit 844a656b95f34607a61eaaa6c0900458c89c8d6c.
RelRule.Config.EMPTY has been removed in CALCITE-4839 which is a known breaking change. Add new parameter to the constructor of HiveFilterJoinRule and let the problem of initializing the configuration to subclasses. There are other ways to deal with the problem (see two previous commits) but this is the one with minimal changes to the code and easier to review.
…ass constructor The constructor was accidentally removed from CALCITE-4787 along with a bunch of other changes around Config.EMPTY. Adapt the rule by passing a config starting of from JDBCExtractJoinFilterRule.
Documented breaking change CALCITE-4830
…Calcite Simplification becomes possible due to the Calcite upgrade.
… change The signature of ProjectFactory#createProject method was changed adding wildcards with extends for fieldNames in CALCITE-4199.
…liDriver The error occurs because we are using an old version of Avatica which is incompatible with Calcite 1.32.0. java.lang.NoSuchFieldError: BACK_TICK_BACKSLASH at org.apache.calcite.config.Lex.<clinit>(Lex.java:38) at org.apache.calcite.config.CalciteConnectionProperty.<clinit>(CalciteConnectionProperty.java:80) at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:181) at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:135) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1340) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:567) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12739) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:460) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:317) at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:224) at org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:106) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:522) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:474) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:439) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:433) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:121) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:227) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255) at org.apache.hadoop.hive.cli.CliDriver.processCmd1(CliDriver.java:200) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:126) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:421) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:352) at org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.java:727) at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:697) at org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:114) at org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:157) at org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver(TestMiniLlapLocalCliDriver.java:62) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter.java:135) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at org.junit.runners.ParentRunner.run(ParentRunner.java:413) at org.junit.runners.Suite.runChild(Suite.java:128) at org.junit.runners.Suite.runChild(Suite.java:27) at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:95) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at org.junit.runners.ParentRunner.run(ParentRunner.java:413) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:377) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:138) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:465) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:451)
…hen running TestMiniLlapLocalCliDriver The error occurs because we are using an old version of Janino which is incompatible with Calcite 1.32.0. org.apache.hive.com.google.common.util.concurrent.ExecutionError: java.lang.NoSuchMethodError: org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory(Ljava/lang/ClassLoader;)Lorg/codehaus/commons/compiler/ICompilerFactory; at org.apache.hive.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2205) at org.apache.hive.com.google.common.cache.LocalCache.get(LocalCache.java:3953) at org.apache.hive.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) at org.apache.hive.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875) at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:189) at org.apache.calcite.rel.metadata.RelMetadataQueryBase.revise(RelMetadataQueryBase.java:118) at org.apache.calcite.rel.metadata.RelMetadataQuery.isVisibleInExplain(RelMetadataQuery.java:885) at org.apache.calcite.rel.externalize.RelWriterImpl.explain_(RelWriterImpl.java:67) at org.apache.calcite.rel.externalize.RelWriterImpl.done(RelWriterImpl.java:151) at org.apache.calcite.rel.AbstractRelNode.explain(AbstractRelNode.java:252) at org.apache.calcite.plan.RelOptUtil.toString(RelOptUtil.java:2378) at org.apache.calcite.plan.RelOptUtil.toString(RelOptUtil.java:2361) at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1663) at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1588) at org.apache.calcite.tools.Frameworks.lambda$withPlanner$0(Frameworks.java:140) at org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:933) at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:191) at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:135) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1340) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:567) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12739) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:460) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:317) at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:224) at org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:106) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:522) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:474) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:439) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:433) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:121) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:227) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255) at org.apache.hadoop.hive.cli.CliDriver.processCmd1(CliDriver.java:200) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:126) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:421) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:352) at org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.java:727) at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:697) at org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:114) at org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:157) at org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver(TestMiniLlapLocalCliDriver.java:62) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter.java:135) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at org.junit.runners.ParentRunner.run(ParentRunner.java:413) at org.junit.runners.Suite.runChild(Suite.java:128) at org.junit.runners.Suite.runChild(Suite.java:27) at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:95) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at org.junit.runners.ParentRunner.run(ParentRunner.java:413) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:377) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:138) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:465) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:451) Caused by: java.lang.NoSuchMethodError: org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory(Ljava/lang/ClassLoader;)Lorg/codehaus/commons/compiler/ICompilerFactory; at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:154) at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.generateCompileAndInstantiate(JaninoRelMetadataProvider.java:138) at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.lambda$static$0(JaninoRelMetadataProvider.java:72) at org.apache.hive.com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:149) at org.apache.hive.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542) at org.apache.hive.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) at org.apache.hive.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286) at org.apache.hive.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) ... 79 more
….EMPTY Documented breaking change CALCITE-4830. Add a base Hive rule config to use as an extension point for creating the prune empty rule variants in Hive since it is not possible (without ugly casts) to derive a config from the respective Calcite classes.
…eExpressionsRule.Config Documented breaking change CALCITE-4839 removing deprecated Config interfaces.
…n ProjectFactory interface Undocumented breaking change from CALCITE-5127. For the moment, just thrown an exception if someone tries to set variables in project since Hive does not know how to handle them.
The query runs for a very long time and eventually fails with StackOverflowError. jstack (attached at the end) shows that the query is stuck in a very deep call to RelMdPredicates.getPredicates. Analysing the CBO logs we can see that the only rule that fires is HivePreFilteringRule. The rule constantly adds a new HiveFilter till the query plan becomes huge and eventually crashes due to a very deep stack. Turns out that a simplification that was added in CALCITE-5036 turns OR expressions into SEARCH is triggering the problem. Example: OR(=($76, 3), =($76, 1)) => SEARCH($76, Sarg[1, 3]) In HiveCalciteUtil#getPredsNotPushedAlready, we compare simplified with not simplified predicates and thus fail to detect the equivalence between OR and SEARCH. A quick workaround, not sure if it is going be the final solution, is to also simplify the input to getPredsNotPushedAlready. This is sufficient to overcome the StackOverflowError reported here. Worth noting that the HivePreFilteringRule seems in general buggy so there might be more and better solutions to this problem. java.lang.Thread.State: RUNNABLE at java.util.TreeMap.compare(TreeMap.java:1294) at java.util.TreeMap.put(TreeMap.java:538) at org.apache.hive.com.google.common.collect.TreeRangeSet.replaceRangeWithSameLowerBound(TreeRangeSet.java:272) at org.apache.hive.com.google.common.collect.TreeRangeSet.add(TreeRangeSet.java:222) at org.apache.hive.com.google.common.collect.RangeSet.addAll(RangeSet.java:225) at org.apache.hive.com.google.common.collect.AbstractRangeSet.addAll(AbstractRangeSet.java:64) at org.apache.hive.com.google.common.collect.TreeRangeSet.addAll(TreeRangeSet.java:41) at org.apache.calcite.rex.RexSimplify$RexSargBuilder.addSarg(RexSimplify.java:3056) at org.apache.calcite.rex.RexSimplify$SargCollector.accept2b(RexSimplify.java:2894) at org.apache.calcite.rex.RexSimplify$SargCollector.accept2(RexSimplify.java:2812) at org.apache.calcite.rex.RexSimplify$SargCollector.accept_(RexSimplify.java:2793) at org.apache.calcite.rex.RexSimplify$SargCollector.accept(RexSimplify.java:2778) at org.apache.calcite.rex.RexSimplify$SargCollector.access$400(RexSimplify.java:2761) at org.apache.calcite.rex.RexSimplify.lambda$simplifyAnd$3(RexSimplify.java:1488) at org.apache.calcite.rex.RexSimplify$$Lambda$1099/1247334493.accept(Unknown Source) at java.util.ArrayList.forEach(ArrayList.java:1259) at org.apache.calcite.rex.RexSimplify.simplifyAnd(RexSimplify.java:1488) at org.apache.calcite.rex.RexSimplify.simplify(RexSimplify.java:279) at org.apache.calcite.rex.RexSimplify.simplifyUnknownAs(RexSimplify.java:248) at org.apache.calcite.rex.RexSimplify.simplify(RexSimplify.java:223) at org.apache.calcite.rel.metadata.RelMdPredicates.getPredicates(RelMdPredicates.java:299) at org.apache.calcite.rel.metadata.janino.GeneratedMetadata_PredicatesHandler.getPredicates_$(Unknown Source) at org.apache.calcite.rel.metadata.janino.GeneratedMetadata_PredicatesHandler.getPredicates(Unknown Source) at org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:841) at org.apache.calcite.rel.metadata.RelMdPredicates.getPredicates(RelMdPredicates.java:292) ... at org.apache.calcite.rel.metadata.janino.GeneratedMetadata_PredicatesHandler.getPredicates_$(Unknown Source) at org.apache.calcite.rel.metadata.janino.GeneratedMetadata_PredicatesHandler.getPredicates(Unknown Source) at org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:841) at org.apache.calcite.rel.metadata.RelMdPredicates.getPredicates(RelMdPredicates.java:292)
…y13.q due to the presence of Sarg literals The compilation fails in HiveFilterSortPredicates rule since we are trying to cast a Sarg to a NlsString. Sarg literals were introduced in CALCITE-4173 (1.26.0) and there are various places in the code (both in Hive and Calcite) that we cannot deal with them. One way to overcome this along with few other problems related to Sargs is to expand the SEARCH operator back into classic conjunctions and disjunctions. The HiveSearchExpandRule is quick POC (definitely needs revisiting and polishing) covering some frequent cases. java.lang.ClassCastException: org.apache.calcite.util.Sarg cannot be cast to org.apache.calcite.util.NlsString at org.apache.calcite.rel.metadata.RelMdSize.typeValueSize(RelMdSize.java:392) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexFunctionCost.visitCall(HiveFilterSortPredicates.java:210) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexFunctionCost.visitCall(HiveFilterSortPredicates.java:189) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rex.RexCall.accept(RexCall.java:189) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexSortPredicatesShuttle.costPerTuple(HiveFilterSortPredicates.java:179) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexSortPredicatesShuttle.rankingAnd(HiveFilterSortPredicates.java:153) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexSortPredicatesShuttle.lambda$visitCall$0(HiveFilterSortPredicates.java:117) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_261] at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948) ~[?:1.8.0_261] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_261] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_261] at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_261] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_261] at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[?:1.8.0_261] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexSortPredicatesShuttle.visitCall(HiveFilterSortPredicates.java:120) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates$RexSortPredicatesShuttle.visitCall(HiveFilterSortPredicates.java:101) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rex.RexCall.accept(RexCall.java:189) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.rewriteFilter(HiveFilterSortPredicates.java:76) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:62) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveFilter.accept(HiveFilter.java:96) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChild(RelShuttleImpl.java:57) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChildren(RelShuttleImpl.java:71) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visit(RelShuttleImpl.java:141) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:60) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveProject.accept(HiveProject.java:136) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChild(RelShuttleImpl.java:57) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChildren(RelShuttleImpl.java:71) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visit(RelShuttleImpl.java:141) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:60) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveJoin.accept(HiveJoin.java:231) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChild(RelShuttleImpl.java:57) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChildren(RelShuttleImpl.java:71) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visit(RelShuttleImpl.java:141) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:60) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveJoin.accept(HiveJoin.java:231) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChild(RelShuttleImpl.java:57) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChildren(RelShuttleImpl.java:71) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visit(RelShuttleImpl.java:141) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:60) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveJoin.accept(HiveJoin.java:231) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChild(RelShuttleImpl.java:57) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChildren(RelShuttleImpl.java:71) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visit(RelShuttleImpl.java:141) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:60) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveAggregate.accept(HiveAggregate.java:137) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChild(RelShuttleImpl.java:57) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visitChildren(RelShuttleImpl.java:71) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.rel.RelShuttleImpl.visit(RelShuttleImpl.java:141) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveFilterSortPredicates.visit(HiveFilterSortPredicates.java:60) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveProject.accept(HiveProject.java:136) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1742) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1590) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.tools.Frameworks.lambda$withPlanner$0(Frameworks.java:140) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:936) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:191) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:135) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1342) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:569) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12816) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:464) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:326) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:180) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:326) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:223) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:106) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:518) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:470) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:435) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:429) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:121) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:227) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
…SqlKind.IN anymore
If point look up optimizations, along with HiveSearchExpandRule, and HiveReduceExpressionsRule are part of the same program, we get into a recursion where HiveSearchExpandRule expands SEARCH operators, and then HiveReduceExpressionsRule converts operands back to SEARCH. Finally we get a stack overflow. The idea of this patch is to break them up. First reduce all the expressions, and then on the optimized basePlan, apply HiveSearchExpandRules and HivePointLookupOptimizerRules.
soumyakanti3578
force-pushed
the
calcite-upgrade-1.33-no-sarg
branch
from
April 23, 2024 23:47
d694c4f
to
4c6d786
Compare
…R. However, the getKind() method for HiveIN will return SqlKind.IN
asf-ci-hive
added
tests pending
tests failed
and removed
tests failed
tests pending
labels
Apr 30, 2024
asf-ci-hive
added
tests pending
tests failed
and removed
tests failed
tests pending
labels
Apr 30, 2024
Fixes: org.apache.hadoop.hive.ql.parse.SemanticException: Line 0:-1 Invalid numerical constant 'Sarg[[100..102]]' at org.apache.hadoop.hive.ql.parse.type.TypeCheckProcFactory$NumExprProcessor.process(TypeCheckProcFactory.java:364) at org.apache.hadoop.hive.ql.lib.CostLessRuleDispatcher.dispatch(CostLessRuleDispatcher.java:66) at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:105) at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:89) at org.apache.hadoop.hive.ql.lib.ExpressionWalker.walk(ExpressionWalker.java:101) at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:120) at org.apache.hadoop.hive.ql.parse.type.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:231) at org.apache.hadoop.hive.ql.parse.type.ExprNodeTypeCheck.genExprNode(ExprNodeTypeCheck.java:49) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:13614) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:13569) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:13537) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperatorChildren(SemanticAnalyzer.java:9727) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:10008) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:9935) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:10215) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:12434) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:12310) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:640) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:13178) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:467) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327) at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:180) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327) at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:224) at org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:107) Tested with: mvn test -pl itests/qtest -Pitests -Dtest=TestCliDriver -Dtest.output.overwrite=true -Dqfile="smb_mapjoin_46.q"
…ive.optimize.point.lookup Fixes: org.apache.hadoop.hive.ql.parse.SemanticException: Line 0:-1 Invalid function 'SEARCH' at org.apache.hadoop.hive.ql.parse.type.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:950) at org.apache.hadoop.hive.ql.parse.type.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1481) at org.apache.hadoop.hive.ql.lib.CostLessRuleDispatcher.dispatch(CostLessRuleDispatcher.java:66) at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:105) at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:89) at org.apache.hadoop.hive.ql.lib.ExpressionWalker.walk(ExpressionWalker.java:101) at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:120) at org.apache.hadoop.hive.ql.parse.type.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:231) at org.apache.hadoop.hive.ql.parse.type.ExprNodeTypeCheck.genExprNode(ExprNodeTypeCheck.java:49) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:13614) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:13569) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:13537) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:3776) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:3756) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:11456) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:12444) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:12310) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:640) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:13178) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:467) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327) at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:180) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327) Tested with: mvn test -pl itests/qtest -Pitests -Dtest=TestMiniLlapLocalCliDriver -Dtest.output.overwrite=true -Dqfile="in_typecheck_char.q"
…otIdPredicate This was failing due to mismatch in nullability of SNAPSHOT__ID in the filter condition and its input, tableScan. HiveAugmentSnapshotMaterializationRule.snapshotIdType(TYPE_FACTORY) returns NOT NULL type, whereas createTableType method of TestRuleBase creates nullable virtual columns. for (VirtualColumn virtualColumn : virtualColumns) { columnNames.add(virtualColumn.getName()); schema.add(TypeConverter.convert(virtualColumn.getTypeInfo(), TYPE_FACTORY)); } TypeConverter.convert(TypeInfo type, RelDataTypeFactory dtFactory) creates nullable types. So this can also be fixed by using the overloaded convert method: convert(TypeInfo type, boolean nullable, RelDataTypeFactory dtFactory) Tested with: mvn test -Dtest=org.apache.hadoop.hive.ql.optimizer.calcite.rules.views.TestHivePushdownSnapshotFilterRule#testFilterIsRemovedAndVersionIntervalFromIsSetWhenFilterHasSnapshotIdPredicate
In Calcite 1.25, in Filter.java, isValid was checked only when DEBUG mode was enabled. In 1.33, isValid is always checked. This test case was incorrectly assigning INT value to VARCHAR field. Even after fixing that it failed because of nullability, like in the previous commit. Tested with: mvn test -Dtest=org.apache.hadoop.hive.ql.optimizer.calcite.rules.views.TestHivePushdownSnapshotFilterRule#testFilterLeftIntactWhenItDoesNotHaveSnapshotIdPredicate
asf-ci-hive
added
tests pending
tests failed
and removed
tests failed
tests pending
labels
May 2, 2024
It was failing because of Sarg with: java.lang.AssertionError: cannot convert SARG literal to class java.lang.Long at org.apache.calcite.rex.RexLiteral.getValueAs(RexLiteral.java:1143) at org.apache.hadoop.hive.ql.optimizer.calcite.rules.views.HivePushdownSnapshotFilterRule$SnapshotIdShuttle.setSnapShotId(HivePushdownSnapshotFilterRule.java:121) at org.apache.hadoop.hive.ql.optimizer.calcite.rules.views.HivePushdownSnapshotFilterRule$SnapshotIdShuttle.visitCall(HivePushdownSnapshotFilterRule.java:104) at org.apache.hadoop.hive.ql.optimizer.calcite.rules.views.HivePushdownSnapshotFilterRule$SnapshotIdShuttle.visitCall(HivePushdownSnapshotFilterRule.java:88) at org.apache.calcite.rex.RexCall.accept(RexCall.java:189) at org.apache.hadoop.hive.ql.optimizer.calcite.rules.views.HivePushdownSnapshotFilterRule.onMatch(HivePushdownSnapshotFilterRule.java:84) at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:337) On my local machine it also complained about: Exception in thread "main" java.lang.NoClassDefFoundError: org/locationtech/jts/geom/Geometry at org.apache.calcite.sql.type.JavaToSqlTypeConversionRules.<init>(JavaToSqlTypeConversionRules.java:74) at org.apache.calcite.sql.type.JavaToSqlTypeConversionRules.<clinit>(JavaToSqlTypeConversionRules.java:41) at org.apache.calcite.jdbc.JavaTypeFactoryImpl.createType(JavaTypeFactoryImpl.java:153) which is similar to the init-metastore failures in upstream Ptests. I added org.locationtech.jts:jts-core:1.19.0 to ql/pom.xml, but not sure if it's the best place to add it. Tested with: mvn test -Dtest=org.apache.hadoop.hive.ql.txn.compactor.TestMaterializedViewRebuild#testSecondRebuildCanBeIncrementalAfterMajorCompaction
Fixes: java.lang.ClassCastException: org.apache.calcite.util.Sarg cannot be cast to java.lang.Number at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.toExprNodeConstantDesc(ExprNodeConverter.java:332) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitLiteral(ExprNodeConverter.java:246) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitLiteral(ExprNodeConverter.java:101) at org.apache.calcite.rex.RexLiteral.accept(RexLiteral.java:1217) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitCall(ExprNodeConverter.java:208) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitCall(ExprNodeConverter.java:101) at org.apache.calcite.rex.RexCall.accept(RexCall.java:189) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitCall(ExprNodeConverter.java:208) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitCall(ExprNodeConverter.java:101) at org.apache.calcite.rex.RexCall.accept(RexCall.java:189) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveFilterVisitor.visit(HiveFilterVisitor.java:51) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveOpConverter.dispatch(HiveOpConverter.java:107) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveProjectVisitor.visit(HiveProjectVisitor.java:62) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveOpConverter.dispatch(HiveOpConverter.java:97) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveSortExchangeVisitor.visit(HiveSortExchangeVisitor.java:41) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveOpConverter.dispatch(HiveOpConverter.java:113) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.JoinVisitor.visit(JoinVisitor.java:77) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveOpConverter.dispatch(HiveOpConverter.java:101) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveProjectVisitor.visit(HiveProjectVisitor.java:62) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveOpConverter.dispatch(HiveOpConverter.java:97) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.opconventer.HiveOpConverter.convert(HiveOpConverter.java:87) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedHiveOPDag(CalcitePlanner.java:1418) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:582) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:13178) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:467) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327) at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:180) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327) at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:224) Tested with: mvn test -pl itests/qtest -Pitests -Dtest=TestMiniLlapLocalCliDriver -Dtest.output.overwrite=true -Dqfile="cbo_rp_outer_join_ppr.q"
…ition in HivePartitionPruneRule Fixes: java.lang.ClassCastException: org.apache.calcite.util.Sarg cannot be cast to java.lang.Number at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.toExprNodeConstantDesc(ExprNodeConverter.java:332) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitLiteral(ExprNodeConverter.java:246) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitLiteral(ExprNodeConverter.java:101) at org.apache.calcite.rex.RexLiteral.accept(RexLiteral.java:1217) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitCall(ExprNodeConverter.java:208) at org.apache.hadoop.hive.ql.optimizer.calcite.translator.ExprNodeConverter.visitCall(ExprNodeConverter.java:101) at org.apache.calcite.rex.RexCall.accept(RexCall.java:189) at org.apache.hadoop.hive.ql.optimizer.calcite.RelOptHiveTable.computePartitionList(RelOptHiveTable.java:477) at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HivePartitionPruneRule.perform(HivePartitionPruneRule.java:63) at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HivePartitionPruneRule.onMatch(HivePartitionPruneRule.java:46) at org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:223) at org.apache.calcite.plan.volcano.IterativeRuleDriver.drive(IterativeRuleDriver.java:59) at org.apache.calcite.plan.volcano.VolcanoPlanner.findBestExp(VolcanoPlanner.java:523) at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.applyMaterializedViewRewriting(CalcitePlanner.java:2083) at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1717) at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1574) at org.apache.calcite.tools.Frameworks.lambda$withPlanner$0(Frameworks.java:140) at org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:936) Tested with: mvn test -pl itests/qtest -Pitests -Dtest=TestMiniLlapLocalCliDriver -Dtest.output.overwrite=true -Dqfile="materialized_view_partitioned_2.q"
…etastore-server and beeline
Quality Gate passedIssues Measures |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
Why are the changes needed?
Does this PR introduce any user-facing change?
Is the change a dependency upgrade?
How was this patch tested?