You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Hive QA (Jira)" <ji...@apache.org> on 2020/05/27 18:55:00 UTC

[jira] [Commented] (HIVE-23556) Support hive.metastore.limit.partition.request for get_partitions_ps

    [ https://issues.apache.org/jira/browse/HIVE-23556?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17118016#comment-17118016 ] 

Hive QA commented on HIVE-23556:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/13004150/HIVE-23556.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/22651/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/22651/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-22651/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2020-05-27 18:40:28.224
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-22651/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2020-05-27 18:40:28.252
+ cd apache-github-source-source
+ git fetch origin
From https://github.com/apache/hive
   f49d257..a3a25eb  master     -> origin/master
+ git reset --hard HEAD
HEAD is now at f49d257 HIVE-23547 Enforce testconfiguration.properties file format and alphabetical order (Miklos Gergely, reviewed by Laszlo Bodor)
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/master
HEAD is now at a3a25eb HIVE-23488 : Optimise PartitionManagementTask::Msck::repair (Rajesh Balamohan via Ashutosh Chauhan)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2020-05-27 18:40:40.790
+ rm -rf ../yetus_PreCommit-HIVE-Build-22651
+ mkdir ../yetus_PreCommit-HIVE-Build-22651
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-22651
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-22651/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch
Trying to apply the patch with -p0
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc6550287692100694868.exe, --version]
libprotoc 2.6.1
protoc-jar: executing: [/tmp/protoc6550287692100694868.exe, -I/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore, --java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/target/generated-sources, /data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
protoc-jar: executing: [/tmp/protoc8384485657991550447.exe, --version]
libprotoc 2.6.1
ANTLR Parser Generator  Version 3.5.2
Output file /data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
ANTLR Parser Generator  Version 3.5.2
Output file /data/hiveptest/working/apache-github-source-source/parser/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/parser/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file /data/hiveptest/working/apache-github-source-source/parser/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/parser/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 43 classes.
Processing annotations
Annotations processed
Output file /data/hiveptest/working/apache-github-source-source/parser/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexerStandard.java does not exist: must build /data/hiveptest/working/apache-github-source-source/parser/src/java/org/apache/hadoop/hive/ql/parse/HiveLexerStandard.g
org/apache/hadoop/hive/ql/parse/HiveLexerStandard.g
Processing annotations
No elements to process
Output file /data/hiveptest/working/apache-github-source-source/parser/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/parser/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
org/apache/hadoop/hive/ql/parse/HintParser.g
Generating vector expression code
Generating vector expression test code
Processing annotations
Annotations processed
Processing annotations
No elements to process
Processing annotations
Annotations processed
Processing annotations
No elements to process
Processing annotations
Annotations processed
Processing annotations
No elements to process
May 27, 2020 6:49:39 PM org.apache.jasper.servlet.TldScanner scanJars
INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
+ [[ -d itests ]]
+ cd itests
+ mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven
[ERROR] COMPILATION ERROR : 
[ERROR] /data/hiveptest/working/apache-github-source-source/itests/hcatalog-unit/src/test/java/org/apache/hive/hcatalog/listener/DummyRawStoreFailEvent.java:[116,8] org.apache.hive.hcatalog.listener.DummyRawStoreFailEvent is not abstract and does not override abstract method getNumPartitionsByPs(java.lang.String,java.lang.String,java.lang.String,java.util.List<java.lang.String>) in org.apache.hadoop.hive.metastore.RawStore
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:testCompile (default-testCompile) on project hive-hcatalog-it-unit: Compilation failure
[ERROR] /data/hiveptest/working/apache-github-source-source/itests/hcatalog-unit/src/test/java/org/apache/hive/hcatalog/listener/DummyRawStoreFailEvent.java:[116,8] org.apache.hive.hcatalog.listener.DummyRawStoreFailEvent is not abstract and does not override abstract method getNumPartitionsByPs(java.lang.String,java.lang.String,java.lang.String,java.util.List<java.lang.String>) in org.apache.hadoop.hive.metastore.RawStore
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :hive-hcatalog-it-unit
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-22651
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 13004150 - PreCommit-HIVE-Build

> Support hive.metastore.limit.partition.request for get_partitions_ps
> --------------------------------------------------------------------
>
>                 Key: HIVE-23556
>                 URL: https://issues.apache.org/jira/browse/HIVE-23556
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Toshihiko Uchida
>            Assignee: Toshihiko Uchida
>            Priority: Minor
>         Attachments: HIVE-23556.patch
>
>
> HIVE-13884 added the configuration hive.metastore.limit.partition.request to limit the number of partitions that can be requested.
> Currently, it takes in effect for the following MetaStore APIs
> * get_partitions,
> * get_partitions_with_auth,
> * get_partitions_by_filter,
> * get_partitions_spec_by_filter,
> * get_partitions_by_expr,
> but not for
> * get_partitions_ps,
> * get_partitions_ps_with_auth.
> This issue proposes to apply the configuration also to get_partitions_ps and get_partitions_ps_with_auth.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)