You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Hive QA (Jira)" <ji...@apache.org> on 2019/08/23 06:54:00 UTC

[jira] [Commented] (HIVE-16951) ACID Compactor, PartialScanTask, MergeFileTask, ColumnTruncateTask, HCatUtil don't close JobClient

    [ https://issues.apache.org/jira/browse/HIVE-16951?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16913996#comment-16913996 ] 

Hive QA commented on HIVE-16951:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12874746/HIVE-16951.1.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/18386/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/18386/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-18386/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2019-08-23 06:52:22.866
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-18386/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2019-08-23 06:52:22.869
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at db59ec6 HIVE-21580: Introduce ISO 8601 week numbering SQL:2016 formats (Karen Coppage via Marta Kuczora)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at db59ec6 HIVE-21580: Introduce ISO 8601 week numbering SQL:2016 formats (Karen Coppage via Marta Kuczora)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2019-08-23 06:52:23.504
+ rm -rf ../yetus_PreCommit-HIVE-Build-18386
+ mkdir ../yetus_PreCommit-HIVE-Build-18386
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-18386
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-18386/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch
error: a/hcatalog/core/src/main/java/org/apache/hive/hcatalog/common/HCatUtil.java: does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/merge/MergeFileTask.java: does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java: does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java: does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java: does not exist in index
error: ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java: does not exist in index
error: patch failed: ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java:219
Falling back to three-way merge...
Applied patch to 'ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java' cleanly.
error: patch failed: ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java:280
Falling back to three-way merge...
Applied patch to 'ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java' with conflicts.
error: core/src/main/java/org/apache/hive/hcatalog/common/HCatUtil.java: does not exist in index
error: src/java/org/apache/hadoop/hive/ql/io/merge/MergeFileTask.java: does not exist in index
error: src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java: does not exist in index
error: src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java: does not exist in index
error: src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java: does not exist in index
The patch does not appear to apply with p0, p1, or p2
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-18386
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12874746 - PreCommit-HIVE-Build

> ACID Compactor, PartialScanTask, MergeFileTask, ColumnTruncateTask, HCatUtil don't close JobClient
> --------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-16951
>                 URL: https://issues.apache.org/jira/browse/HIVE-16951
>             Project: Hive
>          Issue Type: Bug
>          Components: Transactions
>    Affects Versions: 1.2.2, 2.1.1
>            Reporter: Vaibhav Gumashta
>            Priority: Major
>         Attachments: HIVE-16951.1.patch
>
>
> When a compaction job is launched, we create a new JobClient everytime we run the MR job:
> {code}
>   private void launchCompactionJob(JobConf job, Path baseDir, CompactionType compactionType,
>                                    StringableList dirsToSearch,
>                                    List<AcidUtils.ParsedDelta> parsedDeltas,
>                                    int curDirNumber, int obsoleteDirNumber, HiveConf hiveConf,
>                                    TxnStore txnHandler, long id, String jobName) throws IOException {
>     job.setBoolean(IS_MAJOR, compactionType == CompactionType.MAJOR);
>     if(dirsToSearch == null) {
>       dirsToSearch = new StringableList();
>     }
>     StringableList deltaDirs = new StringableList();
>     long minTxn = Long.MAX_VALUE;
>     long maxTxn = Long.MIN_VALUE;
>     for (AcidUtils.ParsedDelta delta : parsedDeltas) {
>       LOG.debug("Adding delta " + delta.getPath() + " to directories to search");
>       dirsToSearch.add(delta.getPath());
>       deltaDirs.add(delta.getPath());
>       minTxn = Math.min(minTxn, delta.getMinTransaction());
>       maxTxn = Math.max(maxTxn, delta.getMaxTransaction());
>     }
>     if (baseDir != null) job.set(BASE_DIR, baseDir.toString());
>     job.set(DELTA_DIRS, deltaDirs.toString());
>     job.set(DIRS_TO_SEARCH, dirsToSearch.toString());
>     job.setLong(MIN_TXN, minTxn);
>     job.setLong(MAX_TXN, maxTxn);
>     if (hiveConf.getBoolVar(HiveConf.ConfVars.HIVE_IN_TEST)) {
>       mrJob = job;
>     }
>     LOG.info("Submitting " + compactionType + " compaction job '" +
>       job.getJobName() + "' to " + job.getQueueName() + " queue.  " +
>       "(current delta dirs count=" + curDirNumber +
>       ", obsolete delta dirs count=" + obsoleteDirNumber + ". TxnIdRange[" + minTxn + "," + maxTxn + "]");
>     RunningJob rj = new JobClient(job).submitJob(job);
>     LOG.info("Submitted compaction job '" + job.getJobName() + "' with jobID=" + rj.getID() + " compaction ID=" + id);
>     txnHandler.setHadoopJobId(rj.getID().toString(), id);
>     rj.waitForCompletion();
>     if (!rj.isSuccessful()) {
>       throw new IOException(compactionType == CompactionType.MAJOR ? "Major" : "Minor" +
>           " compactor job failed for " + jobName + "! Hadoop JobId: " + rj.getID() );
>     }
>   }
> {code}
> We should close the JobClient to release resources (cached FS objects etc).
> Similarly for other classes listed above.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)