You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by Murli Varadachari <mv...@facebook.com> on 2008/12/31 03:32:56 UTC
*UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730295.3
Compiling hiveopensource at /usr/local/continuous_builds/src/hiveopensource-0.17.1/hiveopensource_0_17_1
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Buildfile: build.xml
clean:
clean:
[echo] Cleaning: anttasks
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks
clean:
[echo] Cleaning: cli
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli
clean:
[echo] Cleaning: common
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common
clean:
[echo] Cleaning: metastore
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore
Overriding previous definition of reference to test.classpath
clean:
[echo] Cleaning: ql
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql
clean:
[echo] Cleaning: serde
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde
clean:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
BUILD SUCCESSFUL
Total time: 20 seconds
Buildfile: build.xml
deploy:
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/classes
download-ivy:
init-ivy:
settings-ivy:
resolve:
[ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 138ms :: artifacts dl 4ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 1 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve] confs: [default]
[ivy:retrieve] 1 artifacts copied, 0 already retrieved (14096kB/321ms)
install-hadoopcore:
[untar] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1.tar.gz into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore
[touch] Creating /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1.installed
compile:
[echo] Compiling: common
[javac] Compiling 1 source file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/classes
jar:
[echo] Jar: common
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/hive_common.jar
deploy:
[echo] hive: common
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/classes
dynamic-serde:
compile:
[echo] Compiling: serde
[javac] Compiling 128 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/classes
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
jar:
[echo] Jar: serde
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/hive_serde.jar
deploy:
[echo] hive: serde
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes
model-compile:
[javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
core-compile:
[echo] Compiling:
[javac] Compiling 38 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
model-enhance:
[echo] Enhancing model classes with JPOX stuff....
[java] JPOX Enhancer (version 1.2.2) : Enhancement of classes
[java] JPOX Enhancer completed with success for 8 classes. Timings : input=239 ms, enhance=275 ms, total=514 ms. Consult the log for full details
compile:
jar:
[echo] Jar: metastore
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/hive_metastore.jar
deploy:
[echo] hive: metastore
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
Overriding previous definition of reference to test.classpath
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
ql-init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/gen-java/org/apache/hadoop/hive/ql/parse
build-grammar:
[echo] Building Grammar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g ....
[java] ANTLR Parser Generator Version 3.0.1 (August 13, 2007) 1989-2007
compile-ant-tasks:
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test/classes
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] :: resolution report :: resolve 10ms :: artifacts dl 0ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 0 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 0 already retrieved (0kB/2ms)
install-hadoopcore:
compile:
[echo] Compiling: anttasks
[javac] Compiling 2 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes
[javac] Note: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
deploy-ant-tasks:
init:
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] :: resolution report :: resolve 10ms :: artifacts dl 0ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 0 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 0 already retrieved (0kB/2ms)
install-hadoopcore:
compile:
[echo] Compiling: anttasks
jar:
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes/org/apache/hadoop/hive/ant
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/hive_anttasks.jar
deploy:
[echo] hive: anttasks
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
configure:
[copy] Copying 239 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/java
compile:
[echo] Compiling: ql
[javac] Compiling 241 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
jar:
[echo] Jar: ql
[unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/lib/commons-jexl-1.1.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
[unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/libthrift.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/thrift/classes
[unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/commons-lang-2.4.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/commons-lang/classes
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/hive_exec.jar
deploy:
[echo] hive: ql
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/classes
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 60ms :: artifacts dl 4ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 1 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/5ms)
install-hadoopcore:
compile:
[echo] Compiling: cli
[javac] Compiling 5 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/classes
[javac] Note: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/cli/src/java/org/apache/hadoop/hive/cli/OptionsProcessor.java uses unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
jar:
[echo] Jar: cli
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/hive_cli.jar
deploy:
[echo] hive: cli
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/classes
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/classes
core-compile:
[javac] Compiling 6 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/classes
compile:
jar:
[echo] Jar: service
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/hive_service.jar
deploy:
[echo] hive: service
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
package:
[echo] Deploying Hive jars to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/files
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/queries
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/php
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
[copy] Copying 5 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin/ext
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
[copy] Copying 6 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/php
[copy] Copying 12 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
[copy] Copied 3 empty directories to 1 empty directory under /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
[copy] Copying 35 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib
[copy] Copying 16 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/files
[copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
[copy] Copying 41 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/queries
BUILD SUCCESSFUL
Total time: 42 seconds
RUNNING TEST FOR HIVE OPENSOURCE - ant test
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Buildfile: build.xml
clean-test:
clean-test:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
clean-test:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
clean-test:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
Overriding previous definition of reference to test.classpath
clean-test:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
clean-test:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
clean-test:
[delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
BUILD SUCCESSFUL
Total time: 1 second
Buildfile: build.xml
clean-test:
clean-test:
clean-test:
clean-test:
Overriding previous definition of reference to test.classpath
clean-test:
clean-test:
clean-test:
deploy:
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/classes
download-ivy:
init-ivy:
settings-ivy:
resolve:
[ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 90ms :: artifacts dl 3ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 1 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/7ms)
install-hadoopcore:
compile:
[echo] Compiling: common
jar:
[echo] Jar: common
deploy:
[echo] hive: common
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/classes
dynamic-serde:
compile:
[echo] Compiling: serde
jar:
[echo] Jar: serde
deploy:
[echo] hive: serde
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes
model-compile:
core-compile:
[echo] Compiling:
model-enhance:
compile:
jar:
[echo] Jar: metastore
deploy:
[echo] hive: metastore
Overriding previous definition of reference to test.classpath
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
ql-init:
build-grammar:
compile-ant-tasks:
init:
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] :: resolution report :: resolve 8ms :: artifacts dl 0ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 0 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 0 already retrieved (0kB/3ms)
install-hadoopcore:
compile:
[echo] Compiling: anttasks
deploy-ant-tasks:
init:
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] :: resolution report :: resolve 8ms :: artifacts dl 0ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 0 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 0 already retrieved (0kB/2ms)
install-hadoopcore:
compile:
[echo] Compiling: anttasks
jar:
deploy:
[echo] hive: anttasks
configure:
compile:
[echo] Compiling: ql
[javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
jar:
[echo] Jar: ql
[unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/lib/commons-jexl-1.1.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
[unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/libthrift.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/thrift/classes
[unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/commons-lang-2.4.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/commons-lang/classes
deploy:
[echo] hive: ql
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/classes
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 37ms :: artifacts dl 2ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 1 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/3ms)
install-hadoopcore:
compile:
[echo] Compiling: cli
jar:
[echo] Jar: cli
deploy:
[echo] hive: cli
init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/src
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/classes
core-compile:
compile:
jar:
[echo] Jar: service
deploy:
[echo] hive: service
test:
test:
[echo] Nothing to do!
test:
[echo] Nothing to do!
test-conditions:
gen-test:
init:
model-compile:
core-compile:
[echo] Compiling:
model-enhance:
compile:
compile-test:
[javac] Compiling 14 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes
test-jar:
test-init:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
[copy] Copying 18 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
[copy] Copied 5 empty directories to 3 empty directories under /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
test:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/logs
[junit] Running org.apache.hadoop.hive.metastore.TestAlter
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.061 sec
[junit] Running org.apache.hadoop.hive.metastore.TestCreateDB
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.044 sec
[junit] Running org.apache.hadoop.hive.metastore.TestDBGetName
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.051 sec
[junit] Running org.apache.hadoop.hive.metastore.TestDrop
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.067 sec
[junit] Running org.apache.hadoop.hive.metastore.TestGetDBs
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.047 sec
[junit] Running org.apache.hadoop.hive.metastore.TestGetSchema
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.087 sec
[junit] Running org.apache.hadoop.hive.metastore.TestGetTable
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.057 sec
[junit] Running org.apache.hadoop.hive.metastore.TestGetTables
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.071 sec
[junit] Running org.apache.hadoop.hive.metastore.TestHiveMetaStore
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 8.798 sec
[junit] Running org.apache.hadoop.hive.metastore.TestPartitions
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.063 sec
[junit] Running org.apache.hadoop.hive.metastore.TestTableExists
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.059 sec
[junit] Running org.apache.hadoop.hive.metastore.TestTablePath
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.033 sec
[junit] Running org.apache.hadoop.hive.metastore.TestTruncate
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.082 sec
Overriding previous definition of reference to test.classpath
test-conditions:
init:
compile-ant-tasks:
init:
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 0 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 0 already retrieved (0kB/2ms)
install-hadoopcore:
compile:
[echo] Compiling: anttasks
deploy-ant-tasks:
init:
download-ivy:
init-ivy:
settings-ivy:
resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] confs: [default]
[ivy:retrieve] :: resolution report :: resolve 7ms :: artifacts dl 1ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 0 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 0 already retrieved (0kB/2ms)
install-hadoopcore:
compile:
[echo] Compiling: anttasks
jar:
deploy:
[echo] hive: anttasks
gen-test:
[qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
[qtestgen] Dec 30, 2008 6:22:06 PM org.apache.velocity.runtime.log.JdkLogChute log
[qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
[qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParse.java from template TestParse.vm
[qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
[qtestgen] Dec 30, 2008 6:22:06 PM org.apache.velocity.runtime.log.JdkLogChute log
[qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
[qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParseNegative.java from template TestParseNegative.vm
[qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
[qtestgen] Dec 30, 2008 6:22:07 PM org.apache.velocity.runtime.log.JdkLogChute log
[qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
[qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/cli/TestCliDriver.java from template TestCliDriver.vm
[qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
[qtestgen] Dec 30, 2008 6:22:07 PM org.apache.velocity.runtime.log.JdkLogChute log
[qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
[qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/cli/TestNegativeCliDriver.java from template TestNegativeCliDriver.vm
ql-init:
build-grammar:
configure:
compile:
[echo] Compiling: ql
[javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
compile-test:
[javac] Compiling 15 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.
[javac] Compiling 4 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
test-jar:
[jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar
test-init:
[copy] Copying 18 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data
[copy] Copied 4 empty directories to 2 empty directories under /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data
test:
[mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/logs
[junit] Running org.apache.hadoop.hive.cli.TestCliDriver
[junit] Begin query: mapreduce1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] plan = /tmp/plan46097.xml
[junit] 08/12/30 18:22:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:22:22 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:22:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:22 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:22:23 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:23 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:22:23 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:22:23 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:22:23 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:22:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:22:23 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:22:23 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:23 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: partname={}
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: alias=src
[junit] 08/12/30 18:22:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:23 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:22:23 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:23 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:23 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:22:23 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:22:23 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp945985308
[junit] 08/12/30 18:22:23 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:23 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:23 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:22:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:23 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:22:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:23 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:22:23 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:22:23 INFO mapred.TaskRunner: Task 'reduce_n0a18c' done.
[junit] 08/12/30 18:22:23 INFO mapred.TaskRunner: Saved output of task 'reduce_n0a18c' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp945985308
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:22:24 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:22:24 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce1.q.out
[junit] Done query: mapreduce1.q
[junit] Begin query: mapreduce3.q
[junit] plan = /tmp/plan46098.xml
[junit] 08/12/30 18:22:28 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:22:29 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:22:29 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:29 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:29 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:29 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:22:29 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:22:29 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:22:29 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:22:29 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:22:29 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:22:29 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:29 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:22:29 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:22:29 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:22:29 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:29 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: partname={}
[junit] 08/12/30 18:22:29 INFO exec.ScriptOperator: alias=src
[junit] 08/12/30 18:22:29 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:29 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:22:30 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:30 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:22:30 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:22:30 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:30 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:30 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:22:30 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:22:30 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1353893075
[junit] 08/12/30 18:22:30 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:30 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:22:30 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:22:30 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:22:30 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:30 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:30 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:22:30 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:30 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:22:30 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:22:30 INFO mapred.TaskRunner: Task 'reduce_tt39x9' done.
[junit] 08/12/30 18:22:30 INFO mapred.TaskRunner: Saved output of task 'reduce_tt39x9' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1353893075
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:22:30 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:22:30 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce3.q.out
[junit] Done query: mapreduce3.q
[junit] Begin query: alter1.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/alter1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/alter1.q.out
[junit] Done query: alter1.q
[junit] Begin query: showparts.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/showparts.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/showparts.q.out
[junit] Done query: showparts.q
[junit] Begin query: mapreduce5.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_mapreduce5(TestCliDriver.java:403)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: subq2.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_subq2(TestCliDriver.java:428)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input_limit.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_limit.q.out
[junit] Done query: input_limit.q
[junit] Begin query: input11_limit.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11_limit(TestCliDriver.java:478)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input20.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input20(TestCliDriver.java:503)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input14_limit.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input14_limit(TestCliDriver.java:528)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: sample2.q
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample2(TestCliDriver.java:553)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] Begin query: inputddl1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl1(TestCliDriver.java:578)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: sample4.q
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample4(TestCliDriver.java:603)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: inputddl3.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl3(TestCliDriver.java:628)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby2_map.q
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby2_map(TestCliDriver.java:653)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: sample6.q
[junit] plan = /tmp/plan46099.xml
[junit] 08/12/30 18:22:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:22:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
[junit] 08/12/30 18:22:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:43 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:44 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:44 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:22:44 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:22:44 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
[junit] 08/12/30 18:22:44 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:22:44 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:22:44 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:22:44 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:22:44 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:22:44 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:44 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:44 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:44 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:44 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:22:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:44 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:22:44 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:22:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:44 INFO exec.FilterOperator: PASSED:118
[junit] 08/12/30 18:22:44 INFO exec.FilterOperator: FILTERED:382
[junit] 08/12/30 18:22:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
[junit] 08/12/30 18:22:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:22:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1410453330
[junit] 08/12/30 18:22:45 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:22:45 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample6.q.out
[junit] Done query: sample6.q
[junit] Begin query: groupby4_map.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby4_map(TestCliDriver.java:703)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: inputddl5.q
[junit] plan = /tmp/plan46100.xml
[junit] 08/12/30 18:22:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:22:48 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/inputddl5
[junit] 08/12/30 18:22:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:48 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:48 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:22:48 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:22:48 INFO exec.MapOperator: Adding alias inputddl5 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/inputddl5/kv4.txt
[junit] 08/12/30 18:22:48 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:22:48 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:22:48 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:22:48 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:48 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:48 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:22:48 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:48 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:22:48 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:48 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/inputddl5/kv4.txt:0+6
[junit] 08/12/30 18:22:48 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:22:48 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-391140214
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:22:49 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:22:49 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46101.xml
[junit] 08/12/30 18:22:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:22:51 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/inputddl5
[junit] 08/12/30 18:22:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:51 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:51 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:51 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:22:51 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:22:51 INFO exec.MapOperator: Adding alias inputddl5 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/inputddl5/kv4.txt
[junit] 08/12/30 18:22:51 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:22:51 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:22:51 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:22:51 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:22:51 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:22:51 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:22:51 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:22:51 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:51 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:52 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:22:52 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:22:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:52 INFO exec.FilterOperator: PASSED:1
[junit] 08/12/30 18:22:52 INFO exec.FilterOperator: FILTERED:0
[junit] 08/12/30 18:22:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/inputddl5/kv4.txt:0+6
[junit] 08/12/30 18:22:52 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:22:52 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp877478071
[junit] 08/12/30 18:22:52 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:52 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:52 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:22:52 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:22:52 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:22:52 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:22:52 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:22:52 INFO mapred.TaskRunner: Task 'reduce_8bhe5r' done.
[junit] 08/12/30 18:22:52 INFO mapred.TaskRunner: Saved output of task 'reduce_8bhe5r' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp877478071
[junit] 08/12/30 18:22:52 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:22:52 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46102.xml
[junit] 08/12/30 18:22:54 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:22:54 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/66797295/172745450.10002
[junit] 08/12/30 18:22:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:54 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:54 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:54 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:22:55 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:22:55 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/66797295/172745450.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/66797295/172745450.10002/reduce_8bhe5r
[junit] 08/12/30 18:22:55 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:22:55 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:22:55 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:22:55 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:55 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:55 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:22:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/66797295/172745450.10002/reduce_8bhe5r:0+124
[junit] 08/12/30 18:22:55 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:22:55 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp56015581
[junit] 08/12/30 18:22:55 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:55 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:22:55 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:22:55 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:22:55 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:22:55 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:22:55 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:22:55 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:22:55 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:22:55 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:22:55 INFO mapred.TaskRunner: Task 'reduce_uarhen' done.
[junit] 08/12/30 18:22:55 INFO mapred.TaskRunner: Saved output of task 'reduce_uarhen' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp56015581
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:22:55 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:22:55 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Exception: Client Execution failed with error code = 9
[junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl5(TestCliDriver.java:731)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: sample8.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample8(TestCliDriver.java:753)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: inputddl7.q
[junit] plan = /tmp/plan46103.xml
[junit] 08/12/30 18:22:59 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:22:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t1
[junit] 08/12/30 18:22:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:22:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:22:59 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:22:59 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:00 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:00 INFO exec.MapOperator: Adding alias t1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t1/kv1.txt
[junit] 08/12/30 18:23:00 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:00 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:00 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:00 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:00 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:00 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:00 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:00 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t1/kv1.txt:0+5812
[junit] 08/12/30 18:23:00 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:00 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-285160883
[junit] 08/12/30 18:23:00 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:00 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:00 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:00 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:00 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:00 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:00 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:00 INFO mapred.TaskRunner: Task 'reduce_iuyptm' done.
[junit] 08/12/30 18:23:00 INFO mapred.TaskRunner: Saved output of task 'reduce_iuyptm' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-285160883
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:00 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:23:00 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46104.xml
[junit] 08/12/30 18:23:02 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:02 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/301529399/515143997.10002
[junit] 08/12/30 18:23:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:02 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:03 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:03 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:03 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:03 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/301529399/515143997.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/301529399/515143997.10002/reduce_iuyptm
[junit] 08/12/30 18:23:03 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:03 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:03 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/301529399/515143997.10002/reduce_iuyptm:0+124
[junit] 08/12/30 18:23:03 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:03 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1799313570
[junit] 08/12/30 18:23:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:03 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:03 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:03 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:03 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:03 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:03 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:03 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:03 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:03 INFO mapred.TaskRunner: Task 'reduce_jslde4' done.
[junit] 08/12/30 18:23:03 INFO mapred.TaskRunner: Saved output of task 'reduce_jslde4' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1799313570
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:23:04 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:04 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46105.xml
[junit] 08/12/30 18:23:05 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:05 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t2
[junit] 08/12/30 18:23:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:06 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:06 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:06 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:06 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:06 INFO exec.MapOperator: Adding alias t2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t2/kv1.seq
[junit] 08/12/30 18:23:06 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:06 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:06 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:06 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:06 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:06 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:06 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:06 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:06 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:06 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t2/kv1.seq:0+10508
[junit] 08/12/30 18:23:06 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:06 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1706136056
[junit] 08/12/30 18:23:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:06 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:06 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:06 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:06 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:06 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:06 INFO mapred.TaskRunner: Task 'reduce_f3b9ee' done.
[junit] 08/12/30 18:23:06 INFO mapred.TaskRunner: Saved output of task 'reduce_f3b9ee' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1706136056
[junit] 08/12/30 18:23:07 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:23:07 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46106.xml
[junit] 08/12/30 18:23:08 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:08 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/527239131/1495342797.10002
[junit] 08/12/30 18:23:08 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:08 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:09 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:09 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:09 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:09 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:09 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/527239131/1495342797.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/527239131/1495342797.10002/reduce_f3b9ee
[junit] 08/12/30 18:23:09 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:09 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:09 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:09 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:09 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:09 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:09 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/527239131/1495342797.10002/reduce_f3b9ee:0+124
[junit] 08/12/30 18:23:09 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:09 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-394965812
[junit] 08/12/30 18:23:09 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:09 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:09 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:09 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:09 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:09 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:09 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:09 INFO mapred.TaskRunner: Task 'reduce_glaldm' done.
[junit] 08/12/30 18:23:09 INFO mapred.TaskRunner: Saved output of task 'reduce_glaldm' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-394965812
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:10 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:23:10 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46107.xml
[junit] 08/12/30 18:23:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:12 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t3/ds=2008-04-09
[junit] 08/12/30 18:23:12 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:12 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:23:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:12 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:12 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:12 INFO exec.MapOperator: Adding alias t3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t3/ds=2008-04-09/kv1.txt
[junit] 08/12/30 18:23:13 INFO exec.MapOperator: Got partitions: ds
[junit] 08/12/30 18:23:13 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:13 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:13 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:13 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:13 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:23:13 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:23:13 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:13 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:13 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:23:13 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:13 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:13 INFO exec.FilterOperator: PASSED:500
[junit] 08/12/30 18:23:13 INFO exec.FilterOperator: FILTERED:0
[junit] 08/12/30 18:23:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t3/ds=2008-04-09/kv1.txt:0+5812
[junit] 08/12/30 18:23:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2074770468
[junit] 08/12/30 18:23:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:13 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:13 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:13 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:13 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:13 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:13 INFO mapred.TaskRunner: Task 'reduce_7z6qy8' done.
[junit] 08/12/30 18:23:13 INFO mapred.TaskRunner: Saved output of task 'reduce_7z6qy8' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2074770468
[junit] 08/12/30 18:23:13 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:23:13 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46108.xml
[junit] 08/12/30 18:23:15 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:15 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/5303531/67269899.10002
[junit] 08/12/30 18:23:15 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:15 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:15 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:15 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:15 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:16 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:16 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/5303531/67269899.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/5303531/67269899.10002/reduce_7z6qy8
[junit] 08/12/30 18:23:16 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:16 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:16 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:16 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:16 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:16 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:16 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/5303531/67269899.10002/reduce_7z6qy8:0+124
[junit] 08/12/30 18:23:16 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:16 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1650994783
[junit] 08/12/30 18:23:16 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:16 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:16 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:16 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:16 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:16 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:16 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:16 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:16 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:16 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:16 INFO mapred.TaskRunner: Task 'reduce_jrtl37' done.
[junit] 08/12/30 18:23:16 INFO mapred.TaskRunner: Saved output of task 'reduce_jrtl37' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1650994783
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:23:16 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:16 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46109.xml
[junit] 08/12/30 18:23:18 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:18 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t4/ds=2008-04-09
[junit] 08/12/30 18:23:18 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:18 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:19 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:19 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:19 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:19 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:19 INFO exec.MapOperator: Adding alias t4 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t4/ds=2008-04-09/kv1.seq
[junit] 08/12/30 18:23:19 INFO exec.MapOperator: Got partitions: ds
[junit] 08/12/30 18:23:19 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:19 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:19 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:19 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:19 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:23:19 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:23:19 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:19 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:19 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:19 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:19 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:23:19 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:19 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:20 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:20 INFO exec.FilterOperator: PASSED:500
[junit] 08/12/30 18:23:20 INFO exec.FilterOperator: FILTERED:0
[junit] 08/12/30 18:23:20 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t4/ds=2008-04-09/kv1.seq:0+10508
[junit] 08/12/30 18:23:20 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:20 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1271723440
[junit] 08/12/30 18:23:20 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:20 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:20 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:20 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:20 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:20 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:20 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:20 INFO mapred.TaskRunner: Task 'reduce_d7rxf3' done.
[junit] 08/12/30 18:23:20 INFO mapred.TaskRunner: Saved output of task 'reduce_d7rxf3' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1271723440
[junit] 08/12/30 18:23:20 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:20 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46110.xml
[junit] 08/12/30 18:23:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:22 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/172031259/44731815.10002
[junit] 08/12/30 18:23:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:22 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:22 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:22 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:22 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:22 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/172031259/44731815.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/172031259/44731815.10002/reduce_d7rxf3
[junit] 08/12/30 18:23:22 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:22 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:22 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:22 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:22 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:22 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:22 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/172031259/44731815.10002/reduce_d7rxf3:0+124
[junit] 08/12/30 18:23:22 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:23 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp2011654655
[junit] 08/12/30 18:23:23 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:23 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:23:23 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:23 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:23 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:23 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:23 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:23 INFO mapred.TaskRunner: Task 'reduce_l8z986' done.
[junit] 08/12/30 18:23:23 INFO mapred.TaskRunner: Saved output of task 'reduce_l8z986' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp2011654655
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:23:23 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:23 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl7.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl7.q.out
[junit] Done query: inputddl7.q
[junit] Begin query: notable_alias1.q
[junit] plan = /tmp/plan46111.xml
[junit] 08/12/30 18:23:27 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:27 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:23:27 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:27 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:27 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:27 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:27 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:28 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:23:28 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:28 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:28 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:28 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:23:28 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:28 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:23:28 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:28 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:28 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:28 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:23:28 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:23:28 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:23:28 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:28 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1888561167
[junit] 08/12/30 18:23:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:28 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:28 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:28 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:28 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:28 INFO mapred.TaskRunner: Task 'reduce_f9lps8' done.
[junit] 08/12/30 18:23:28 INFO mapred.TaskRunner: Saved output of task 'reduce_f9lps8' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1888561167
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:23:28 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:28 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46112.xml
[junit] 08/12/30 18:23:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:31 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/863962675/553964635.10001
[junit] 08/12/30 18:23:31 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:31 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:31 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:31 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:31 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:32 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:32 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/863962675/553964635.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/863962675/553964635.10001/reduce_f9lps8
[junit] 08/12/30 18:23:32 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:32 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:32 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:23:32 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:32 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:32 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:32 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/863962675/553964635.10001/reduce_f9lps8:0+2219
[junit] 08/12/30 18:23:32 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:32 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-883114474
[junit] 08/12/30 18:23:32 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:32 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:32 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:23:32 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:23:32 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:32 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:32 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:32 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:32 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:32 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:32 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:32 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:23:32 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:32 INFO mapred.TaskRunner: Task 'reduce_43sipf' done.
[junit] 08/12/30 18:23:32 INFO mapred.TaskRunner: Saved output of task 'reduce_43sipf' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-883114474
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:23:32 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:32 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/notable_alias1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/notable_alias1.q.out
[junit] Done query: notable_alias1.q
[junit] Begin query: input0.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input0(TestCliDriver.java:828)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join1(TestCliDriver.java:853)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input2.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input2.q.out
[junit] Done query: input2.q
[junit] Begin query: join3.q
[junit] plan = /tmp/plan46113.xml
[junit] 08/12/30 18:23:38 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:38 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:23:38 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:38 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:38 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:39 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:39 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.MapOperator: Adding alias src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:23:39 INFO exec.MapOperator: Adding alias src3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:23:39 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:23:39 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.ReduceSinkOperator: Using tag = 2
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:23:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp729824504
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:39 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:39 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:39 INFO exec.JoinOperator: Initialization Done
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:39 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:39 INFO mapred.TaskRunner: Task 'reduce_w8zyud' done.
[junit] 08/12/30 18:23:39 INFO mapred.TaskRunner: Saved output of task 'reduce_w8zyud' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp729824504
[junit] 08/12/30 18:23:39 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:23:39 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join3.q.out
[junit] Done query: join3.q
[junit] Begin query: input4.q
[junit] plan = /tmp/plan46114.xml
[junit] 08/12/30 18:23:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:23:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4
[junit] 08/12/30 18:23:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:43 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:23:44 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:44 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:23:44 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:44 INFO exec.MapOperator: Adding alias input4 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4/kv1.txt
[junit] 08/12/30 18:23:44 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:44 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:44 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:44 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:44 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:44 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:44 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4/kv1.txt:0+5812
[junit] 08/12/30 18:23:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1245128494
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:23:45 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:45 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Exception: Client Execution failed with error code = 9
[junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4(TestCliDriver.java:931)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: describe_xpath.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_describe_xpath(TestCliDriver.java:953)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join5.q
[junit] plan = /tmp/plan46115.xml
[junit] 08/12/30 18:23:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:48 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:23:49 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:49 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:49 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:49 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:23:49 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:23:49 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:23:49 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:49 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:23:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:23:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: FILTERED:491
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: PASSED:9
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: FILTERED:493
[junit] 08/12/30 18:23:49 INFO exec.FilterOperator: PASSED:7
[junit] 08/12/30 18:23:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:23:49 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:49 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp304128015
[junit] 08/12/30 18:23:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:23:49 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:49 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:23:49 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:23:49 INFO mapred.TaskRunner: Task 'reduce_h4qxa2' done.
[junit] 08/12/30 18:23:49 INFO mapred.TaskRunner: Saved output of task 'reduce_h4qxa2' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp304128015
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:23:50 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:23:50 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join5.q.out
[junit] Done query: join5.q
[junit] Begin query: input_testxpath2.q
[junit] plan = /tmp/plan46116.xml
[junit] 08/12/30 18:23:53 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:23:54 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift
[junit] 08/12/30 18:23:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:54 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:54 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:54 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:23:54 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:23:54 INFO exec.MapOperator: Adding alias src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq
[junit] 08/12/30 18:23:54 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:23:54 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:23:54 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:23:54 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:54 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:54 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:23:54 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:23:54 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:23:54 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:23:54 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:23:54 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:54 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:23:54 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:23:54 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:23:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:23:54 INFO exec.FilterOperator: PASSED:10
[junit] 08/12/30 18:23:54 INFO exec.FilterOperator: FILTERED:0
[junit] 08/12/30 18:23:54 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
[junit] 08/12/30 18:23:54 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:23:54 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1500084748
[junit] 08/12/30 18:23:55 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:23:55 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_testxpath2.q.out
[junit] Done query: input_testxpath2.q
[junit] Begin query: input6.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input6(TestCliDriver.java:1028)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join7.q
[junit] plan = /tmp/plan46117.xml
[junit] 08/12/30 18:23:59 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:23:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:23:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:23:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:23:59 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:23:59 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:00 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:00 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:00 INFO exec.MapOperator: Adding alias c:c:src3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:00 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.ReduceSinkOperator: Using tag = 2
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: PASSED:9
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: FILTERED:491
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: PASSED:7
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: FILTERED:493
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: PASSED:2
[junit] 08/12/30 18:24:00 INFO exec.FilterOperator: FILTERED:498
[junit] 08/12/30 18:24:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:00 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:00 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp843231967
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:00 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:00 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:24:00 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:00 INFO mapred.TaskRunner: Task 'reduce_p1c0hp' done.
[junit] 08/12/30 18:24:00 INFO mapred.TaskRunner: Saved output of task 'reduce_p1c0hp' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp843231967
[junit] 08/12/30 18:24:00 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:00 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join7.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join7.q.out
[junit] Done query: join7.q
[junit] Begin query: input8.q
[junit] plan = /tmp/plan46118.xml
[junit] 08/12/30 18:24:04 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:24:04 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1
[junit] 08/12/30 18:24:04 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:04 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:04 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:04 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:24:04 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:04 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt
[junit] 08/12/30 18:24:04 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:04 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:04 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:04 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:04 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:04 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:04 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:04 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:04 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:04 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:05 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:05 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:05 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:05 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:05 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:05 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt:0+216
[junit] 08/12/30 18:24:05 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:05 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1491192848
[junit] 08/12/30 18:24:05 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:24:05 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input8.q.out
[junit] Done query: input8.q
[junit] Begin query: union.q
[junit] plan = /tmp/plan46119.xml
[junit] 08/12/30 18:24:08 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:24:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:09 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:09 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:09 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:24:09 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.MapOperator: Adding alias null-subquery1:unioninput-subquery1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:09 INFO exec.MapOperator: Adding alias null-subquery2:unioninput-subquery2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:09 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:09 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.ForwardOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.ForwardOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.ForwardOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.ForwardOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.ForwardOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:09 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.ForwardOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:09 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: PASSED:414
[junit] 08/12/30 18:24:09 INFO exec.FilterOperator: FILTERED:86
[junit] 08/12/30 18:24:09 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:09 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:09 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1038857479
[junit] 08/12/30 18:24:10 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:24:10 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/union.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/union.q.out
[junit] Done query: union.q
[junit] Begin query: join9.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join9(TestCliDriver.java:1128)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: udf2.q
[junit] plan = /tmp/plan46120.xml
[junit] 08/12/30 18:24:13 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:24:13 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:13 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:14 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:14 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:24:14 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:14 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:14 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:14 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:14 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:14 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:14 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:14 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:14 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:14 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:14 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:14 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:14 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:14 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:14 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:14 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:14 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:14 INFO exec.FilterOperator: FILTERED:499
[junit] 08/12/30 18:24:14 INFO exec.FilterOperator: PASSED:1
[junit] 08/12/30 18:24:14 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:14 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:14 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1761640489
[junit] 08/12/30 18:24:15 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:24:15 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46121.xml
[junit] 08/12/30 18:24:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:24:16 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
[junit] 08/12/30 18:24:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:16 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:24:17 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:17 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:24:17 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:17 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000
[junit] 08/12/30 18:24:17 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:17 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:17 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:17 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:17 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:17 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:17 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:17 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000:0+8
[junit] 08/12/30 18:24:17 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:17 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1821140866
[junit] 08/12/30 18:24:18 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:24:18 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf2.q.out
[junit] Done query: udf2.q
[junit] Begin query: input10.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input10(TestCliDriver.java:1178)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join11.q
[junit] plan = /tmp/plan46122.xml
[junit] 08/12/30 18:24:21 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:21 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:21 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:21 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:21 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:21 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:21 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:21 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:21 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:21 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:21 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:21 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:21 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:21 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:21 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:21 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:21 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:24:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:22 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:22 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:24:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:22 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:22 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:24:22 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:24:22 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:22 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:22 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1318688638
[junit] 08/12/30 18:24:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:22 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:22 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:24:22 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:22 INFO mapred.TaskRunner: Task 'reduce_igx6c6' done.
[junit] 08/12/30 18:24:22 INFO mapred.TaskRunner: Saved output of task 'reduce_igx6c6' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1318688638
[junit] 08/12/30 18:24:22 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:22 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join11.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join11.q.out
[junit] Done query: join11.q
[junit] Begin query: input4_cb_delim.q
[junit] plan = /tmp/plan46123.xml
[junit] 08/12/30 18:24:25 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:24:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4_cb
[junit] 08/12/30 18:24:25 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:25 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:25 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:26 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:26 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:24:26 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:26 INFO exec.MapOperator: Adding alias input4_cb to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4_cb/kv1_cb.txt
[junit] 08/12/30 18:24:26 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:26 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:26 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:26 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:26 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:26 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:26 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:26 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:26 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:26 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4_cb/kv1_cb.txt:0+5812
[junit] 08/12/30 18:24:26 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:26 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp2137490136
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:24:27 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:24:27 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Exception: Client Execution failed with error code = 9
[junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4_cb_delim(TestCliDriver.java:1231)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: udf4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf4(TestCliDriver.java:1253)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input12.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input12(TestCliDriver.java:1278)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join13.q
[junit] plan = /tmp/plan46124.xml
[junit] 08/12/30 18:24:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:30 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:30 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:31 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:31 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:31 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:31 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:31 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:31 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:24:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:31 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:31 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:31 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:31 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:31 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:31 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:24:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:31 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:31 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:31 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:31 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:24:31 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:24:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-304192273
[junit] 08/12/30 18:24:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:31 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:24:31 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:31 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:24:31 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:31 INFO mapred.TaskRunner: Task 'reduce_8b5fw5' done.
[junit] 08/12/30 18:24:31 INFO mapred.TaskRunner: Saved output of task 'reduce_8b5fw5' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-304192273
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:24:31 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:31 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46125.xml
[junit] 08/12/30 18:24:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:33 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/29437697/675658525.10002
[junit] 08/12/30 18:24:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:34 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:24:34 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:34 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:34 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: Adding alias $INTNAME to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/29437697/675658525.10002/reduce_8b5fw5
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:34 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:24:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/29437697/675658525.10002/reduce_8b5fw5:0+9116
[junit] 08/12/30 18:24:34 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:34 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1100889546
[junit] 08/12/30 18:24:34 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: Adding alias src3:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:34 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:34 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:34 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:34 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:34 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:34 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:24:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:34 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:34 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:34 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:35 INFO exec.FilterOperator: FILTERED:311
[junit] 08/12/30 18:24:35 INFO exec.FilterOperator: PASSED:189
[junit] 08/12/30 18:24:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:35 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
[junit] 08/12/30 18:24:35 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1100889546
[junit] 08/12/30 18:24:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:35 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:24:35 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:24:35 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:35 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:35 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:35 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:35 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:24:35 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:35 INFO mapred.TaskRunner: Task 'reduce_o16lc0' done.
[junit] 08/12/30 18:24:35 INFO mapred.TaskRunner: Saved output of task 'reduce_o16lc0' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1100889546
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:24:35 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:35 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join13.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join13.q.out
[junit] Done query: join13.q
[junit] Begin query: input14.q
[junit] plan = /tmp/plan46126.xml
[junit] 08/12/30 18:24:38 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:38 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:38 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:38 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:38 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:39 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:39 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:39 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:39 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:24:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: partname={}
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: alias=tmap:src
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:39 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1850066293
[junit] 08/12/30 18:24:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:39 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:39 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:39 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:24:39 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:24:39 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:24:39 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:39 INFO mapred.TaskRunner: Task 'reduce_9xor9n' done.
[junit] 08/12/30 18:24:39 INFO mapred.TaskRunner: Saved output of task 'reduce_9xor9n' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1850066293
[junit] 08/12/30 18:24:39 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:39 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input14.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input14.q.out
[junit] Done query: input14.q
[junit] Begin query: join15.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join15(TestCliDriver.java:1353)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input16.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input16(TestCliDriver.java:1378)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input_part1.q
[junit] plan = /tmp/plan46127.xml
[junit] 08/12/30 18:24:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:24:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
[junit] 08/12/30 18:24:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:43 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:24:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:43 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:24:43 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:43 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
[junit] 08/12/30 18:24:43 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:24:43 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:43 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:43 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:24:43 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:24:43 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:43 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:43 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:43 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:43 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:44 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:24:44 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:44 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:24:44 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:24:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
[junit] 08/12/30 18:24:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1689648872
[junit] 08/12/30 18:24:44 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:24:44 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part1.q.out
[junit] Done query: input_part1.q
[junit] Begin query: join17.q
[junit] plan = /tmp/plan46128.xml
[junit] 08/12/30 18:24:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:48 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:48 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:24:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:48 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:48 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:48 INFO exec.MapOperator: Adding alias src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:48 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:49 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:49 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:49 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:24:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:49 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:49 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:49 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:24:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:49 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:49 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:49 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1770948783
[junit] 08/12/30 18:24:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:49 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:24:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:49 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:49 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:24:49 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:49 INFO mapred.TaskRunner: Task 'reduce_spmoip' done.
[junit] 08/12/30 18:24:49 INFO mapred.TaskRunner: Saved output of task 'reduce_spmoip' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1770948783
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:24:49 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:49 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join17.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join17.q.out
[junit] Done query: join17.q
[junit] Begin query: input18.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input18(TestCliDriver.java:1453)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input_part3.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part3(TestCliDriver.java:1478)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby2.q
[junit] plan = /tmp/plan46129.xml
[junit] 08/12/30 18:24:53 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:53 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:24:53 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:54 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:54 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:54 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:54 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:54 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:24:54 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:54 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:24:54 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:24:54 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:54 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:24:54 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:24:54 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:24:54 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:24:55 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:24:55 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:55 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2050431705
[junit] 08/12/30 18:24:55 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:24:55 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:24:55 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:24:55 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:24:55 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:55 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:24:55 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:55 INFO mapred.TaskRunner: Task 'reduce_mez3y6' done.
[junit] 08/12/30 18:24:55 INFO mapred.TaskRunner: Saved output of task 'reduce_mez3y6' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2050431705
[junit] 08/12/30 18:24:55 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:55 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46130.xml
[junit] 08/12/30 18:24:57 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:24:57 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/441857879/554106932.10001
[junit] 08/12/30 18:24:57 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:24:57 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:24:57 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:57 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:24:57 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:24:57 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:24:57 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/441857879/554106932.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/441857879/554106932.10001/reduce_mez3y6
[junit] 08/12/30 18:24:57 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:24:57 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:24:57 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:24:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:57 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:24:57 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/441857879/554106932.10001/reduce_mez3y6:0+566
[junit] 08/12/30 18:24:58 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:24:58 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1822339927
[junit] 08/12/30 18:24:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:24:58 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:24:58 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:24:58 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:58 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:58 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:24:58 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:24:58 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:24:58 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:58 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:24:58 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:24:58 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:24:58 INFO mapred.TaskRunner: Task 'reduce_8wioe0' done.
[junit] 08/12/30 18:24:58 INFO mapred.TaskRunner: Saved output of task 'reduce_8wioe0' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1822339927
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:24:58 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:24:58 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby2.q.out
[junit] Done query: groupby2.q
[junit] Begin query: show_tables.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/show_tables.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/show_tables.q.out
[junit] Done query: show_tables.q
[junit] Begin query: input_part5.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part5(TestCliDriver.java:1553)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby4(TestCliDriver.java:1578)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby6.q
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby6(TestCliDriver.java:1603)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: input1_limit.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input1_limit(TestCliDriver.java:1628)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby8.q
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby8(TestCliDriver.java:1653)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: input2_limit.q
[junit] plan = /tmp/plan46131.xml
[junit] 08/12/30 18:25:03 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:25:03 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:03 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:03 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:04 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:25:04 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:04 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:25:04 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:04 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:04 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:04 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:04 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:04 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:25:04 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:25:04 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:04 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:04 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:25:04 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:25:04 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:04 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:25:04 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:04 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:25:04 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:04 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:04 INFO exec.FilterOperator: PASSED:8
[junit] 08/12/30 18:25:04 INFO exec.FilterOperator: FILTERED:3
[junit] 08/12/30 18:25:04 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:04 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:04 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1297189940
[junit] 08/12/30 18:25:05 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:25:05 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input2_limit.q.out
[junit] Done query: input2_limit.q
[junit] Begin query: input3_limit.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3_limit(TestCliDriver.java:1703)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: create_1.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/create_1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/create_1.q.out
[junit] Done query: create_1.q
[junit] Begin query: scriptfile1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_scriptfile1(TestCliDriver.java:1753)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: case_sensitivity.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_case_sensitivity(TestCliDriver.java:1778)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: sort.q
[junit] plan = /tmp/plan46132.xml
[junit] 08/12/30 18:25:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:09 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:25:10 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:10 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:10 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:10 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:10 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:10 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:10 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:10 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:10 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:10 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:10 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:10 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:10 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:10 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:10 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1353976538
[junit] 08/12/30 18:25:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:10 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:25:10 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:25:10 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:10 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:25:10 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:10 INFO mapred.TaskRunner: Task 'reduce_uxo0jk' done.
[junit] 08/12/30 18:25:10 INFO mapred.TaskRunner: Saved output of task 'reduce_uxo0jk' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1353976538
[junit] 08/12/30 18:25:11 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:11 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sort.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sort.q.out
[junit] Done query: sort.q
[junit] Begin query: mapreduce2.q
[junit] plan = /tmp/plan46133.xml
[junit] 08/12/30 18:25:13 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:14 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:14 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:14 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:14 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:25:14 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:14 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:14 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:14 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:14 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:14 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:14 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:14 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:14 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:25:14 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:14 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:14 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:25:14 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: partname={}
[junit] 08/12/30 18:25:14 INFO exec.ScriptOperator: alias=src
[junit] 08/12/30 18:25:14 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:14 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:15 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:15 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:25:15 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:25:15 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:15 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:15 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:15 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:15 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1523513773
[junit] 08/12/30 18:25:15 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:25:15 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:25:15 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:25:15 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:25:15 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:15 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:15 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:15 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:15 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:25:15 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:15 INFO mapred.TaskRunner: Task 'reduce_91wtpd' done.
[junit] 08/12/30 18:25:15 INFO mapred.TaskRunner: Saved output of task 'reduce_91wtpd' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1523513773
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:15 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:15 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46134.xml
[junit] 08/12/30 18:25:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:17 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
[junit] 08/12/30 18:25:17 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:17 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:25:17 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:17 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:17 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:17 INFO exec.MapOperator: Adding alias t:dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/reduce_91wtpd
[junit] 08/12/30 18:25:17 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:17 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:17 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:17 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:17 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:17 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:17 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:17 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:25:17 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:25:17 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:17 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:18 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/reduce_91wtpd:0+8228
[junit] 08/12/30 18:25:18 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:18 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp290299346
[junit] 08/12/30 18:25:18 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:25:18 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:25:18 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:25:18 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:25:18 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:18 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:18 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:18 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:18 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:25:18 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:18 INFO mapred.TaskRunner: Task 'reduce_wo0ljj' done.
[junit] 08/12/30 18:25:18 INFO mapred.TaskRunner: Saved output of task 'reduce_wo0ljj' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp290299346
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:18 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:18 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce2.q.out
[junit] Done query: mapreduce2.q
[junit] Begin query: mapreduce4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_mapreduce4(TestCliDriver.java:1853)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: nullinput.q
[junit] plan = /tmp/plan46135.xml
[junit] 08/12/30 18:25:21 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:21 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/tstnullinut
[junit] 08/12/30 18:25:21 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] Job need not be submitted: no output: Success
[junit] 08/12/30 18:25:21 INFO exec.ExecDriver: Job need not be submitted: no output: Success
[junit] plan = /tmp/plan46136.xml
[junit] 08/12/30 18:25:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] Job need not be submitted: no output: Success
[junit] 08/12/30 18:25:22 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/339448325/69467753.10002
[junit] 08/12/30 18:25:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:22 INFO exec.ExecDriver: Job need not be submitted: no output: Success
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/nullinput.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/nullinput.q.out
[junit] Done query: nullinput.q
[junit] Begin query: mapreduce6.q
[junit] plan = /tmp/plan46137.xml
[junit] 08/12/30 18:25:25 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:25 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:25 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:25 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:25:26 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:26 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:26 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:26 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:26 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:26 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:26 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:26 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
[junit] 08/12/30 18:25:26 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
[junit] 08/12/30 18:25:26 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:26 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:26 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:26 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:26 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:26 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1611580891
[junit] 08/12/30 18:25:26 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
[junit] 08/12/30 18:25:26 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
[junit] 08/12/30 18:25:26 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:25:26 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:26 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:26 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:26 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:25:26 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:26 INFO mapred.TaskRunner: Task 'reduce_ziiome' done.
[junit] 08/12/30 18:25:26 INFO mapred.TaskRunner: Saved output of task 'reduce_ziiome' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1611580891
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:25:27 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:27 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce6.q.out
[junit] Done query: mapreduce6.q
[junit] Begin query: sample1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample1(TestCliDriver.java:1928)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: sample3.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample3(TestCliDriver.java:1953)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby1_map.q
[junit] plan = /tmp/plan46138.xml
[junit] 08/12/30 18:25:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:30 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:30 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:30 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:30 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:30 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:30 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:30 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:30 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:25:30 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:25:30 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:30 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:30 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:25:30 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-156306050
[junit] 08/12/30 18:25:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:31 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:25:31 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:25:31 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:31 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:25:31 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:31 INFO mapred.TaskRunner: Task 'reduce_d30q5z' done.
[junit] 08/12/30 18:25:31 INFO mapred.TaskRunner: Saved output of task 'reduce_d30q5z' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-156306050
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:31 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:31 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46139.xml
[junit] 08/12/30 18:25:32 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:25:33 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/413813196/19244402.10001
[junit] 08/12/30 18:25:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:33 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:25:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:33 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:33 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:33 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/413813196/19244402.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/413813196/19244402.10001/reduce_d30q5z
[junit] 08/12/30 18:25:33 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:33 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:33 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:33 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:33 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:33 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/413813196/19244402.10001/reduce_d30q5z:0+11875
[junit] 08/12/30 18:25:34 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:34 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1037926450
[junit] 08/12/30 18:25:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:34 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:25:34 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:25:34 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:34 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:34 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:34 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:34 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:34 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:34 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:34 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:25:34 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:34 INFO mapred.TaskRunner: Task 'reduce_j7msa' done.
[junit] 08/12/30 18:25:34 INFO mapred.TaskRunner: Saved output of task 'reduce_j7msa' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1037926450
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:25:34 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:34 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1_map.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby1_map.q.out
[junit] Done query: groupby1_map.q
[junit] Begin query: inputddl2.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl2.q.out
[junit] Done query: inputddl2.q
[junit] Begin query: groupby1_limit.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby1_limit(TestCliDriver.java:2028)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: sample5.q
[junit] plan = /tmp/plan46140.xml
[junit] 08/12/30 18:25:38 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:25:38 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket
[junit] 08/12/30 18:25:38 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:38 INFO mapred.FileInputFormat: Total input paths to process : 2
[junit] 08/12/30 18:25:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:39 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:39 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: PASSED:98
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: FILTERED:402
[junit] 08/12/30 18:25:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
[junit] 08/12/30 18:25:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1467829610
[junit] 08/12/30 18:25:39 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv2.txt
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:39 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:39 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: PASSED:207
[junit] 08/12/30 18:25:39 INFO exec.FilterOperator: FILTERED:793
[junit] 08/12/30 18:25:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv2.txt:0+5791
[junit] 08/12/30 18:25:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
[junit] 08/12/30 18:25:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1467829610
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:40 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:25:40 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample5.q.out
[junit] Done query: sample5.q
[junit] Begin query: groupby3_map.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby3_map(TestCliDriver.java:2078)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: groupby2_limit.q
[junit] plan = /tmp/plan46141.xml
[junit] 08/12/30 18:25:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
[junit] 08/12/30 18:25:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:43 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:44 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:44 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:44 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:44 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:44 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:44 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:44 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:44 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:44 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-398452428
[junit] 08/12/30 18:25:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:44 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:25:44 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:25:44 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:44 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:25:44 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:44 INFO mapred.TaskRunner: Task 'reduce_g79kdk' done.
[junit] 08/12/30 18:25:44 INFO mapred.TaskRunner: Saved output of task 'reduce_g79kdk' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-398452428
[junit] 08/12/30 18:25:44 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:44 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46142.xml
[junit] 08/12/30 18:25:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
[junit] 08/12/30 18:25:46 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/137916517/39993323.10002
[junit] 08/12/30 18:25:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:46 WARN exec.ExecDriver: Number of reduce tasks inferred based on input size to : 1
[junit] 08/12/30 18:25:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:46 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:47 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:25:47 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:47 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/137916517/39993323.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/137916517/39993323.10002/reduce_g79kdk
[junit] 08/12/30 18:25:47 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:47 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:25:47 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:25:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/137916517/39993323.10002/reduce_g79kdk:0+11875
[junit] 08/12/30 18:25:47 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:47 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1494460046
[junit] 08/12/30 18:25:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:25:47 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:25:47 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:25:47 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:47 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:47 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:25:47 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:25:47 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:47 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:25:47 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:47 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:25:47 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:25:47 INFO mapred.TaskRunner: Task 'reduce_xxgisq' done.
[junit] 08/12/30 18:25:47 INFO mapred.TaskRunner: Saved output of task 'reduce_xxgisq' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1494460046
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:48 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:25:48 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby2_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby2_limit.q.out
[junit] Done query: groupby2_limit.q
[junit] Begin query: inputddl4.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl4.q.out
[junit] Done query: inputddl4.q
[junit] Begin query: sample7.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample7(TestCliDriver.java:2153)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby5_map.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby5_map(TestCliDriver.java:2178)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: inputddl6.q
[junit] Exception: Client Execution failed with error code = 9
[junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl6(TestCliDriver.java:2206)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: input16_cc.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input16_cc(TestCliDriver.java:2228)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: cast1.q
[junit] plan = /tmp/plan46143.xml
[junit] 08/12/30 18:25:55 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:25:55 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:25:55 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:25:55 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:25:55 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:55 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:25:55 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:25:55 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:25:55 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:25:56 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:25:56 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:25:56 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:56 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:25:56 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:25:56 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:56 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:25:56 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:25:56 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:25:56 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:25:56 INFO exec.FilterOperator: FILTERED:499
[junit] 08/12/30 18:25:56 INFO exec.FilterOperator: PASSED:1
[junit] 08/12/30 18:25:56 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:25:56 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:25:56 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1048820941
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:25:56 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:25:56 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/cast1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/cast1.q.out
[junit] Done query: cast1.q
[junit] Begin query: inputddl8.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl8.q.out
[junit] Done query: inputddl8.q
[junit] Begin query: quote1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_quote1(TestCliDriver.java:2303)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join0.q
[junit] plan = /tmp/plan46144.xml
[junit] 08/12/30 18:26:01 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:26:01 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:26:01 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:01 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:02 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:26:02 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:02 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:26:02 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:02 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:02 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:02 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:26:02 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:26:02 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:26:02 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:26:02 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: PASSED:10
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: FILTERED:490
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: PASSED:10
[junit] 08/12/30 18:26:02 INFO exec.FilterOperator: FILTERED:490
[junit] 08/12/30 18:26:02 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:26:02 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:02 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp820362062
[junit] 08/12/30 18:26:02 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:26:02 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:26:02 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:02 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:02 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:26:02 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:26:02 INFO mapred.TaskRunner: Task 'reduce_gw16th' done.
[junit] 08/12/30 18:26:02 INFO mapred.TaskRunner: Saved output of task 'reduce_gw16th' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp820362062
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:26:03 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:26:03 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46145.xml
[junit] 08/12/30 18:26:04 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:26:05 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/495826568/550220418.10002
[junit] 08/12/30 18:26:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:05 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:26:05 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:05 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:26:05 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:05 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/495826568/550220418.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/495826568/550220418.10002/reduce_gw16th
[junit] 08/12/30 18:26:05 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:05 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:05 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:26:05 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:26:05 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:26:05 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:05 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/495826568/550220418.10002/reduce_gw16th:0+5836
[junit] 08/12/30 18:26:05 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:05 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp149212334
[junit] 08/12/30 18:26:05 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:26:05 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
[junit] 08/12/30 18:26:06 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:26:06 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:26:06 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:06 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:26:06 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:26:06 INFO mapred.TaskRunner: Task 'reduce_4i7vao' done.
[junit] 08/12/30 18:26:06 INFO mapred.TaskRunner: Saved output of task 'reduce_4i7vao' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp149212334
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:26:06 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:26:06 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join0.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join0.q.out
[junit] Done query: join0.q
[junit] Begin query: notable_alias2.q
[junit] plan = /tmp/plan46146.xml
[junit] 08/12/30 18:26:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:26:10 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:26:10 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:10 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:10 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:10 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:10 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:26:10 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:10 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:10 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:10 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:10 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:10 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:26:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:10 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:10 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:10 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:10 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:26:10 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:26:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:26:10 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:10 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp45824710
[junit] 08/12/30 18:26:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:10 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:26:10 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:10 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:26:10 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:26:10 INFO mapred.TaskRunner: Task 'reduce_egwgbg' done.
[junit] 08/12/30 18:26:10 INFO mapred.TaskRunner: Saved output of task 'reduce_egwgbg' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp45824710
[junit] 08/12/30 18:26:11 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:26:11 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46147.xml
[junit] 08/12/30 18:26:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:26:13 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/264875941/445589740.10001
[junit] 08/12/30 18:26:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:13 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:13 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:13 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:26:13 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:13 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/264875941/445589740.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/264875941/445589740.10001/reduce_egwgbg
[junit] 08/12/30 18:26:13 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:13 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:13 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:26:13 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:13 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/264875941/445589740.10001/reduce_egwgbg:0+2219
[junit] 08/12/30 18:26:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp427966293
[junit] 08/12/30 18:26:14 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:14 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:14 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:26:14 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:26:14 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:14 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:14 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:14 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:14 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:14 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:14 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:14 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:26:14 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:26:14 INFO mapred.TaskRunner: Task 'reduce_edn2ov' done.
[junit] 08/12/30 18:26:14 INFO mapred.TaskRunner: Saved output of task 'reduce_edn2ov' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp427966293
[junit] 08/12/30 18:26:14 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:26:14 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/notable_alias2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/notable_alias2.q.out
[junit] Done query: notable_alias2.q
[junit] Begin query: input1.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input1.q.out
[junit] Done query: input1.q
[junit] Begin query: cluster.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_cluster(TestCliDriver.java:2403)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join2(TestCliDriver.java:2428)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input3.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input3.q.out
[junit] 6c6
[junit] < inputddl6 src src1 src_sequencefile src_thrift srcbucket srcpart test3a test3b
[junit] ---
[junit] > src src1 src_sequencefile src_thrift srcbucket srcpart test3a test3b
[junit] 45c45
[junit] < inputddl6 src src1 src_sequencefile src_thrift srcbucket srcpart test3a test3c
[junit] ---
[junit] > src src1 src_sequencefile src_thrift srcbucket srcpart test3a test3c
[junit] Exception: Client execution results dailed with error code = 1
[junit] junit.framework.AssertionFailedError: Client execution results dailed with error code = 1
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3(TestCliDriver.java:2461)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: join4.q
[junit] plan = /tmp/plan46148.xml
[junit] 08/12/30 18:26:21 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:26:21 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:26:21 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:21 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:22 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:22 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:22 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:26:22 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:22 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:22 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:22 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:26:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:26:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: FILTERED:491
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: PASSED:9
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: FILTERED:493
[junit] 08/12/30 18:26:22 INFO exec.FilterOperator: PASSED:7
[junit] 08/12/30 18:26:22 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:26:22 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:22 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-528585012
[junit] 08/12/30 18:26:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:22 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:22 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:26:22 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:26:22 INFO mapred.TaskRunner: Task 'reduce_yxhdvo' done.
[junit] 08/12/30 18:26:22 INFO mapred.TaskRunner: Saved output of task 'reduce_yxhdvo' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-528585012
[junit] 08/12/30 18:26:23 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:26:23 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join4.q.out
[junit] Done query: join4.q
[junit] Begin query: input5.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input5(TestCliDriver.java:2503)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join6.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join6(TestCliDriver.java:2528)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input_testxpath3.q
[junit] plan = /tmp/plan46149.xml
[junit] 08/12/30 18:26:28 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:26:28 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift
[junit] 08/12/30 18:26:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:28 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:29 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:29 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:26:29 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:29 INFO exec.MapOperator: Adding alias src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq
[junit] 08/12/30 18:26:29 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:29 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:29 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:29 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:29 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:29 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:29 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:29 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:29 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:29 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:29 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
[junit] 08/12/30 18:26:29 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:29 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1340654191
[junit] 08/12/30 18:26:30 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:26:30 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_testxpath3.q.out
[junit] Done query: input_testxpath3.q
[junit] Begin query: input_dynamicserde.q
[junit] plan = /tmp/plan46150.xml
[junit] 08/12/30 18:26:34 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:26:34 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift
[junit] 08/12/30 18:26:34 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:35 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:26:35 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:35 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:26:35 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:35 INFO exec.MapOperator: Adding alias src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq
[junit] 08/12/30 18:26:35 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:35 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:35 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:35 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:35 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:35 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:35 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:35 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:35 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:35 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:35 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
[junit] 08/12/30 18:26:35 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:35 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp165631019
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:26:36 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:26:36 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46151.xml
[junit] 08/12/30 18:26:37 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:26:38 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
[junit] 08/12/30 18:26:38 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:38 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:38 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:38 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:26:38 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:38 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000
[junit] 08/12/30 18:26:38 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:38 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:38 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:38 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:38 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:38 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:38 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:38 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:38 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:38 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000:0+532
[junit] 08/12/30 18:26:38 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:38 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp11526998
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:26:39 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:26:39 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_dynamicserde.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_dynamicserde.q.out
[junit] Done query: input_dynamicserde.q
[junit] Begin query: input7.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input7(TestCliDriver.java:2603)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: input_testsequencefile.q
[junit] plan = /tmp/plan46152.xml
[junit] 08/12/30 18:26:44 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:26:44 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:26:44 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:44 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:44 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:44 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:44 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:26:44 INFO util.NativeCodeLoader: Loaded the native-hadoop library
[junit] 08/12/30 18:26:44 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
[junit] 08/12/30 18:26:44 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:44 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:44 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:44 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:44 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:44 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:44 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:44 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:44 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:44 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:44 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:44 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:45 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:45 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:26:45 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:45 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1451888627
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:26:45 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:26:45 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testsequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_testsequencefile.q.out
[junit] Done query: input_testsequencefile.q
[junit] Begin query: join8.q
[junit] plan = /tmp/plan46153.xml
[junit] 08/12/30 18:26:50 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:26:50 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:26:50 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:26:50 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:26:50 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:26:50 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:26:50 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:26:51 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:51 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:26:51 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:26:51 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:26:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:26:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: PASSED:9
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: FILTERED:491
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: PASSED:7
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: FILTERED:493
[junit] 08/12/30 18:26:51 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:26:51 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:26:51 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1983302254
[junit] 08/12/30 18:26:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:26:51 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:26:51 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: PASSED:5
[junit] 08/12/30 18:26:51 INFO exec.FilterOperator: FILTERED:6
[junit] 08/12/30 18:26:51 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:26:51 INFO mapred.TaskRunner: Task 'reduce_68fkmo' done.
[junit] 08/12/30 18:26:51 INFO mapred.TaskRunner: Saved output of task 'reduce_68fkmo' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1983302254
[junit] 08/12/30 18:26:51 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:26:51 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join8.q.out
[junit] Done query: join8.q
[junit] Begin query: input9.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input9(TestCliDriver.java:2678)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input_dfs.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_dfs.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_dfs.q.out
[junit] Done query: input_dfs.q
[junit] Begin query: input.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input.q.out
[junit] Done query: input.q
[junit] Begin query: udf1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf1(TestCliDriver.java:2753)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join10.q
[junit] plan = /tmp/plan46154.xml
[junit] 08/12/30 18:27:02 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:02 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:02 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:02 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:02 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:02 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.MapOperator: Adding alias x:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:02 INFO exec.MapOperator: Adding alias y:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:02 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:02 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:02 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:27:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:02 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:02 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:02 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:02 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:27:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:02 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:02 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:03 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:03 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1695772186
[junit] 08/12/30 18:27:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:03 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:27:03 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:27:03 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:03 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:03 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:03 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:03 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:27:03 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:27:03 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:03 INFO mapred.TaskRunner: Task 'reduce_c8abu2' done.
[junit] 08/12/30 18:27:03 INFO mapred.TaskRunner: Saved output of task 'reduce_c8abu2' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1695772186
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:27:04 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:04 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join10.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join10.q.out
[junit] Done query: join10.q
[junit] Begin query: input11.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11(TestCliDriver.java:2803)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: noalias_subq1.q
[junit] plan = /tmp/plan46155.xml
[junit] 08/12/30 18:27:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:27:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:09 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:09 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:09 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:27:09 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:09 INFO exec.MapOperator: Adding alias x:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:09 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:09 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:09 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:09 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:09 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:09 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:09 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:09 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:09 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:09 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:09 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:10 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:27:10 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:27:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:10 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:10 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1271076448
[junit] 08/12/30 18:27:10 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:27:10 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/noalias_subq1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/noalias_subq1.q.out
[junit] Done query: noalias_subq1.q
[junit] Begin query: udf3.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf3(TestCliDriver.java:2853)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: join12.q
[junit] plan = /tmp/plan46156.xml
[junit] 08/12/30 18:27:15 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:15 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:15 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:15 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:16 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:16 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:16 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:16 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:16 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:16 INFO exec.MapOperator: Adding alias src3:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:16 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.ReduceSinkOperator: Using tag = 2
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:16 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: PASSED:64
[junit] 08/12/30 18:27:16 INFO exec.FilterOperator: FILTERED:436
[junit] 08/12/30 18:27:16 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:16 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:16 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-959783140
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:16 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:16 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:16 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:17 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:17 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:27:17 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:17 INFO mapred.TaskRunner: Task 'reduce_koqjg9' done.
[junit] 08/12/30 18:27:17 INFO mapred.TaskRunner: Saved output of task 'reduce_koqjg9' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-959783140
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:27:17 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:17 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join12.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join12.q.out
[junit] Done query: join12.q
[junit] Begin query: input_testxpath.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_testxpath(TestCliDriver.java:2903)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input13.q
[junit] plan = /tmp/plan46157.xml
[junit] 08/12/30 18:27:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:27:22 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:22 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:23 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:23 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:27:23 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:23 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:23 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:23 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:23 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:23 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: PASSED:105
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: FILTERED:395
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: PASSED:103
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: FILTERED:397
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: PASSED:208
[junit] 08/12/30 18:27:23 INFO exec.FilterOperator: FILTERED:292
[junit] 08/12/30 18:27:23 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:23 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:23 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp386172647
[junit] 08/12/30 18:27:24 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:27:24 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input13.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input13.q.out
[junit] Done query: input13.q
[junit] Begin query: udf5.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf5(TestCliDriver.java:2953)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: join14.q
[junit] plan = /tmp/plan46158.xml
[junit] 08/12/30 18:27:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
[junit] 08/12/30 18:27:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
[junit] 08/12/30 18:27:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:30 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:30 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:30 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:30 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: FILTERED:0
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: PASSED:500
[junit] 08/12/30 18:27:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
[junit] 08/12/30 18:27:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1198761913
[junit] 08/12/30 18:27:31 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: FILTERED:0
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: PASSED:1000
[junit] 08/12/30 18:27:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
[junit] 08/12/30 18:27:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
[junit] 08/12/30 18:27:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1198761913
[junit] 08/12/30 18:27:31 INFO mapred.MapTask: numReduceTasks: 1
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:27:31 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:27:31 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:31 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: FILTERED:86
[junit] 08/12/30 18:27:31 INFO exec.FilterOperator: PASSED:414
[junit] 08/12/30 18:27:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0002' done.
[junit] 08/12/30 18:27:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0002' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1198761913
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:32 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:27:32 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:27:32 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:32 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:32 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:32 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:32 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:32 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:32 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:32 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:27:32 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:32 INFO mapred.TaskRunner: Task 'reduce_z1qirf' done.
[junit] 08/12/30 18:27:32 INFO mapred.TaskRunner: Saved output of task 'reduce_z1qirf' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1198761913
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:27:32 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:32 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join14.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join14.q.out
[junit] Done query: join14.q
[junit] Begin query: input_part0.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part0(TestCliDriver.java:3003)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input15.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input15.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input15.q.out
[junit] Done query: input15.q
[junit] Begin query: join16.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join16.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join16.q.out
[junit] Done query: join16.q
[junit] Begin query: input_part2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part2(TestCliDriver.java:3078)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input17.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input17(TestCliDriver.java:3103)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby1.q
[junit] plan = /tmp/plan46159.xml
[junit] 08/12/30 18:27:44 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:44 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:44 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:44 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:44 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:45 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:45 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:45 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:45 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:45 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:45 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:45 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:45 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:45 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:27:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:45 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:45 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:45 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:45 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:45 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp698729237
[junit] 08/12/30 18:27:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:45 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:27:45 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:27:45 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:45 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:27:45 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:45 INFO mapred.TaskRunner: Task 'reduce_eil0uv' done.
[junit] 08/12/30 18:27:45 INFO mapred.TaskRunner: Saved output of task 'reduce_eil0uv' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp698729237
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:27:46 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:27:46 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46160.xml
[junit] 08/12/30 18:27:47 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:47 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/49797106/485324406.10001
[junit] 08/12/30 18:27:47 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:47 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:47 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:48 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:48 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:48 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/49797106/485324406.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/49797106/485324406.10001/reduce_eil0uv
[junit] 08/12/30 18:27:48 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:48 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:48 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:27:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:48 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:48 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/49797106/485324406.10001/reduce_eil0uv:0+11875
[junit] 08/12/30 18:27:48 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:48 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-422154870
[junit] 08/12/30 18:27:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:48 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:27:48 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:27:48 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:48 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:48 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:48 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:48 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:48 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:48 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:48 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:27:48 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:48 INFO mapred.TaskRunner: Task 'reduce_pn0b49' done.
[junit] 08/12/30 18:27:48 INFO mapred.TaskRunner: Saved output of task 'reduce_pn0b49' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-422154870
[junit] 08/12/30 18:27:49 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:49 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby1.q.out
[junit] Done query: groupby1.q
[junit] Begin query: join18.q
[junit] plan = /tmp/plan46161.xml
[junit] 08/12/30 18:27:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:51 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1
[junit] 08/12/30 18:27:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:52 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:52 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:52 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:52 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:52 INFO exec.MapOperator: Adding alias b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt
[junit] 08/12/30 18:27:52 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:52 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:52 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:52 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:52 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:27:52 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:27:52 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:27:52 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt:0+216
[junit] 08/12/30 18:27:52 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:52 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1934003619
[junit] 08/12/30 18:27:52 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:27:52 INFO thrift.TBinarySortableProtocol: Sort order is "++"
[junit] 08/12/30 18:27:52 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:27:52 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:27:52 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:52 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:27:52 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:52 INFO mapred.TaskRunner: Task 'reduce_4bdwy5' done.
[junit] 08/12/30 18:27:52 INFO mapred.TaskRunner: Saved output of task 'reduce_4bdwy5' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1934003619
[junit] 08/12/30 18:27:53 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:53 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46162.xml
[junit] 08/12/30 18:27:54 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:54 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:27:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:55 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:27:55 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:55 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:55 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:55 INFO exec.MapOperator: Adding alias a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:27:55 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:55 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:27:55 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:27:55 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:55 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:27:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:55 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:27:55 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:27:55 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:55 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-537596821
[junit] 08/12/30 18:27:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:55 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:27:55 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:27:55 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:55 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:27:55 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:55 INFO mapred.TaskRunner: Task 'reduce_umz5vk' done.
[junit] 08/12/30 18:27:55 INFO mapred.TaskRunner: Saved output of task 'reduce_umz5vk' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-537596821
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:27:56 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:56 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46163.xml
[junit] 08/12/30 18:27:57 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:27:57 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10002
[junit] 08/12/30 18:27:57 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:27:57 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:27:57 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:58 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:27:58 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:27:58 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:27:58 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10002/reduce_4bdwy5
[junit] 08/12/30 18:27:58 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:27:58 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:27:58 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:27:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:58 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:27:58 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10002/reduce_4bdwy5:0+699
[junit] 08/12/30 18:27:58 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:27:58 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1097515447
[junit] 08/12/30 18:27:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:27:58 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:27:58 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:27:58 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:27:58 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:27:58 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:27:58 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:27:58 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:27:58 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:27:58 INFO mapred.TaskRunner: Task 'reduce_2n8s1i' done.
[junit] 08/12/30 18:27:58 INFO mapred.TaskRunner: Saved output of task 'reduce_2n8s1i' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1097515447
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:27:59 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:27:59 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46164.xml
[junit] 08/12/30 18:28:00 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:28:00 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10004
[junit] 08/12/30 18:28:00 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:00 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:01 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:01 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:01 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:01 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:01 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10004 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10004/reduce_umz5vk
[junit] 08/12/30 18:28:01 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:01 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:01 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:28:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:01 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:01 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10004/reduce_umz5vk:0+11875
[junit] 08/12/30 18:28:01 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:01 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1074562762
[junit] 08/12/30 18:28:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:01 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:28:01 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:28:01 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:01 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:01 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:01 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:01 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:28:01 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:28:01 INFO mapred.TaskRunner: Task 'reduce_4pirz9' done.
[junit] 08/12/30 18:28:01 INFO mapred.TaskRunner: Saved output of task 'reduce_4pirz9' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1074562762
[junit] 08/12/30 18:28:02 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:28:02 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46165.xml
[junit] 08/12/30 18:28:03 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:28:04 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10003
[junit] 08/12/30 18:28:04 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10005
[junit] 08/12/30 18:28:04 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:04 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:04 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:04 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:04 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:04 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:04 INFO exec.MapOperator: Adding alias $INTNAME to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10003/reduce_2n8s1i
[junit] 08/12/30 18:28:04 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:04 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:04 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:28:04 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:04 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:04 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:04 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10003/reduce_2n8s1i:0+699
[junit] 08/12/30 18:28:04 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:04 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1636797687
[junit] 08/12/30 18:28:04 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:05 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.MapOperator: Adding alias $INTNAME1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10005/reduce_4pirz9
[junit] 08/12/30 18:28:05 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:05 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.ReduceSinkOperator: Initializing children:
[junit] 08/12/30 18:28:05 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:28:05 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:05 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:05 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:28:05 INFO exec.ReduceSinkOperator: Initialization Done
[junit] 08/12/30 18:28:05 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:28:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:05 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:05 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/154575109/806142414.10005/reduce_4pirz9:0+11875
[junit] 08/12/30 18:28:05 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
[junit] 08/12/30 18:28:05 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1636797687
[junit] 08/12/30 18:28:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:05 INFO exec.JoinOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.JoinOperator: Initializing children:
[junit] 08/12/30 18:28:05 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:05 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:05 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:05 INFO exec.JoinOperator: Initialization Done
[junit] 08/12/30 18:28:05 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:28:05 INFO mapred.TaskRunner: Task 'reduce_m66zzs' done.
[junit] 08/12/30 18:28:05 INFO mapred.TaskRunner: Saved output of task 'reduce_m66zzs' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1636797687
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:05 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:28:05 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join18.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join18.q.out
[junit] Done query: join18.q
[junit] Begin query: input_part4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part4(TestCliDriver.java:3178)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input19.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input19.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input19.q.out
[junit] Done query: input19.q
[junit] Begin query: groupby3.q
[junit] plan = /tmp/plan46166.xml
[junit] 08/12/30 18:28:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:28:10 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:10 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:10 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:10 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:10 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:10 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:10 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:10 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:10 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:10 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:10 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:10 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:10 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:10 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:10 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:28:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:10 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:10 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:10 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:10 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1574389060
[junit] 08/12/30 18:28:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:28:10 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:28:10 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:28:10 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:10 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:28:11 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:28:11 INFO mapred.TaskRunner: Task 'reduce_6f5tue' done.
[junit] 08/12/30 18:28:11 INFO mapred.TaskRunner: Saved output of task 'reduce_6f5tue' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1574389060
[junit] 08/12/30 18:28:11 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:28:11 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46167.xml
[junit] 08/12/30 18:28:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:28:13 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1122573307/90699982.10001
[junit] 08/12/30 18:28:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:13 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:13 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:13 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:13 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:13 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1122573307/90699982.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1122573307/90699982.10001/reduce_6f5tue
[junit] 08/12/30 18:28:13 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:13 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:13 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:28:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1122573307/90699982.10001/reduce_6f5tue:0+183
[junit] 08/12/30 18:28:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-329019279
[junit] 08/12/30 18:28:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:14 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:28:14 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:28:14 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:14 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:14 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:14 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:14 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:28:14 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:28:14 INFO mapred.TaskRunner: Task 'reduce_cqblig' done.
[junit] 08/12/30 18:28:14 INFO mapred.TaskRunner: Saved output of task 'reduce_cqblig' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-329019279
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:14 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:28:14 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby3.q.out
[junit] Done query: groupby3.q
[junit] Begin query: subq.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_subq(TestCliDriver.java:3253)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: union2.q
[junit] plan = /tmp/plan46168.xml
[junit] 08/12/30 18:28:17 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:28:17 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:17 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:17 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:18 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:18 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:18 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.MapOperator: Adding alias null-subquery1:unionsrc-subquery1:s1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:18 INFO exec.MapOperator: Adding alias null-subquery2:unionsrc-subquery2:s2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:18 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:18 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.ForwardOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.ForwardOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:28:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:18 INFO exec.ForwardOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.ForwardOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.ForwardOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:28:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:18 INFO exec.ForwardOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:18 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:18 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:18 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1739215502
[junit] 08/12/30 18:28:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:18 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:28:18 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:18 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:28:18 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:28:18 INFO mapred.TaskRunner: Task 'reduce_777wr9' done.
[junit] 08/12/30 18:28:18 INFO mapred.TaskRunner: Saved output of task 'reduce_777wr9' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1739215502
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:28:19 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:28:19 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46169.xml
[junit] 08/12/30 18:28:20 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
[junit] 08/12/30 18:28:20 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/15086749/75907823.10002
[junit] 08/12/30 18:28:20 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:20 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:20 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:21 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:21 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:28:21 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:21 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/15086749/75907823.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/15086749/75907823.10002/reduce_777wr9
[junit] 08/12/30 18:28:21 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:21 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:28:21 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:28:21 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:21 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/15086749/75907823.10002/reduce_777wr9:0+124
[junit] 08/12/30 18:28:21 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:21 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp126336592
[junit] 08/12/30 18:28:21 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:21 INFO thrift.TBinarySortableProtocol: Sort order is ""
[junit] 08/12/30 18:28:21 INFO exec.GroupByOperator: Initializing Self
[junit] 08/12/30 18:28:21 INFO exec.GroupByOperator: Initializing children:
[junit] 08/12/30 18:28:21 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:21 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:21 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:21 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:21 INFO exec.GroupByOperator: Initialization Done
[junit] 08/12/30 18:28:21 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:28:21 INFO mapred.TaskRunner: Task 'reduce_1sftji' done.
[junit] 08/12/30 18:28:21 INFO mapred.TaskRunner: Saved output of task 'reduce_1sftji' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp126336592
[junit] 08/12/30 18:28:22 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:28:22 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/union2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/union2.q.out
[junit] Done query: union2.q
[junit] Begin query: input_part6.q
[junit] plan = /tmp/plan46170.xml
[junit] 08/12/30 18:28:24 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
[junit] 08/12/30 18:28:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
[junit] 08/12/30 18:28:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11
[junit] 08/12/30 18:28:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12
[junit] 08/12/30 18:28:25 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:25 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:25 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:25 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:25 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:25 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:25 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:25 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: FILTERED:500
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: PASSED:0
[junit] 08/12/30 18:28:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp638841855
[junit] 08/12/30 18:28:25 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: FILTERED:1000
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: PASSED:0
[junit] 08/12/30 18:28:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp638841855
[junit] 08/12/30 18:28:25 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11/kv1.txt
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: FILTERED:1500
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: PASSED:0
[junit] 08/12/30 18:28:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11/kv1.txt:0+5812
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Task 'job_local_1_map_0002' done.
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0002' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp638841855
[junit] 08/12/30 18:28:25 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: Got partitions: ds/hr
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:25 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:25 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: FILTERED:2000
[junit] 08/12/30 18:28:25 INFO exec.FilterOperator: PASSED:0
[junit] 08/12/30 18:28:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt:0+5812
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Task 'job_local_1_map_0003' done.
[junit] 08/12/30 18:28:25 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0003' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp638841855
[junit] 08/12/30 18:28:26 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:26 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part6.q.out
[junit] Done query: input_part6.q
[junit] Begin query: groupby5.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby5(TestCliDriver.java:3328)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: groupby7.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby7(TestCliDriver.java:3353)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: udf_testlength.q
[junit] plan = /tmp/plan46171.xml
[junit] 08/12/30 18:28:29 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:29 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:29 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:29 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:29 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:29 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:29 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:29 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:29 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:29 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:29 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:29 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:29 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:29 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:29 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:29 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:29 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:29 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:29 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:29 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:29 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp94094287
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:30 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:30 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf_testlength.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf_testlength.q.out
[junit] Done query: udf_testlength.q
[junit] Begin query: fileformat_text.q
[junit] plan = /tmp/plan46172.xml
[junit] 08/12/30 18:28:34 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:34 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:34 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:35 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:35 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:35 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:35 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:35 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:35 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:35 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:35 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:35 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:28:35 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:28:35 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:35 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:35 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:35 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:35 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:35 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:35 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:35 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:28:35 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:35 INFO exec.FilterOperator: PASSED:10
[junit] 08/12/30 18:28:35 INFO exec.FilterOperator: FILTERED:490
[junit] 08/12/30 18:28:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:35 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:35 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1856161807
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:36 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:36 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_text.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/fileformat_text.q.out
[junit] Done query: fileformat_text.q
[junit] Begin query: fileformat_sequencefile.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_fileformat_sequencefile(TestCliDriver.java:3428)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: udf_round.q
[junit] plan = /tmp/plan46173.xml
[junit] 08/12/30 18:28:41 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:42 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:42 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:42 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:42 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:42 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:42 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:42 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:42 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:42 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:42 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:42 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:42 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:42 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:42 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:42 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:42 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:42 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:42 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:42 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:42 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:42 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:42 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:42 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:42 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:42 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:42 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-132306851
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:43 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:43 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46174.xml
[junit] 08/12/30 18:28:45 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:45 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:45 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:45 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:45 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:45 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:45 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:45 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:45 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:45 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:45 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:45 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:45 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:45 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:45 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:45 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:45 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:45 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:45 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:45 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:45 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:45 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:45 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:45 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:45 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2004280139
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:46 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:46 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46175.xml
[junit] 08/12/30 18:28:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:48 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:48 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:49 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:49 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:49 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:49 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:49 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:49 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:49 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:49 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:49 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:49 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:49 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:49 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:49 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:49 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:49 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:49 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp630041177
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:28:49 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:49 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] plan = /tmp/plan46176.xml
[junit] 08/12/30 18:28:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:52 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:52 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:52 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:52 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:52 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:52 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:52 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:52 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:52 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:52 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:52 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:52 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:52 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:52 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:52 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:52 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:52 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:52 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:52 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:52 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:52 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1272988232
[junit] 08/12/30 18:28:53 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:53 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] plan = /tmp/plan46177.xml
[junit] 08/12/30 18:28:54 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
[junit] 08/12/30 18:28:54 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:28:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:28:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:28:55 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:28:55 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:28:55 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:28:55 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:28:55 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:28:55 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:28:55 INFO exec.TableScanOperator: Initializing Self
[junit] 08/12/30 18:28:55 INFO exec.TableScanOperator: Initializing children:
[junit] 08/12/30 18:28:55 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:55 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:55 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:28:55 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:28:55 INFO exec.LimitOperator: Initializing Self
[junit] 08/12/30 18:28:55 INFO exec.LimitOperator: Initializing children:
[junit] 08/12/30 18:28:55 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:28:55 INFO exec.LimitOperator: Initialization Done
[junit] 08/12/30 18:28:55 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:55 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:28:55 INFO exec.TableScanOperator: Initialization Done
[junit] 08/12/30 18:28:55 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:28:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:28:55 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:28:55 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1837315064
[junit] 08/12/30 18:28:56 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:28:56 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf_round.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf_round.q.out
[junit] Done query: udf_round.q
[junit] Begin query: fileformat_void.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_void.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/fileformat_void.q.out
[junit] Done query: fileformat_void.q
[junit] Tests run: 128, Failures: 65, Errors: 0, Time elapsed: 407.176 sec
[junit] Test org.apache.hadoop.hive.cli.TestCliDriver FAILED
[junit] Running org.apache.hadoop.hive.cli.TestNegativeCliDriver
[junit] Begin query: input1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/input1.q.out
[junit] Done query: input1.q
[junit] Begin query: notable_alias3.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias3(TestNegativeCliDriver.java:117)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: notable_alias4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias4(TestNegativeCliDriver.java:142)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input2.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/input2.q.out
[junit] Done query: input2.q
[junit] Begin query: bad_sample_clause.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_bad_sample_clause(TestNegativeCliDriver.java:192)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: input_testxpath4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_testxpath4(TestNegativeCliDriver.java:217)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: invalid_tbl_name.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_tbl_name(TestNegativeCliDriver.java:242)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: union.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/union.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/union.q.out
[junit] Done query: union.q
[junit] Begin query: joinneg.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_joinneg(TestNegativeCliDriver.java:292)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: invalid_create_tbl1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_create_tbl1(TestNegativeCliDriver.java:317)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: invalid_create_tbl2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_create_tbl2(TestNegativeCliDriver.java:342)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: subq_insert.q
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_subq_insert(TestNegativeCliDriver.java:367)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] Begin query: load_wrong_fileformat.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_load_wrong_fileformat(TestNegativeCliDriver.java:392)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: describe_xpath1.q
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='11') failed with exit code= 9
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='11') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath1(TestNegativeCliDriver.java:417)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: clusterbydistributeby.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clusterbydistributeby(TestNegativeCliDriver.java:442)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: describe_xpath2.q
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath2(TestNegativeCliDriver.java:467)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: describe_xpath3.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath3(TestNegativeCliDriver.java:492)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: describe_xpath4.q
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath4(TestNegativeCliDriver.java:517)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: strict_pruning.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict_pruning(TestNegativeCliDriver.java:542)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: clusterbysortby.q
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clusterbysortby(TestNegativeCliDriver.java:567)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: fileformat_bad_class.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_bad_class(TestNegativeCliDriver.java:592)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 21 more
[junit] Begin query: fileformat_void_input.q
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
[junit] at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_input(TestNegativeCliDriver.java:617)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
[junit] Begin query: fileformat_void_output.q
[junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_void_output.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/fileformat_void_output.q.out
[junit] Done query: fileformat_void_output.q
[junit] Tests run: 23, Failures: 19, Errors: 0, Time elapsed: 26.031 sec
[junit] Test org.apache.hadoop.hive.cli.TestNegativeCliDriver FAILED
[junit] Running org.apache.hadoop.hive.ql.exec.TestExecDriver
[junit] Beginning testMapPlan1
[junit] Generating plan file /tmp/plan60751.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60751.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1611379972 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F485905420
[junit] plan = /tmp/plan60751.xml
[junit] 08/12/30 18:29:32 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 0
[junit] 08/12/30 18:29:32 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:32 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:32 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:32 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:32 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:32 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:29:32 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:32 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:32 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:32 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:29:32 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:29:32 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:32 INFO util.NativeCodeLoader: Loaded the native-hadoop library
[junit] 08/12/30 18:29:32 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
[junit] 08/12/30 18:29:32 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:29:32 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:32 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:29:32 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:29:32 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:32 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:32 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1786282559
[junit] map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:33 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:29:33 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapPlan1 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapPlan2
[junit] Generating plan file /tmp/plan60752.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60752.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1705351660 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-983891071
[junit] plan = /tmp/plan60752.xml
[junit] 08/12/30 18:29:34 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 0
[junit] 08/12/30 18:29:35 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:35 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:35 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:35 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:29:35 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:35 INFO mapred.MapTask: numReduceTasks: 0
[junit] 08/12/30 18:29:35 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:35 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:35 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:35 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:29:35 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:29:35 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: partname=null
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: alias=a
[junit] 08/12/30 18:29:35 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:29:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:35 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:29:35 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:35 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:35 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:35 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-998788316
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:29:36 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:36 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapPlan2 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapRedPlan1
[junit] Generating plan file /tmp/plan60753.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60753.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1804997670 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-1567492587
[junit] plan = /tmp/plan60753.xml
[junit] 08/12/30 18:29:37 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
[junit] 08/12/30 18:29:37 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:37 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:37 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:37 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:38 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:38 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:38 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:38 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:38 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:38 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:38 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:29:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:38 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:38 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:38 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:38 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1825176447
[junit] 08/12/30 18:29:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:38 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:29:38 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:29:38 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:38 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:29:38 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:29:38 INFO mapred.TaskRunner: Task 'reduce_bolpcb' done.
[junit] 08/12/30 18:29:38 INFO mapred.TaskRunner: Saved output of task 'reduce_bolpcb' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1825176447
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:39 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:29:39 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapRedPlan1 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapPlan2
[junit] Generating plan file /tmp/plan60754.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60754.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-812139891 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-73677864
[junit] plan = /tmp/plan60754.xml
[junit] 08/12/30 18:29:40 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
[junit] 08/12/30 18:29:40 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:40 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:40 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:40 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:29:40 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:41 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:41 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:41 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:41 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:41 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:41 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:29:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:41 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:41 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:41 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:41 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp439231093
[junit] 08/12/30 18:29:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:41 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:29:41 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:29:41 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:29:41 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:29:41 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:41 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:29:41 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:29:41 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:29:41 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:29:41 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:29:41 INFO mapred.TaskRunner: Task 'reduce_7wdtir' done.
[junit] 08/12/30 18:29:41 INFO mapred.TaskRunner: Saved output of task 'reduce_7wdtir' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp439231093
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:41 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:29:41 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapRedPlan2 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapPlan3
[junit] Generating plan file /tmp/plan60755.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60755.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-823551349 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-964933484
[junit] plan = /tmp/plan60755.xml
[junit] 08/12/30 18:29:43 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 5
[junit] 08/12/30 18:29:43 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:43 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src2
[junit] 08/12/30 18:29:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:43 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:29:43 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:43 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:44 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:44 INFO exec.ReduceSinkOperator: Using tag = 0
[junit] 08/12/30 18:29:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1871522028
[junit] 08/12/30 18:29:44 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: Adding alias b to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src2/kv2.txt
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:44 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:44 INFO exec.ReduceSinkOperator: Using tag = 1
[junit] 08/12/30 18:29:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src2/kv2.txt:0+5791
[junit] 08/12/30 18:29:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
[junit] 08/12/30 18:29:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1871522028
[junit] 08/12/30 18:29:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:44 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:29:44 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:29:44 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:44 INFO exec.SelectOperator: Initialization Done
[junit] map = 100%, reduce =0%
[junit] 08/12/30 18:29:44 INFO exec.ExecDriver: map = 100%, reduce =0%
[junit] 08/12/30 18:29:44 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:29:44 INFO mapred.TaskRunner: Task 'reduce_xny8n7' done.
[junit] 08/12/30 18:29:44 INFO mapred.TaskRunner: Saved output of task 'reduce_xny8n7' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1871522028
[junit] map = 100%, reduce =100%
[junit] 08/12/30 18:29:45 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:45 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapRedPlan3 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapPlan4
[junit] Generating plan file /tmp/plan60756.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60756.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F2033749460 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F169152504
[junit] plan = /tmp/plan60756.xml
[junit] 08/12/30 18:29:47 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
[junit] 08/12/30 18:29:47 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:47 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:47 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:47 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] 08/12/30 18:29:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:47 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:47 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:47 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:47 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:47 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:29:47 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:29:47 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:29:47 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:29:47 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:47 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:29:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: partname=null
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: alias=a
[junit] 08/12/30 18:29:48 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:29:48 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:48 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:48 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:48 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:48 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1637045755
[junit] 08/12/30 18:29:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:48 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:29:48 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:29:48 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:48 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:29:48 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:29:48 INFO mapred.TaskRunner: Task 'reduce_7d2n8q' done.
[junit] 08/12/30 18:29:48 INFO mapred.TaskRunner: Saved output of task 'reduce_7d2n8q' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1637045755
[junit] 08/12/30 18:29:48 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:29:48 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] testMapRedPlan4 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapPlan5
[junit] Generating plan file /tmp/plan60757.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60757.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F188257322 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-1953420987
[junit] plan = /tmp/plan60757.xml
[junit] 08/12/30 18:29:50 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
[junit] 08/12/30 18:29:50 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:50 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:50 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:50 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:50 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:50 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:50 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:50 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:50 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:50 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:29:50 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:29:50 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:50 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:29:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:51 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:29:51 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:51 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:51 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:51 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1583094982
[junit] 08/12/30 18:29:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:51 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:29:51 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:29:51 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:51 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:29:51 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:29:51 INFO mapred.TaskRunner: Task 'reduce_31lz4s' done.
[junit] 08/12/30 18:29:51 INFO mapred.TaskRunner: Saved output of task 'reduce_31lz4s' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1583094982
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:51 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:29:51 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapRedPlan5 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Beginning testMapPlan6
[junit] Generating plan file /tmp/plan60758.xml
[junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60758.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fda
ta%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outp
ut=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1501059191 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F1340066862
[junit] plan = /tmp/plan60758.xml
[junit] 08/12/30 18:29:53 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
[junit] 08/12/30 18:29:53 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
[junit] 08/12/30 18:29:53 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
[junit] 08/12/30 18:29:53 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
[junit] 08/12/30 18:29:53 INFO mapred.FileInputFormat: Total input paths to process : 1
[junit] Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:53 INFO exec.ExecDriver: Job running in-process (local Hadoop)
[junit] 08/12/30 18:29:53 INFO mapred.MapTask: numReduceTasks: 1
[junit] 08/12/30 18:29:53 INFO exec.MapOperator: Initializing Self
[junit] 08/12/30 18:29:53 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
[junit] 08/12/30 18:29:53 INFO exec.MapOperator: Got partitions: null
[junit] 08/12/30 18:29:53 INFO exec.SelectOperator: Initializing Self
[junit] 08/12/30 18:29:53 INFO exec.SelectOperator: Initializing children:
[junit] 08/12/30 18:29:53 INFO exec.ScriptOperator: Initializing Self
[junit] 08/12/30 18:29:53 INFO exec.ScriptOperator: Initializing children:
[junit] 08/12/30 18:29:53 INFO exec.ReduceSinkOperator: Initializing Self
[junit] 08/12/30 18:29:53 INFO exec.ReduceSinkOperator: Using tag = -1
[junit] 08/12/30 18:29:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: Initialization Done
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: Executing [/bin/cat]
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: tablename=src
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: partname=null
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: alias=a
[junit] 08/12/30 18:29:54 INFO exec.SelectOperator: Initialization Done
[junit] 08/12/30 18:29:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: StreamThread OutputProcessor done
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:54 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
[junit] 08/12/30 18:29:54 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
[junit] 08/12/30 18:29:54 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
[junit] 08/12/30 18:29:54 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp561011439
[junit] 08/12/30 18:29:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
[junit] 08/12/30 18:29:54 INFO exec.ExtractOperator: Initializing Self
[junit] 08/12/30 18:29:54 INFO exec.ExtractOperator: Initializing children:
[junit] 08/12/30 18:29:54 INFO exec.FilterOperator: Initializing Self
[junit] 08/12/30 18:29:54 INFO exec.FilterOperator: Initializing children:
[junit] 08/12/30 18:29:54 INFO exec.FileSinkOperator: Initializing Self
[junit] 08/12/30 18:29:54 INFO exec.FilterOperator: Initialization Done
[junit] 08/12/30 18:29:54 INFO exec.ExtractOperator: Initialization Done
[junit] 08/12/30 18:29:54 INFO exec.FilterOperator: FILTERED:416
[junit] 08/12/30 18:29:54 INFO exec.FilterOperator: PASSED:84
[junit] 08/12/30 18:29:54 INFO mapred.LocalJobRunner: reduce > reduce
[junit] 08/12/30 18:29:54 INFO mapred.TaskRunner: Task 'reduce_nsnhps' done.
[junit] 08/12/30 18:29:54 INFO mapred.TaskRunner: Saved output of task 'reduce_nsnhps' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp561011439
[junit] map = 100%, reduce =100%
[junit] Ended Job = job_local_1
[junit] 08/12/30 18:29:54 INFO exec.ExecDriver: map = 100%, reduce =100%
[junit] 08/12/30 18:29:54 INFO exec.ExecDriver: Ended Job = job_local_1
[junit] testMapRedPlan6 execution completed successfully
[junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 24.049 sec
[junit] Running org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator
[junit] ExprNodeColumnEvaluator ok
[junit] ExprNodeFuncEvaluator ok
[junit] testExprNodeConversionEvaluator ok
[junit] Evaluating 1 + 2 for 10000000 times
[junit] Evaluation finished: 0.601 seconds, 0.060 seconds/million call.
[junit] Evaluating 1 + 2 - 3 for 10000000 times
[junit] Evaluation finished: 1.582 seconds, 0.158 seconds/million call.
[junit] Evaluating 1 + 2 - 3 + 4 for 10000000 times
[junit] Evaluation finished: 2.003 seconds, 0.200 seconds/million call.
[junit] Evaluating concat("1", "2") for 10000000 times
[junit] Evaluation finished: 1.773 seconds, 0.177 seconds/million call.
[junit] Evaluating concat(concat("1", "2"), "3") for 10000000 times
[junit] Evaluation finished: 3.326 seconds, 0.333 seconds/million call.
[junit] Evaluating concat(concat(concat("1", "2"), "3"), "4") for 10000000 times
[junit] Evaluation finished: 5.121 seconds, 0.512 seconds/million call.
[junit] Evaluating concat(col1[1], cola[1]) for 1000000 times
[junit] Evaluation finished: 0.294 seconds, 0.294 seconds/million call.
[junit] Evaluating concat(concat(col1[1], cola[1]), col1[2]) for 1000000 times
[junit] Evaluation finished: 0.497 seconds, 0.497 seconds/million call.
[junit] Evaluating concat(concat(concat(col1[1], cola[1]), col1[2]), cola[2]) for 1000000 times
[junit] Evaluation finished: 0.733 seconds, 0.733 seconds/million call.
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 16.333 sec
[junit] Running org.apache.hadoop.hive.ql.exec.TestJEXL
[junit] JEXL library test ok
[junit] Evaluating 1 + 2 for 10000000 times
[junit] Evaluation finished: 0.711 seconds, 0.071 seconds/million call.
[junit] Evaluating __udf__concat.evaluate("1", "2") for 1000000 times
[junit] Evaluation finished: 1.195 seconds, 1.195 seconds/million call.
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 2.375 sec
[junit] Running org.apache.hadoop.hive.ql.exec.TestOperators
[junit] Testing Filter Operator
[junit] filtered = 4
[junit] passed = 1
[junit] Filter Operator ok
[junit] Testing FileSink Operator
[junit] FileSink Operator ok
[junit] Testing Script Operator
[junit] [0] io.o=[1, 01]
[junit] [0] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
[junit] [1] io.o=[2, 11]
[junit] [1] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
[junit] [2] io.o=[3, 21]
[junit] [2] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
[junit] [3] io.o=[4, 31]
[junit] [3] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
[junit] [4] io.o=[5, 41]
[junit] [4] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
[junit] Script Operator ok
[junit] Testing Map Operator
[junit] io1.o.toString() = [[0, 1, 2]]
[junit] io2.o.toString() = [[0, 1, 2]]
[junit] answer.toString() = [[0, 1, 2]]
[junit] io1.o.toString() = [[1, 2, 3]]
[junit] io2.o.toString() = [[1, 2, 3]]
[junit] answer.toString() = [[1, 2, 3]]
[junit] io1.o.toString() = [[2, 3, 4]]
[junit] io2.o.toString() = [[2, 3, 4]]
[junit] answer.toString() = [[2, 3, 4]]
[junit] io1.o.toString() = [[3, 4, 5]]
[junit] io2.o.toString() = [[3, 4, 5]]
[junit] answer.toString() = [[3, 4, 5]]
[junit] io1.o.toString() = [[4, 5, 6]]
[junit] io2.o.toString() = [[4, 5, 6]]
[junit] answer.toString() = [[4, 5, 6]]
[junit] Map Operator ok
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 1.64 sec
[junit] Running org.apache.hadoop.hive.ql.exec.TestPlan
[junit] Serialization/Deserialization of plan successful
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.138 sec
[junit] Running org.apache.hadoop.hive.ql.io.TestFlatFileInputFormat
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 1.008 sec
[junit] Running org.apache.hadoop.hive.ql.metadata.TestHive
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 8.327 sec
[junit] Running org.apache.hadoop.hive.ql.metadata.TestPartition
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.313 sec
[junit] Running org.apache.hadoop.hive.ql.parse.TestParse
[junit] Begin query: case_sensitivity.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/case_sensitivity.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/case_sensitivity.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/case_sensitivity.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/case_sensitivity.q.xml
[junit] Done query: case_sensitivity.q
[junit] Begin query: input20.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input20.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input20.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input20.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input20.q.xml
[junit] Done query: input20.q
[junit] Begin query: sample1.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample1(TestParse.java:178)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: sample2.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/sample2.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample2.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/sample2.q.xml
[junit] Done query: sample2.q
[junit] Begin query: sample3.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample3(TestParse.java:230)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: sample4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample4(TestParse.java:256)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: sample5.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample5(TestParse.java:282)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: sample6.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/sample6.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample6.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/sample6.q.xml
[junit] Done query: sample6.q
[junit] Begin query: sample7.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:334)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: cast1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/cast1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/cast1.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/cast1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/cast1.q.xml
[junit] Done query: cast1.q
[junit] Begin query: join1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join1.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join1.q.xml
[junit] Done query: join1.q
[junit] Begin query: input1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input1.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input1.q.xml
[junit] Done query: input1.q
[junit] Begin query: join2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join2(TestParse.java:438)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: input2.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input2(TestParse.java:464)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: join3.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join3.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join3.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join3.q.xml
[junit] Done query: join3.q
[junit] Begin query: input3.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input3.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input3.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input3.q.xml
[junit] Done query: input3.q
[junit] Begin query: input4.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input4.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input4.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input4.q.xml
[junit] Done query: input4.q
[junit] Begin query: join4.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join4(TestParse.java:568)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: input5.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input5(TestParse.java:594)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] Begin query: join5.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join5.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join5.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join5.q.xml
[junit] Done query: join5.q
[junit] Begin query: input6.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input6(TestParse.java:646)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: input_testxpath2.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input_testxpath2.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath2.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input_testxpath2.q.xml
[junit] Done query: input_testxpath2.q
[junit] Begin query: join6.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join6.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join6.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join6.q.xml
[junit] Done query: join6.q
[junit] Begin query: input7.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input7(TestParse.java:724)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: join7.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join7(TestParse.java:750)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: input8.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input8.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input8.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input8.q.xml
[junit] Done query: input8.q
[junit] Begin query: input_testsequencefile.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testsequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input_testsequencefile.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testsequencefile.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input_testsequencefile.q.xml
[junit] Done query: input_testsequencefile.q
[junit] Begin query: join8.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join8(TestParse.java:828)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: union.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:854)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: input9.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input9.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input9.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input9.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input9.q.xml
[junit] Done query: input9.q
[junit] Begin query: udf1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/udf1.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/udf1.q.xml
[junit] Done query: udf1.q
[junit] Begin query: udf4.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:932)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: input_testxpath.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input_testxpath.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input_testxpath.q.xml
[junit] Done query: input_testxpath.q
[junit] Begin query: input_part1.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input_part1(TestParse.java:984)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] Begin query: groupby1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/groupby1.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/groupby1.q.xml
[junit] Done query: groupby1.q
[junit] Begin query: groupby2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby2(TestParse.java:1036)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: groupby3.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/groupby3.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby3.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/groupby3.q.xml
[junit] Done query: groupby3.q
[junit] Begin query: subq.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/subq.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/subq.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/subq.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/subq.q.xml
[junit] Done query: subq.q
[junit] Begin query: groupby4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby4(TestParse.java:1114)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: groupby5.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/groupby5.q.out
[junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby5.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/groupby5.q.xml
[junit] Done query: groupby5.q
[junit] Begin query: groupby6.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby6(TestParse.java:1166)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Tests run: 41, Failures: 19, Errors: 0, Time elapsed: 90.037 sec
[junit] Test org.apache.hadoop.hive.ql.parse.TestParse FAILED
[junit] Running org.apache.hadoop.hive.ql.parse.TestParseNegative
[junit] Begin query: insert_wrong_number_columns.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/insert_wrong_number_columns.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/insert_wrong_number_columns.q.out
[junit] Done query: insert_wrong_number_columns.q
[junit] Begin query: duplicate_alias.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/duplicate_alias.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/duplicate_alias.q.out
[junit] Done query: duplicate_alias.q
[junit] Begin query: unknown_function1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_function1(TestParseNegative.java:164)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_function2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_function2(TestParseNegative.java:195)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_table1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_table1(TestParseNegative.java:226)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_function3.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_function3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_function3.q.out
[junit] Done query: unknown_function3.q
[junit] Begin query: quoted_string.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_quoted_string(TestParseNegative.java:288)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_table2.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_table2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_table2.q.out
[junit] Done query: unknown_table2.q
[junit] Begin query: unknown_function4.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_function4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_function4.q.out
[junit] Done query: unknown_function4.q
[junit] Begin query: garbage.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/garbage.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/garbage.q.out
[junit] Done query: garbage.q
[junit] Begin query: unknown_function5.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_function5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_function5.q.out
[junit] Done query: unknown_function5.q
[junit] Begin query: invalid_list_index2.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_list_index2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_list_index2.q.out
[junit] Done query: invalid_list_index2.q
[junit] Begin query: invalid_dot.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_dot(TestParseNegative.java:474)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: invalid_function_param1.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_function_param1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_function_param1.q.out
[junit] Done query: invalid_function_param1.q
[junit] Begin query: invalid_map_index2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_map_index2(TestParseNegative.java:536)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_column1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column1(TestParseNegative.java:567)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: invalid_function_param2.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_function_param2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_function_param2.q.out
[junit] Done query: invalid_function_param2.q
[junit] Begin query: unknown_column2.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column2(TestParseNegative.java:629)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_column3.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column3(TestParseNegative.java:660)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_column4.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column4(TestParseNegative.java:691)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: unknown_column5.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] Failed with exception checkPaths: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt already exists
[junit] FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:330)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column5(TestParseNegative.java:722)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: unknown_column6.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column6(TestParseNegative.java:753)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: invalid_list_index.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] Failed with exception checkPaths: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt already exists
[junit] FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
[junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
[junit] at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:330)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_list_index(TestParseNegative.java:784)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Begin query: nonkey_groupby.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/nonkey_groupby.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/nonkey_groupby.q.out
[junit] Done query: nonkey_groupby.q
[junit] Begin query: invalid_map_index.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_map_index.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_map_index.q.out
[junit] Done query: invalid_map_index.q
[junit] Begin query: invalid_index.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_index.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_index.q.out
[junit] Done query: invalid_index.q
[junit] Begin query: wrong_distinct1.q
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_wrong_distinct1(TestParseNegative.java:908)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Begin query: missing_overwrite.q
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
[junit] OK
[junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table srcbucket
[junit] OK
[junit] Loading data to table src
[junit] OK
[junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/missing_overwrite.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/missing_overwrite.q.out
[junit] Done query: missing_overwrite.q
[junit] Begin query: wrong_distinct2.q
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
[junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
[junit] at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
[junit] at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_wrong_distinct2(TestParseNegative.java:970)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:164)
[junit] at junit.framework.TestCase.runBare(TestCase.java:130)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:120)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:230)
[junit] at junit.framework.TestSuite.run(TestSuite.java:225)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
[junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
[junit] at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
[junit] Tests run: 29, Failures: 15, Errors: 0, Time elapsed: 49.235 sec
[junit] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
[junit] at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
[junit] ... 20 more
[junit] Test org.apache.hadoop.hive.ql.parse.TestParseNegative FAILED
[junit] Running org.apache.hadoop.hive.ql.tool.TestLineageInfo
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.535 sec
BUILD FAILED
/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build.xml:104: The following error occurred while executing this line:
/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build-common.xml:261: Tests failed!
Total time: 11 minutes 14 seconds
EXIT VALUE IS 1 for runtests