You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by Ashish Thusoo <at...@facebook.com> on 2009/01/07 13:46:01 UTC

Problem with running test on NFS - was RE: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4

I have been able to verify that the files in warehouse/src are open for reads from the spawned map/reduce jvm while they are being deleted by the test jvm. This seems totally counter intuitive because we wait
for the spawned map/reduce jvm to complete before hitting the file deletion code in the test jvm. So the only other explanation seems to me that the test jvm is running with multiple threads.
However. jstack does not indicate that and it seems that the unit test cases are all being run in a single thread.

The other possible explanation (though unlikely) is that the close on the record readers in map/red jvm is not getting called. However, this does happen in the finally block in TaskTracker.java:run(). And clearly that
code should get called before that jvm exits.

So how this possible at all??

Puzzled...
Ashish

PS: The reason I say that the two jvms have these files open is because the lsof output on that directory does show the two pids...

________________________________________
From: Ashish Thusoo [athusoo@facebook.com]
Sent: Wednesday, December 31, 2008 11:11 AM
To: hive-dev@hadoop.apache.org
Subject: RE: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4

Actually I just verified that this is also happening without the /tmp fix.

an example stack being.. which is not even in build/ql/tmp... so even moving things to /tmp is not going to help..

Ashish

    [junit] Exception: MetaException(message:Unable to delete directory: file:/home/athusoo/apacheprojects/hive_ws2/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/home/athusoo/apacheprojects/hive_ws2/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_fileformat_sequencefile(TestCliDriver.java:3426)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:154)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:127)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:118)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:208)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:203)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/home/athusoo/apacheprojects/hive_ws2/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: cluster.q
    [junit] plan = /tmp/plan28487.xml

________________________________________
From: Ashish Thusoo [athusoo@facebook.com]
Sent: Wednesday, December 31, 2008 10:47 AM
To: hive-dev@hadoop.apache.org
Subject: RE: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4

I actually wanted to stop using /tmp altogether a put everything in build so that everything is contained at one place and a clean build would get rid of all these files.. but /tmp/<pid> is certainly a back up option if we can't get this working with the filer.

I think the problem is being caused by the fact that we run junit with fork="yes" though there does not seem to be a direct reason linking the two.

Ashish
________________________________________
From: Dhruba Borthakur [dhruba@gmail.com]
Sent: Wednesday, December 31, 2008 10:33 AM
To: hive-dev@hadoop.apache.org
Subject: Re: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4

Maybe we can use /tmp.<pid> to make the directory unique per test run.

dhruba

On Wed, Dec 31, 2008 at 10:20 AM, Ashish Thusoo <at...@facebook.com>wrote:

> sure. let me look into this a bit more and see if I can track the filer
> problem. If not I will revert back to /tmp..
>
> Ashish
> ________________________________________
> From: Murli Varadachari
> Sent: Wednesday, December 31, 2008 10:10 AM
> To: Ashish Thusoo; hive-dev@hadoop.apache.org
> Cc: Murli Varadachari
> Subject: Re: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1
> based on SVN Rev# 730302.4
>
> Ashish,
>
> This build host doesn't have any local disk space left — the build
> framework itself takes up most  [ have ordered for new disks ] — is it
> possible for you to revert your changes back  for now so that it starts
> writing back to /tmp/ql  as before. I will build / run a single hadoop
> configuration so that it doesn't clash with multiple hadoop tests.
>
> Meanwhile I can hold off on building hadoop — so that you guys don't get
> deluged with error messages.  I am looking at some other alternatives too.
>
> Stay tuned.
>
> Cheers
> murli
>
>
> On 12/31/08 9:49 AM, "Ashish Thusoo" <at...@facebook.com> wrote:
>
> That explains it. Something is wrong with our filer configuration then, I
> do see intermittent problems when our build tries to delete directories. Now
> that the tmp directory is in build/ql/tmp we have started hitting this in
> tests. Originally it used to happen as part of test cleanups when we we
> tried to delete build. No idea why this is happening. Is it possible to run
> this on the local disk for now, while we try to figure out why this is
> happening with the filer.
>
> Ashish
> ________________________________________
> From: Murli Varadachari
> Sent: Wednesday, December 31, 2008 9:38 AM
> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> Cc: Murli Varadachari
> Subject: Re: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1
> based on SVN Rev# 730302.4
>
> It is a filer ! Local disk space on build hosts ore rather limited.
>
> Cheers
> murli
>
>
> On 12/31/08 9:03 AM, "Ashish Thusoo" <at...@facebook.com> wrote:
>
> I have seen this happen on a filer (when build is on a filer as opposed to
> a local disk). Can you verify that
>  /usr/local/continuous_builds/src/hiveopensource-0.17.1/hiveopensource_0_17_1
> is not on a filer...
>
> Thanks,
> Ashish
> ________________________________________
> From: Murli Varadachari [mvaradachari@facebook.com]
> Sent: Tuesday, December 30, 2008 7:51 PM
> To: hive-dev@hadoop.apache.org
> Subject: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on
> SVN Rev# 730302.4
>
> Compiling hiveopensource at
> /usr/local/continuous_builds/src/hiveopensource-0.17.1/hiveopensource_0_17_1
> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Buildfile: build.xml
>
> clean:
>
> clean:
>     [echo] Cleaning: anttasks
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks
>
> clean:
>     [echo] Cleaning: cli
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli
>
> clean:
>     [echo] Cleaning: common
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common
>
> clean:
>     [echo] Cleaning: metastore
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore
>
> Overriding previous definition of reference to test.classpath
>
> clean:
>     [echo] Cleaning: ql
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql
>
> clean:
>     [echo] Cleaning: serde
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde
>
> clean:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> BUILD SUCCESSFUL
> Total time: 20 seconds
> Buildfile: build.xml
>
> deploy:
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/classes
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/classes
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
> http://ant.apache.org/ivy/ ::
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> common;working@devbuild001.snc1.facebook.com<co...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
> [ivy:retrieve] :: resolution report :: resolve 132ms :: artifacts dl 5ms
>
>  ---------------------------------------------------------------------
>
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   1   |   0   |   0   |   0   ||   1   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  1 artifacts copied, 0 already retrieved (14096kB/368ms)
>
> install-hadoopcore:
>    [untar] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1.tar.gz
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore
>
>    [touch] Creating
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1.installed
>
> compile:
>     [echo] Compiling: common
>    [javac] Compiling 1 source file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/classes
>
> jar:
>     [echo] Jar: common
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/hive_common.jar
>
> deploy:
>     [echo] hive: common
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/classes
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/classes
>
> dynamic-serde:
>
> compile:
>     [echo] Compiling: serde
>    [javac] Compiling 128 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/classes
>
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> jar:
>     [echo] Jar: serde
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/hive_serde.jar
>
> deploy:
>     [echo] hive: serde
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes
>
> model-compile:
>    [javac] Compiling 8 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
>
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
>
> core-compile:
>     [echo] Compiling:
>    [javac] Compiling 38 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
>
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> model-enhance:
>     [echo] Enhancing model classes with JPOX stuff....
>     [java] JPOX Enhancer (version 1.2.2) : Enhancement of classes
>
>     [java] JPOX Enhancer completed with success for 8 classes. Timings :
> input=170 ms, enhance=180 ms, total=350 ms. Consult the log for full details
>
> compile:
>
> jar:
>     [echo] Jar: metastore
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/hive_metastore.jar
>
> deploy:
>     [echo] hive: metastore
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
> Overriding previous definition of reference to test.classpath
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
>
> ql-init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/gen-java/org/apache/hadoop/hive/ql/parse
>
> build-grammar:
>     [echo] Building Grammar
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g
>  ....
>
>     [java] ANTLR Parser Generator  Version 3.0.1 (August 13, 2007)
>  1989-2007
>
> compile-ant-tasks:
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test/classes
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> ant;working@devbuild001.snc1.facebook.com<an...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 10ms :: artifacts dl 0ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   0   |   0   |   0   |   0   ||   0   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/13ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: anttasks
>    [javac] Compiling 2 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes
>
>    [javac] Note:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java
> uses or overrides a deprecated API.
>
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>
> deploy-ant-tasks:
>
> init:
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> ant;working@devbuild001.snc1.facebook.com<an...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 16ms :: artifacts dl 0ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   0   |   0   |   0   |   0   ||   0   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/2ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: anttasks
>
> jar:
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes/org/apache/hadoop/hive/ant
>
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/hive_anttasks.jar
>
> deploy:
>     [echo] hive: anttasks
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> configure:
>     [copy] Copying 239 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/java
>
> compile:
>     [echo] Compiling: ql
>    [javac] Compiling 241 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
>
>    [javac] Note: Some input files use or override a deprecated API.
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> jar:
>     [echo] Jar: ql
>    [unzip] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/lib/commons-jexl-1.1.jar
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
>
>    [unzip] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/libthrift.jar
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/thrift/classes
>
>    [unzip] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/commons-lang-2.4.jar
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/commons-lang/classes
>
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/hive_exec.jar
>
> deploy:
>     [echo] hive: ql
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/classes
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/src
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/classes
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> cli;working@devbuild001.snc1.facebook.com<cl...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
> [ivy:retrieve] :: resolution report :: resolve 43ms :: artifacts dl 2ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   1   |   0   |   0   |   0   ||   1   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/4ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: cli
>    [javac] Compiling 5 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/classes
>
>    [javac] Note:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/cli/src/java/org/apache/hadoop/hive/cli/OptionsProcessor.java
> uses unchecked or unsafe operations.
>
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>
> jar:
>     [echo] Jar: cli
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/hive_cli.jar
>
> deploy:
>     [echo] hive: cli
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/classes
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/classes
>
> core-compile:
>    [javac] Compiling 6 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/classes
>
> compile:
>
> jar:
>     [echo] Jar: service
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/hive_service.jar
>
> deploy:
>     [echo] hive: service
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
>
> package:
>     [echo] Deploying Hive jars to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/files
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/queries
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/php
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
>     [copy] Copying 5 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin/ext
>
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
>
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
>
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
>
>     [copy] Copying 6 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/php
>
>     [copy] Copying 12 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
>
>     [copy] Copied 3 empty directories to 1 empty directory under
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
>
>     [copy] Copying 35 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib
>
>     [copy] Copying 16 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/files
>
>     [copy] Copying 1 file to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
>     [copy] Copying 41 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/queries
>
> BUILD SUCCESSFUL
> Total time: 40 seconds
> RUNNING TEST FOR HIVE OPENSOURCE - ant test
> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Buildfile: build.xml
>
> clean-test:
>
> clean-test:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
>
> clean-test:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
>
> clean-test:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
>
> Overriding previous definition of reference to test.classpath
>
> clean-test:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
>
> clean-test:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
>
> clean-test:
>   [delete] Deleting directory
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
>
> BUILD SUCCESSFUL
> Total time: 1 second
> Buildfile: build.xml
>
> clean-test:
>
> clean-test:
>
> clean-test:
>
> clean-test:
> Overriding previous definition of reference to test.classpath
>
> clean-test:
>
> clean-test:
>
> clean-test:
>
> deploy:
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/classes
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
> http://ant.apache.org/ivy/ ::
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> common;working@devbuild001.snc1.facebook.com<co...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
> [ivy:retrieve] :: resolution report :: resolve 96ms :: artifacts dl 3ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   1   |   0   |   0   |   0   ||   1   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/5ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: common
>
> jar:
>     [echo] Jar: common
>
> deploy:
>     [echo] hive: common
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/classes
>
> dynamic-serde:
>
> compile:
>     [echo] Compiling: serde
>
> jar:
>     [echo] Jar: serde
>
> deploy:
>     [echo] hive: serde
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes
>
> model-compile:
>
> core-compile:
>     [echo] Compiling:
>
> model-enhance:
>
> compile:
>
> jar:
>     [echo] Jar: metastore
>
> deploy:
>     [echo] hive: metastore
> Overriding previous definition of reference to test.classpath
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
>
> ql-init:
>
> build-grammar:
>
> compile-ant-tasks:
>
> init:
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> ant;working@devbuild001.snc1.facebook.com<an...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 37ms :: artifacts dl 0ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   0   |   0   |   0   |   0   ||   0   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/12ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: anttasks
>
> deploy-ant-tasks:
>
> init:
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> ant;working@devbuild001.snc1.facebook.com<an...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   0   |   0   |   0   |   0   ||   0   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/3ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: anttasks
>
> jar:
>
> deploy:
>     [echo] hive: anttasks
>
> configure:
>
> compile:
>     [echo] Compiling: ql
>    [javac] Compiling 8 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
>
> jar:
>     [echo] Jar: ql
>    [unzip] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/lib/commons-jexl-1.1.jar
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
>
>    [unzip] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/libthrift.jar
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/thrift/classes
>
>    [unzip] Expanding:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/commons-lang-2.4.jar
> into
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/commons-lang/classes
>
> deploy:
>     [echo] hive: ql
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/src
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/classes
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> cli;working@devbuild001.snc1.facebook.com<cl...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
> [ivy:retrieve] :: resolution report :: resolve 29ms :: artifacts dl 2ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   1   |   0   |   0   |   0   ||   1   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/4ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: cli
>
> jar:
>     [echo] Jar: cli
>
> deploy:
>     [echo] hive: cli
>
> init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/src
>
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/classes
>
> core-compile:
>
> compile:
>
> jar:
>     [echo] Jar: service
>
> deploy:
>     [echo] hive: service
>
> test:
>
> test:
>     [echo] Nothing to do!
>
> test:
>     [echo] Nothing to do!
>
> test-conditions:
>
> gen-test:
>
> init:
>
> model-compile:
>
> core-compile:
>     [echo] Compiling:
>
> model-enhance:
>
> compile:
>
> compile-test:
>    [javac] Compiling 14 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes
>
> test-jar:
>
> test-init:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
>
>     [copy] Copying 18 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
>
>     [copy] Copied 5 empty directories to 3 empty directories under
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
>
> test:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/logs
>
>    [junit] Running org.apache.hadoop.hive.metastore.TestAlter
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.069 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestCreateDB
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.041 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestDBGetName
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.033 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestDrop
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.083 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestGetDBs
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.043 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestGetSchema
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.069 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestGetTable
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.063 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestGetTables
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.072 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestHiveMetaStore
>    [junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 9.318 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestPartitions
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.066 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestTableExists
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.055 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestTablePath
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.061 sec
>    [junit] Running org.apache.hadoop.hive.metastore.TestTruncate
>    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.074 sec
> Overriding previous definition of reference to test.classpath
>
> test-conditions:
>
> init:
>
> compile-ant-tasks:
>
> init:
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> ant;working@devbuild001.snc1.facebook.com<an...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 31ms :: artifacts dl 0ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   0   |   0   |   0   |   0   ||   0   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/3ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: anttasks
>
> deploy-ant-tasks:
>
> init:
>
> download-ivy:
>
> init-ivy:
>
> settings-ivy:
>
> resolve:
> :: loading settings :: file =
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
>
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#
> ant;working@devbuild001.snc1.facebook.com<an...@devbuild001.snc1.facebook.com>
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
>
>  ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts
> |
>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
>  ---------------------------------------------------------------------
>        |      default     |   0   |   0   |   0   |   0   ||   0   |   0
> |
>
>  ---------------------------------------------------------------------
> [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/2ms)
>
> install-hadoopcore:
>
> compile:
>     [echo] Compiling: anttasks
>
> jar:
>
> deploy:
>     [echo] hive: anttasks
>
> gen-test:
>  [qtestgen] Template
> Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
>
>  [qtestgen] Dec 30, 2008 7:41:10 PM
> org.apache.velocity.runtime.log.JdkLogChute log
>  [qtestgen] INFO: FileResourceLoader : adding path
> '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
>
>  [qtestgen] Generated
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParse.java
> from template TestParse.vm
>
>  [qtestgen] Template
> Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
>
>  [qtestgen] Dec 30, 2008 7:41:10 PM
> org.apache.velocity.runtime.log.JdkLogChute log
>  [qtestgen] INFO: FileResourceLoader : adding path
> '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
>
>  [qtestgen] Generated
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParseNegative.java
> from template TestParseNegative.vm
>
>  [qtestgen] Template
> Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
>
>  [qtestgen] Dec 30, 2008 7:41:10 PM
> org.apache.velocity.runtime.log.JdkLogChute log
>  [qtestgen] INFO: FileResourceLoader : adding path
> '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
>
>  [qtestgen] Generated
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/cli/TestCliDriver.java
> from template TestCliDriver.vm
>
>  [qtestgen] Template
> Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
>
>  [qtestgen] Dec 30, 2008 7:41:10 PM
> org.apache.velocity.runtime.log.JdkLogChute log
>  [qtestgen] INFO: FileResourceLoader : adding path
> '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
>
>  [qtestgen] Generated
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/cli/TestNegativeCliDriver.java
> from template TestNegativeCliDriver.vm
>
> ql-init:
>
> build-grammar:
>
> configure:
>
> compile:
>     [echo] Compiling: ql
>    [javac] Compiling 8 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
>
> compile-test:
>    [javac] Compiling 15 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
>
>    [javac] Note: Some input files use or override a deprecated API.
>    [javac] Note: Recompile with -Xlint:deprecation for details.
>    [javac] Note: Some input files use unchecked or unsafe operations.
>    [javac] Note: Recompile with -Xlint:unchecked for details.
>    [javac] Compiling 4 source files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
>
> test-jar:
>      [jar] Building jar:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar
>
> test-init:
>     [copy] Copying 18 files to
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data
>
>     [copy] Copied 4 empty directories to 2 empty directories under
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data
>
> test:
>    [mkdir] Created dir:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/logs
>    [junit] Running org.apache.hadoop.hive.cli.TestCliDriver
>    [junit] Begin query: mapreduce1.q
>    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
>    [junit] OK
>    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
>    [junit] OK
>    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
>    [junit] OK
>    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
>    [junit] OK
>    [junit] Loading data to table srcbucket
>    [junit] OK
>    [junit] Loading data to table srcbucket
>    [junit] OK
>    [junit] Loading data to table src
>    [junit] OK
>    [junit] plan = /tmp/plan60431.xml
>    [junit] 08/12/30 19:41:25 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to jobconf value of: 1
>
>    [junit] 08/12/30 19:41:26 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
>
>    [junit] 08/12/30 19:41:26 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:41:26 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:41:26 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:41:26 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:41:26 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: Adding alias src to
> work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
>
>    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:41:26 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:41:26 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:26 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:26 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Initializing Self
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:26 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:41:26 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:41:26 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:26 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Initialization Done
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Executing [/bin/cat]
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: tablename=src
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: partname={}
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: alias=src
>    [junit] 08/12/30 19:41:26 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:26 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: StreamThread
> ErrorProcessor done
>    [junit] 08/12/30 19:41:27 INFO exec.ScriptOperator: StreamThread
> OutputProcessor done
>    [junit] 08/12/30 19:41:27 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:27 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:27 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-834949849
>
>    [junit] 08/12/30 19:41:27 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:27 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:27 INFO exec.ExtractOperator: Initializing Self
>    [junit] 08/12/30 19:41:27 INFO exec.ExtractOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:27 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:27 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:27 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:41:27 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:27 INFO exec.ExtractOperator: Initialization Done
>    [junit] 08/12/30 19:41:27 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Task 'reduce_cy2095'
> done.
>    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Saved output of task
> 'reduce_cy2095' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-834949849
>
>    [junit]  map = 100%,  reduce =100%
>    [junit] 08/12/30 19:41:27 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:41:27 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce1.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce1.q.out
>
>    [junit] Done query: mapreduce1.q
>    [junit] Begin query: mapreduce3.q
>    [junit] plan = /tmp/plan60432.xml
>    [junit] 08/12/30 19:41:31 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to jobconf value of: 1
>
>    [junit] 08/12/30 19:41:31 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
>
>    [junit] 08/12/30 19:41:31 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:41:31 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:41:32 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:41:32 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:41:32 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: Adding alias src to
> work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
>
>    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:41:32 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:41:32 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:32 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:32 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Initializing Self
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:32 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:41:32 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Initialization Done
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Executing [/bin/cat]
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: tablename=src
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: partname={}
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: alias=src
>    [junit] 08/12/30 19:41:32 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:32 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: StreamThread
> ErrorProcessor done
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: StreamThread
> OutputProcessor done
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:32 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:41:32 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:41:32 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-728768704
>
>    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort
> order is "++"
>    [junit] 08/12/30 19:41:33 INFO exec.ExtractOperator: Initializing Self
>    [junit] 08/12/30 19:41:33 INFO exec.ExtractOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:33 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:33 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:33 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:41:33 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:33 INFO exec.ExtractOperator: Initialization Done
>    [junit] 08/12/30 19:41:33 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:41:33 INFO mapred.TaskRunner: Task 'reduce_1za0qz'
> done.
>    [junit]  map = 100%,  reduce =100%
>    [junit] 08/12/30 19:41:33 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] 08/12/30 19:41:33 INFO mapred.TaskRunner: Saved output of task
> 'reduce_1za0qz' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-728768704
>
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:41:34 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce3.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce3.q.out
>
>    [junit] Done query: mapreduce3.q
>    [junit] Begin query: alter1.q
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/alter1.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/alter1.q.out
>
>    [junit] Done query: alter1.q
>    [junit] Begin query: showparts.q
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/showparts.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/showparts.q.out
>
>    [junit] Done query: showparts.q
>    [junit] Begin query: mapreduce5.q
>    [junit] org.apache.hadoop.hive.ql.metadata.HiveException:
> MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit] Exception: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
>    [junit]     at
> org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_mapreduce5(TestCliDriver.java:405)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
>    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
>    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
>    [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
>    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
>    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
>    [junit] Caused by: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
>    [junit]     ... 21 more
>    [junit] Begin query: subq2.q
>    [junit] plan = /tmp/plan60433.xml
>    [junit] 08/12/30 19:41:43 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to jobconf value of: 1
>
>    [junit] 08/12/30 19:41:43 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
>
>    [junit] 08/12/30 19:41:43 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:41:43 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:41:43 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:41:43 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:41:43 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:41:43 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:41:43 INFO exec.MapOperator: Adding alias a:b to
> work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
>
>    [junit] 08/12/30 19:41:43 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:41:43 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:41:43 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:43 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:43 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:43 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:41:43 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:41:43 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:43 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:44 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:44 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:41:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:44 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp172943633
>
>    [junit] 08/12/30 19:41:44 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:44 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:44 INFO exec.GroupByOperator: Initializing Self
>    [junit] 08/12/30 19:41:44 INFO exec.GroupByOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:44 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:41:44 INFO exec.GroupByOperator: Initialization Done
>    [junit] 08/12/30 19:41:44 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Task 'reduce_eqe5uc'
> done.
>    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Saved output of task
> 'reduce_eqe5uc' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp172943633
>
>    [junit]  map = 100%,  reduce =100%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:41:44 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] 08/12/30 19:41:44 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] plan = /tmp/plan60434.xml
>    [junit] 08/12/30 19:41:46 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to jobconf value of: 1
>
>    [junit] 08/12/30 19:41:46 INFO exec.ExecDriver: Adding input file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002
>
>    [junit] 08/12/30 19:41:46 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:41:46 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:41:47 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] 08/12/30 19:41:47 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:41:47 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: Adding alias
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002
> to work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002/reduce_eqe5uc
>
>    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:41:47 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:41:47 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:47 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002/reduce_eqe5uc:0+11875
>
>    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-359312783
>
>    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:47 INFO exec.GroupByOperator: Initializing Self
>    [junit] 08/12/30 19:41:47 INFO exec.GroupByOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: Initializing Self
>    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:47 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: Initialization Done
>    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:47 INFO exec.GroupByOperator: Initialization Done
>    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: PASSED:258
>    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: FILTERED:51
>    [junit] 08/12/30 19:41:47 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Task 'reduce_x1fq4d'
> done.
>    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Saved output of task
> 'reduce_x1fq4d' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-359312783
>
>    [junit]  map = 100%,  reduce =100%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:41:48 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] 08/12/30 19:41:48 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/subq2.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/subq2.q.out
>
>    [junit] Done query: subq2.q
>    [junit] Begin query: input_limit.q
>    [junit] Exception: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit] org.apache.hadoop.hive.ql.metadata.HiveException:
> MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
>    [junit]     at
> org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_limit(TestCliDriver.java:455)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
>    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
>    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
>    [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
>    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
>    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
>    [junit] Caused by: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
>    [junit]     ... 21 more
>    [junit] Begin query: input11_limit.q
>    [junit] Exception: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
>
>    [junit] org.apache.hadoop.hive.ql.metadata.HiveException:
> MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
>
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
>    [junit]     at
> org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11_limit(TestCliDriver.java:480)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
>    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
>    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
>    [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
>    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
>    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
>    [junit] Caused by: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
>    [junit]     ... 21 more
>    [junit] Begin query: input20.q
>    [junit] plan = /tmp/plan60435.xml
>    [junit] 08/12/30 19:41:51 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to jobconf value of: 1
>
>    [junit] 08/12/30 19:41:52 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
>
>    [junit] 08/12/30 19:41:52 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:41:52 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:41:52 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:41:52 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:41:52 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:41:52 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:41:52 INFO exec.MapOperator: Adding alias tmap:src
> to work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
>
>    [junit] 08/12/30 19:41:52 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:41:52 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:41:52 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Initializing Self
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:52 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:41:52 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:41:52 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:52 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Initialization Done
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Executing [/bin/cat]
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: tablename=src
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: partname={}
>    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: alias=tmap:src
>    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:52 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:41:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread
> ErrorProcessor done
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread
> OutputProcessor done
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:53 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1155092949
>
>    [junit] 08/12/30 19:41:53 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:53 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:53 INFO exec.ExtractOperator: Initializing Self
>    [junit] 08/12/30 19:41:53 INFO exec.ExtractOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Initializing Self
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:53 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Initialization Done
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Executing
> [/usr/bin/uniq, -c, |, sed, s@^ *@@, |, sed, s@\t@_@, |, sed, s@ @\t@]
>
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: tablename=null
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: partname=null
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: alias=null
>    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:53 INFO exec.ExtractOperator: Initialization Done
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread
> OutputProcessor done
>    [junit] /usr/bin/uniq: extra operand `s@^ *@@'
>    [junit] Try `/usr/bin/uniq --help' for more information.
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread
> ErrorProcessor done
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:53 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Task 'reduce_a3xv04'
> done.
>    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Saved output of task
> 'reduce_a3xv04' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1155092949
>
>    [junit]  map = 100%,  reduce =100%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:41:53 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] 08/12/30 19:41:53 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input20.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input20.q.out
>
>    [junit] Done query: input20.q
>    [junit] Begin query: input14_limit.q
>    [junit] plan = /tmp/plan60436.xml
>    [junit] 08/12/30 19:41:57 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to jobconf value of: 1
>
>    [junit] 08/12/30 19:41:57 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
>
>    [junit] 08/12/30 19:41:57 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:41:57 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:41:57 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:41:57 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:41:57 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: Adding alias tmap:src
> to work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
>
>    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:41:58 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:58 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:58 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:41:58 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Initialization Done
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Executing [/bin/cat]
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: tablename=src
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: partname={}
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: alias=tmap:src
>    [junit] 08/12/30 19:41:58 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:41:58 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: StreamThread
> ErrorProcessor done
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: StreamThread
> OutputProcessor done
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:41:58 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-581691282
>
>    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:41:58 INFO exec.ExtractOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.ExtractOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:58 INFO exec.LimitOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.LimitOperator: Initializing
> children:
>    [junit] 08/12/30 19:41:58 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:41:58 INFO exec.LimitOperator: Initialization Done
>    [junit] 08/12/30 19:41:58 INFO exec.ExtractOperator: Initialization Done
>    [junit] 08/12/30 19:41:58 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Task 'reduce_n004sn'
> done.
>    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Saved output of task
> 'reduce_n004sn' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-581691282
>
>    [junit]  map = 100%,  reduce =100%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:41:58 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] 08/12/30 19:41:58 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] plan = /tmp/plan60437.xml
>    [junit] 08/12/30 19:42:00 INFO exec.ExecDriver: Number of reduce tasks
> determined at compile : 1
>    [junit] 08/12/30 19:42:00 INFO exec.ExecDriver: Adding input file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001
>
>    [junit] 08/12/30 19:42:00 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:42:00 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:42:01 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:42:01 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:42:01 INFO mapred.MapTask: numReduceTasks: 1
>    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: Adding alias
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001
> to work list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001/reduce_n004sn
>
>    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:42:01 INFO exec.ReduceSinkOperator: Initializing
> Self
>    [junit] 08/12/30 19:42:01 INFO exec.ReduceSinkOperator: Using tag = -1
>    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:42:01 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001/reduce_n004sn:0+900
>
>    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-916040072
>
>    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort
> order is "+"
>    [junit] 08/12/30 19:42:01 INFO exec.ExtractOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.ExtractOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:01 INFO exec.LimitOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.LimitOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:01 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: Initialization Done
>    [junit] 08/12/30 19:42:01 INFO exec.LimitOperator: Initialization Done
>    [junit] 08/12/30 19:42:01 INFO exec.ExtractOperator: Initialization Done
>    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: PASSED:5
>    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: FILTERED:15
>    [junit] 08/12/30 19:42:01 INFO mapred.LocalJobRunner: reduce > reduce
>    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Task 'reduce_fvz6nf'
> done.
>    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Saved output of task
> 'reduce_fvz6nf' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-916040072
>
>    [junit]  map = 100%,  reduce =100%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:42:02 INFO exec.ExecDriver:  map = 100%,  reduce
> =100%
>    [junit] 08/12/30 19:42:02 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input14_limit.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input14_limit.q.out
>
>    [junit] Done query: input14_limit.q
>    [junit] Begin query: sample2.q
>    [junit] plan = /tmp/plan60438.xml
>    [junit] 08/12/30 19:42:05 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to 0 since there's no reduce operator
>
>    [junit] 08/12/30 19:42:05 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
>
>    [junit] 08/12/30 19:42:05 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:42:05 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:42:05 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:42:06 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:42:06 INFO mapred.MapTask: numReduceTasks: 0
>    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: Adding alias s to work
> list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
>
>    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:42:06 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:42:06 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:06 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:42:06 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:42:06 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:42:06 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:42:06 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1347919543
>
>    [junit]  map = 100%,  reduce =0%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:42:07 INFO exec.ExecDriver:  map = 100%,  reduce =0%
>    [junit] 08/12/30 19:42:07 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample2.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample2.q.out
>
>    [junit] Done query: sample2.q
>    [junit] Begin query: inputddl1.q
>    [junit] Exception: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit] org.apache.hadoop.hive.ql.metadata.HiveException:
> MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
>    [junit]     at

> org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
>    [junit]     at
> org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl1(TestCliDriver.java:580)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
>    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
>    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
>    [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
>    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
>    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
>    [junit] Caused by: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
>    [junit]     ... 21 more
>    [junit] Begin query: sample4.q
>    [junit] Exception: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
>
>    [junit] org.apache.hadoop.hive.ql.metadata.HiveException:
> MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
>
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
>    [junit]     at
> org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample4(TestCliDriver.java:605)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
>    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
>    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
>    [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
>    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
>    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
>    [junit] Caused by: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
>    [junit]     ... 21 more
>    [junit] Begin query: inputddl3.q
>    [junit] diff -I \(file:\)\|\(/tmp/.*\)
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl3.q.out
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl3.q.out
>
>    [junit] Done query: inputddl3.q
>    [junit] Begin query: groupby2_map.q
>    [junit] Exception: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit] org.apache.hadoop.hive.ql.metadata.HiveException:
> MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
>    [junit]     at
> org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
>    [junit]     at
> org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby2_map(TestCliDriver.java:655)
>    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>    [junit]     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>    [junit]     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
>    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
>    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
>    [junit]     at
> junit.framework.TestResult.runProtected(TestResult.java:124)
>    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
>    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
>    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
>    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
>    [junit]     at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
>    [junit] Caused by: MetaException(message:Unable to delete directory:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
>
>    [junit]     at
> org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
>    [junit]     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
>    [junit]     at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
>    [junit]     ... 21 more
>    [junit] Begin query: sample6.q
>    [junit] plan = /tmp/plan60439.xml
>    [junit] 08/12/30 19:42:12 WARN exec.ExecDriver: Number of reduce tasks
> not specified. Defaulting to 0 since there's no reduce operator
>
>    [junit] 08/12/30 19:42:12 INFO exec.ExecDriver: Adding input file
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
>
>    [junit] 08/12/30 19:42:12 INFO exec.ExecDriver: adding libjars:
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
>
>    [junit] 08/12/30 19:42:12 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with processName=JobTracker, sessionId=
>    [junit] 08/12/30 19:42:12 INFO mapred.FileInputFormat: Total input paths
> to process : 1
>    [junit] Job running in-process (local Hadoop)
>    [junit] 08/12/30 19:42:12 INFO exec.ExecDriver: Job running in-process
> (local Hadoop)
>    [junit] 08/12/30 19:42:13 INFO mapred.MapTask: numReduceTasks: 0
>    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: Initializing Self
>    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: Adding alias s to work
> list for file
> /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
>
>    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: Got partitions: null
>    [junit] 08/12/30 19:42:13 INFO exec.TableScanOperator: Initializing Self
>    [junit] 08/12/30 19:42:13 INFO exec.TableScanOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: Initializing Self
>    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing Self
>    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing
> children:
>    [junit] 08/12/30 19:42:13 INFO exec.FileSinkOperator: Initializing Self
>    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initialization Done
>    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: Initialization Done
>    [junit] 08/12/30 19:42:13 INFO exec.TableScanOperator: Initialization
> Done
>    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
>    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: PASSED:118
>    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: FILTERED:382
>    [junit] 08/12/30 19:42:13 INFO mapred.LocalJobRunner:
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
>
>    [junit] 08/12/30 19:42:13 INFO mapred.TaskRunner: Task
> 'job_local_1_map_0000' done.
>    [junit] 08/12/30 19:42:13 INFO mapred.TaskRunner: Saved output of task
> 'job_local_1_map_0000' to
> file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-692599239
>
>    [junit]  map = 100%,  reduce =0%
>    [junit] Ended Job = job_local_1
>    [junit] 08/12/30 19:42:13 INFO exec.ExecDriver:  map = 100%,  reduce =0%
>    [junit] 08/12/30 19:42:13 INFO exec.ExecDriver: Ended Job = job_local_1
>    [junit] diff -I \(file:\)\|\
> ...
>
> [Message clipped]